![]() With this in mind, it may be wise to act accordingly.Įxploration will take a little longer, but you will now be able to relaunch your exploration and enjoy an error-free export! As a colleague often repeats, you will now be able to enjoy (of a complete and quality export).Ī function that is sometimes underestimated, the option of changing User Agent is very useful and relevant when you spot errors (type 429 in particular).īriefly, and in order to avoid any misunderstanding, a User Agent is simply the type of crawler used. ![]() It is also recommended taking a look at the robots.txt file to see if it contains a crawl delay. In this new window, just decrease the maximum number of URLs per second, as seen in this screenshot: To reduce the number of requests per second in order not to overload the server and to avoid activating potential protection, select from the menu Configuration> Speed.
0 Comments
Leave a Reply. |