SearchImagesVideoNewsMapsMail
Login
Ctrl previousnext Ctrl

How to control the robot request rate to my site

If your server is overloaded and cannot keep up with our robot download requests, you should use the Crawl-delay directive. It will allow you to specify the minimum amount of time (in seconds) between our search robot downloading one page and starting the next. To ensure compatibility with robots that don't fully comply with the robots.txt processing standard, the Crawl-delay directive will need to be added to the group beginning with User-agent, and immediately following the Disallow (Allow) directive.

The Yandex search robot supports decimal values of Crawl-delay, for example, 0.5. This doesn't mean that our robot will visit your site every 0.5 seconds, but gives our robot more freedom, and improves the indexing of your site.

Examples:

User-agent: Yandex
Crawl-delay: 2 # specifies a 2 second timeout
User-agent: *
Disallow: /search
Crawl-delay: 4.5 # specifies a 4.5 second timeout
Keyboard