The site crawl rate

Yandex robots constantly index sites: crawl them and download pages to their database.

The site crawl rate is the number of requests per second that the robot sends to your site. The crawl speed rate replaces the Crawl-delay directive used in the robots.txt file.

Note. You must configure the crawl speed for the site's primary domain and each subdomain separately.
  1. Default setting
  2. Changing the crawl rate

Default setting

The optimal site crawl speed is calculated using algorithms so that the robot can load the maximum number of pages without overloading the server. Therefore, the Trust Yandex option in the Indexing → Crawl rate section in Yandex.Webmaster is turned on by default.

Changing the crawl rate

Note. Changing the crawl rate in Yandex.Webmaster interface doesn't affect the number of robot's requests for downloading the RSS feed used to form and update Turbo pages.

You may need to reduce the crawl speed if you notice a large number of robot requests to the server where your site is located. This can increase the server response time and, as a result, reduce the loading speed of site pages. You can check these indicators in the Yandex.Metrica report.

Before changing the crawl rate for your site, find out what pages the robot requests more often.

  • Analyze the server logs. Contact the person responsible for the site or the hosting provider.
  • View the list of URLs on the Indexing → Crawl statistics page in Yandex.Webmaster (set the option to All pages).

If you find that the robot accesses service pages, prohibit their indexing in the robots.txt file using the Disallow directive. This will help reduce the number of unnecessary robot requests.

To check if the rules are correct, use the Robots.txt analysis tool.

  1. In Yandex.Webmaster, go to the Indexing → Crawl rate page.
  2. Turn on the Set manually option.
  3. Move the slider to the desired position. By default, it is set to the optimal crawl rate calculated for your site.
  4. Save the changes.