Critical issues
This section contains solutions to common problems from the “Critical” category identified during site diagnostics in Yandex Webmaster. Issues in this category may lead to the exclusion of individual pages or the entire site from search results.
Tip
Track and fix errors as soon as possible. You can configure notifications for website monitoring results.
Slow server response
This message means that the average page response time during a crawl by the search bot exceeded 3 seconds. This could be due to a particularly slow server response for certain pages on the site. Examples of such pages can be seen in the message itself.
A long server response time can lead to delays in site indexing. If the server response is currently fast, then you can ignore this message. It will disappear after the search bot crawls the site next time.
On how to speed up page reindexing by the bot, see How can I add a site to search?. You can check the server response time using the Server response check tool.
Invalid SSL certificate settings
This message is displayed in the following cases:
- The certificate has expired.
- The certificate was issued for a different domain or not for all subdomains where it is used. For example, the certificate was issued for the domain
example.com
but is used for the domainwww.example.com
. - The certificate from the certification authority is missing in users' browsers or it has been revoked.
- A self-signed certificate is used.
If there are problems with the SSL certificate, the browser notifies the user about them. The user might refuse from visiting an unsecure site.
To resolve the error, check the SSL certificate and the server settings. To do this, contact your hosting provider.
The Yandex robot will discover any changes the next time it crawls the site. If it doesn't detect the problem, you'll stop seeing the message about it in Yandex Webmaster.
Duplicate pages with GET parameters were found
Duplicate pages are pages that have same content but are located at different URLs. Links with GET parameters can also be considered duplicates, since the Yandex robot considers them different pages. Such pages are merged into a group of duplicates.
If your website has duplicate pages:
-
The page you need may disappear from search results, if the robot has selected another page from the group of duplicates.
Also, in some cases, pages may not be grouped, participating in the search as different documents. In this way, they compete with each other. This may impact the website's ranking in search results.
-
Depending on which page remains in the search, the address of the document may change. This can cause difficulties when viewing statistics in web analytics services.
-
It takes the indexing robot longer to crawl the website's pages, which means the data about pages that are important to you is sent to the search database more slowly. The robot can also create an additional load on your website.
Note
If you recently added Clean-param directives, prohibited crawling duplicates using robots.txt, or placed the rel="canonical" attribute, it may take a few days for these changes to be reflected in Site diagnostics in Yandex Webmaster and for the notification to stop displaying.
To tell Yandex which page should be included in the search and get rid of duplicates, add the Clean-param directive to the robots.txt
file so that the bot does not take the URL parameters into account.
Tip
The Clean-Param directive is intersectional, so it can be indicated anywhere within the file. If you define other directives specifically for the Yandex bot, list all rules intended for it in a single section. In this case, the User-agent: *
string will be ignored.
Example of the Clean-param directive
-
#for addresses like: https://example.com/page?utm_source=link&utm_medium=cpc&utm_campaign=new #robots.txt will contain: User-agent: Yandex Clean-param: utm_source&utm_medium&utm_campaign /page #thus we tell the bot that it should keep the https://example.com/page address in the search #to apply the directive to parameters on pages at any address, do not specify the address: User-agent: Yandex Clean-param: utm_source&utm_medium&utm_campaign