- Slow server response time
- Invalid SSL certificate settings
- Duplicate pages with GET parameters were found
Slow server response time
This means that the average response time for all site pages was longer than three seconds when accessed by the search robot. This could be due to a particularly slow server response for certain pages on the site. If the server is currently responding quickly, the error message will disappear within a few days.
Invalid SSL certificate settings
This message is displayed in the following cases:
- Certificate expired.
- The certificate was issued for a different domain or not for all subdomains where it is used. For example, the certificate was issued for the domain example.com but is used for the domain www.example.com.
- The certificate from the certification authority is missing in users' browsers or it has been revoked.
- A self-signed certificate is used.
If there are problems with the SSL certificate, the browser notifies the user about them. Users might avoid the site because it is not secure.
To fix this problem, check the SSL certificate and the server settings. You may need to contact your hosting provider.
The Yandex robot will discover any changes the next time it crawls the site. If it doesn't detect a problem, the message will stop appearing in Yandex.Webmaster.
Duplicate pages with GET parameters were found
Duplicate pages are pages that have same content but are located at different URLs. Links with GET parameters can also be considered duplicates, since the Yandex robot considers them different pages. Such pages are merged into a group of duplicates.
If your website has duplicate pages:
- The page you need may disappear from search results, if the robot has selected another page from the group of duplicates.
- In some cases, if there are GET parameters, pages may not be grouped and may participate in the search as different documents. As a result, they compete with each other. This may impact the website's ranking in search results.
- Depending on which page remains in the search, the address of the document may change. This may affect, for example, the reliability of statistics in web analytics services.
- It takes the indexing robot longer to crawl the website's pages, which means the data about pages that are important to you is sent to the search database more slowly. The robot can also create an additional load on your website.
To tell Yandex which page should be included in the search and get rid of duplicates, add the Clean-param directive to the robots.txt file so that the robot doesn't take the URL parameters into account.