Not only your site's design and content, but other factors affect the representation of your site in the search results. This includes incorrect server settings and errors that occur when the pages are downloaded by a robot.
Some errors caused by server configuration problems are described below.
To make the robot receive the site pages properly, the HTTP header passed to the robot must comply with the RFC2616 standard. In particular, pay attention to the way the robot processes last-modified and Content-Encoding parameters.
If server configuration blocks the IP of the indexing robot, the site may disappear from search results. You can find out about the issue in the Yandex.Webmaster service. In your site list, a message will appear next to the problem site: "Your site is not being indexed. (Site indexing prevented by the settings of the server on which it is hosted.)" Click the question mark next to the message to obtain detailed information about the problem.
Our robot considers pages with urls ending with "/" and not ending with "/" to be different pages. If the content of these two pages is identical, it is better to set up redirect 301 from one page to another. You can do that using settings in htaccess file.
To make the robot index only useful site pages, make the non-existent subdomains and pages inaccessible or set them up to return error code HTTP 404.