The database of search engines stores an infinite number of pages from different sites. Some of them are of little use. Such pages are considered by Yandex or Google to be of little value and are often not included in the index.
Let’s figure out what these pages are, how to find them, and what happens if you work with them effectively.
Little informative (low value) page: concept
Low-value pages (in slang they are called zombie pages) have little chance of being in demand among network users. The content on these pages may duplicate the pages already known to the robot. Also, such pages may have little or no content. So, to reiterate, a zombie page is:
- an empty page without content (or containing only cross-cutting blocks),
- half empty page (where there is little content),
- duplicate page (very similar to another).
What’s wrong with these pages?
Low-value or low-information pages reduce the site’s crawl budget
Keyword Cannibalization Happens
Site authority may decrease
Behavioral factors worsen
What is the task before us?
- Find low-value or uninformative pages.
- Decide what to do with them.
- Close unnecessary pages from indexing.
- Work on the pages we need.
How to search zombie pages?
You should go to Yandex.Webmaster and see which pages the search engine considers of little value. Go to “Indexing” – “Pages” in the search and select “Excluded pages”.
As you can see in the screenshot, in the “Status” column we see the reason for the exclusion from the index. To make it more convenient to work with pages, we can upload them from Yandex.Webmaster to a table. If you work with Google products, this method may not work for you, so let’s look at other options.
There is a Screaming Frog SEO Spider program. If you are engaged in website promotion, then you probably know about it. If not, install it and crawl the site.
After scanning, you can sort pages by Word Count (number of words) or Size (page size):
The assumption is that the smaller a page is, the less content and value it has. The same goes for word count.
It’s also, of course, worth checking for duplicate tags to find similar pages. We go to the Title tab and look for duplicate meta tags, as shown in the screenshot.
There is another way to search for zombie pages using Yandex.Metrica. To do this, you need to find pages with bad behavioral factors. For example, sort pages by bounce rate. But the above two are enough for me.
- We optimize. This method makes sense when there are resources and page optimization can bring significant profit. Usually I don’t work with optimization of low-value pages, but proceed to the next options.
- We glue. If we have the same or similar pages, we glue them together using redirects and redirect from one page to another.
- Close from indexing or delete. There are standard page types: internal search, sorting, pagination, filters. If we have filter pages or other types of standard pages flying out of the index, it is worth closing them in the robots.txt file, even though they are closed in canonicals. To do this, you need to write the line in robots.txt:
This is relevant for pages like site.ru/filter/prop_2049-is-ks-set/apply/ . An important point is that before closing something, you need to check such pages for traffic. Otherwise, you can lose it.
Dealing with low-value pages can bring good results at low cost. The larger the site, the more work with zombie pages will give. In any case, I advise you to use the tips from the article and evaluate the results.