Google Omits Millions of Sites from SERPS

Joined
Nov 5, 2014
Messages
831
Likes
616
Degree
3
[Moderator Note: Please don't link to images. Embed them.]

t8FbVf7.jpg


a29wrIH.jpg


zgBpSFC.jpg


That's not my niche. It's just an example. I checked the head terms of my niche and on page 10-30 of the SERP, they all have a "click here to view omitted results similar to the ... that are displayed above". It's weird. We think it's a content quality issue, as now, low quality sites that did rank are being omitted from the SERPs. Only highly relevant pages are appearing. But that's only a hunch. For one of my verticals, almost all commercial sites are removed, leaving only government sites. For this PS5 vertical, it removes a lot of other webpages from the 35,000,000 indexed pages. What's going on? Anyone else get this too?

...how do you get webpages out of the omitted results section if they're in there? Things we tried so far are:
  1. Deleting all content on the page -- does nothing.
  2. removing duplicate content, as the documentation for omitted results say that it's for duplicate content -- did nothing.
  3. rewriting the whole webpage to see if a totally different author, writing style, etc would change it (fresh text for the LSA algorithm) -- did nothing.
What do I do? This cost a 20% loss in revenue or so. And it is appearing on many, many SERPs, not just my keywords. Seems like this was a Google change that happened that no one noticed.

@CCarter and @eliquid Since you two run rank trackers, did you guys notice how tons of SERPs are having "If you like, you can repeat the search with the omitted results included" now? I checked and no one's talking about it but... it's everywhere. Might be important...
 
Last edited by a moderator:
Honestly, not sure I understand.

I've been seeing this for years for all kinds of terms.

What's the issue here about?

I'm not sure if this is somehow different from what's been going on for years, or if this is new to you ( not sarcasm )
 
Honestly, not sure I understand.

I've been seeing this for years for all kinds of terms.

What's the issue here about?

I'm not sure if this is somehow different from what's been going on for years, or if this is new to you ( not sarcasm )
From what I remember, you could get up to 1,000 results for a query back in the day. There would be omitted results but you wouldn’t be limited to about no more than 200 results per query. This was from me paying attention on the results screen of ScrapeBox. To me, it makes no sense that, for example, even for a query such as “dog” that Google is only showing 210 results out of the millions of indexed pages.

it seems that the omission function got a lot stronger lately.
 
It seems to me that for at least a year if not more, Google has finally accepted that it can't crawl and index everything, and they might as well go ahead and start making choices (like Bing has done from the start).

This is why we get so many newbies wanting to know why their posts aren't indexed yet. The quality of the content may be fine, but Google needs convincing now. They need to do marketing or at least score some dofollow links that are indexable. The days of shoving every sitemap and RSS feed down Google's throat and getting results are gone.

This goes much further than it getting harder to get indexed, or not showing in the public facing index. They're [censored] the index as well.

And if you think about it, a huge portion of Google's existence is about high quality results, and the easiest way to maintain decent results is to block out low quality crap (usually from SEO's and spammers) in the first place. We've got longer and longer time delays, inverted results, bounces, and all that crap. But why not just simply not index it at all? That seems to be the direction they're going.

"We're not going to index and crawl everything frantically any more. It's not sustainable. You need to convince us you're worth indexing" aka quality content and page rank.
 
It seems to me that for at least a year if not more, Google has finally accepted that it can't crawl and index everything, and they might as well go ahead and start making choices (like Bing has done from the start).

This is why we get so many newbies wanting to know why their posts aren't indexed yet. The quality of the content may be fine, but Google needs convincing now. They need to do marketing or at least score some dofollow links that are indexable. The days of shoving every sitemap and RSS feed down Google's throat and getting results are gone.

This goes much further than it getting harder to get indexed, or not showing in the public facing index. They're [censored] the index as well.

And if you think about it, a huge portion of Google's existence is about high quality results, and the easiest way to maintain decent results is to block out low quality crap (usually from SEO's and spammers) in the first place. We've got longer and longer time delays, inverted results, bounces, and all that crap. But why not just simply not index it at all? That seems to be the direction they're going.

"We're not going to index and crawl everything frantically any more. It's not sustainable. You need to convince us you're worth indexing" aka quality content and page rank.

It might be content quality but, how does Google filter out webpages, by use of omitted results? Internally, we are running a test. Here's the logic:
  1. We know that omitted results appears in all keywords.
  2. For the top 250 keywords by clicks in the last 16 months, download the keywords from GSC. For the keywords received clicks, we know that a URL of the site must have ranked.
  3. Check the keyword in Google.
  4. If our domain appears in the results, we know that it was not omitted. Record the URL that appears.
  5. If our domain does not appear in the results, check under omitted results. Record the domain that appears.
  6. #4 is a list of URL/keyword pairings that passed the omitted results filter. #5 is a list of URL/keyword pairings that got stopped by the filter.
What is the difference from #4 and #5? We have yet to find out. Anyone else want to try this too? You are more than welcome to.
 
Totally off-the-cuff theoretical thinking here because I'm not actually checking at the moment, but I wonder how much of those omitted results are syndicated (copy and pasted) results that add zero value for Google or users?

"Omitted entries are very similar to the X already displayed." I'd be checking to see if my writers were just rewriting posts that already ranked on the front page. Same sets of headers and all, just "spun" the paragraphs manually. I could see Google moving a lot of that crap to omitted too. I suspect 90% of all SEO's content is nothing more than that anyways, and then they try to win on links.

Might be one thread to pull on during your investigation. "Is my #5 pairing URL's content very similar to one of the top 10?" as in a rewrite, basically.
 
Totally off-the-cuff theoretical thinking here because I'm not actually checking at the moment, but I wonder how much of those omitted results are syndicated (copy and pasted) results that add zero value for Google or users?

"Omitted entries are very similar to the X already displayed." I'd be checking to see if my writers were just rewriting posts that already ranked on the front page. Same sets of headers and all, just "spun" the paragraphs manually. I could see Google moving a lot of that crap to omitted too. I suspect 90% of all SEO's content is nothing more than that anyways, and then they try to win on links.

Might be one thread to pull on during your investigation. "Is my #5 pairing URL's content very similar to one of the top 10?" as in a rewrite, basically.

That's what I was thinking of too. For years, SEOs have been reading the top 10 results and re-writing it to rank for the top 10. The omission might be because those rewrites are very similar to what is displayed. That's why one of our tests was to write a page totally anew from a new writer with a new POV and new tone and voice. Nothing happened.

However, when we test the top 250 keywords, we'll see if a URL is omitted for one keyword and is not omitted for another. It'll be good to know too.

The oddest thing is that, for a keyword like "india embassy", Google somehow omitted all results besides government ones. Embassy directories no longer rank, unless it was ran by the government. That's weird and, also, good for the user. So odd.
 
That's what I was thinking of too. For years, SEOs have been reading the top 10 results and re-writing it to rank for the top 10. The omission might be because those rewrites are very similar to what is displayed. That's why one of our tests was to write a page totally anew from a new writer with a new POV and new tone and voice. Nothing happened.
Well that is depressing. Seems if the ticket to indexation (not even talking about clicks here) is a bunch of links and writing truly unique content doesn't waive that pre-requisite then this notion that just writing quality content will work is truly and utterly dead.

The oddest thing is that, for a keyword like "india embassy", Google somehow omitted all results besides government ones. Embassy directories no longer rank, unless it was ran by the government. That's weird and, also, good for the user. So odd.
Starting with COVID and now bleeding into the Russian/Ukraine conflict U.S media has been frothing over the opportunity to self-censor themselves. Google was probably at the forefront with their medic updates, seems something of a personal crusade for Gary Illyes, and without any pushback from users, publishers, or advertisers have just continued marching down that road. Thus, perhaps only .govs are authorities on embassy's, nothing else is "trustworthy" anymore.. same line of thought applied to COVID/war, if it isn't the "official gov't approved line" it is flagged at best and removed at worst.

Personally I'm not a fan of censorship. Definitely not a fan if it is being applied indiscriminately to every fucking query..
 
Back