Where Do Quality Raters Get Their Database of Webpages to Evaluate From?

Joined
Aug 16, 2021
Messages
139
Likes
94
Degree
0
I understand Google has quality raters digging through each page to assess them on E-E-A-T and give them a ranking. And sites with low EEAT scores get downgraded while others get upgraded.

But where do these raters get their database of sites from? Does Google maintain a master list of URLs that is distributed to these people? Or, do they give them a list of queries and ask them to rate the ranking pages on EEAT.

My gut feel is it's the latter since the Raters are also made to assess 'Search User intent' needs by going through specific queries. So maybe the objective of these raters is to push down irrelevant and non-trustworthy results that made it to the top, rather than move up those that are not ranking (which is perhaps taken care of by the algorithm).

Thinking aloud - if it's the latter, then small sites that get screwed over once will never stand a chance of coming back at all. This is because sites often lose 90% of traffic when this happens. And regardless of how much improvement you bring to your website, there is nobody rating you back to the top because you never show up on any of the queries.

PS: Title needs to read 'Where do' and not 'How to'.
 
In the REAL quality rater guidelines, the one they keep internally, not the one they publish for the world, they give example URLs of what to look for on what looks good and what looks bad.

As well it is not down to one quality rater to make a determination, it's multiple raters are asked to rate several URLs and if 28 out of 30 think it doesn't meet the standard then the site will drop. A senior guy can override decisions but that also get scrutinized cause they are smart about corruption.

Also the quality raters work remotely. I know this because one of them came down to an office of ours and logged into his system, before the world knew what a quality rater even was, explain to us what they did.

I recall THAT was the reason I made my WickedFire account to report that google has thousands of employees rating the search results. Before then I was just a lurker. This was 2007/2008. Obviously no one believed me, but I stuck around WF still.
 
I actually signed up to do this in the past so I could see how it works and get all the actual internal documents, which are the same as the public ones but more added on how to use the software and all the work-related crap. I performed the tasks for about 1 week and then ghosted them. This was admittedly a decade ago or more and they were already doing EEAT style stuff then.

Those documents don't really tell you anything the public ones don't already say.

It's outsourced to complete dunces and you get less than 30 seconds to make decisions. It's pretty absurd and grueling work.

99% of the "cases" I was assigned (automatically of course) were either 1) "check this one page", which meant look at the page and then quickly look around the site (about page, etc.) and try to find reviews online for them, too, or 2) Here's two pages, one on the left and one on the right. Which is better, more trustworthy, etc. All in less than 30 seconds.

Trusted, long-term reviewers with a fast rate and high probability of accuracy may be assigned more complex reviews. This was just my experience before I bailed.

What everyone needs to realize is these quality raters aren't making decisions that effect your pages. They're there to bring confirmation to Google that the algorithm is correctly assessing quality in the way a human would. You could say that what the raters say influences the algorithm, but I'm telling you that Google knows well enough to the point where they know which raters are just choosing random options (and they get fired). They're not making decisions, they're just refining and bringing confirmation to the algorithm based on how humans make "flash decisions" about quality and trust.

"Why would they spend the money then?" It's literal peanuts outsourced to people who barely speak English.

The QRG (quality rater guidelines) documents you can read publicly tell you what you need to know to come into alignment with what Google wants your site to look like.... literally LOOK like, in order to be PERCEIVED as high quality.

Here are the documents for those interested.
 
I actually signed up to do this in the past so I could see how it works and get all the actual internal documents, which are the same as the public ones but more added on how to use the software and all the work-related crap. I performed the tasks for about 1 week and then ghosted them. This was admittedly a decade ago or more and they were already doing EEAT style stuff then.

Those documents don't really tell you anything the public ones don't already say.

It's outsourced to complete dunces and you get less than 30 seconds to make decisions. It's pretty absurd and grueling work.

99% of the "cases" I was assigned (automatically of course) were either 1) "check this one page", which meant look at the page and then quickly look around the site (about page, etc.) and try to find reviews online for them, too, or 2) Here's two pages, one on the left and one on the right. Which is better, more trustworthy, etc. All in less than 30 seconds.

Trusted, long-term reviewers with a fast rate and high probability of accuracy may be assigned more complex reviews. This was just my experience before I bailed.

What everyone needs to realize is these quality raters aren't making decisions that effect your pages. They're there to bring confirmation to Google that the algorithm is correctly assessing quality in the way a human would. You could say that what the raters say influences the algorithm, but I'm telling you that Google knows well enough to the point where they know which raters are just choosing random options (and they get fired). They're not making decisions, they're just refining and bringing confirmation to the algorithm based on how humans make "flash decisions" about quality and trust.

"Why would they spend the money then?" It's literal peanuts outsourced to people who barely speak English.

The QRG (quality rater guidelines) documents you can read publicly tell you what you need to know to come into alignment with what Google wants your site to look like.... literally LOOK like, in order to be PERCEIVED as high quality.

Here are the documents for those interested.
This is right. Knowing this, you can create a big database of your own. Go be your own quality rater for a few days and build a few thousand site database.

Then, you can track all the different obvious factors for those sites from the rater guidelines. When updates come around, you will be able to see which perform well and which don't.

It's not perfect, but it will give you insight into which EAT related factors they are trying to adjust to remove the sites you as a reviewer rate poorly and which you rate well.

Plus, if an update happens and there's no noticeable correlation between traffic changes and those factors, you know it's not aimed at EAT factors which narrows things down quite a lot.

Not a perfect method and you can't rely on it super heavily, but better than having absolutely zero insight and just guessing.
 
Back