Politico
Boot Camp
- Joined
- Oct 6, 2014
- Messages
- 140
- Likes
- 232
- Degree
- 1
This honestly makes a lot of sense. I know you gotta take everything from the horses mouth with a grain of salt, but this really makes sense when you think about what we see happen almost every updateInteresting one; could also be part of another split test of their algo. Here's a really interesting take from an ai engineer I saved a few months ago about short-term bursts shortly after algo updates
"From my experience working with AI developers running on large amounts of data and complex multi-variant models, my thought is this has very little to do with your content.
When Google indexes sites they have a dynamic scoring system that continuously takes into account user response data along with the categorizations Google has already done on each piece of content on your site. Every time they update their algorithms and sub-algorithms they need to re-run all the pages on all the sites that fall within the category of sites they were trying to improve the search results for.
For example if they add another factor to one of their algo models - like how many scrolls and clicks somebody does, or how many internal links a page has, or whether the page uses specific code pattern - then all the pages on all the sites this applies to need to be re-run through the new algorithm. The reason is you can’t compare outputs from 2 different ranking models. So they basically wipe the old post-process data used to rank your pages previously and rerun those pages over time with the new algorithm. If you had good content scores before, that gets wiped and they rebuild it from new user experience data generated by the new algorithm. It takes time and ideally you get the same or better ranking afterward.
The pattern you are describing where irrelevant / bad content sites and large high-authority sites (eg Home Depot) are outranking you now seems to be an artifact of the historical ranking data wipe. When Google wipes and has to reconstruct a portion of the ranking data, what’s left is the data that hasn’t changed. In this case it’s probably the historical backlink ranking data that was left which is now inordinately more important in the rankings because the relevance ranking data got wiped and hasn’t been rebuilt yet. So the guys with tons of backlinks are winning temporarily.
Google also takes time to split test. So they will apply the new algo to one population of sites, and keep the old algo for the other group of sites. Then compare results to see which version of the algo “wins”. Your site might be in the “new algo” rather than the control group. They’re probably using AI to design and run these tests on the fly, too.
Google has been doing a massive development push on relevance and NLP in the last 3-4 years. Relevance-based algorithms are dramatically different than old data-value based algos. Now when they do an algo update they’re not just shifting the value they place on one or more known discrete ranking factors. They are transforming the entire ranking model in novel ways they don’t even understand completely.
in a nutshell:
1) it’s probably not your fault
2) Google is probably not singling out a particular site or post model (unless they are explicit about it)
3) you probably lost rankings because a chunk of your historical data got wiped and needs to be rebuilt by Google over time
4) You (and other good sites) should recover after a few months once Google has a chance to rebuild your user response data
5) You cannot prevent these things - you can only mitigate the damage by “doing all the things” on each site, and diversifying across sites, revenue models and niches."