Google News from SMX Next: Coati Updates, EAT, & Deprecated Algorithms

Ryuzaki

お前はもう死んでいる
Moderator
BuSo Pro
Digital Strategist
Joined
Sep 3, 2014
Messages
6,229
Likes
13,100
Degree
9
SMX Next, a conference by Search Engine Land has gone down and Hyung-Jin Kim, the Vice President of Google Search, revealed some interesting information.

Panda Became Coati​

First, Panda is no more. It evolved into the Coati algorithim (a striped little marsupial looking animal). Panda was "consumed" into the core ranking algorithm and is "ever-flux" meaning it's always updating. I never agreed with that and now we know why. Because some functions of Panda now run with the Coati algorithm. Coati is supposedly part of the core algorithm, too, so what I'm attributing to Panda/Coati may not run as part of that any more (I'm referring some technical SEO data roll outs regarding indexation quality scores).

EAT Applies to Every Query​

EAT (Expertise, Authority, Trust) applies to every niche, every query, every result, period. But what we also learned is that YMYL (Your Money, Your Life) has a more intense need for EAT, which tells us what we've long been thinking, that different niches may have different algorithms at play (or at least different weightings within the algos).

Deprecated Algorithms​

And finally we learned from a new help document that the following algorithm updates have been "retired":
  • Hummingbird (better natural language processing)
  • Mobile-friendly ranking system (the entire index is mobile-first now)
  • Page speed system (Core Web Vitals usurped it)
  • Panda (Coati improved and both are in the core algo)
  • Penguin (runs live in the core algo now)
  • Secure site system (probably runs live)
Calling these retired may be accurate, but it's probably more accurate to say they've been absorbed into larger algorithms.

Current Notable Ranking Systems​

That leaves us with the following that the help document lists as "notable ranking systems" they currently use:
  • BERT (Bidirectional Encoder Representations from Transformers)
  • Crisis Information systems (freshness updates during times of crisis)
  • Deduplication systems (stops unhelpful duplication from copycats & in featured snippets)
  • Exact match domain system (reduces the strength of EMD's)
  • Freshness system (for queries that deserve freshness like news)
  • Helpful content system (boosts original & helpful content for people by people)
  • Link analysis systems & PageRank (and people keep saying PageRank is gone)
  • Local news system (for Top Stories & Local News, tries to surface the original local sites)
  • MUM (Multitask Unified Model, used to understand & generate language for featured snippets, etc.)
  • Neural matching (concept comprehension & linking to pages that match concepts)
  • Original content system (tries to boost original posts over syndicated or cited copies)
  • Removal-based demotion system (legal removals & personal information removals)
  • Page experience system (Core Web Vitals, user metrics)
  • Passage ranking system (identifies parts of a page instead of the whole thing, used for jump links)
  • Product reviews system (identifies high quality product reviews)
  • RankBrain (used to match queries to concepts and not just their exact word usage, think synonyms)
  • Reliable information systems (boosts high authority, demotes low authority, used to promote fake news [I kid])
  • Site diversity system (doesn't allow any domain to have more than 2 SERP listings)
  • Spam detection systems (SpamBrain, removes spam sites from SERPs)
There they are with my explanations. There's a LOT to take in and consider just from this one guy spilling the beans. I hope you find it helpful. Any thoughts?
 
  • BERT (Bidirectional Encoder Representations from Transformers)
  • Site diversity system (doesn't allow any domain to have more than 2 SERP listings)
Thank you for that awesome recap!

Questions about these two.
BERT.. what is that? Transformers? never heard of that one.

Site diversity: The only exception is when there is a map pack in the serps right? Because I believe I have seen SERPs where the website is in the local pack, AND in the organic listings below it.
 
@zelch, (I just read some stuff to explain, it's not something I know deeply), BERT is a language model that attempts to take in the context of an entire sentence (bidirectional) as opposed to sequentially understanding a sentence from left-to-right. The goal being to help understand grammar and syntax to the point where it can even predict what a missing word would be based on the overall context. They even try to predict what the context of the next entire sentence would be.

It's used in stuff like sentiment analysis (is this a positive review or negative?), finding answers to questions (featured snippets), NERD (named entity recognition and disambiguation - explained in the crash course here).

Seems to be really involved in not only relevance but evaluating the depth and/or breadth of content, possibly in detecting AI content, and so forth.

With the Site Diversity stuff, I've seen results duplicated in the map pack and in the organic listings. I think this is fairly common. I've not seen a site featured in the map pack and then be listed twice in the organic listings, though. But they definitely de-duplicated sites from the organic listings if they have the featured snippet.
 
Back