What's your take on Google's Search Algorithm right now?

Ryuzaki

お前はもう死んでいる
Moderator
BuSo Pro
Digital Strategist
Joined
Sep 3, 2014
Messages
6,230
Likes
13,100
Degree
9
It's flat out driving me nuts lately. I hit up two long time friends yesterday (@stackcash & @Bloghue) to bitch about it and they shared most of the same sentiments.

The past two months have been non-stop updates that seem to have made the problems they started last year just get worse. I'm pretty good at diagnosing whatever has changed with each slew of updates and I feel like I'm pretty close this time too, only this time it seems extremely non-sensical. To some degree, they seem to be extremely over-weighing:
  • Freshness
  • RankBrain
I'm seeing across a ton of SERPs a lot of nonsense and low quality results. The problem is two fold. First it seems like freshness rank is being given an absurdly disproportionate weight to the point I'm seeing brand new sites with very little domain 'authority' and almost no links to the page destroying a lot of us old timers with fairly recently updated content and tons of links on very strong domains. And the content is trash. I'm also seeing some sites getting locked into the top 3 that are the trashiest EMD's and PMD's (exact and partial match domains) and I'm guessing this is due to RankBrain. They had top 3 and are getting superior user metrics because they're in the top 3 and now they'll stay there for that reason. Some of these sites are crap you and I have gotten thin content manual penalties for in the past. Just blatant middle man affiliate money grabs.

It seems like links mean very little these days and even hurt you in some cases. Anchor texts seem to mean very little. Age is meaning little. Even domain authority is getting usurped by freshness. This could just be my vertical, I don't analyze others that hard and I don't do a lot of Google searching otherwise in my free time.

I hate to say it but churning and burning looks like it has a real shot at coming back. I'm guessing the reason we're seeing so many core updates lately is because they know they've goofed and are trying to sort it out.

Google also seems to be doing stupid stuff like giving illogical preference to random sites and letting them skyrocket for no clear reasons, to the point where we'd be better off building in batches rather than focusing on one or two authority sites. Like... if you wanted to build an authority site in X niche, you're better off building 10 of them and seeing which one Google magically prefers and then dropping the others and focusing on that one.

There's a lot of valid thoughts out there that seem to be true, but then are contradicted by ridiculousness like brand new sites hitting the top 3 in record time over long-term authority sites. I was giving a lot of thought to what @eliquid said about Google validating you by letting you have ultra long-tail traffic, then unleashing you for the next tier up with a bit higher search volume, in tiers like that. I still think that's true to a degree, but I'm also seeing the algorithm doing some real stupid stuff. It's not necessarily a contradiction either, it could be an anomaly, but how long can this go on? I've been seeing this become increasingly worse over at least a year if not a bit longer.

Do you guys have any thoughts on what's going on with the algo right now? And in your own personal usage of Google, do you feel like the quality of the results has taken a drop or not?​

My main project has been an uphill battle whereas all of my typical white hat, high quality build & content and speed optimization was skyrocketing me before. I'm continuing to see the same traffic levels while losing more and more valuable traffic. It's driving me bonkers, especially because it's starting to really look like factors out of our control are at play now, ones that seem out of Google's control currently too.
 
I agree on the freshness bit. Don't really agree on the any new domain bit, I am seeing old established domains rank for anything they want. Another gripe I have is snippets + a ton of other fluff (reviews and other meta data) that Google is showing and that have no relevance to the search query, but are pushing down the actual ranking sites.
 
I just found and read this article by SearchEngineJournal that nearly backs up all of my inclinations, especially explaining why myself and my bros have been seeing the crappiest sites ranking in the top positions.

The basics are that Google has rolled out and is tweaking their new Neural Matching Algorithm. All of the traditional ranking factors are still at play and create a ranking of posts for the search query, but is then re-ranked with zero emphasis on links and only on an AI understanding of relevance not entirely based on the query itself.

That absolutely explains what I'm seeing. Freshness gets some crap site and post of theirs into the top XX, which earns it a spot into the Neural Matching Algorithm's attempt at re-ranking the qualified posts. And occasionally it thinks the bad pages are worth being at the top.

The point of this new algorithm is to try to understand context better using synonyms and n-grams. An example I mentioned before is to search "What's the movie with the guy in the bunny suit?" and see what happens.

To me, this appears to be the culmination of the Caffiene update (freshness), and Hummingbird & RankBrain (relationship vectors for queries, which is why the top 3 tend to get locked into place because they become a central node to this relationship building).

If you ask me, this new AI algo either hasn't quite dealt with enough data to shake itself out yet, or it needs to go back to the drawing board for a while. I think Caffiene & Rankbrain are confusing it.

Something else to consider is Google reported a 30%+ YoY increase in ad clicks in the SERPs. I doubt they increased their traffic that much. Discombobulation and lower quality results may be temporarily pushing users to the ads and to refine their queries. But over the long haul this is a bad move because users will find an alternative altogether. Who knows if this plays a role in what's happening. Mayhap it is, mayhap it ain't.
 
I guess this explains many inconsistencies regarding traffic vs rankings vs earnings. Regular rankings are not as important now, like if you used to rank top 3 for a keyword, then you could reliably count a multiplier in long tail. With this, you can rank top 3 and get few long tail, or you can rank 30 and get a lot of long tail. All based on intent and contextual relevance.

Do I get it correctly?
 
I guess this explains many inconsistencies regarding traffic vs rankings vs earnings. Regular rankings are not as important now, like if you used to rank top 3 for a keyword, then you could reliably count a multiplier in long tail. With this, you can rank top 3 and get few long tail, or you can rank 30 and get a lot of long tail. All based on intent and contextual relevance.

Do I get it correctly?

I'd say so. But I would not call it regular rankings as this may be the new regular. What you just described is the new normal for me.
 
So @GarrettGraff hit me up and brought up a good point.

In July, Google updated their Search Quality Raters Guidelines to have heavy emphasis on "Content Creators." They're told to look for the content creator on the page and then try to search the net and see what other sites have to say about them. If the reputation is bad, then the page gets a low quality score, even if the content is amazing. So what happens if they can't find an author at all?

It says the "publication" itself matters, in terms of who controls that, but they want to know for every page too. This all makes sense if you recall the few years they showed authors' faces in the SERPs. Even if they took that out it's likely that they still use it, I'd assume.

They took out the author stuff because they were keeping some kind of Author Rank on each author, and that was an imbalance because it didn't get adopted enough. But that doesn't mean they still can't assign a score to pages based on whether an author exists or not.

Yesterday I looked at a lot of sites in my niche. Almost all of them had no authors being displayed, and almost all of them took a hit during this same time period I'm crying about. Some survived without authors too but that's always the case with these updates.

Any thoughts on this?
 
I wish it was that simple. I am convinced that there is some sort of author rank, convinced enough to use the same name on my guest posts, but it's just one piece of a large puzzle. I'd be very surprised if this alone was responsible for recent changes in the serp's.

I see different problems on every site I analyze so to say any one thing is responsible seems like grasping at straws to me.
 
Tough luck if your name is John Smith.

It probably doesn't hurt to have author bios with links to LinkedIn, Google Plus etc, which then have links to guest posts elsewhere under the same name. I don't know, sometimes it feels like it would be difficult to make such huge calculations and connections even for Google.
 
@bernard, Yeah, Google realized, according to the stuff I read, that only certain authors matter. They can't keep tabs on everyone.

What I'm saying, with regard to author trust and quality ratings, is that simply having any author at all is now better than not having one. It doesn't matter who, though there's probably a boost as you say if there's a LinkedIn, Twitter, etc, where it can be easily seen that they're legit. And there needs to be some form of author biography listing credentials and qualifications.

@Calamari, I agree and I'm not saying this is the one and only thing. I think the Neural Matching Algorithm is a thing for sure. But if authorship is a thing, it's a thing within our control.

The problem here is we won't know unless some of us test it and even then we'll probably have to wait for the next big core update related to authorship to see if we magically bounce back (if we strike early enough while they're re-crawling for it). I'm going to test it.
 
I'm at this already, but as I'm using personas until next year, that makes it less legit. I'll definitely use G+ though and share my guests posts and interlink my LinkedIn and stuff like that.

I'm thinking of buying PersonaName.com and doing a quick little blog there as well.
 
@bernard, Google removed all of the authorship markup related to Google+, and is actually getting ready to shut Google+ down and convert it to a kind of "everyone in our office chats on it internally" style of project. Heads up there, it might save you some time.
 
For what it's worth, my site took a minor hit back in August (I guess during the 'Medic' update?) with traffic declining 5 to 10% until the end of September where it bounced back and then some. Compared to that lull, traffic's now up 33% with the recent algorithm changes and I haven't done anything on my part other than post new articles and update some of the older content. This is on a 3 year old domain.

I also don't utilize any sort of author persona. Every post is anonymous and it doesn't seem to make a difference in my case.

This makes me think they're evaluating on-page content differently and placing more weight on it compared to other factors (like links).
 
I'll let you in on a little secret.

I've been doing what you said in the original post of, "you're better off building 10 of them and seeing which one Google magically prefers and then dropping the others and focusing on that one. " with 1 twist.

I don't drop the other sites. You never know when one of those is the "preferred" mix of ingredients in Google's never ending thirst of algo change. Could be in 2 months, could be in 24 months. You'd be surprised ( unless you know this already ) what comes back up from the dead sometimes.

Plus, I like trying to take over the niche, instead of just a ranking.

Hope that helps

.
 
It's frustrating to say the least. Mainly because these last months updates haven't made any sense at all.

Crappy sites and articles beating some really well-optimized and knowledgeable sites and articles. Forums being in the top results of important keywords.

On some keywords there are even sites in top 10 position that look like they are from the 90's. They even have those moving GIF's in the sidebar. And they beat heavily researched and well-designed websites! Crazy.

The poor results reflect in my Google Analytics as well, basically all sites bounce rates have gone way way up. One site that used to have around 50-60% bounce rate is now around 92%.

Another strange thing is that they have excluded a lot of websites. They limit the results with 4-6 pages and then write at the bottom of the page:

"In order to get the most relevant results we have omitted some results that are similar to the 90 already shown.
If you want, you can redo the search and include the omitted results."

As a SEO I am not happy with this, but still fighting, of course. As a consumer of Google, I am disappointed that they are giving me crappy results now days.
 
Just a side-note, I think some of the SEO rhetoric out there is a bit bias regarding this. For example a lot of SEO folks consider "quality" as having a high word count - like they strive for "word count" versus substance or user interest. I believe we are seeing Google pull back on these quantitative metrics as it grows to have a deeper understanding of content.

For example you don't need a 10,000 word article on how to apply toothpaste to brush your teeth, but if word count is all you focus on then you'll regard long content as "quality". I recently saw another person mention they added 100,000 words of content to their website in a month... Ehhh, that's the equivalent of a novel. I talk about this here: Which one is best for seo and traffic purposes

What's the point? 100,000 words of content is approximately 250 pages of a book.

Think about that for a moment, 250 pages.

To put that into perspective the King James Bible is about 783,000 words.

"A Game of Thrones" the first book in the series is 295,000 words.

Lord of the Rings: "The Fellowship of the Ring" (first book) is 186,000 words.

Stephen King's "The Gunslinger" is 55,000 words.

The book of Genesis (Bible) is 32,000 words.

To add 100,00 words of content to a website - which mostly users would consider fluff, doesn't do you or your website visitors any service. I've looked up simple recipes to find web pages with 1,000 word intros about their childhood growing up eating Apple Pies, before I can scroll far enough down to find the damn Pie recipe. You don't think that Google can detect fluff?

So ask yourself do you equate "quality" with "more words"? If so - isn't Google more likely to figure it out and make adjustments?

Google's end goal is to satisfy it's users - Does your 10,000 word article that could be answered in 250 words help the end user? If no, will not Google eventually penalize you for fluff?

I believe you'll have an easier time simply trying to convey a topic/subject within an article that helps your website visitors out - rather than shoot for an arbitrary number of word count or LSI terms, or whatever the latest metrics the cool kids are using.
 
It was me who added 100.000 words, but it's for the entire site, during its entire lifetime :D

I do agree, that I think Google has dialed back the "content boost" this time around. Not only length though, content in general, opting for other authority factors like Ryu talks about trust, authority, and if these are machine learning, then who knows what patterns they look at.
 
It was me who added 100.000 words, but it's for the entire site, during its entire lifetime :D

Probably was - however MULTIPLE people have this idea is that this exact number of 100,000 words of new content in a month should go up on their site, even in the thread I posted the original comment in, that was the OP's goal. All one can do after so many years of seeing the same mistakes made is just shake your head and move on... Until Google drops one of these Algos that "adjusts the environment" - then it's time to speak.

At the end of the day no matter how we feel personally about Google and their past behavior and mistakes, Google's goal is to satisfy it's users, so if you simply skip all the quantitative measurements and steps, instead think of content ideas that help your users you'll be aligning yourself with Google's interests - eventually.
 
Yes, I think you're right, but honestly part of aligning with users is moving away from text into other content types. No one wants to read 2000 words on bedding, they want a video. They don't want to read a formula about vo2max, if they can use a calculator. Which is where I'm going with my fitness equipment site. I made a nice little badge "hands on guarantee" and we'll do videos and real live product reviews.
This is intimidating to a lot of people I guess (me included).
 
It’s very frustrating and annoying what Google does some times.

The recent G updates has focused on site quality and trust signals although it’s messed up at the moment.

I think we could learn a thing or 2 from 10beasts dot com.

It’s baffling how in spite of all the G updates in the last 2 years , they’ve been ranking for high compt. Tech keywords despite clearly being nothing but a cookie grabber.

None of the G updates affected them. Are we missing something here?
 
For example you don't need a 10,000 word article on how to apply toothpaste to brush your teeth, but if word count is all you focus on then you'll regard long content as "quality".

In school we were demanded to write papers of specific word counts. As customers of writing services we often have to specify how many words we would like on a topic. Writers are commonly paid by the word.

Never were we asked to cover a topic in as many words as necessary, to place an order for a well researched article that covers the topic in a concise, thorough manner, or to pay writers solely based on the quality of their research and final product, not the length.

Not to say this is a direct solution to the SEO issues raised, but in general there is plenty of conditioning that the average person endures that acts directly against logic. This provides a huge opportunity for those who are willing to do the work to ask 'why', to think for themselves and to test their own ideas to success.
 
If no, will not Google eventually penalize you for fluff?

That's an interesting thought. They've already targeted this kind of thing in the past, like the Farmer update that killed eZineArticles and the like. Then they targeted doorway pages too.

I think people tend to forget that the "thin content" penalty isn't just for short posts with no images and crap like that. It's specifically for not adding value.

There's been times where I've targeted keywords and not took the time to check what was ranking in the SERPs because I knew what the intent was. I matched the intent and manage to rank in the top 5 now but what I didn't match (and paid for it by not ranking for months upon months) was a completely mismatched "style of solution." Turned out Google was favoring pages with 150-200 words instead of my 3,000. It was showing results with a definition and a paragraph about the phrase.

How do they determine this stuff? User metrics and pogo-sticking. But being different did end up sticking me to the top 5. I'm the only post in there that goes into full depth on the topic and I'm willing to bet that that eventually flips the "style of solution" on the SERP as more people follow suit.

One thing my main project is getting a reputation for is that it's casual but scholarly and in-depth. Every post goes hard to cover every topic completely and with real insight you don't get on typical sites by SEO's. That's adding value.

But if you're tossing up 10,000 words of nonsense just to have the longest content, that will be reflected in your user metrics and SERP metrics. Google is using that data to sort you out of the stack as it is. Will they eventually come up with some kind of "fluff" penalty? I could see it, and I could see it being page-by-page and eventually a domain-wide penalty too, especially as all of these AI algorithms come online with more and more natural language processing and neural networks.

Yes, I think you're right, but honestly part of aligning with users is moving away from text into other content types. No one wants to read 2000 words on bedding, they want a video. They don't want to read a formula about vo2max, if they can use a calculator. Which is where I'm going with my fitness equipment site. I made a nice little badge "hands on guarantee" and we'll do videos and real live product reviews.
This is intimidating to a lot of people I guess (me included).

This kind of approach is like I've done. I could be miles down the road by now but I go balls to the wall on every post. Ultimately once you get bigger exposure this kind of effort will create a reputation for you. I think it's the right path, and like CCarter always mentions, it's the kind of path that big money doesn't have the flexibility or patience to follow. It's a lot like the Backlinko site. Rather than pushing out a million posts, he keeps revamping his core posts to make them better and better.

I don't know that doing one or the other is better, but I do know the kind of posts that really match for user intent over worrying about bots and spiders make for ridiculously powerful link bait, which can then be pushed around the rest of the site to eat up a ton of search terms.

It’s baffling how in spite of all the G updates in the last 2 years , they’ve been ranking for high compt. Tech keywords despite clearly being nothing but a cookie grabber.

No kidding. The new owner seems to have fixed all the craziness of the initial site. They got it unpenalized (they bought it and immediately were penalized, not a big surprise there). Not only have they cleaned up the link profile but it looks like the site was redesigned, the content re-written, and the presentation is pretty damn good at this point. Kind of seems like they slid into the top slots dirty and then cleaned up their act and managed to stay. It's definitely an interesting case study.
 
I track about 100 different websites in many different niches. Here are a couple observations that glaringly stick out to me. A lot of this is nothing new, but as you guys know they can turn the dials up and down on ranking signals at any given time.

1) When pointing links to a page, I'm seeing keywords lose rankings almost universally across all sets I am tracking. Seems as though they have put in a filter to decrease rankings with initial links (even one link), most likely to combat BH SEO. I've been seeing them do this for quite a few years, but what has changed, at least from what I am tracking, is the time it takes for them to pass the juice and increase rankings. That time frame seems to be somewhere in the 2-4 month range at this point (I think a lot depends on keyword competition and type of keywords). Take into consideration I use a lot of the same type of links (contextual, relevant, high quality), so this could be different with some diversity. I don't isolate the link building from the on-page changes, which tend to be done at the same time. On-page changes seem to take way less time to react and generally don't move in the opposite direction like links do. I think my next test should be to point some links at an older page, which has already been audited/optimized.

2) Dated post seem to be fluctuating wildly, especially if there are multiple articles from the same source. Also, the newer the post, the more crazy it dances (top 3 to nowhere, to #55, back to #3). This obviously has to do with the freshness algo, but man can it get frustrating when you're doing a suppression job and the shit won't anchor.
 
It seems like links mean very little these days and even hurt you in some cases.

When pointing links to a page, I'm seeing keywords lose rankings almost universally across all sets I am tracking. Seems as though they have put in a filter to decrease rankings with initial links (even one link), most likely to combat BH SEO.

Yes! Links have always been able to hurt us in terms of a negative bounce to confuse spammers and get them to out themselves. But something different is definitely going on. It doesn't seem random any more, it seems like a guaranteed negative reaction now, one that's not promised to eventually show a benefit, even after months. It's looking like the algo is currently very skeptical of any page that attracts links, which goes back to me saying I'm seeing a lot of trash ranking right now from newer sites.

I'm guessing this will be rolled back, because without a doubt the SERPs have to be suffering in quality. But something is driving that 30% ad revenue growth this year, and that's likely what it is. A balance will be struck and we'll have to get back to sanity eventually.
 
I'm guessing this will be rolled back, because without a doubt the SERPs have to be suffering in quality. But something is driving that 30% ad revenue growth this year, and that's likely what it is. A balance will be struck and we'll have to get back to sanity eventually.

I don't like to be a conspiracy type guy, but I always try to follow the money. Google needs to consistently improve revenue (higher stock price). Knowing that Adwords is one of their main revenue sources, it never surprises me that they manipulate the SERPs, ad space, ad width/height, coloring/shading, etc... Not sure what exactly would be driving such high click rates to paid ads (can you link to your source?), but Tin Foil Hat John wouldn't be surprised if it has something to do with a cash grab. Rational John would agree though, it's probably a problem with the algo.

Here's a perfect example of what I saw 4-5 years ago in the real estate space. Google cleared out all of the local realtor's websites in favor of all the top aggregator websites like Zillow, Realtor.com, etc.. I know a lot of guys who lost almost all of their organic traffic and were forced into paid. I've seen this in quite a few niches. Either wiping out the SERPs, or recycling them. Question is, was this due to them weighting authority more heavily or a cash grab. Either way, makes for an interesting discussion.
 
Merkle put out their Q3 2018 report with the following information, which can be cross referenced with other sites:

Q51cLYT.jpg


I'm looking semi-stupid right about now though, and that's what I get for shooting shit off the cuff. They did have a massive ad spend spike since this time last year that's been maintained and growing each quarter (this is year over year data), but why? My theory doesn't explain it because the clicks aren't up but maybe 3-4%, so the rest has to be explained by bidding pressure. I know they dropped exact match bidding at some point recently.

Anyways, they've definitely changed something about the response to new backlinks, that's something we've both noticed. I'm at a loss to explain it unless it's got to do with something simple that a lot of people skipped out on like author boxes that suddenly is being weighted a bit too high. I'm going to dig into the SERPs again and find examples of stuff ranking that doesn't deserve it traditionally and start cross referencing what they're doing versus what old school good sites didn't do. It'll pop out at us eventually.
 
Back