Re-re-re. Classification.

Joined
Jul 24, 2021
Messages
4
Likes
2
Degree
0
I've been reading this post on BuSo, and it got me thinking.... https://www.buildersociety.com/threads/reclassifying-sites.7280/

My current task is to recover a 10 year old ecommerce website. It is DR38; most websites that are ranking higher for the queries I care about are lower DR (some are even complete garbage). My website used to be doing much better in the SERPs.

The differences I could pinpoint are:

1) most other websites are on Shopify, and mine isn't. Does Google favor Shopify stores nowadays?
2) mine has a blog with affiliate links

Did I ruin my store with an affiliate blog?
If so, which would be the best course of action:

1) relocate just the affiliate content to a new domain
2) move the entire blog from the category /blog/ to a subdomain?

There is of course another option I'm already working on... Creating more stores! I feel like there are 10 new websites popping up every day in my niche, and they are just crappy. I can't believe they get any business, but who knows.
 
Dude, I registered an account here just to comment on that thread but, since I don't have 3 likes, I'll write my opinion here.
First of all, I meet Mueller himself in person at Brighton SEO last fall. He specifically said that the problem are those people who think that, if they just make X more sites, they'll reach a financial goal. That's not what he wants nor what Google wants. Don't do that. You're trying to poison Google and litter the Internet.
Secondly, IMO, LLMs have allowed Google to understand search intent and the meaning of a website very well. It doesn't know if your site is running Shopify or Magento or another e-commerce platform an ddoesn't need to. The issue is not if your site is relevant to the query or not. It can figure out relevancy easy with LLMs.
The issue is quality. What it's looking at is user behavior data from Chrome, SERPs, Android devices, and so forth. If humans go to your site and your site gives the reader a bad impression and your time on site, bounce rate, pages per session, etc metrics are all bad, Google will know and put a lower quality score on your site compared to your competition. This quality score is domain wide and is the average of the last 13 month of user data.
So, no, I don't think there's classifiers anywhere in Google. I just think your competitors have a higher quality site and the user data shows. Improve your PLPs, PDPs, and blog articles so that people stay on your site. That's all. No need to jump through so much technical loops when it's all about good writing, descriptive media, and a helpful website.
You can try fooling Google all you want but you can't fool your customers, readers, and people. People know spam when they see it.
 
I hope you eventually do something productive with those balls you got, dude. You have no right to call my well established US-located physical business spam. Also, you totally missed the point, and I don't know what I did to trigger you so much. Go to the post I referenced in my opening comment and tell the guy who started a new website to outrank the old one that they are polluting the Internet.
 
I hope you eventually do something productive with those balls you got, dude. You have no right to call my well established US-located physical business spam. Also, you totally missed the point, and I don't know what I did to trigger you so much. Go to the post I referenced in my opening comment and tell the guy who started a new website to outrank the old one that they are polluting the Internet.
I apologize if you are offended, sir, but I was only quoting John Mueller's comment, when I discussed stuff with him in person. He's the face of Google's webmaster hangout, whom you're having trouble with. This thread is about your problem with him and them, and I think you're still going to have a problem with him and them given your stated plan.
I don't know what you should do and can't help you.
 
I do believe google has some classifiers, just look at the sites that are shadowbanned after 23 Mar. Traffic declined to death. If quality is the only thing that matters, there is no way for a couple of really good sites that get shadow banned after one specific algo update.

Dude, I registered an account here just to comment on that thread but, since I don't have 3 likes, I'll write my opinion here.
First of all, I meet Mueller himself in person at Brighton SEO last fall. He specifically said that the problem are those people who think that, if they just make X more sites, they'll reach a financial goal. That's not what he wants nor what Google wants. Don't do that. You're trying to poison Google and litter the Internet.
Secondly, IMO, LLMs have allowed Google to understand search intent and the meaning of a website very well. It doesn't know if your site is running Shopify or Magento or another e-commerce platform an ddoesn't need to. The issue is not if your site is relevant to the query or not. It can figure out relevancy easy with LLMs.
The issue is quality. What it's looking at is user behavior data from Chrome, SERPs, Android devices, and so forth. If humans go to your site and your site gives the reader a bad impression and your time on site, bounce rate, pages per session, etc metrics are all bad, Google will know and put a lower quality score on your site compared to your competition. This quality score is domain wide and is the average of the last 13 month of user data.
So, no, I don't think there's classifiers anywhere in Google. I just think your competitors have a higher quality site and the user data shows. Improve your PLPs, PDPs, and blog articles so that people stay on your site. That's all. No need to jump through so much technical loops when it's all about good writing, descriptive media, and a helpful website.
You can try fooling Google all you want but you can't fool your customers, readers, and people. People know spam when they see it.
I have to point out that, just because google has LLM it doesn't mean they will know your site using LLM. LLM is expensive, google cannot use LLM to understand every piece of websites. There must be some cheap, ways to understand a site in the early stage.

Don't tell me, after every single fresh site comes out, google will use LLM to analyze your site even the site is without any real user data. It is expensive.

G may only use LLM after your site is of certain authority or user data that it knows your site/page is good enough to know better.
 
Back