Penalty Time... and what to do next

Joined
Mar 27, 2015
Messages
837
Likes
1,486
Degree
3
So after making my thread about 2019 goals it seems that I was too slow in sacrificing a virgin or two to the SEO gods.

One of my sites got smashed by what seems to be some sort of algorithm on 17th January.

Pre smash organic traffic was cruising at around 45k sessions per day. Today as I type this we are sitting at 14k sessions per day: a 68% decline. Ouch.

The site is a general and broad topic category and is not anything related to your money or your life type contents.

Some major kws have simply dropped from page 1 to position 100+

I have had a well know SEO pro look into it and analysis has been inconclusive to date. I had built some links (link 10-15 links) a year ago on some guest posts but that was it.

In recent months the site has attracted a ton of spam links however all remained ok until I created a disavow file and nuked them all (Aug 18) This seemed to be the turning point for a slow decline which gathered pace until 17th January when the site got smashed.

I have since deleted the whole disavow file.

I am still ranking number 1 spot for tons of kws so the penalty does not appear to be sitewide and there is no manual action listed in GSC.

My guess at this point is that the 10 main high traffic kws which I have tanked for were over optimized (on page) so I am in the process of rewriting and reformatting.

Anyone have any thoughts or questions?
 
In recent months the site has attracted a ton of spam links however all remained ok until I created a disavow file and nuked them all (Aug 18) This seemed to be the turning point for a slow decline which gathered pace until 17th January when the site got smashed.

I'm not convinced that the disavow file is what started the slow decline unless you were overly aggressive about what you disavowed.

Sounds more like a general quality problem to me. You can still rank in the #1 slot for lot's of keywords even if your site is suffering from an algo penalty. Sometimes the algo penalties can be a few different compounding problems and other times it's one overarching problem.

I do think that deleting the disavow was a mistake, but if the tide starts to turn in the next few weeks and you did noting else other than delete the file I think that would be a very interesting thing to note.
 
I was super aggressive and deleted like 95% of all domains pointing at my site.

Now that I have rolled it back to re-avow them we will see what happens... It can't really get worse lol.

On the 10 main kws - do you think I should rework the content on those existing urls or start fresh urls on the site for those kws and 301 the old over optimized urls to the new?

Its not a competitive niche by any means so I think onsite is the key to restoring the site to its former glories since that was what got it there in the first place, before the SEO winds of change blew.
 
On the 10 main kws - do you think I should rework the content on those existing urls or start fresh urls on the site for those kws and 301 the old over optimized urls to the new?

301'ing is fashionable but in most cases it's not necessary at all. I would gently rework to de-optimize if they are over-optimized and rework it with the intent of increasing the amount of time that your visitors spend on the page in general.
 
Thanks @Calamari

It still leaves me amazed that there is not a tool that can analyse the top 10 serps and spit out kw densities and other key factors. I remember WebCEO could do this like 10 years ago lol.
 
Check out your indexation with a site:domain.com search and see if the numbers are in alignment with the number of posts you have plus pages plus archives with pagination. Alternatively you can look in Search Console if you have that set up.

I'm positive that an aspect of these recent biweekly August - January updates had a Panda aspect.

The new Search Console "Search Analytics" dashboard or whatever it's called has a killer Coverage report (I wrote about it there at length) that'll show you any severe indexation problems.

If you find that's an issue, please report back to help me confirm my thoughts and findings.
 
It still leaves me amazed that there is not a tool that can analyse the top 10 serps and spit out kw densities and other key factors. I remember WebCEO could do this like 10 years ago lol.

There are some broad brush thresholds that I use as a rule when I tweak on page seo. If you want me to walk you through some of the specifics of my on page process feel free to PM me.
 
@Ryuzaki - I checked this out and here are the stats -

Pages / Posts on Site: 269
Site: Domain search results: 280

You can see the GSC results here -
ObQqRRz.jpg


Doesnt look like anything strange going on with errors etc.

@Calamari thanks dude - PM sent.
 
The internet is full of stories just like yours, @MrMedia. Lots of people getting whacked for no clear reason. Google themselves said there was nothing specific any site did wrong or can fix, it's just how the new algorithm shook out. Everyone thinks different things because so many types of sites got hit that had every type of problem.

I get why @Ryuzaki said it's Panda, because he's fixed or improved every possible issue but a Panda one and is having problems. I read a lot of posts about the updates since August and saw a lot of sites with the same indexation problems being mentioned. But there's the E-A-T talk and other attempts to squeeze the discussion down to one core issue. I think this is one of the rare times Google is telling the truth. They changed so much at once and probably none of it was targeted hard. But changing 100 rank variables at once is going to cause some big shake ups.

Fixing and improving everything and anything is probably the right approach. My sites did the up and down squirrelly dance but eventually leveled out right near where they were. But reading all of the horror stories made me take an objective look at my sites and I see a lot of places I can improve if not just for the users' sake.

You mentioned your site is broad topic. That could be "the big thing" for you. Google said that this did have a big "relevancy" aspect to it.

How old is this site?
 
From my understanding it analyses the serps and tells you what you are deficient in - not what the averages are.

I am happy to be corrected on this point if I am talking BS.
 
top 10 serps and spit out kw densities and other key factors.

Curious, what other factors are you looking for besides KW density? (I'm thinking people would want both the average and standard deviation of metrics). But off the top of my head - title and meta tag, header tags (h1-h6), total word count, kw density, a word cloud of the top 10 page's content, amount of outbound and internal links, alt tags in images, and brief analysis of the paragraph content? Perhaps sentiment of the top 10's content?
 
Curious, what other factors are you looking for besides KW density? (I'm thinking people would want both the average and standard deviation of metrics). But off the top of my head - title and meta tag, header tags (h1-h6), total word count, kw density, a word cloud of the top 10 page's content, amount of outbound and internal links, alt tags in images, and brief analysis of the paragraph content? Perhaps sentiment of the top 10's content?


All of those factors you mentioned. Just using kw density as a baseline factor that just seems to be missing from most tools. SEMRush can do some of it, Ahrefs some of it.. It sure would make things easier to have it all in one report.
 
Thanks @Calamari

It still leaves me amazed that there is not a tool that can analyse the top 10 serps and spit out kw densities and other key factors. I remember WebCEO could do this like 10 years ago lol.

Ryte has a module for TF*IDF (term frequency), which is a souped up version of keyword density. I wrote about it a couple years ago here on my blog. They have changed since then (used to be onpage.org), but the module is basically the same.

Also, check out my article in my sig for auditing pages. All deal with same topics.
 
Ryte has a module for TF*IDF (term frequency), which is a souped up version of keyword density. I wrote about it a couple years ago here on my blog. They have changed since then (used to be onpage.org), but the module is basically the same.

Also, check out my article in my sig for auditing pages. All deal with same topics.

Thanks. Tried Ryte but it appears to be inaccurate.

One recommended kw was blue widgets which they said I had on page 24 times. I checked and it was on the page 86 times! Not great...
 
Thanks. Tried Ryte but it appears to be inaccurate.

One recommended kw was blue widgets which they said I had on page 24 times. I checked and it was on the page 86 times! Not great...

Yeah, they all have flaws. Matthew Woodward wrote a pretty decent case study I just read here. Pretty comprehensive.

As always, test, test, test and see what works best.
 
From my understanding it analyses the serps and tells you what you are deficient in - not what the averages are.

I am happy to be corrected on this point if I am talking BS.

It does show what you are deficient in, and then recommends you how much to update your content based on the average.

it does measure over 570 factors, and will wil hit the 1000 factors measurement points within 2019, another cool thing about it is that if you get hit by one of googles weekly updates, and you run the software on your site regularly, not only will you imediatley know what to correct, but you can also see what the update targeted in the serps.
 
It does show what you are deficient in, and then recommends you how much to update your content based on the average.

it does measure over 570 factors, and will wil hit the 1000 factors measurement points within 2019, another cool thing about it is that if you get hit by one of googles weekly updates, and you run the software on your site regularly, not only will you imediatley know what to correct, but you can also see what the update targeted in the serps.

Oh shit, totally forgot about Cora. I usually pay my buddy for the reports, but have a big project in the works, so I will be subscribing to the service.
 
Just got a report back (cost $5) from a vendor, and there are averages @MrMedia, does against Page 1 averages for outputs along with the other Practical Max, so it looks like a decent solution:

jjB6BfW.png


Actually it has averages for each page for top 100 (need to show hidden columns):

82yVldK.png
 
Last edited:
UPDATE

The site continues to lose rankings and kws at a very fast rate. Currently doing 15k visits p/day down from peak 50k+ p/day.

I have had a number of consultants look at it and offer their advice. Two have confirmed it is a combination of dupe content and spam attack links pointed at the site which have triggered an algorithmic penalty.

The dupe content is a combination of the main writer passing off content as original that was ripped from elsewhere. It is amazing that we ranked for so long on content that in some cases exists on 37 other websites... It was a major breakdown in our editorial process that let this slip through.

As we ranked other sites then copied our "copied" content and because the DA on the site is so low they eventually started outranking us.

Two people fired instantly and the writer responsible is gone and appropriate feedback left on Upwork. She won't work again under that profile.

Side note - never take your eye of the ball even with a team you trust and have worked with for 5.5 years...

This combined with a ton of chinese and porn links sent by competitors seems to have tipped the site into oblivion.

Slow decline started in August and then exploded mid January in line with major SERP updates.

I missed the signs of the coming storm it seems. Second lesson learnt.

ACTION PLAN

Using CORA I have discovered all the problems with the onsite. I have now briefed an excellent new writer to start replacing the main articles that we lost rankings for and most of our traffic for. In total around 13 pages drove 80% of our total organic traffic.

QUESTION

@CCarter or anyone with more brain power in this area than me.

Would you -

1. Add the fresh and unique content to the existing URL.
2. Creat a new URL on the same site and add the new content and then 301 the old page to the new URL.
3. Start fresh - find an aged related domain and start over, 301ing all the old unique content from the old site and ditching the crap.

The metrics on this domain are weak (DA 11) but it still ranks for 65k kws and drives about 15k visits per day.
 
1. Add the fresh and unique content to the existing URL.
2. Creat a new URL on the same site and add the new content and then 301 the old page to the new URL.

These two options are effectively the same thing.

3. Start fresh - find an aged related domain and start over, 301ing all the old unique content from the old site and ditching the crap.

If you 301 the old domain to the new domain, like the homepage and the good articles, you'll still pass on any of the spam that was attached to those posts, and if there's a penalty associated with the duplicate content then the penalty will pass on too.

If you're trying to escape the penalty without fixing it, you can't do the 301's.

404-ing all the crap content can help, if it was the bulk of what was copied and spammed. You won't know immediately though, Google often only releases penalties during big updates.

What you can do is go through all of your backlinks at a domain-level view in Ahrefs or something similar and any domain that sent a spammy link can be added to your disavow file like this: domain:spamsite.com. This will allow you to quickly kill off that spam, but Google has to recrawl those links before they'll apply the disavow, which essentially just turns the links nofollow.

I think your best move, if you're itching to recover with a new domain, is to buy an aged and link-filled domain in the exact same niche and start fresh on it. Go after the same big keywords again there, and keep this site separate from the one that tanked. Then also try to recover the tanked site. If all goes well, you can end up with two sites in the same niche and dominate the winning keywords.

That will give you a path forward while you try to salvage the original, but keep you from spinning your wheels hoping you can recover, and waiting and waiting and waiting. I've done that a bunch and advise not losing time over it by being too attached.
 
Thanks dude - always good to get a second insight into things. The new domain option seems like the best bet to me too.
 
Start fresh - find an aged related domain and start over, 301ing all the old unique content from the old site and ditching the crap.

What's to stop your competitors from spamming the new domain?

I person would disavow the bad links and attempt to regain traction, since you've got good Google history from being in the SERPs.

You might regain traction just through the massive rewrites of the 13 pages. But how are you going to start a brand new domain AND recover at the same time?

If you think about it from a brand perspective you stumbled and now have to recover (small public relation problem) - and after recovering you'll be stronger than before.

If it were me I'd create new content on the old pages, and then do a marketing blitz with guest-posts from industry experts (onto your website), and doing outreach, and lots of traffic leaking and create a holistic marketing campaign.

Did you have an email list (growing user database) when times were good? If so that's perfect and you can start mailing that list for repeat business.

If not - well I can't stress enough how controlling your own destiny means being able to talk to your customers/visitors without a Buffer (search engine or Facebook platform) - an email list is the #1 way to do it. If you were grabbing 3-5% of the visitors' emails daily, your list would be massive within a single year at 45K a day in visits (1800 to 2250 emails a day captured).

You need to flex the other marketing muscles and grow them, so when search takes a hit, you won't be at the mercy of Google and in panic mode. Create multiple channels of traffic coming in, and one channel slowing down isn't detrimental.
 
Back