Message us

All great projects start with a conversation.

Thank you. We received your message and will get back to you soon!

Browse our showcase

or Send another message?

SEO

How we overturned a Penguin penalty

On May 22nd this year, Google unleashed its "Penguin 2.0" algorithm update, a change that was designed to punish anyone who had links from what Google considers to be 'dodgy' websites.

Or links that had excessive amounts of anchor text. In short, it decided to punish anyone who had done SEO pre-2005.

We were faced with one client who had done old-school SEO, back in the days when Google rewarded websites for such activities, and had long since stopped. However, the footprint was there - hundreds of old links, all with exact-match anchor text, that had been devalued for years - and they were now worth negative points.

Rankings suffered as a result. The client’s leading keyphrase dropped from number 3 to number 57, and the trophy keyword dropped out of the top 100 altogether. Punishment for crimes in a previous SEO life.

Today, though, those keywords are back, and are rising almost every day. New keywords join them, and even that trophy keyword which had dropped out of the top 100, has resurfaced.

How we got them back

1) Redressing the balance

Penguin has made everyone suspicious of links, but I tend to follow Eric Ward’s advice, in that Penguin has made good links even stronger. When we build links, we look at three things:

  • Authority
  • Context
  • Trust

Our strategy had always been to ensure that the only links we obtained were from contextually relevant websites, who published regular, insightful content, and who had an audience.

If this client had bad links, one way of tackling Penguin would be to play with the thresholds that Google creates in order to determine whether someone is spammy or not. For example, if Google sees you have 25% of spammy links, and the threshold is a theoretical 20%, we need to get that proportion down to 19% or lower by getting more high-quality links.

However, high-quality links are much harder to get, and require more investment of time. Redressing the balance is always going to be hard when you’re faced with a client whose backlink profile stretches back into the days of directories and article marketing.

2) Contacting the owner of every site

Another reason to be disappointed in Google, then. Instead of focusing on “the good stuff” - i.e. writing new content, providing thought leadership guest posts to contextually relevant bloggers, creating infographics, etc., we had to focus on rolling up our sleeves and contacting the webmaster of every single site and asking them to remove the link.

First of all, we had to determine what was a bad link. LinkDetox did much of the hard work. It highlighted the ‘spammy’ links and categorised them for us, so that we could see exactly why they may be harming our client. Equally, we could go through the list and discount links at a glance, and then export the list of sites.

Secondly, we had to find the contact details for every single website. Sometimes, it’s not possible - some websites have long since been abandoned. Some webmasters are irate at the very suggestion their website is ‘low quality’ or ‘harmful’, and some webmasters are only too happy to help.

Some even took their whole website down. Wow.

3) Disavowing all the bad links

I don’t believe this has any effect, but Google suggests you do it, so we did it. We exported all of our ‘spammy links’ and sent it to Google, effectively saying that we don’t vouch for any of them.

If anything, this is Clever Little Design doing Google’s work, fetching a list of spammy links and saying “here you go, we’ve found these for you.” I believe that we could have overturned this algorithmic penalty without submitting a disavow file.

4) Contacting Google themselves

Now, there was no manual penalty, but there’s no harm in submitting a reconsideration request, at least to get someone in Google to look at the site and reply to you.

So we contacted Google, and we got a reply. It was a boilerplate reply, yes, but it stated that there was no manual penalty and perhaps we’d been doing something against Google’s guidelines, who knows, maybe we should have a look? Yes, it was a cut-and-paste job, but ironically, the rankings started to move upwards on the day that Google sent us that e-mail.

*****

This wasn’t a huge penalty. Overall, keywords dropped an average of 10 positions. Those with a high volume of exact-match anchor text links dropped more. Some actually increased. The recovery was protracted, slow, and at times, barely visible, as Positionly’s graph shows.

So, the takeaway here is that the site wasn’t hit by a site-wide penalty, it was a targeted, algorithmic penalty based on:

  • a high volume of directory & article directory links to certain pages
  • a high volume of exact-match anchor text links to certain pages

To reverse the penalty - and the ranking drop - the most effective strategy was to have the links removed, reducing the number of overall backlinks, and thereby increasing the percentage of strong, contextually relevant and authoritative editorial backlinks that we had worked hard on creating.

For all its complexity, Google is actually quite simple - it’s playing with thresholds, and once you know what those thresholds represent, you can make some kind of progress. The shame is that you have to focus on removing the bad stuff, when you could be doing more of creating the good stuff.

The positive? If you have some good stuff, then you stand a chance of recovering. If not, get cracking.