How to Recover from Google Algorithm Updates

Google updates its search system 10-15 times a week.  Bing updates its search system maybe 10-20 times per month.  Unfortunately Bing is less open about how its search engine works, but we can be sure of one thing: Bing does not work exactly like Google.  Keep that in mind because a widely held belief among search marketers is that “what works for Google works for Bing”, and that is just plain wrong.  Some of what works for Google works for Bing.  And some things that work for Bing won’t work for Google.  Worse yet, some things Google likes hurt your performance in Bing and some things Bing likes hurt your performance in Google.

How to recover from page quality filters & penalties and link quality filters & penalties.

How to recover from page quality filters & penalties and link quality filters & penalties.

But let’s stick to the algorithm updates.  What exactly changes with an “algorithm update”?  Unfortunately, this is another topic where Web marketers remain confused.  Technically, an algorithm is just a set of rules that one or more programs (or routines or procedures within a program) follow or execute.  A large, complicated search system such as those run by Baidu, Bing, Google, and Yandex consists of many algorithms numbering in the hundreds or thousands.  Some algorithms will be used in more than one task.  Some tasks will use many algorithms.

When marketers speak of “Google algorithm updates” they mean they see changes in Google’s search results.

Just because YOU see changes in Google’s search results doesn’t mean everyone else does.  And given how many changes Google makes to the search system every week you and I could see different types of changes that are completely unrelated to each other.  But here are a few examples of algorithmic changes that affect most or all of Google’s searchers:

  • Google Panda (the page quality filter introduced in February 2011)
  • Google Penguin (the link quality filter introduced in April 2012)
  • Google “Albert” (my name for the October 2017 “search location” change)

Many things that people call “algorithms” are not algorithms.  They are third-party euphemisms for “I think I see something”, nothing more.  If someone claims to have identified a Google update they are smoking crack.  If you believe their claims YOU are smoking crack.  You and I cannot see individual Google algorithm updates.  Your best bud does not see Google algorithm updates.  Whatever you see or think you see, it’s not a Google algorithm update.

We might occasionally see “tests”.  Google freely admits that it tests all sorts of things.  Most people who spot Google tests are identifying changes in the search results presentation.  Maybe Google is including some more data, maybe it’s dropping some data.  These tests don’t mean Google is rolling out a massive update to the search system.  These tests only imply that Google is collecting data about how proposed changes to the search system may work.

So How Do We Know an “Algorithm” Was Changed or Added to the Search System?

There is only one reliable method for knowing that an algorithm has been changed, added, or deleted: Google says so.  Anything else is just someone blowing crack smoke in your face.

Lacking any confirmation from Google about new or modified algorithms, we can only assume that whatever they are changing “under the hood” is affecting a lot of users.  That doesn’t mean those changes are affecting your site(s).

Some Examples of Non-algorithmic Things Google Can Change in the Search Results

Not to put too fine a point on it, but no computer algorithm is very useful without a few other things to help it be useful to people.  Every algorithm needs data, input channels from which it gets the data, and output channels where it sends whatever data it releases.  The search results are the output channels for Bing and Google’s data.  Their crawlers are input channels.  And, of course, the Web browser is an input channel through which the search engines receive queries and user actions.

Here are a few things that can change search results without requiring algorithm changes:

  • New content is added to the indexable Web
  • Old content is removed from the indexable Web
  • Indexable links are added to the Web
  • Indexable links are removed from the Web
  • Searchers begin using new queries
  • Searchers stop using old queries
  • Searchers begin clicking on new results for older queries
  • Searchers stop clicking on older results for older queries
  • New HTML markup is added to previously indexed content
  • HTML markup is removed from previously indexed content
  • Crawl queues are reprioritized or rebuilt (re-initialized)
  • Web servers or data centers go offline
  • Web servers or data centers are upgraded
  • Searchers begin using new devices
  • Searchers stop using old devices
  • Search engines add new types of listings (features) to search results
  • Search engines remove some types of listings from search results
  • Search engines reformat some types of listings in search results

You don’t need to change search algorithms to change the search results.  They are in constant flux because the Web is in constant flux and the search engines have no control over that changefulness.

This lengthy preamble is important because most people who think in terms of “there was an algorithm change” are overlooking the constant change of non-algorithmic influences on search results.  A lot of things are changing every day, every hour.  All of these changes combine to create significant differences between last year’s search results and this year’s search results.  Yes, the search engines are adding, modifying, and removing algorithms but all this other activity makes it completely impossible for you and your favorite marketing bloggers to know what is changing.

Anyone who says, “X has changed in Google” without confirmation from Google is smoking crack and blowing smoke in your eyes.

And so now let’s take a look at what it takes to recover from a Google Algorithm Update, because that is why everyone is reading this article.

Step 1: Confirm There was an Algorithm Update

This is clearly not as easy as it once was.  For a while Google experimented with telling us on a monthly basis about all the changes they were making.  Much to their chagrin they found that Web marketers read way too much into those disclosures.

And for a few years Google told us when they made significant changes to the way the Google Panda algorithm and the Google Penguin algorithm work.  But, again, marketers read way too much into these disclosures.

So now Google may gradually confirm there was some sort of significant change in retrospect but most of the time they deny the wild guesses coming from search marketers.  Search engineers have no reason to lie about when major algorithms are released.  They don’t have to say anything at all.  If they don’t say anything that might be a subtle confirmation of something, but it probably doesn’t mean what you think it should.  On the other hand, if the search engine guys are denying that anything major happened then most likely nothing major happened.

That doesn’t mean several minor or moderate changes didn’t happen.  It just means there was nothing monumental on the scale of a Panda or Penguin algorithm release.

Key takeaway: Assume nothing until the search engine says something happened.

Believe it or not, Google still announces many changes to their search results.  You just don’t think of these changes as being significant because they have no effect on your search referral data.  For example, the Google Albert Update (see “Making search results more local and relevant”, Oct. 27, 2017) doesn’t really change your “rankings”.  It just changes where searcher queries are resolved.  For some marketers this means they’ll see changes in traffic but for most marketers the changes will probably be minimal.  Those people, if they don’t follow Google’s blog, won’t even be aware that an algorithm change occurred (unless they read news stories about it).

The “algorithm” that changed was the one that determines where the user’s query is processed.

Step 2: Confirm That YOUR Search Referral Traffic was Affected

Assuming we have confirmed that a real Bing or Google algorithm was added, modified, or deleted from the search system, marketers still make the mistake of looking at search visibility reports.  These “ranking reports” are useless fluff.  If you really want to track rankings then limit your rank reporting to whatever data Bing and Google show you in their Webmaster consoles.

Search visibility reports are horribly inaccurate.  If you are not pulling real search referral data from Webmaster dashboards, get out of the business.  That’s basic SEO.  Everyone should be doing it.

In other words, ignore every SEO blog and conference presentation that uses a chart that purports to show you “keyword rankings”, “rankings”, and search visibility.  You should be asking why people are not showing you real search referral data.  It’s not like they cannot get it.

If you are not seeing any changes in your search referral data (from the search engine dashboards, not from third-party SEO tools) then you’re good.  You don’t need to be upset.  If you think those “ranking reports” mean anything when they don’t agree with your search referral data, you’re smoking crack.

Key takeaway: Stop looking at search visibility reports for “keywords”.  Focus on search referral traffic reports.

But What If Your Search Referral Data DOES Change?

It still might not be connected to the confirmed, verified search algorithm update.  About half of the “I lost my traffic!” situations we investigate turn out to be normal seasonal downturns.

If you don’t have at least two years’ worth of search referral data (three years is better) you may mistake a normal seasonal change in traffic for the effects of a confirmed, verified search algorithm update.  You should proceed cautiously because you could seriously hurt your site’s search optimization for one or more search engines if you jump to the wrong conclusion.

Sometimes marketers change their sites and panic when they think they see search algorithm updates.  If we have confirmed and verified that an algorithm update occurred, we still need to rule out any likely effects from recent changes to the site.  A quick audit is in order.  Make sure nothing has been broken in terms of navigation and page structure.  If page titles have been removed or modified, if page Hx section headings have been removed or modified, if internal links have been removed or modified in any way, all bets are off.

Key takeaway: Confirm you didn’t shoot yourself in the foot before you decide the search engine just blew your site out of the water.

It’s easier to fix a broken site when you clearly see you broke it.

Step 3: Compare Notes with Other Sites that Have Been Affected

This is the nightmare scenario for all Web marketers.  Because we are an industry without any real standards, you cannot trust other Web marketers’ conclusions about anything.  You cannot trust their claims of seeing algorithm updates.  You cannot trust their claims of being affected by confirmed, verified algorithmic changes.

When a search engine really does modify its search system in a big way, you are almost completely on your own.  Whatever your best forum friends say about the algorithm will generally, typically, almost assuredly be wrong for 1-5 weeks.

Even so, you need to compare notes with EVERYONE who believes they are affected.

You are looking for patterns of common things

You’re not just looking for changes in search referral traffic.  People losing or gaining search referral traffic immediately after a confirmed, verified search system update don’t tell you anything useful.  What you need to know are things like:

  • What kind of search referral traffic did they lose?
  • How similar are their sites’ topics to your sites?
  • How similar are their linking practices to your linking practices?
  • How similar are their site designs to your site’s design?

If you cannot find at least three sites very similar to your own that are clearly suffering from the latest confirmed, verified algorithm update you’re not going to get anything actionable from what other people are doing.  Everyone will be saying, “I changed this, I tried that” and they’ll complain and celebrate and add more confusion to the communal discussion.  Just following along with the crowd because some idiot says the update was X or Y only puts you further away from figuring out what you need to do.

Throughout past major updates there have always been “statistical outlier” sites that either come back quickly or don’t respond to any kinds of attempted fixes.  They may not have the problems you think they have, or they could have more problems than your site does.

You need to find sites that are in as similar a situation to your own as possible, and you need to be able to learn from their changes in search referral traffic.  No, you cannot rely on third-party SEO tools that estimate search traffic by keyword.

Step 4: Qualify the Changes Announced by the Search Engine

Major search algorithm changes fall into three main categories of behavior:

    1. They change the search user experience
    2. They change how content is assessed
    3. They change how links are assessed

If the search engine says it’s a “content quality change” and 99% of the Web marketers you follow on social media are talking about backlinks and disavow files then, I’m sorry to say, 99% of the people you follow on social media are idiots.  This is usually the way it goes.  Search engine says “X” and Web marketers say “Y”.  It will take the Web marketers 1-2 to years to get their bearings.  Some of them will never figure it out because they read the same wrong ideas over and over again until they become convinced the misinformation is correct even though reality tells them otherwise.

If the search engine says it’s a “link quality change” then you need to compare your link acquisition practices to what the search engine is talking about.  Falling back on standard link building or link disavowing advice isn’t going to help you.  Through the years Google has shared many, many examples of the kinds of links they object to.  And yet when people share the kinds of links they disavow it turns out that most Web marketers are disavowing the wrong kinds of links.

When all else fails, focus on what little information the search engine has shared and tune out all the speculation and complaining that the Web marketing community is sharing.  If you cannot relate what you have done with your site and backlinks to what the search engine is telling you, go back to Step 1 above and start over.  Maybe this update has nothing to do with you and something else is going on.

Step 5: Design a “Shroud of Turin” Experiment

This is the hardest part of the process.  You have confirmed there is an update, your altered traffic data matches the pattern of other sites whose traffic changed, and you’ve been able to confirm those sites are similar to yours.  So now you have to start experimenting with changes.  Unless you’re very confident in guessing what broke (and your confidence should be backed by a history of success), you need to test your ideas in small experiments.

When scientists were given unprecedented access to the Shroud of Turin they tore very small strips of cloth from the hems of the shroud and conducted the majority of experiments on those small strips.  This is what I mean by “Shroud of Turin” experiment.  You want to do as little potential damage as possible while triggering as much positive change as possible.

Content Quality Experiments Look Like This

These are just a few examples of the kinds of content quality experiments that have worked for us in the past.

  • Change the sitewide navigation
  • Change the sitewide advertising
  • Change the page design / layout
  • Change the article structure
  • Rewrite articles in a new narrative style

There is no guarantee any of these kinds of experimental changes will work, but if a page quality filter is affecting your site’s ability to perform it won’t have anything to do with:

  • Pogosticking (users clicking from SERP to page and back)
  • Time on page
  • Length of content

Link Quality Experiments Look Like This

If you have been (directly or indirectly) buying links, publishing guest posts, using private blog networks, or setting up reciprocal link pages then you could try removing or disavowing those links.  If they are passing a negative value to your site then getting rid of spammy links may help you recover your search referral traffic in a few weeks or months.

If you are not actively “building” links then maybe someone targeted you for a negative SEO campaign.  However, if the links you think are bad are not like the kinds of links you would directly or indirectly buy, acquire through guest posts, get from private blog networks, etc. then you’re probably mistaking perfectly innocent automated links for a negative SEO attack.

Aggressive linking is accomplished through very simple, easy channels.  If you cannot easily replicate the links you find for a site then the chances are good that no one else can, either.  And in that case the links are almost certainly NOT spammy and NOT part of a negative SEO attack.

It is better to disavow links you believe may have been identified as spam and see what happens.  If your search referral traffic begins recovering you may be on the right track.  If not, then remove the disavows and try something else.

How to Recover from a Google Panda Penalty

It’s not really a “Google Panda penalty” but many people still think of it as a penalty.  We don’t have to split hairs here.  If you think it’s a “Google Panda penalty” then fine, it’s a Google Panda penalty.  In terms of effect, the Google Panda score or downgrade hurts your traffic as if it were a “manual action” type of penalty.  The difference is that with Panda you have to fix whatever is wrong; with a manual action you have to fix whatever is wrong and file a reconsideration request.

Assuming your site was hit by a Google Panda filter, how would you know?  It’s not like Google has released some tool that says, “This page received a negative score from the Panda algorithm”.  Many marketers have asked Google for such a tool.  And believe it or not, Google really does have an internal dashboard that shows their engineers which pages have been downgraded by Panda.  They may not know exactly why Panda disliked the page, but they know which pages were hurt.  Or they could know what led to the Panda downgrade.

Since you cannot see what Googlers can see you have to shoot in the dark.  But here are a few things that have generally worked for us in the past:

  • Remove page clutter (ads, widgets, things that break up or obscure real content)
  • Clean up sitewide navigation (less is better)
  • Remove PageRank sculpting (always a dumb idea, PR sculpting doesn’t help)
  • Improve page layout to present ideas more clearly and thoughtfully
  • Rewrite poorly written articles

I don’t like the last option because it’s the weakest approach.  Google’s quality algorithms reward many poorly written articles on Wiki sites so basic grammar and idiom tend NOT to be the issues with Panda.  They might be, but it’s hard to justify betting everything on a rewrite of your content.

We have found, however, that some “old school” content written prior to 2011 worked better after it was rewritten to provide more information, to highlight important points with more section headers, and to be more “readable” although not necessarily more “scannable”.

If you’re convinced a content quality filter is hurting your site, look at the top-performing sites in the queries you KNOW send traffic (from your own data); ask yourself how those sites present their information compared to yours.  What are they doing that is more user-friendly?

A high quality document is easy to read, provides good information, and doesn’t create unnecessary friction between the reader and the content.

How to Recover from a Google Penguin Penalty

The Penguin algorithm was developed to replace or complement Google’s manual link penalties.  In other words, after Google’s Web spam team manually delisted or penalized thousands of link building blogs from March 2012 to April 2012 (the so-called “Google Blogacalypse”), they apparently took what they learned from identifying all those blog networks and used it to create an algorithm that identified probable link spam.

The Google Penguin 1.0 Penalty was applied by the algorithm only to the home pages of sites.  We didn’t learn that much for at least a year, but at the time “home page backlinks” were still all the rage.  It makes sense that the Penguin 1.0 algorithm only looked at home pages for spammy links.  They were easy to identify and most people who relied on blog networks were counting on the links that were found on the home pages.

Subsequent iterations of Penguin looked deeper into the sites for those spammy links.  But Penguin continued to focus on “blog networks”.  That makes sense because most if not all of those spammy link building blogs were publishing their posts as independent pages.  The spammers who survived the Penguin 1.0 penalty probably had links that had already scrolled off the front pages of the blogs.  Their links had sunk deeper into the sites.

Penguin is now “baked into” Google’s core search system.  That means it’s running on autopilot.  Google’s Web spam team is confident that Penguin is going to spot the kinds of self-placed links that spammy Web marketers are most likely depending on.  At best that means the links won’t pass any positive value to your site.  At worst it means the links are passing enough negative value that your site is hurting.

The preferred solution for dealing with spammy self-placed links, especially those built by third-parties like “SEO agencies”, is to disavow them.  But as I have mentioned before, Web marketers have proven to be really bad at choosing which links to disavow.

Your rule of thumb should be, “If I cannot create a link like that then I don’t need to disavow it.”

Spammy links are most likely to appear in:

  • Guest blog post biographies
  • Guest blog post bylines
  • Guest blog post body content
  • Blog comments
  • Forum comments

Some of these links are made by hand.  Some of these links are made by software.  But they all favor the same kinds of pages: the kinds of pages that YOU can create at will and where YOU have complete control over what is linked to and with what kind of anchor text.

Although building tens of thousands of comment links that use “rel=’nofollow’” might get you into trouble, if the backlinks you’re finding have the “nofollow” attribute then they are probably NOT hurting.  After all, most major social media sites where people regularly place their own links (sometimes thousands per day) use the “nofollow” attribute.

A manual Web spam penalty may require that you physically remove the links or the linking content.  It will at the very least require that you file a reconsideration request.

Although the Penguin algorithm doesn’t really penalize your site (it simply tells the rest of Google’s system to ignore the link), if you were benefiting from these spammy links before then being hit by Penguin feels like a penalty.

Prior to Google Penguin 4.0 the negative value WAS passed to your site.  So people who are concerned about a Google Penguin penalty are almost certainly reading older SEO articles and presentations that don’t distinguish between pre-Penguin 4.0 behavior and post Penguin 4.0 behavior.  The Google Penguin algorithm is now a link filter that prevents links from passing value to your site.

What is beautiful about Penguin 4.0 is that it allows you to create links again that can really send traffic to your site without worrying about whether you’re violating Google’s guidelines.  In other words, ask for or put a “rel=’nofollow’” attribute on links you create because if Penguin decides they are spammy it will treat them as if they are nofollowed.

It’s better to think in terms of which kinds of nofollowed links will send you traffic than in terms of which links will slip past the Penguin algorithm.

Concluding How to Recover from a Google Penalty

The basic process hasn’t really changed in 20 years.  If you’re violating a search engine’s guidelines, stop violating the guidelines.

Learning how to be more creative with your marketing is the only real long-term solution to getting around the limits that search engines place on our promotional efforts.

Just keep in mind that the social media companies are better at identifying and removing spam than they used to be, too.  All of the major social media platforms now function as search engines and like Bing and Google companies like Twitter, Facebook, and Pinterest have learned the hard way that allowing Web marketers free rein harms the quality of their users’ experiences.

If you allow greed to shape your marketing priorities you will forever find yourself bemoaning the latest quality algorithm updates.  Just because someone discovers a method for acquiring links that hasn’t been outlawed doesn’t mean it won’t be.  The more people who pile on a “great” marketing tip the more likely it will become tomorrow’s spam, today.

Follow SEO Theory

A confirmation email will be sent to new subscribers AND unsubscribers. Please look for it!