Why Do Websites Lose Traffic Gradually After A Major Google Update?

What would cause a continued gradual week by week decline in search referral traffic after a major Google algorithm update has supposedly finished? This is a real question I found on the Web, asked by a frustrated Website owner who didn’t find a clear, definitive answer. I’m pretty sure if you ask a Googler, you’ll hear back the usually uninformative “I don’t know; it could depend on a lot of things”. I think that’s a fair answer because it’s honest and omits nothing. It’s just not a very satisfying response, though, is it?

Big corporations like Amazon, Facebook, and Google, who integrate millions of people and Websites into their business processes, rarely spend much time looking into any one individual site’s issues. I’ve walked this path with Amazon. When my partner Randy Ray and I started Reflective Dynamics, Inc. a few years ago we decided to merge our personal affiliate marketing projects into the corporate umbrella. We did that for several reasons, but the bottom line is that it simplified life for us. Except we had to open up a lot of new affiliate accounts, including one with Amazon’s Associates program.

We had to join that program 3 times before we figured out why they could not verify us. It was a silly, simple technical problem but we had so many Websites we couldn’t tell which one was broken. Amazon sent a crawler to check the site and found it was blocked. We asked politely which site was blocked and received the same exact dashboard-generated automated response every time. Having worked in customer support roles in the past, I imagined some frustrated Amazon employee being forced to choose from three options on a screen, none of which would permit a personal reply. Worse, the employees who reviewed our first two applications probably didn’t know which site was blocking the crawler.

*=> Are you curious about what happened? You know you are. Amazon has a funky crawler that comes out of Amazon Web Services. We had to unblock all the spammy IP addresses from AWS to let the crawler come through, and then we had to block all the spammy IP addresses again. Amazon is very popular with people who run rogue crawlers and bad bots. So, once I realized I’d never see the crawler in action, I bit the bullet and endured a few hours of slow server performance while waiting for Amazon to verify our sites.

So Google Changes Its Search System Like 3-5 Times A Day

A picture of a declining traffic graphy.

Are you really losing Google traffic? Your analytics data isn’t telling you the entire story.

On some forums I drive certain people crazy by mentioning this fact every time there is an uptick in Web marketer chatter about supposed “major Google algorithm updates”. There are major Google algorithm updates (although there is no technical explanation for what that means). Google makes massive changes several times a year. For over a decade I have talked about Google’s Spring Cleaning and Fall Flurries, although they can release a major upgrade to the search system (or re-write everything completely) at any time of the year. As a computer programmer who participated in many large-scale projects, I understand why you get only a few big changes per year and lots of little ones all year long. It’s not easy to switch over to a whole new set of instructions (and data). The software people’s job is to make it look easy, but it never is.

Those incremental changes add up over time and they play important roles in determining whose traffic goes up and down. I remember an older, wiser gentleman telling younger me one day, “It’s a recession if it happens to someone else; it’s a depression if it happens to you.” We were, in fact, talking about a recession that had just been announced (it was probably over by the time everyone heard about it). But Barry’s observation was as true of changes in search referral traffic as it was of real economic events. When you see other people online complaining about lost traffic, that’s bad SEO. When it’s you who lost the traffic, you did nothing wrong, it must be a major Google update.

Someone else’s bad SEO = recession.

Your lost traffic due to major Google update = depression.

Assuming your site was directly affected by some major algorithmic update (as confirmed by Google in one of their general purpose “we made large-scale changes” announcements), it could also be affected by all these continuing daily changes just as it was before.

More likely, in my opinion, something else is happening when your traffic declines gradually. It could be something over which you have no control. It could be something you can easily resolve.

Not Every “Major Google Update” Really Is

Google doesn’t confirm every wild speculation about massive updates the Web marketing community produces. There are a small number of individuals who have made a career out of declaring major Google updates on the basis of, say, changes in search referral traffic for two Websites. Honestly, even if you could document lost traffic on 200 Websites, that’s a far cry from showing that Google changed things for the approximately 200 million Websites in their index.

But let’s not quibble over numbers. Instead, let’s talk about popular SEO plugins. I shall not mention any brand names because I know (thanks to my long memory) that several of the most popular SEO plugins have caused Websites to lose traffic. Earlier this year one very popular plugin had a very bad unhappy day and we’re still seeing people talk about how it tanked their Websites. Okay, mistakes happen. And sometimes it takes a while for everyone to realize exactly what did happen.

What we know for sure is that an unknown (but fairly large) number of Websites were hit hard by at least one software bug this year around the time that many people believed an unnanounced Google update had been released. So that complicates things for everyone who consults in SEO, because when people ask for help diagnosing lost traffic one of the first things we experts all do is ask “when did this happen” and then consult a timeline of known and suspected Google updates. Yes, I do it too. That’s just the prudent thing to do.

But I never assume that any changes in traffic must be due to a major Google change. In fact, many times people make changes to their Websites or start new linking campaigns around the times these (supposed) updates occur. How do you determine that it was Google and not you that tanked your traffic? There is no fast, simple easy way to do that. You have to roll up your sleeves and look at everything before picking a plausible explanation.

*=> Plausible explanations are not facts. Once people decide they have a plausible explanation for something happening they begin making critical decisions. If you’re just gabbing online about UFO stories one plausible explanation is as good (or bad) as another. No harm done. But I’ve watched many frustrated Website owners make unnecessary, potentially damaging changes to their sites because they had accepted a plausible explanation as a fact.

Just because an explanation is plausible doesn’t mean it’s correct. In a room full of plausible explanations you will have at most one correct explanation (and often times none of them are correct). Being plausible doesn’t give an explanation any advantages over implausible explanations. And Occam’s Razor won’t help you out because most people don’t even know what Occam’s Razor is (it’s NOT “the simplest explanation is usually the correct one”).

A plausible explanation is a hypothesis. Treat it as such. Try to break it. Disprove it in any way you can. Once you’ve proven a plausible explanation wrong, you move on to the next. When you have exhausted all means of disproving the plausible explanation it’s still just a hypothesis. But at this point you can begin making some critical decisions.

The takeaway here is simple: Never assume it was a Google algorithm change.

Maybe there was an algorithm change, but that doesn’t mean it affected your site in a bad way.

Algorithmic Changes Produce “Aftershocks” In The Data

I usually refer to these things as “cascade events”. But a single change to any part of a search engine’s system could potentially have multiple residual and/or echo effects on an individual Website (or many Websites). Here are a couple of illustrative examples:

*=> Poisoned PageRank Spreads Across The Web If you’ve been following Google’s evolution of PageRank for many years then you know that they used to calculate all the PageRank at once and then upload it to the search system. Now they use a real-time methodology that doesn’t update every site’s PageRank immediately (or at least not in the same original “Google Dancey” way).

Since 2011 we have documented a number of cases where changes in Website backlink profiles took weeks or months to fully “shake out”. I’m not going to share the details of the experiments, but think of the telephone game, where you whisper a message into someone’s ear and then they whisper the message into the next person’s ear.

Gradually the message is passed down the line but by the time it reaches the last person it’s usually morphed into something ridiculous. PageRank flows from document to document in much the same way. It’s altered at each stage, either by filters that dampen the value or by different allocations of incoming PageRank or by different allocations of outflowing PageRank. This happens continually across the index as Google updates it.

So the page that sent you X PageRank yesterday may no longer send you any PageRank. Maybe the link was dropped from the page. Maybe the page was expired from the index and hasn’t been recrawled yet. Maybe the page violated some guideline and the filters prevent it from passing PageRank to your site. Maybe you disavowed a good link.

There are many reasons why a link that was good and helpful yesterday may no longer help you today.

Whether Google uses a “negative PageRank” valuation is irrelevant. PageRank poison is whatever causes less PageRank-like value to flow to your page from the rest of the Web.

*=> It’s Not You, It’s The Guy Linking To You So if you lose PageRank, does that mean you did something wrong? Assuming you don’t create spammy links you probably did nothing wrong. But your upstream PageRank sources could screw you out of a lot of value in many ways. Here are a few examples:

  • They add “nofollow” to all outbound links
  • They remove old outbound links
  • They remove all outbound links
  • They disavow their own backlink profiles
  • They receive a manual action
  • Their backlink profiles are algorithmically filtered
  • They sell their site and the new owners replace the links
  • The sites go offline

Are you exhausted? I could go on like this all day. There are so many ways your Website can lose PageRank-like value through no fault of your own, you will never be aware of just how much value comes and goes in your backlink profile.

There Could Be Other Types Of Aftershocks, Too

If you run a large site and Google rolls out a massive new update, it may take them weeks or months to recrawl and re-evaluate all the pages on your site. The “major update” could entail several changes that affect every page on your site. They could change how they compute PageRank-like value, how they decide which content to index, how they index the content, and how one page on your site helps or hurts other pages on your site.

Any small change, such as when they add a new document classifier to their system, could gradually change how your site performs in the search results. A document classifier is a computer program, an “algorithm”, that performs a specific task. Think of it as one little soldier in a long line of soldiers marching down the road. Every soldier has to pick up a rock and examine it. Some rocks are examined by many soldiers one after the other. Other rocks are examined by only a few soldiers. Each soldier is looking for something specific, although two or more soldiers could be looking for similar things.

A document classifier might set a flag that tells the search system to run another document classifier, to ignore a Web document or its contents, or to send an alert to a human. A document classifier might compute some sort of Information Retrieval score that is combined with other scores when the document is served up in a search result.

It takes time for a search engine to process billions upon billions of Web documents. So even though “the change went live today”, it may be weeks or months before the change gets to that one critical page that you think is doing great.

In other words, while everyone is complaining about the latest (supposed) Google algorithm change, your lost traffic could be the result of the last algorithm change. You have no way of knowing for sure just by looking at when you lost your traffic.

How Do You Teach A Learning Algorithm, Anyway?

This is something that has confused people since 2011, when Google released the Panda algorithm. At the time Google engineers had been using machine learning systems for years, but Panda was the Great Mother of All Machine Learning Search Engine Updates. For several months Google engineers watched a very lengthy discussion where they asked people to share URLs for sites they (the Webmaster community, the site owners) felt had been unjustly hurt by the Panda algorithm.

After a few months Google ran the algorithm again and released an update. And then a month or two later they did it again. And for several years Google intermittently released updates to the Panda algorithm – or, technically, they rescored the Web (if I may put it that way for convenience).

So what changed? The algorithm itself was used to determine signals that could be used to separate the Web index into “high quality sites” and “low quality sites”. The signals had to be integrated into document classifiers. That would take time. And then the Web had to be recrawled and re-evaluated against the new signals. In this way, some sites that suffered early on were released from the Panda Prison and some sites that had escaped Google’s Judgment earlier were finally caught and punished.

The thing about learning algorithms is that they must analyze data. The classic learning algorithm model requires a “training set”, where human engineers divide the data into “desired” and “undesired”. The learning algorithm then looks for ways to match that training set as closely as possible.

The data Google and other search engines that use learning algorithms require comes from one place: the Web. So every time a Googler says “we’re using a new learning algorithm”, that means they ran a lot of data through a lot of hoops before settling on some solution. They don’t always explain what the specific problem was.

*=> The RankBrain Algorithm Doesn’t Judge Websites Even today I still see Web marketers “explaining” how their sites were hit by RankBrain (or, worse, “how to optimize for RankBrain”). The RankBrain algorithm, according to Google when they announced it, evaluated queries people typed in to the search engine. RankBrain compared those queries to previously entered (stored) queries for which results had already been found. About 30% of the time, RankBrain suggested substituting an older query for a new query and the query resolve system used RankBrain’s suggestion. Google doesn’t talk about RankBrain any more so we don’t even know if they still use it.

But if that’s all RankBrain does, then why was it so important? I’d say it reduced their processing time by millions of CPU seconds every day. RankBrain probably reduced Google’s query resolving costs by suggesting that previously determined result sets be used. RankBrain made Google faster, although I would say they sacrificed a little bit of quality to achieve that performance improvement. Many time over the past few years I have had to rewrite queries with ridiculous exception rules because Google insists on showing me the same results over and over.

*=> A Learning Algorithm Needs Time To Learn, But You Won’t See It While it may be possible that Google is using real-time learning algorithms (I’ve read a few patent applications that suggest they have developed some real-time learning systems), they usually do the “learning” stuff offline, in the background, where you and I don’t see it. They’ll run experiments in the search results and collect data but we’re most likely NOT privy to the algorithmic learning process.

Still, the Web changes every day. People add new content, delete content, and change content. That includes all the HTML code that is used to present pages to users and crawlers. That includes all the widgets and funky Javascript code you run into. That includes everything, every image, every video, every detail.

Imagine how much data a learning algorithm must process just to learn something, and yet a substantial part of that data is constantly changing. You may change your site from “good to bad” or “bad to good” by some older algorithms measure and never know it.

Last Words About Why Traffic Continues To Drop Week After Week

It’s not so easy to explain something that is so frustratingly simple to document. All you have to do is look at an analytics chart. But I saved the best for last. Sometimes the search referrals don’t drop off. Sometimes your analytics code stops working. It’s always a good idea to compare your analytics data to at least 2 other sources.

Most people I know only use Google Analytics. You are Ood holding your brains in your hands when you do that. You’re very vulnerable to any single threat to your data collection. That could include a breakdown in a plugin’s functionality, or any random mistake you make when updating your site.

When you use server-side analytics (such as Google Search Console and Bing Webmaster Tools reports) you at least see consistent data. You will see a lot of data that Google Analytics doesn’t collect. I’m not a fan of Google Analytics, but it’s no worse than any other remotely hosted, third-party analytics package in this respect. If you don’t validate your analytics data you may be all in a panic over “lost traffic” that wasn’t really lost.

One question I frequently ask people who worry about lost rankings is, “How are your sales doing?” If you are not actually losing money (based on how much you believe you should be earning), then you probably don’t have to worry about whether Google just released a major update.

But it would be a good idea to audit your analytics data collection, just in case something broke somewhere.

References

If you’re looking for advice on how to diagnose a Website that is losing traffic, try these articles.

If, after reading these articles you’re still feeling lost, my partner Randy Ray and offer Website auditing and consulting.

Follow SEO Theory

A confirmation email will be sent to new subscribers AND unsubscribers. Please look for it!