7 Ways You Do SEO Wrong

There are many discussions about search engine optimization on social media: you’ll find them on Facebook, LinkedIn, Twitter, and more traditional Web forums.  The most popular SEO bloggers are essentially running discussion forums in their comment sections, too.  It’s easy to find the worst misinformation being shared in the Web marketing community.  Virtually every popular idea that is shared over and over again is wrong.  These claims have been repeatedly debunked by numerous SEO bloggers and conference presenters, by search engine employees, and in some cases even by the news media.  And yet Web marketers continue to share bad advice openly, freely, and matter-of-factly.

7 Ways You Do SEO Wrong: Why popular, common SEO tips are wrong and what you should do instead.

7 Ways You Do SEO Wrong: Why popular, common SEO tips are wrong and what you should do instead.

What’s worse is that people who are well and widely respected will do their parts to debunk some of the nonsense only to turn around and share other nonsense.  It is hard to keep up with all the changes in digital marketing.  Every month I take some of SEO Theory’s old content offline because it’s outdated.  My autoshares have not gone without criticism.  But while I don’t excuse my shameless autosharing of older, outdated material, that’s not as bad as people today who should know better continuing to spread false information.

It isn’t that they haven’t heard the facts.  It is that they have rejected the facts in favor of SEO mythologies.  A “myth” is not necessarily untrue, but SEO mythologies are largely built on outdated information (because we constantly share it) and really bad analysis.  Bad analysis is the hallmark of the most popular SEO “thought leaders”.  They are good at presenting their ideas and terrible at vetting them.  If this industry had acceptable standards these people would be laughed off the stages of their own conferences.

Let’s take a look at some of the most egregious mistakes that a lot of you are making.

1. You Test Mobile PageSpeed via Desktop Connections

Everyone with any skin in the SEO game seems to have gotten the memo about how important mobile optimization is.  There are still plenty of Websites that are not mobile friendly, but that’s an entirely different issue.  It’s all but impossible to find a Web marketing meeting, conference, or presentation that doesn’t at least make a statement (if not devoting an entire presentation, panel, or track) to advising everyone to get up to speed on mobile optimization.  So we’re good on the preamble.  But the devil is in the details, and one of the details that appears to have eluded the majority of Web marketers is a very important one.  I say “majority” because so far I seem to be the only person talking about this.

You have to test your mobile pages at MOBILE SPEEDS.  What is a “mobile speed”?  It’s a broken 4G connection that downgrades to a broken 3G connection that downgrades to a broken 2G connection that downgrades to a broken 1G connection.

Over 80% of the mobile audience across the globe is still connecting to your mobile sites at no better than 3G connections.

So when you submit your URLs to “page speed” test tools, do they give you the option of testing at mobile speeds?  If not, then what speeds are you testing at?

Worse, do you just hit [CTRL] + [SHIFT] + [I] to see what your pages look like in mobile mode?  Are your developers NOT using multiple smartphones over cheap Wi-Fi connections to see what happens?

Simulation is killing your mobile testing.  You need to get out there in the real world where people actually use their smart phones and test your sites THERE.  At the very least, set your testing tools to ONLY test page speeds at 3G.  Yes, some people get 4G connections.

A steady 4G connection is a few seconds faster a steady 3G connection.  That “3 second rule” you keep shooting for?  It’s 14 seconds on 4G.  It’s 19 seconds on 3G.

What You Should Do: There are several page speed testing tools that allow you to set specifications to 3G and other connection types.  I don’t want to recommend one over others.  Look at your preferred tools and see if they give you the option of clocking downloads at true mobile speeds, or if their reports at least provide that information.  Desktop download speeds of 3-5 seconds usually translate into 15-20 second 3G downloads, but don’t make assumptions.  Test at the right speeds.

2. You “Speed Up” Web Pages by Slowing Down Client Connections

Too many people are still talking about using prefetch to “speed up” Websites.  Now, there are different kinds of prefetches.  DNS prefetching is not so bad unless you’re linking to 100 or more hosts on a page.  Every time you instruct a mobile browser to prefetch something, anything, you are using the mobile user’s bandwidth without their permission.  That costs people money.  Worse, it slows down their Internet connections.  Why are you DELIBERATELY slowing down your users’ connection speeds?

It doesn’t matter how many page speed testing tools tell you to implement prefetching.  They are wrong to do so.  Worse, page prefetches won’t work with HTTPS URLs so why are you adding useless bytes to the page?

You live in a world where 4 kilobytes really matters.  Sure, your server has unlimited bandwidth but your mobile users do not.  Even if their cellular plans provide something called “unlimited bandwidth” you should know by now what happens when 21 people crowd into a 20-user Wi-Fi space.  There is no such thing as “unlimited bandwidth” in the mobile world.

Another problem that has only just recently surfaced (and so most of you haven’t even heard this once) is that HTTP/2 runs SLOWER over broken 4G and broken 3G connections than HTTP/1.1.  Well, that sucks, doesn’t it?  So you convinced your company to invest in HTTPS and HTTP/2 connections because they are so much faster on the desktop but meanwhile people standing in shopping centers around the world cannot download your huge pages.

When it comes to managing site speed there are two components to the speed measurement:

  1. Speed of server
  2. Speed of client (mobile device, desktop, etc.)

You can speed up the server.  You can only slow down the client.  That really sucks but it’s reality.  No matter how many grandiose SEO experts tell you otherwise, you cannot speed up the user’s mobile experience.  All you can do is TRANSMIT LESS DATA and hope the smart phone gets it all before the connection craps out.  Your goal as an SEO specialist is to figure out:

  • How to slow down clients less than other sites do
  • How NOT to suck over really slow connections

Technically, these responsibilities should fall under Web design and development but the SEO community picked up the “put mobile first” banner so the SEO community needs to understand what it is advising people to do.  You’re hurting the user experience with bad advice.

What You Should Do: Practice surfing the Web without the benefit of your in-home/office Wi-Fi network.  Watch how long it takes pages to load from a variety of sites when you are running on a 3G connection.  Most of the world will be stuck on 3G for several more years.  Then they will be stuck on 4G for years after that.  5G is coming but it will be a long time getting here.

3. You Mistake “Crawl Budget” for “Crawl Management”

Search engines set and manage “crawl budget”.  You just need to manage crawl on the Website.  These are two completely different concepts.  Even this week I saw at least one well-known, widely respected, knowledgeable SEO specialist talk about “crawl budget” on social media.

We should not be having this conversation AGAIN.

Search engine optimization has to manage crawl on the server, which has nothing to do with “crawl budget”.  The search engineers may call this “crawl cap” (as Shari Thurow likes to point out) or they may call it something else entirely.  But if they speak about “crawl budget” they are trying to talk to us in our terms.  And what they usually have to say is: you can’t do anything about crawl budget.

Using “rel=’nofollow’” attributes on internal links hurts crawl efficiency.  Some people think it improves crawl.  Nope, this is PageRank Sculpting, which is stupid and counter-productive.  PageRank Sculpting has always been stupid and counter-productive.  PageRank Sculpting is the reason why so many of you still believe that sub-folders (directories) work better than subdomains.

How should the SEO specialist manage crawl?  Here are a few suggestions:

  • Block folders that should not be indexed (or noindex the pages)
  • Implement 30x redirects as required
  • Replace 404 status URLs with useful, meaningful content that supports normal site navigation
  • Implement reasonable alternative site navigation structures
  • Limit unnecessary self-promotional links, especially in page body copy
  • Ensure that XML sitemap files are updated
  • Ensure that internal navigation is updated
  • Block SEO crawlers, rogue bots, and non-referring search engines
  • Block “business intelligence” crawlers (if you differentiate them from the previous group)
  • Vet RSS feed fetching tools (do they send traffic or facilitate social shares?)

You should not be crawling client Websites if you have access to back-end data.  You absolutely should not be crawling competitor Websites: that’s unethical.

Crawl budget is not crawl management.

You cannot manage crawl budget.

You can and should manage who crawls your server, when, and how.

What You Should Do: Focus on crawl management.  Ignore any SEO advice that uses the phrase “crawl budget”.  It’s clearly misinformed advice.

4. You Disavow Harmless Sites that Actually Help Your Site

The Bruce Clay Agency recently shared the top ten domains that people submit to their disavowed sites index.  None of those domains should be disavowed.  These bad choices in what to disavow reflect the erroneous thinking that has been applied to the whole question of what should be disavowed.

In fact, just this week Googler Gary Illyes addressed the very serious disavow problem but I believe he gave out bad advice.  He suggested that people should only disavow sites they don’t trust.

So let’s count a show of hands: How many of you trust Websites you have never heard of or that link to your site thousands of times over?  In my experience most people would raise their hands.

It’s not that you should disavow sites you don’t trust.  You’re already doing that and you are disavowing the wrong sites.  Suspicious Websites don’t get your client sites penalized.

What kinds of links should you disavow?

  • Links you (or your client) deliberately placed to manipulate search results (targeted anchor text)
  • Links your (client’s) predecessor deliberately placed to manipulate search results
  • Links a competitor placed to hurt your site (but only if you can verify this)
  • Links you (or your client) paid for
  • Links your (client’s) predecessor paid for
  • Links a search engine spam team (not “Top Contributors”) says are hurting your site (via search dashboard message)

If you are not sure about who created the link and why you probably don’t need to disavow it.  If some “expert” or “Top Contributor” in the Google support forums tells you such-and-such links are spammy, and you don’t know who created them or why, ignore the “expert” or Top Contributor.

It’s spam if it’s created by you or your client for your (client’s) SEO benefit.  Otherwise, it’s not spam.

These may be “low value” links but they are not “toxic” links.

Manipulative links are obvious to anyone who has created links for the sake of improving their rankings.  Manipulative links look exactly like the kinds of links you created.  They are found in exactly the kinds of places where you put your links.  They are placed on exactly the kinds of sites that you used to create the links to improve your search rankings.

If you’re in doubt about whether a link is bad for your (client) site, it’s probably a good, safe link.  If you don’t trust the site but you cannot put any links there yourself it’s most likely a safe site that should not be disavowed.

I’m not saying these links are going to do much for your (client) site.  I’m just saying you don’t have any reason to disavow them.  Worse, if they are helping your site you’re shooting yourself in the foot.

People create all sorts of crazy automated linking Websites.  That doesn’t make them spam.  We’d all be dealing with manual penalties if that was the case; otherwise, the search engines should just ignore the links.  And if the search engines are not ignoring these automated links but are not handing out spam penalties because of them, then you need to ignore them.

We get thousands of these links.  We have never been penalized.  We have never disavowed them.

What You Should Do: Create a check list of page quality indicators that doesn’t include “length of article”, “domain authority”, “page authority”, etc.  The check list should include items such as “why does this page exist?” and “what is the page supposed to be doing?”  What does the content on the page itself indicate about the page’s purpose?  Learn to judge page quality without SEO metrics.  They are usually wrong.

5. You Use Search Visibility Reports

There are few tools in the SEO industry that waste more time and money than search visibility reports.  These reports attempt to show you how many queries a given Website can be found in.  These reports do not in any way, to any degree, provide reliable estimates of search referral traffic.

You cannot use search visibility reports for competitive analysis.  That’s like asking the kids down the street to estimate how many eggs are in your refrigerator (and you don’t let them see your refrigerator).

You cannot use search visibility reports as substitutes for real traffic data.  While it would be great to have access to your competition’s Search Console and Bing Toolbox data, you don’t.  There is no tool out there that can tell you how much traffic those sites receive.

Note: If you subscribe to an SEO tool that collects referral data from real Websites they have a legal obligation to protect everyone’s data.  They should be careful about what they show you based on real competitor search referral data.  If they are showing you exactly what your competitors share with them, don’t be stupid and blog about it.

Search visibility reports are based on wild guesses.  What’s worse, if they use any sort of crawling those crawls are always coming from Web servers or compromised computers (their IP addresses look suspicious to the search engines).  And if the crawls don’t complete within 1 hour or less they are probably aggregating search results data from multiple index updates.

You just cannot trust search visibility reports in any capacity.  Although I used to talk about computing search visibility, I never meant for the industry to treat search visibility estimates as reliable substitutes for real data of any kind.

You really are wasting your time and money with search visibility reports.  They don’t tell you when Google rolls out an update, they don’t tell you when a Website has been penalized, and they don’t tell you anything about who is clicking on which search listings.  In fact, here is one of the worst flaws in search visibility reporting industry: they treat all queries as absolute.

How well do you understand search results page performance?  Did you know that …

  • A Website may have more than 1 listing in a search results
  • A query may produce featured snippets sporadically
  • A query may produce different featured snippets throughout the day
  • A query may show paid listings sporadically
  • A query may be more popular at certain times of the day
  • A query may be more popular in some locations than others
  • A query may be more popular on some devices than others
  • A query may be more popular on some operating systems than others
  • A query may be more popular on some browsers than others
  • Most people do not click on the first result
  • Some lower results may receive more clicks than first results
  • Many people click on multiple listings in a search result
  • Many people change the query when they don’t like the results

Your beloved search visibility reports don’t take these kinds of disturbances in the Force into consideration.  The search results are not fixed points in time.  They are clay that is constantly being reshaped and molded.

What You Should Do: First, give up this dream of spying on other people’s traffic.  Unless you hack their search engine accounts (and you should NOT) you won’t see their traffic.  Second, limit your “spying” to looking for ideas to improve your content, keyword goals, and relationships with other sites.  Practice earning links because that’s the only long-term SEO link strategy that works.

6. You Use SEO Tool Metrics to Judge Link Quality

I roll my eyes every time I come across an online discussion or SEO case study that talks about “links from high DA / Trustflow / AR” sites.  I understand the intuitive desire to find link resources that Bing and Google think are “high quality” but none of these SEO tools can tell you if the linking pages are:

  • indexed in Bing or Google
  • passing positive value in Bing or Google
  • helping any of their link destinations to rank better

Since these tools cannot tell you what you need to know, why do you place so much value in them?  Maybe it makes you feel good to see lots of numbers besides these links but if you don’t even know whether the search engines have indexed the links then what does it matter what their third-party metrics are?

I understand that people want to find good sources for links.  Link research tools will probably never go out of style.  But you place way too much value on these tools.  Without being able to qualify the links the way they should be they are substituting their own data for the search engine data you cannot get.

You should be suspicious of any attempt to use substitution data for any reason in search engine optimization.  That’s equivalent to collecting data from 10 million flu patients in 2010 to use as a basis for analyzing a cholera epidemic in 2020.  You may eventually find some overlap in the data but that doesn’t make the substitution valid.

When you use the wrong data to make a judgment you make the wrong judgment.  There is no acceptable way to rationalize that.

You can argue there are other reasons to use link research tools; I won’t dispute that.  But if you’re judging the quality of backlinks on the basis of what you find in SEO link research tools you are doing this wrong.

Link research tools may fill in some gaps if the search engines don’t show you all the links they know about (and they won’t tell you if there are more links in their indexes than they’ll report to you).  But you still need to vet the links you find in those backlink reports from other resources.  At the very least, check to see if they are indexed in the major search engines.

What You Should Do: Practice link research without the SEO tools.  If you don’t know what link spam looks like these tools are the worst way to learn about it.  And stop obsessing over “relevant links”.  Read this article about irrelevant links if they frighten you.

7. You Include More than 1 Day’s Worth of Data in Your Analyses

This relates specifically to how many of you use Google Search Console data that you download.  Especially for very busy Websites, the GSC data may be sampled and is already averaged or otherwise aggregated.  Analyzing samples, averages, and aggregated data produces very unreliable information.  Your analysis is flawed from the beginning.

If you download a single day’s worth of data at a time, however, you’ll get better results than if you download multiple days’ worth of data at a time.  There is no minimum acceptable threshold.  2 days’ worth of data is just as bad as 90 days’ worth of data.

Although it may save you time to widen your reporting window to 90 days, you’re just wasting your time.  You need to perform 90 downloads to get started and a daily download thereafter.

And you must do that for every type of data you collect from Google Search Console.

Furthermore, you cannot combine the days’ data if you are going to calculate anything, except for consecutive days where the estimated averages do not change.  For example, if you download 7 days’ worth of query data and the average rankings per query change every day, you cannot combine these 7 days’ data into a single set of numbers, not if you want to calculate something like Click Through Ratios.

On the other hand, if you download 7 days’ worth of data and the average rankings do not change from day to day for some of the queries then you can combine that data.  But you are still summing aggregated data on the basis of averages because the more clicks there are in your data the greater the chance that your “average position” really is an average of several positions.

There are other ways you misuse the data from Search Console, too.  It’s the only organic search referral data we have but it provides limited useful insight into what is happening in the search results.

The less precise your data is the more room for error you must allow in your analysis.  There is no formula you can apply to calculate the probable error.  You just have to be cautious about how you handle this data.  At the very least if you keep the daily totals separate then you can plot trend lines for query referrals, page referrals, and query-by-page referrals.

What You Should Do: Obviously I want you to use as few days’ data at a time as possible.  But on sites that receive less traffic you may be tempted to aggregate multiple days’ worth of data to “see what is going on”.  Technically, any SEO report is only valid if its data is captured during a single search index window (the time between changes to the index).  For Google that works out to about 3-18 hours, depending on when they release their index changes.  The more you aggregate data from different index windows the less reliable your analysis becomes.  With a small traffic site you cannot treat “average position” and “rankings” as usable data.  Just focus on the pages receiving traffic and the queries sending that traffic until you can build more traffic.

In Conclusion

Although there is a lot of pretty good information about what you should be doing for search engine optimization on the Web (and at conferences), the three categories of information where you are most likely to be misinformed by your favorite experts are:

  1. Page speed measurement
  2. Content quality
  3. Link quality

You need to push back on these kinds of posts and presentations.  If everyone really understands these topics so well, if they can explain what is going wrong so easily, then why do we still need these posts and presentations in 2017?  The Google Panda algorithm came out in 2011.  We’re still talking about “page quality” (actually, people are still confused and talk about “content quality”).  The Penguin algorithm was first released in 2012 and is now running on automatic with minimal human oversight.  We’re still talking about “link quality”.  These topics would have faded if people were sharing better information.  You’d get better guidance from all the “SEO 101” tutorials.  These topics would not be worth including in conferences.

Hence, you’re still being fed the wrong information.  Stop repeating it.  Question everything.

Honestly, we’ve shared some really detailed information about what real Web spam looks like in the SEO Theory Premium Newsletter.  We may share more in upcoming issues but now you don’t have to subscribe blindly hoping to see the right articles come along.  You can buy individual issues without subscribing today.

I recommend Volume 6, Issue 7 (February 17, 2017) because it includes the article “How Well Do We Vet Link Toxicity?”  You’ll also want to read Volume 6, Bonus Issue 2 (June 29, 2017) because it includes the article “Was There a June 25 Google Update?”  This article discusses spammy links in great detail.  Normally you would have to buy 8 issues to get one of the bonus issues but here is a special offer: You can buy that bonus issue directly and if you decide to buy 6 more issues you can get a 7th issue free.  That would technically keep you in compliance with our bonus issue policy (buy 1 regular, buy the bonus, buy 6 regular, get 1 regular free — in that order).

Another issue to check out is Volume 6, Issue 25 (June 2, 2017) because it includes the article “Link Value Tests”.  This is a more advanced article.  As you scan the list of back issues in the archive you’ll see several other articles about links.  Don’t go for the low-hanging fruit.  Be thoughtful about what you buy.  SEO is about far more than just links.

Why Not Just Share This Information Openly? The problem with outing spammy link practices in detail is that the link spammers learn from what we share and they change their methods.  We’re not trying to protect link spam here.  The search engines really do identify it very, very well.  We’re protecting the competitive advantage that our subscribers have over you.  If you’re not sure of what a spammy link looks like it’s because you’ve never built them and you’ve never been tutored by someone who has.  When you learn what kinds of links are really spam you have a much clearer idea of which links should be disavowed and why.  You’re far less likely to disavow links that are actually helping a site.  In today’s disavow-insane Web marketing world, that’s a HUGE competitive advantage.

The same goes for “quality content”.  Anyone who uses the phrase “quality content” really doesn’t know what it’s all about.  Content is not just words on the page (text).  It’s everything.  The search engines are not gauging the quality of your content, they are gauging the quality of your pages.  And, yes, we have discussed page quality extensively on this blog.  Here are some of the better articles (in my opinion) that you can read for free:

Other SEO Theory Articles You Should Read

Follow SEO Theory

A confirmation email will be sent to new subscribers AND unsubscribers. Please look for it!