Myth 1: SEO is Voodoo or Snake Oil
There is a low bar to entry into the field of digital marketing, including and especially SEO. There are no real certification processes (because how would you certify something that changes every day?) and Google never publishes the algorithms, so there is no way to test an individual’s knowledge against what they contain.
Basically, when you hire an SEO provider it has to be based on trust.
This is why the myth that SEO is voodoo prevails. It prevails because bad practitioners did bad work and the client is left with no other way to explain their lack of results. In fact, it is often these bad practitioners who use the myth to explain their poor results.
That being said, SEO isn’t voodoo (or magic or “bovine feces”). Real SEO is the process of making sites adhere better to Google’s algorithms, for specific query strings, in order to increase relevant site traffic and/or company revenues.
These algorithms aren’t completely unknowable things.
While Google never publishes the details of that information, informed SEO professionals have a good understanding of what will bring a site in compliance with those algorithms (or, in the case of black hat SEO, how they can game those algorithms). They are after all based on math and processes governed by logic.
A trustworthy SEO professional lives and breathes algorithm changes, which can amount to multiple changes a day. They know why the algorithms do what they do as best as anyone not working at Google can do.
This is the opposite of voodoo and magic. It is called earned knowledge. It is also a very hard earned knowledge.
When you pay an SEO pro, you aren’t paying for their time. You are paying for their knowledge and results. Prices are set accordingly.
Myth 2: Content Is All You Need
“Content is KING!”
You will find many articles that make this statement. While they are not completely untrue, content is less king and more like a valuable business partner to links, design, and usability.
Mostly, though, content and links are the like the conjoined twins of the SEO world. You must have both. One will not work without the other (at least not well and not for the long term).
Now, Google will tell you many long-tail queries rank without links. That is likely true. It is also likely that these long-tail queries are so unique that there is no competition for them, so links don’t play an active role the way they do in a competitive query.
If you’re trying to rank for the Walking Dead, you better have links* or don’t expect anyone to find you.
*Good links. Not poor, $99 links bought from a link farm.
So while content is very important, content needs links. Just like links need content.
Bonus Tip: Content is not king. Content is special, but not king. Like peanut butter and jelly you can have one without the other, but it isn’t as good. Add technical to this duo and you have the triad that is the basis of all good core SEO.
Myth 3: Speed Isn’t That Important
Google said a while back that page speed is only a tie-breaker when all other factors are equal. This is one of those cases where I can say that this is not borne out in real-world testing.
Personally, I had a client increase their traffic by over 200,000 sessions a day when they cut their page speed by 50 percent during a likely Panda update. So while it is true that it acts as a tie-breaker when all things are equal it can also dramatically improve rankings when your site has a severe page speed issue.
Now when I say a page speed issue, I don’t mean you cut your 5-second site load time down to 2 seconds. I mean when you dramatically cut your page load, say a 22-second site load time down to 8 seconds, which is what happened in this case.
Know What is Being Measured
It is also important to know what Google is measuring when they are evaluating page speed. While they are looking at overall speed the issue they are most “critical” of is how long the DOM (Direct Object Model) takes to load. The DOMitems are the visible items on the page excluding ads, if you have stacked your load right.
This means that if you can cut your DOM load from 22 seconds to 8 seconds as in the example, Google will likely reward you for the dramatic decrease in page load because you are now dramatically faster. This is an additional benefit of improving page speed unrelated to breaking a tie on a specific query result.
A faster site is much easier for Googlebot to crawl. When the site is not slowing the crawl down, more of your site is getting indexed either in number of pages or in depth of page crawl.
Note: The Google Page Speed Insight tool only measures items in the DOM, so you could have a higher page speed score than another site, but still perform more poorly in the rankings because your overall page load is too slow. Page speed is very important and will become even more so as we move into mobile first. So never discount it.
Myth 4: Links Are Dead
I once had a call from a potential client that asked me if I could remove all his links.
“Remove all your links? May I ask why you would want to do that,” I asked.
“Because I heard links were bad and I need to remove them,” he told me.
“Did you buy the links or get them from some nefarious method?”
“No they are all legit.”
“Then, sir, whatever you do, use me or don’t for other reasons, do not get rid of your links!”
True story.
Links aren’t dead.
Links aren’t close to dead.
If you have the best content in the world and no links, your site won’t get much visibility. Links and content are correlated with rankings. Great content still needs great links (or a lot of mediocre ones).
If you’re buying links for $99 and expecting to get to the top spots in Google, you’re barking up a very dead tree.
Remember, good links require topical relevancy and legitimacy. If it isn’t natural and it comes from an unrelated page or site, it probably won’t help much.
Bonus tip: Reciprocal linking died circa 2007, maybe earlier. Linking to your buddy and them linking to you won’t do you much good.
Myth 5: Keyword Density
There was a time keyword density have some validity.
Really, if it did not work why do you think all those people were stuffing white text on white backgrounds for ranking purposes? Then Google got smarter and it did away with keyword stuffing as a viable practice and even people who got good results from applying density testing to much smaller keyword placements no longer could count on knowing what keyword density would help.
In both cases, this no longer exists.
While you can still put any word on the page too many times, there is no set range of what makes a page rank. In fact, you can find results now where the keyword does not exist in the visible portion of the page. It might be in the links or in the image tagging or somewhere else that is not part of the content it might even be a similar not exact match. This is not typical, but it does exist.
Bottom line: placing a keyword X times per page is no longer something worth spending your time on. There are far better fish to fry.
Bonus Tip: Better to make relevant content that you can link to internally and others can link to externally than to waste time on optimizing keywords. That being said your title tag is still highly relevant. Spend some time adding your query set there. That might give you a boost.
Myth 6: You Must Submit Your Site
At least twice a week I get an email from an SEO site submission company telling me I need to pay them to submit my site to the search engines.
Seriously? No, you do not.
Now, are there times when it is good to submit your site URLs? Sure when you need the search engines to come back to the site to do things like pick up a new piece of content or re-evaluate a page, however, you never need to submit your site.
Google is advanced enough now – and especially with its status as registrar – that it can find you minutes after not only that site is live, but also when the domain is registered.
Now if you’ve been live for a few weeks and have an inbound link to the site and Google has not come by as evident by your logs it can’t hurt to submit it via Google Search Console Fetch and Render, but never ever pay someone to submit your site.
Bonus Tip: When in doubt just use Google’s URL submit form or “fetch and render/submit” in Google Search Console.
Myth 7: You Don’t Need a Sitemap
Sitemaps are not a nice to have add-on for sites today. This gets even more important as we move to the mobile-first algorithms in 2018.
Why? When Google cannot easily crawl a portion of your site, the sitemap allows the crawler to better find these pages.
Bonus Tip: Google is going to have a harder time finding pages due to the reduced size of navigational elements in mobile-first indexing. Sitemaps – both XML and HTML – will be the best way for them to find all the pages on the site you want indexed and ranked.
Myth 8: Query Must Have Freshness
QDF, or Query Deserves Freshness, most certainly applies to queries that need fresh results. For instance, from a news site or say the most recent Powerball numbers.
That does not mean you have to change every element on your homepage every day, or even very often.
While there are sites that absolutely must have fresh content on their main site pages on a daily or weekly basis, most do not.
Evergreen pages are evergreen for a reason. If you write an article on mobile-first indexing and that information has not changed, you do not need to change that page to give it “freshness”.
You do, however, need to have some fresh content on your site. So a good content strategy is how you address having fresh content without trying to meet some unnatural goal for daily content changes.
Bonus Tip: For smaller sites that have small teams or little money and do not need to have fresh content daily, you can just invest in adding pages to the site when needed but keeping an active blog presence. Adding 2-3 blog posts a week will keep the site relevant without adding the demands and costs of continually updating pages.
Myth 9: Because Big Brands Do It, It Must Be Good!
Remember your parents saying to you when you were little, “Would you jump off a bridge just because Johnny told you to?!” Same thing goes here.
There is a long history of sites copying bad website decisions from each other simply because they thought the other site knew something they didn’t.
Don’t be a lemming.
What one site does may work for them and may not. What if they tell you it is the best thing since sliced bread? Unless you’re looking at their metrics, don’t believe them and even if it is the best thing for them, the chances of that being right for you are slim.
Why? Because you’re a different company. Your users have different queries and user intent. Just because Facebook and Twitter use infinite scroll doesn’t mean you should.
In fact, because big brands don’t suffer as much from user and Googlebot discontent when they get it wrong, they are more likely to – get it wrong.
Don’t copy big brands. Find what works for your users and stick to that.
Bonus Tip: If you want to try something that you see on another site, find a section of yours that isn’t bringing in a lot of traffic and then A/B test the idea on your own pages. Your data will show you what works best for you. Never assume because a big brand does it, you will benefit from following their path.
Myth 10: Algorithm Devaluations = Penalties
Google has two types of site devaluations.
Penguin, Panda, Pirate, Pigeon, Layout etc. are all algorithms. Algorithms can giveth and they can taketh away. This means that not every site sees devaluations from the update of these processes. Many sites see positive results. This is called an “algorithmic change” not a penalty.
What are penalties then?
Penalties are manual actions you can find in Google Search Console. This is when Google took a look at your site and decided it was in violation of the Webmaster Guidelines and devalued the site. You know this happened by checking your messages in Google Search Console. When it happens they will tell you.
Penalties also require you “submit a reconsideration request” to regain your site status and remove the penalty.
Algorithmic devaluations have no such consideration. You fix what you think went wrong. Then you wait to see if Google gives you back your rankings when that algorithm or set of algorithms comes back through and re-evaluates the site.
Myth 11: Duplicate Content Is a Penalty
There is NO duplicate content penalty!
There has never been a duplicate content penalty.
Google does have a duplicate content filter, which simply means that if there is more than one item of content that is the same Google will not rank both for the same query. It will only rank one.
This makes perfect sense. Why would you want the results for a query to bring back the same content multiple times? It is simply easier to rewrite the piece than try to guess what those might be.
All that said, too much duplicate content can affect you with the Panda algorithm, but that is more about site quality rather than manual actions.
Bonus tip: The duplicate content filter applies to titles and meta descriptions as well. Make sure to make all your titles and descriptions unique.
Myth 12: Social Media Helps You Rank
Social media, done well, will get you exposure. That exposure can get you links and citations. Those links and citations can get you better rankings.
That doesn’t mean that social media postings are inherently helpful to getting you rank.
Social media doesn’t give you links, but it encourages others to link to you. It also means that the social media post may escape its ecosystem and provide you a true site link. But don’t hold your breath.
Social media is about visibility.
Getting those people to share your content and link to or mention your site in a way that Google counts it as a “link”? That is SEO.
Myth 13: Buying Google Ads Helps with Organic Ranking
No. Just no. Investing in PPC won’t boost your organic search rankings.
These two divisions are in two separate buildings and not allowed to engage with each other about these things.
Personally, I have worked with sites that have had massive budgets in Google AdWords. Their site still lived and died in organic by the organic algorithms. They received no bonus placements from buying Ads.
Bonus Tip: What buying ads can do is promote brand validation. In user experiments, it has been shown that when a user sees an ad and the site in the organic rankings together, they believe it to have more authority. This can increase click-through rates.
Myth 14: Google Uses AI in All its Algorithms
No. Google doesn’t use AI in the live algorithms except for RankBrain.
Now, Google does use AI to train the algorithms and in ways internally we are not privy to. However, Google doesn’t use AI in terms of the live algorithms.
Why?
Very simply put, because if it breaks they would not know how to fix it. AI operates on a self-learning model.
If it were to break something on search and that broken piece hurt Google’s ability to make money there would be no easy way to fix it. More than 95 percent of Google’s revenue still comes from ads, so it would be extremely dangerous to allow AI to take over without oversight.
Myth 15: RankBrain
So much has been written about RankBrain that is simply incorrect it would be difficult to state it as one myth. So, in general, let’s just talk about what RankBrain is and isn’t.
RankBrain is a top ranking factor that you don’t optimize to meet.
What does that mean? Basically, when Google went from strings to things (i.e., entity search), it needed better ways to determine what a query meant to the user and how the words in the query set related to each other. By doing this analysis, Google could better match the user’s intent.
To this end, they developed a system of processes to determine relationships between entities. For those queries they understand, they bring back a standard SERP. Hopefully, one that best matches your intent as a user.
However, 15 percent of the queries Google sees every day are new. So Google needed a way to deal with entities whose relationship was unclear or unknown when trying to match user intent.
Enter RankBrain!
RankBrain is a machine-learning algorithm that tries to understand what you mean when Google is unsure. It uses entity match and known relationships to infer meaning/intent from those queries it doesn’t understand.
For instance, back when the drought in California was severe if you looked up “water rights Las Vegas NV” (we share water) you would get back all sorts of information about water rights and the history of water rights in the Las Vegas area. However, if you put in a much lesser known area of Nevada, like Mesquite, Google wasn’t sure what you wanted to know.
Why? Because while Google understands Las Vegas as a city (entity) in a geological area (Clark County) and can associate that with water rights, a known topic of interest due to search data. It cannot, however, do the same for Mesquite.
Why? Because no one likely searched for water rights in Mesquite before or very often. The query intent was unknown.
To Google, Mesquite is a city in Nevada, but also a tree/charcoal/flavor/BBQ sauce and it brought back all of these results ignoring the delimiter “water rights” for all but one result. This is RankBrain.
Google is giving you a “kitchen sink.” Over time, if enough people search for that information or the training Google feeds it tells it differently, it will know that you specifically wanted x, not y.
RankBrain is about using AI to determine intent between entities with unknown or loosely formed relationships. So it is a ranking factor, but not really a ranking factor.
Bonus Tip: While there are a few niche cases where it might make sense to optimize for RankBrain, it really doesn’t for most. The query is a living dynamic result that is Google’s best guess at user intent. You would do far better to simply optimize the site properly than trying to gain from optimizing specifically for RankBrain.
Read the full story at Search Engine Journal