“Everything changes and nothing stands still.”
In this quote, Heraclitus of Ephesus was referring to life and the fundamental order of the cosmos.
But he might as well be referring to Google’s chronic algorithm updates, even if he was a few centuries early.
See, Google makes roughly 500 – 600 changes each year.
Some are major, some are minor.
Some are confirmed, some are unconfirmed.
But Google is cranking them out left and right.
Even on the low end of 500 changes each year, this means 1.36 changes occur every single day.
Google is basically tweaking its algorithm all the time.
Here’s a chart that highlights some of the more serious changes of the past six years:
Of course, only a fraction of algorithm changes create any real stir.
Most of you don’t even know they happen.
But this constant flux is enough to put a lot of marketers on edge.
Even if you reach a top ranking position for a great keyword, there’s no guarantee you’ll stay there.
A single update could send you plummeting into no man’s land.
I understand this can be a little unnerving.
But is this fear really justified?
Should you be worried about the next Google algorithm change?
In this post, I offer my input on this topic and explain how you can protect your site from incurring Google’s wrath.
The frequency of major updates
Okay, so we’ve established that Google is constantly making adjustments.
It’s how it continues to dominate the search engine market:
If it remained stagnant, a competitor would inevitably overtake it.
But what we really need to know is just how many updates are major.
By major I mean resulting in a serious shakeup where hundreds of thousands, or even millions, of sites are affected.
According to Link Assistant, there have been nine major updates since 2011.
They are as follows:
- Panda – 2/24/11
- Penguin – 4/24/12
- Pirate – 8/12
- Hummingbird – 8/22/13
- Pigeon – 7/24/14
- Mobile-Friendly Update – 4/21/15
- RankBrain – 10/26/15
- Possum – 9/1/16
- Fred – 3/8/17
That means we’ve had 1.5 major updates per year in the last six years.
This isn’t to say minor updates can’t or won’t impact you, but there are only about 1.5 a year that are cause for any real concern.
Average traffic that comes from Google
Kissmetrics performed a study on over 18,000 small to medium e-commerce sites.
They found that “30.5 percent of all traffic was coming from organic searches on Google, Bing, Yahoo and other search engines.”
Considering that Google has 77.43% of the search market, this means that roughly 23.6% of small to medium e-commerce sites’ traffic comes from Google.
In other words, just under a quarter of all traffic comes from Google.
I would say that’s significant.
Of course, this isn’t true for every website.
But this is what you can expect on average.
Should you worry?
Now we know the frequency of major updates and how much traffic Google sends to the average website.
But is this cause for alarm?
If Google decides to unleash a major update and you get penalized, would it put you in a full-on crisis situation?
Well, it depends.
The way I look at, there are three different factors you need to examine to determine your risk level.
Factor #1 – Your Google traffic
If your site is an outlier, where you get only a small percentage of your traffic from Google (say less than 10%), even the most brutal of algorithm changes shouldn’t have a major impact.
But if Google is your bread and butter, and you count on it to consistently send highly-qualified leads to your site, you could definitely be in trouble if you’re adversely hit with a big update.
This could send your traffic volume and sales plummeting.
It could look something like this:
Factor #2 – User experience
At the end of the day, Google is interested in one thing: providing its users with the best experience possible.
If your website delivers a great user experience, you should be in pretty good shape.
No matter what Google throws at you, there should be a level of stability, and it’s unlikely that your rankings will see a dramatic drop.
Now, I realize that delivering “a great user experience “ is a wide umbrella open to plenty of interpretation.
But here are a few key elements that heavily contribute to it.
As we all know, Matt Cutts loves great content.
This should be your top priority over anything else.
You also don’t want to have any spammy or manipulative links or barrage visitors with obnoxious ads.
Next, there’s functionality, which includes:
- fast-loading pages
- intuitive navigation
- clean interface
- no disruptive popups
Factor #3 – “Schemey” SEO practices
It seems there’s always some “latest and greatest” SEO strategy popping up.
The promise is if you do X, you’ll be able to capitalize on some loophole and see a huge spike in your rankings.
While this approach may pay off initially, it often ends up hurting you in the long run.
I’m a firm believer in the “big picture SEO,” where you focus on the quality, user experience and fundamental SEO best practices rather than trying to game the system.
If you’re doing anything bordering on black-hat or even grey-hat SEO, it’s probably going to come back to haunt you.
But if you keep your nose clean and maintain your integrity, you should be good to go.
How to protect yourself
Here’s the deal.
The next big Google algorithm change is imminent.
It’s going to happen.
It’s not a matter of if but when.
So you need to make sure your site is protected when the next major update inevitably rolls out.
But how do you do this?
Well, you can never completely predict what Google’s going to do next (they’re about as secretive as the CIA), but there are several measures you can take to prevent unnecessary penalties.
Here’s what I suggest.
Diversify your traffic sources
First, don’t put all your eggs in one basket.
Digital marketing has evolved to a point where you now have a buffet of options to choose from.
Organic search traffic is huge, but there are plenty of other ways to generate high-quality traffic that’s primed to convert.
Here are just a few ideas:
- social media
- influencer marketing
- email newsletters
Monitor your link profile
The links pointing to your site can make or break you.
Recent research suggests that “high-quality backlinks account for 30% of your overall page score in Google.”
I can’t stress enough how important it is to keep tabs on which sites are linking to you.
Low quality, irrelevant or spammy sites can be the kiss of death.
One tool you can use to see who’s linking to you is SEMrush.
Just enter your site’s URL in the search bar:
Then click “Start now:”
Scroll down to the “Backlinks” section:
Click on “View full report” for more details:
You’ll then get a list that looks like this:
From there, you’ll want to browse through the list and check for anything questionable.
You can also use the Google Search Console for checking links, which you can learn about in this post.
Create “future proof” content
Like I mentioned before, epic content is what Google is looking for when determining rankings.
If you can provide it, you’ll have a buffer against the impact of the next big algorithm change.
I realize this is easier said than done, but check out this post for 14 examples of truly epic content.
The basic recipe I use consists of the following:
- long-form content (at least 1,500 words)
- plenty of visuals
- plenty of data
- references to authoritative resources.
As long as your content hits its mark, there’s no need to live in perpetual anxiety of the next algorithm update.
For more on how to protect your site from Google penalties, I suggest reading this post from NeilPatel.com.
With so many brands heavily depending on Google for their traffic and ultimately sales, I see why so many people worry about algorithm changes.
The idea of your sales tanking because of an update is scary.
If you’re implementing the wrong approach and tactics, you’re putting yourself at risk, and there’s a strong likelihood your rankings will suffer at some point.
But if you understand Google’s logic and follow SEO best practices, there’s no reason to worry.
Sure, algorithm changes will come.
But you’ll be ready for them.
This way, you can keep things flowing and maintain a steady volume of traffic with minimal disruption.
What’s your experience with algorithm updates in the past? What do you do about algo updates now?