Virtually everyone who has run a website for more than a year has experienced a dramatic and sudden decrease in traffic, which is usually accompanied by a drop in search ranking. When Google drives 80% or more of your traffic, it’s very important to maintain your position. When you drop suddenly and precipitously, you have to wonder what went wrong. Did your site go down? Did you get hit by a negative SEO bomb? Did a change in your meta data trip spam filters? There are a ton of reasons, so here are the ten most common and what you can do to fix the problem.
This is perhaps the most likely reason for anyone to be visiting this article now, on the heels of Penguin 4 and its permanent inclusion into the search algorithm.
There are essentially two forms of link penalty. One is the manual action, and one is algorithmic. If you think your penalty has something to do with your backlinks, you can check which one it is very easily. Just log into your Google Search Console (Webmaster Tools) account and check the manual actions menu on the side. If you have something listed there, it’s a manual action, rather than an algorithmic penalty. If there’s no entry, skip to the next section.
So how do you fix a manual action for links? You need to use a tool of some sort to pull your backlink list. Google Analytics, Majestic SEO, Moz, Cognitive SEO; there are dozens of options out there, free and paid, for you to pick through.
Pull your links and audit them to find any that are irrelevant or spammy. Once you have that list, start approaching webmasters and ask for links to be removed. Sometimes they will be happy to do so; others they won’t answer. For those that aren’t removed, add them to a list and disavow them.
Penguin is an algorithm that Google uses to penalize sites with poor quality backlinks, which can be organically acquired rather than the result of unnatural link building. It’s harder to diagnose, because it’s not really a penalty, it’s just an adjustment of where your ranking sits, according to changing criteria on the part of Google, or new information they discover. “Recovering” from an algorithmic penalty is not actually recovering, it’s taking steps to improve your SEO and seeing the benefits of doing so.
Penguin 4 is the most recent iteration of the algorithm, and with it, Google confirmed that it is being merged with the core Google algorithm, like Panda before it. This means there will be no future Penguin updates, as such; merely algorithmic updates in general. For more on those, skip to number 10.
To diagnose a Penguin penalty, see if your drop in rankings corresponds to a known Penguin update date. If so, all you have to do to fix it is the same link audit and removal/disavow process outlined in the previous step. They are, after all, more or less the same penalty.
On-page issues are a very broad category of errors that can crop up on a website over time. A sudden explosion of those errors can indicate something broken on your site, while a slow, gradual increase can simply reach critical mass and knock your site down a tier or three without any major change involved. In either case, you can simply go back to the Google Search Console and look at the errors found on your site.
Unfortunately, because of the broad nature of “on page errors”, it’s difficult to diagnose the specific cause of this loss of ranking. What sort of errors count?
Essentially anything that could hamper the loading of the page or the user experience can be counted as an on page error, so fix as much as you can.
If you have poor quality hosting, or if your web host doesn’t have the right kind of redundancies or resources, your server can stop responding to queries. If you have extremely poor hosting, your site can drop when you exceed a certain level of bandwidth.
Google understands that, sometimes, outages happen. There’s a certain level of downtime that is almost impossible to avoid. Only massive sites like Amazon or Google themselves can hope to avoid it, and even they fail.
If Google attempts to crawl your site and finds it down, they’ll make a note of it and leave. Later, possibly hours or possibly days later, they will come back and attempt to crawl your site again. If your site is down again, they assume that you have been down the whole time, and will temporarily remove you from the search results until you can get your act together and get your site back up.
Sometimes outages are invisible if you’re not checking your site constantly, so it might be worth investing in a downdetector plugin of some sort. With it, you’ll receive a notification when your site is down, and you can act to bring it up immediately.
One sure-fire way to be removed from the rankings and to have your traffic drop to nothing overnight is to have your site compromised. Google might throw a “malicious site detected” error, or they might have a “this site may be compromised” warning, or some other similar issue. All of them are caused by the hallmarks of a hacked site.
Often, when your site is hacked, you know about it. Alarms and monitors on your web hosting trigger and warn you. Excessive login attempts show up as email warnings. Your entire page is removed and replaced with a spam site or propaganda site.
Other times, the hacker is more subtle. They leave your site as-is, but add more pages to it they can use for their own purposes. These can be spotted by looking at changelogs or files in your hosting.
The process of recovering from a hack is lengthy and involved, so I’m just going to link to a post about it here.
Sometimes your ranking as a whole isn’t hurt, but just your ranking for a single keyword. This can be an issue with reporting, since you’re using a single keyword to monitor the performance of your site as a whole, or it can be an issue with your ranking being driven down by an external force.
What might cause a single keyword drop? There are two primary causes. One is changing or removing content that ranked for that keyword. If you no longer have the keyword on your site, of course you’re no longer going to be relevant. This is why it’s often better to fix and improve old content than it is to simply delete it. The other possible cause is competition; a new competitor or new content from an established competitor can jump in and outrank you, driving your ranking down.
The solution to both issues is simple; create new, fresh, excellent content targeting that keyword to get your ranking back.
Robots.txt is a file every site has – even if it’s blank – that directs search engine crawlers and spiders around. If it’s empty or missing, the same thing happens; the bots do as they are directed by their creators, go about their business with no restraints, and move on. However, you can add a bunch of different types of directives to all bots as a whole, certain classes of bots, or even specific bots. You can limit their access to certain subfolders or pages on your site, specifically. You can read everything you could possibly want to know about them here.
The most common issue with robots.txt files is accidentally disallowing large chunks of your site or even your site as a whole. The easiest way to check to see if you have a robots.txt issue is to simply remove the file – with a backup if it’s complex – and see if the problem improves. If it does, you’re good to go; you can fix your robots.txt file, or you can just leave it blank. To be honest, most of the time you don’t need much, just disallow certain system folders, at most. If you’re trying to use it to hide spammy content, that’s when you run into issues.
Some black hat techniques fall under some of the other categories here. For instance, black hat link building, such as using private blog networks or spammy purchased backlinks for quick pump and dump value, will often trigger Penguin and link manual actions. Spammy keywords and overly optimized content will trigger content-based penalties. Duplicate content triggers Panda.
There are too many black hat techniques to list or to solve. Frankly, if you’re aware enough of SEO to know about Google penalties and to search for answers on a blog like this one, you’re also aware enough to know what you’re doing and whether or not it’s black, gray, or white hat. Identify your black hat techniques, figure out what rules they’re violating, put a plan in place to fix them, and remove the damage as best you can. If you’re lucky, you’ll get your value back the next time Google indexes your site.
Google knows that ads are necessary to keep the internet world alive, but that doesn’t mean they have to like it. They put pretty strict rules in place about how many ads you can have floating around on your website. You can see in the AdWords rules how many ad units you can have on a page, for example. They count all ads more or less equally, though self-created ads that don’t run in traditional ad spots and don’t advertise outside content can sometimes slip through the radar.
Google doesn’t like intrusive ads, like pop-ups and pop-overs. They put a minor penalty in place recently for those, but the major issue you have to watch is putting too many ads above the fold. Google long ago penalized this and added the penalty to the algorithm in their Top Heavy Update.
The solution to an ad-related penalty is always going to be easy to implement, but difficult to palette. Remove some ads. That’s it! But since ads are your source of income, your livelihood, it’s tricky to figure out what to remove. Just assume that the boost in ranking and traffic from removing them will outweigh the lost revenue of a removed ad or two.
At the end of the day, Google can “penalize” a site at any time, simply because they update their algorithm. They have hundreds of individual factors, ranging from trivially minute to extremely important, which affect your ranking in corresponding ways.
Up above I linked to the Moz compendium of Google updates. Any time you lose some ranking, and you can’t think of anything that caused it, go ahead and check that list. While you’re at it, hit up the usual suspects, Moz.com and the other news aggregators. They’ll generally cover any algorithmic change or adjustment, with analysis of how major the change was and what it targeted, according to their reverse engineering and any statements or inferences made by Google.
At this point, recovering ranking and growing ranking become virtually inseparable. Your goal is to please Google, and to do so, you need to contort your site to fit their definition of a good site. Some actions will recover penalties, while others will simply improve metrics; all of them are good.