The modern Google algorithms are very different today than they were five or six or seven years ago. The major paradigm shift came in 2011, but despite a lack of major earth-shattering updates since, the bar has been moved time and time again. What was satisfactory once is barely clearing the bar now, and what barely cleared the bar now wouldn’t pass muster today.
For those of you who own older sites, particularly any site that had content on it before 2011, you’re left with a question. Is that old content holding you back? Should you delete it?
I’ve put together something of a decision tree for you, and broken it into two categories.
Before 2011, the internet was a very different place. Google relied very heavily on keyword usage in text, and writing focused on using those keywords. If you were a pet supply business in Gary Indiana, you would want to use “pet supplies in Gary Indiana” four or five times in your article. It didn’t really matter what else the content was about; that keyword ensured that anyone looking for pet supplies in your area would see your website. From there they could visit your storefront, visit your retail outlet, and make their purchases.
This, naturally, led to a ton of abuse. Companies not even in Gary, or not local to Gary, could use the keyword to sell their online pet supplies. It was oppressive. Sometimes you would even have to dig pages back into the search results to find something actually worthwhile, particularly for informational queries.
What happened in 2011 is the Google Panda update. Panda changed the algorithm to a much more natural form of identifying the topic of content, and simultaneously devalued such abuse of exact match keyword usage. Someone trying to spam keywords for all over the world would be out-done in each geographic area by businesses that were proven to be in those areas. Most importantly, though, quality of content was put far ahead of other factors.
Now, if you haven’t paid much attention to SEO over the last decade, it’s entirely possible that there’s a lot of old content on your site that was created by an old marketer who used techniques viable for the time, but which no longer work. When Panda happened, you would have seen what people call a Panda Penalty. Your site would be devalued due to that old content, and Google would have encouraged you to fix the problem. Of course, if you didn’t pay attention to SEO news and you didn’t have a marketer on your team, you could have missed all of this and just wondered why your site was less effective than it had been. Or you could have just blamed it on the recession and called it a day. Who knows!
I’m going to level with you here; virtually nothing published before 2011 is going to live up to modern content standards. Most of it won’t have many links and will likely have zero traffic all these years later. Your question is not whether it’s good enough to keep around, but rather whether it’s bad enough to remove.
You have two options for this content; you can delete it or you can leave it as it is.
Why would you want to delete the content?
Old content is really only useful for three things. First, it can be the destination for links. Second, it can be fluff of minor value but no detriment, making your site look larger. Third, some rare content can be potentially valuable if buffed up and recreated for a modern audience.
Any links old content has are probably not bringing you much direct value. Sure, if they’re old links from big sites like Moz or Forbes, they’d be valuable links. If they’re links from other small sites, though, there’s not any real link juice coming to you from those links. The actual value of old links comes from the broadening of your entire backlink profile. Those links are making your link profile look more natural, which is a benefit in terms of trust.
The same goes for content. Even if the old content isn’t very good, as long as it’s not actively breaking modern content rules, it’s still fine to have. It makes your site look larger and gives it more potential keywords as relevant to the site as a whole. That specific content is unlikely to actually rank or draw in users, but it shows Google a bit more of what your site is about. Think of it like a scared cat fluffing up its fur to look larger; with less hair, it’s smaller, and it’s fluff is less effective.
I don’t expect to find any content worth buffing up in pre-2011 era content, but it’s possible if it was ebook-quality flagship material back then, it could have some relevance now. You would know if you had that, though, so there’s no reason for me to belabor the point.
If the content you find is spammy in nature, has no links, and doesn’t meet modern quality minimums, you can delete it without issue. That content is most likely to simply be holding you back. This is because of the Panda Penalty.
The Panda Penalty was not actually a penalty, so much as a new organization for the web. If I like up five people in alphabetical order based on their name, and then later decide to organize them based on their last name instead, the person in front might not be in front any more. I didn’t penalize them, but they lost their position due to a change in how I organized the line.
Google’s ranking are similar, except with the caveat that they organize the line based on 200+ factors and many of them you can change. Also the line includes millions of people. It’s not a perfect metaphor, just look past it.
The point is, there’s nothing you can log into and see “Warning: Your Site Is Under The Effects Of A Panda Penalty.”
The only way you would be able to tell by this point is if you have had one consistent installation of Google analytics since 2010 or so. If you have that much historical data, you can “go back in time” and look at your traffic and rankings before and after the release dates of the Panda algorithm updates.
If you didn’t have Google Analytics installed back then, though, you’re probably out of luck. You just have to experiment with removing bad content and see if it helps.
When it comes to content published after the great Panda upheaval, there’s a good chance that it at least meets the minimum standards of content at the time, which means it’s probably serviceable and not worth deleting today.
Once again, the factors you want to look at are primarily whether or not the old content has any valuable links, and whether or not it’s spammy in nature. If you’re the kind of person who didn’t care about all the details of SEO and just wrote whatever you wanted, then and now, chances are it’s probably not super valuable content. However, it’s also unlikely to be actually spam, because it’s hard to just accidentally write spam content.
So the types of content you might want to delete are:
On the other hand, any content that still has valuable links, some valuable data or insights, or that is simply lengthy, original content is fine to keep. That kind of content is not holding you back. The content that hurts you is the kind of content that is, today, considered to be spam or very low quality.
It’s a tough business auditing your content. The simplest way to see if you have something to worry about with Panda or the other subsequent updates is to use the Panguin tool. Panguin is basically a hook for Google analytics that will display your traffic graphs and compare them to dates of known Google updates. You can then see if you’ve experienced dips in traffic when Google updates their requirements, which is a sign that you’re losing value because of your content. Of course, as I mentioned above with the manual method, this only works if you actually have the historical data. If you didn’t have Google analytics installed back in the day, you’re going to be out of luck here.
If you want to make an individual record of your site, another option you have is to use a tool like Moz. First, you need to scrape your entire site to get the complete list of blog URLs. Something like Screaming Frog will do wonderfully, for this and other purposes.
Once you have a list of all of your blog articles and site pages that might be worth auditing, you can send it through the Moz API. You will need API access to feed numerous articles through the Open Site Explorer at once, and API access can be expensive, so make sure you’re willing to do this before you get too far in the process. Though, a site audit begins with a scrape regardless, so that’s never a bad thing.
Moz analyzes a number of metrics on their site, but the two you’re looking for are Page Authority and Spam Score.
Page Authority is a page-level insight about the quality of your page. A very low or nonexistent PA rating means that your page is probably not very good. Spam Score, meanwhile, is an analysis of 17 specific factors that are signs of spam. Good, high quality sites can have a couple of spam indicators and still be fine, so don’t consider anything less than a 5 to be truly detrimental.
Any page with a high spam score and a low PA is likely to be valueless to keep around. You can also see the number of incoming links in the Open Site Explorer as well, and that can help you determine if there are any links you want to keep around.
If you want a more detailed guide for performing a content audit, we published one here not too long ago. Go ahead and check it out.