A website in general not ranking is a problem, but it’s not the problem I’m discussing today. What happens when your blog is ranking fine, but your homepage never seems to show up?
It’s a surprisingly common problem that has a few different possible causes. Generally it’s not something you’ve done wrong, and it’s not a penalty, but both of those can be causes in certain circumstances. I’ve put together the most common causes of a partial site ranking, and hopefully one of them fits the bill for you.
Probably the number one reason why a homepage won’t rank when other pieces of content do is a lack of strong branding. Homepages tend to rank on the power of their brand name and the quality of the content on them. If your home page doesn’t have much content – many non-blog sites fall into this trap – you have to rely entirely on your branding.
Take a look at a page like Technorati. They only have a couple hundred words of content and some images on their home page. They rank because they’re an old site and they have a strong brand name, but if they didn’t have thousands of backlinks and they had a generic name, that home page wouldn’t have a dream of ranking.
So, this is one example of a page that can have a hard time ranking. If your site doesn’t have a critical mass of backlinks pushing every page on it to the top, and if the homepage doesn’t have much in the way of content on its own, it needs extremely strong branding to be able to show up. Even then, it’s likely to rank for keywords relating to the branding itself, not to content on the page. Look again at the Technorati page; what keyword would it rank for? There are a few marketing buzzwords, but there’s nothing that really pushes forward. All you get are unique keywords like the brand name and the name of the brand that bought them.
I mentioned this quite a bit above, but it’s really a problem on its own. Most blogs have a home page that is itself just the most recent couple of blog posts. Other sites have homepages that have navigation, content categories, and product links, but no really strong content of their own. These homepages have a harder time ranking.
Google really is focused almost entirely on content these days. How many people run Google searches and really want to find a home page? Unless you’re just looking for the name of a brand, chances are you either have a piece of information you want to find or a task you want to complete. Both are likely to be more relevant as sub-pages than as a home page.
If nothing else, I recommend that your home page have a solid description of what your brand is and what you do. Think of it like a smaller version of a more robust “about us” page. In fact, if you’ve experienced this non-ranking homepage issue, chances are your about page is ranking better, for exactly this reason.
If you can swing it, make your homepage dynamically load a description or an excerpt of the most recent blog post you’ve published. Blog homepages tend to rank for exactly this reason; they always have some minimum level of content from the blog on display.
If your homepage has a lot of content on it and still isn’t ranking, it’s possible that it’s the format of the content causing the issue. Google doesn’t index videos, because technology has not yet really reached the point where they can casually interpret the content and context of a video when they index it. They do this for YouTube on a very basic level, and the result is the flawed automatic captioning system.
Google will also not index primarily image-based content. If your homepage is made out of images with text on them, in an attempt to be unique with typography, stop that. That’s an old web 2.0 strategy that, while it looks neat, is completely impenetrable to Google. Plus, these days, you can use HTML5 and CSS3 in combination to create pretty much any image design you want in a way that’s readable by Google, at least as far as text is concerned.
Flash content is another big one. Thankfully, it’s on its way out. Flash used to be the big way of animating content, but over the last few years has been hit repeatedly with massive security holes, and as such, many major web browsers have decided to stop automatically loading Flash, and requiring permission to load elements. It’s a great security measure, but it hurts any site using Flash to serve content, or any ad network serving Flash-based ads.
There’s something known as the “Google dance” that happens when something changes, either on your website or on Google’s end in their algorithm. In either case, it’s a bit of a shake up as Google re-indexes your site and re-categorizes it according to their algorithm. Your ranking will rise and fall, sometimes significantly, while they try to figure out where you should rank in the new order of things.
If you have recently completed a redesign of your site, it’s possible that your site is in the middle of the Google dance. As far as Google can see, your old content has disappeared and your new content needs a place to rank. In this case, you might not show up in the rankings, but you will come back eventually.
One way to tell if this is the case, other than knowing that you recently did a major site redesign or overhaul of your homepage, is checking how long you’ve been low or nonexistent in the Google rankings. Generally, the so-called Google dance won’t last more than a few days, or a week at the most.
Google has a lot of usability rules and best practices in place to try to make sure web users get the best experience possible. In part, this is why so many sites have so similar designs these days. How often do you visit a site and know immediately how it’s going to look? Logo in the corner, navigation up top, content in the middle, ads you can ignore along the side, footer with less important links, etc.
In part, the reason for this commonality is that a lot of sites are modeling themselves on top of common themes and designs, and those that aren’t are generally modeling themselves after big name sites. The big sites are in turn doing what works, and “what works” is determined by what Google promotes.
That’s not to say that Google is influencing the design of the web directly. It is, however, saying that Google has rules in place about what makes a site have a good design versus a bad design.
Some elements of a bad design include links that are hard to click, text that is too small or hard to read, the media-as-homepage issues from above, and issues with odd colors, fonts, and layouts. Sure, it might be innovative to have a webpage that scrolls horizontally rather than vertically, but it’s not going to do you any favors in terms of your audience. Leave that kind of experimentation to the artists.
This is one of those technical errors that could cause an issue, but rarely comes up. The only scenario where I’ve seen something like this happen is when you’re doing a redesign and testing it on your live server, so you put noindex in your pages so Google doesn’t index a hybrid or broken partial design. Then when you go live with the full thing, you forget to remove the noindex, and wonder why you’re not indexed.
There are a few different ways you might have noindexed your content. The first is with a meta tag on your page itself. Look at the top of your code, before the body, in the header. You’re looking for something that says “meta name=”robots”“, which is the directive telling what various robots like the Google search spiders are supposed to do. If it has content=noindex on it, that means search engines are told to ignore the site.
If no such meta directive exists, you might check another location. Some sites don’t use on-page meta directives for robots, they use a separate file called robots.txt. This file is generally placed in the root directory for your website and controls the noindex and nofollow directives for individual pages. For example, if you wanted to block all robot-based traffic to the site, you would have a user-agent: * line. This line says the following directive applies to all robots. If you only wanted to block a specific robot, you would use a user-agent: googlebot, or whatever the official name for the bot you’re trying to block happens to be.
Below that, you would have the line “disallow: /”. This is blocking the bot from the url specified. In this case, since it’s just a /, it means everything on the domain after a slash. www.example.com/ as an example, every page on the site would be disallowed.
Finally, you might have some kind of lingering blocks against user agents or site referrers in your .htaccess file. You can read all about how this works here.
Pretty much all of the above issues are only issues for small sites. Larger sites can get away with a lot simply on the power of the links pointing to them, the age and variety of their content, and long years of white hat SEO power. Smaller sites don’t have that going for them, so it’s more like they’re in a more precarious position. It’s easier to push over a tire than it is to push over a whole car, right?
Unfortunately, while this is the easiest to diagnose issue among all of them, it’s also the hardest to fix. If your site isn’t ranking, you need more links and implied links from around the web. However, extended link building campaigns are just that; extended. You need to go about it over time, you can’t just bombard the world with links and expect it to get you where you want to be.
How can you build enough links to give your homepage ranking and brand power? You can partake in influencer networking. You can start a guest posting campaign. You can work with more social marketing, or more paid marketing to the right audience. There are a lot of options, and frankly, you should be working on them all.
This is a relatively rare occurrence, but it’s possible that having your homepage stolen has made your page rank lower or not at all.
Now, generally, Google is resilient to this kind of attack. No one can come in and just copy my home page and drive my site down in the rankings, because Google knows mine came first. They go by the date the site was indexed, rather than the date it displays on the content, specifically to prevent abuse via backdating.
The problem comes up if you were deindexed for some reason, if you changed URLs and didn’t implement a proper redirect, or if they managed to copy your site before you actually got indexed in the first place. In all of these cases, it looks to Google as though your content was actually second in line, which means you will need to change up your content to appear unique. There’s no real way to report the scraper to Google without some weight behind you. You can report a scraper, but if Google still thinks you’re the copycat, they might not take action.