There’s a bit of a pervasive myth in SEO that there are 200 factors that Google uses to determine search ranking. This is a very misleading claim, but it’s also a very clickbait-style headline. After all, that’s why the big Backlinko guide is so popular; it purports to list all of them. This is, of course, a massive lie.
Now, I’m not going to tell you that Backlinko’s list is wrong, or in any way detrimental to use. It’s actually quite a useful tool, it’s just not necessarily actually what Google pays attention to when ranking a site. Many of the factors listed have a lot of different variations – Matt Cutts mentions up to 50 for some factors – and many of the factors listed are only correlation, not necessarily causation.
When traffic rises so does your ranking, but is your traffic going up because of the higher ranking, or is the higher ranking coming from higher measured traffic? Does having more Facebook likes rank a page higher, or do higher ranked pages have more content that is worth sharing and liking more often? That’s the question you have to ask for most factors on any such list.
I’ve gone ahead and compiled as many known actual ranking factors as I can find. All of these will have some effect on your site ranking, though some will be much more or much less valuable than others. It’s up to you to analyze your site, determine which changes will have the most effect for the least effort, and apply those first.
One thing to note is that some of these factors could be described as indexation factors rather than ranking factors. I include them because they’re important, though. If there’s something getting in the way of indexing part of your site – like issues with robots.txt – those pages aren’t indexed, and thus don’t provide value to your site. Fixing the issue will increase the indexed content on your site and, as long as it’s not spam content, will boost your rankings.
The important subtlety to this is that indexation factors can’t really be “improved” so much as they can be fixed. You can’t take a serviceable but messy robots.txt file and clean it up and see a noticeable increase in ranking. If your robots.txt file is fine, you can’t finagle any more improvement out of it. On the other hand, an actual ranking factor like keyword relevance can be improved through tweaks to content, headlines, and meta data. You can usually squeeze a little more value out of you content by tweaking wording or updating information, though sometimes it’s more effort than you get in returns and there are other changes you can make for a bigger effect.
Content Relevance to Query
Obviously, this is one of the biggest ranking factors that exists. Every piece of content is potentially relevant to some queries, but is completely irrelevant to all others. You wouldn’t want turkey recipes to show up in your queries about Fantasy Football, no matter how good the recipe is.
This is the first thing many SEOs learn; that ranking and relevance only apply to specific queries. Anyone wanting to be “number one on Google” needs to learn that the phrase is always, always followed by “for X search query.” The relevance of content comes down to an analysis of that content, both across your site as a whole and within individual pieces of content. This is why it’s so hard to get a general interest blog off the ground, whereas niche blogs can ride an elevator to the top in many cases.
Presence of Relevant Keywords
Keyword relevance is what determines content relevance. Google is smart, and getting smarter all the time, so exact match keywords are becoming less and less important. Writing the same piece of content three times with different synonymous keywords won’t help you; to Google they’re all basically the same thing.
This is also where I’m going to tell you that keyword density doesn’t matter, except in one special case, which is where you over-do it and end up looking like a spam site. I’ll cover that more in a spam section. For the most part, paying attention to keyword density is a waste of time; content relevance is more important. 3 instances versus 4 doesn’t matter at all, unless the 4th is shoehorned into a spot where it doesn’t make sense, in which case it’s a negative effect.
Number of Valuable Links
Links are one of the most influential metrics to measure. There are only a few basic qualities to links, but they all matter. One is the number of links in totality leading from other sites into yours. Another is the number of links from different domains. Another is the relative quality of the sites linking to you. Another is the number of links from your site to other sites, and the quality of those sites. The last is the number of links from parts of your site to other parts of your site; internal links.
A lot of SEOs put extra emphasis on .gov and .edu TLDs, but these don’t actually have an effect. The only reason they are perceived as more valuable is because most of the time they are high quality sites. If you were to pay 400 college students from around the country to create simple sites on their .edu hosting and link to your site, you would find that those links are nowhere near as valuable as a single link from a college resource page.
Location of Links on Page
This is another link factor that is worth mentioning on its own. Links in a footer directed at your site aren’t very valuable. Neither are links in sidebars – which look like ads – or links in top navigation. Links in content – contextual links – tend to be much better. That’s because they are not part of a site design, they are more intentionally chosen to be present on the page the user is viewing, making them more relevant and important.
Use of Google Analytics
Using Google Analytics and Google Webmaster Tools is not itself a ranking factor. However, you can consider it more of an anti-spam verification measure. Most black hat marketers running spam sites don’t want to give Google an inside look at their statistics, because when Google gets that look, they are more likely to be able to identify spam techniques. They will often use third party analytics tools instead. This means that by using Google Analytics, you are telling Google you are okay with the scrutiny. Regardless, this is a very minor factor, all things considered.
Local Search Relevance
Local search is a variation on search relevance, but there are additional factors that go into being locally relevant. One of them is using geographic keywords such as names of towns or streets in your content. Another is having Google Maps integrated on a location page. Another us having your address posted. You can optimize your local search presence and get yourself in carousel results as well, by implemented a basic Google+ page. Using Google+ is not generally a ranking factor, but it helps for tertiary search results for local businesses.
Use of HTTPS Security
Late last year, Google decided to take steps towards making the web more secure, and one of those steps is encouraging webmasters around the world to implement SSL security on their sites. To do this, they decided to make a secure site into a ranking factor. By implementing site-wide SSL, you will very likely get a boost to your rankings. You just have to make sure you’re not running into one of the few issues that can come with site-wide security. At the very least, you need security on any page that requires a login or on any page that is used for the purchase process.
Not Overusing Ads
Ads are one thing that makes the web a worse place to browse, so Google is very keen to keep them at a reasonable level. The more ads you use on your page, the harder it will be to rank. Excessive use of ads will bring down your search ranking, particularly if the site content doesn’t back up the kinds of ads you’re running. This helps minimize the presence of thin affiliate sites and promote sites that have valuable content to share.
Page Load Speeds
Page loading speed is a ranking factor that has more of an influence the worse it is, on a geometric scale. Below one second, loading times aren’t terribly important. The difference from half a second to a third of a second is minimal. However, every second above 1 second is exponentially worse. If your site takes more than one second to load, work on it until it loads under one second. If it already loads in under one second, you don’t need to do much.
Branded domains are on the rise these days. Exact match domains have the deck stacked against them, though they are not automatically penalized to an extreme level. Exact match domains tend to have some benefit in that links pointing to them will have a built-in keyword integration. However, they’re fighting against trends to maintain that value. Essentially, if you have an old EMD that you’re using, you’re fine. You don’t need to rebrand. However, if you’re starting a new site, don’t go for an EMD. You’ll have to work a lot harder to reach the level of value that you can with a branded domain.
Proper Use of Robots.txt
I mentioned this one in the intro. A good robots.txt file is good. A messy but readable robots.txt file is fine; it works, and it doesn’t break anything. Cleaning it up won’t improve your ranking. It’s only when you’re accidentally blocking Google from seeing your pages that you have issues with robots.txt.
Availability of Sitemap
This is a very minor ranking factor in most cases, as it helps Google keep up with new content and changes on your site. However, if your site has orphaned pages or poor internal linking, a sitemap can dramatically increase the indexation of your site, and thus your ranking. Again, it’s an indexation factor more than a ranking factor.
The age of a domain is a minor ranking factor, and typically only matters as a milestone. Domains under six months old are quote-unquote “penalized” in that they have a minor handicap. Once they pass six months old, or a year, or some low milestone, they join the ranks of normal domains and Google lifts the “penalty” that isn’t really a penalty. It’s simply an indication that the site isn’t going to disappear and can be ranked higher more firmly.
Accumulation of Spam Factors
Very rarely will any one spam factor destroy your search ranking. A little bit of duplicate content won’t kill you. A few spam links won’t kill you. Having your WHOIS information protected isn’t by itself a negative, it’s just a little fishy. Overly high keyword density can hurt but won’t ruin you on its own unless it’s a constant problem. However, all of these factors combined can hurt you a lot more and earn you a spammer label or even complete deindexing.
A study performed on the top ranked content across various keywords indicates that the ideal length for content should be around 2,500 words. Content that is much shorter than 1,000 will not have the space to cover a topic and provide enough value. Content that stretches too much longer will cause issues with reader attention, and may be better presented as a detailed study or as an ebook for additional value. That said, content length is primarily just a ranking factor in that too-short content tends to be spammy in nature and thin in value.
Duplicate content is a bad thing. Duplicating a little bit of content – quotes in your posts, the occasional product description – isn’t going to destroy your site. However, widespread duplicate content will be very detrimental to your site as a whole.
There’s also the issue of originality in topic and information provided. If you’re just taking the #1 ranked post for your topic and writing the same thing without providing any additional value, you’re not going to do as well. You bring nothing new to the table, and the existing site has a head start in links and prestige.
Google likes fresh content, particularly when that content provides new value. This doesn’t have to be brand new posts, though. Fresh content can also be old evergreen posts that have been updated to be made factually accurate. That’s why major old guides that were published years ago can still rank higher than newer guide son the same subject; they have been updated to include the additional value necessary to succeed.
Code errors are one of the issues that can demote or derank your site, depending on the severity of the issues. Minor issues that make things display incorrectly or throw errors that are overlooked can be, well, overlooked and aren’t a huge detriment. Errors that break entire pages or lead to 404s will be much worse for your site.
Ease of Navigation on Site
This includes robust navigation in the top bar, a lot of internal links, related posts widgets, and breadcrumbs, among other things. Ideally, there will be no more than 3-4 clicks from any one page to any other page on your site. The less effort it takes to get from point A to point B, the better. However, try to make sure your site operates on a logical structure as well, to make that organization make sense.
Availability of Contact Information
Google likes when you’re able to see the contact information for the owner of a site or business. Address is good for local businesses. A contact form is good, but a raw email is often better. Phone numbers and other means of contact can’t hurt. Just make sure if you have your contact information for this business posted on other sites, you have matching information.
TrustRank and Trust History
Trustrank is an internal “benefit of the doubt” ranking that Google uses for various purposes. A site that has a long history of providing original content and good value will be given a bit of a pass if they post a less than stellar piece of content. Trustrank is also used when helping determine the origin of a piece of duplicate content; the site with the higher trustrank will be given the source credit, assuming all else – publication date, etc – is equal.
Mobile compatibility is a search ranking factor, but it is very minor for desktop search. However, a huge percentage of web users these days are browsing via mobile devices, and not having mobile optimization will be a huge detriment to your appearance in those search results.