For a long time, there has been an ongoing argument over whether or not PageRank is a valid metric. On one side of the issue were webmasters who still used it, and to some extent Google themselves. On the other side were a whole lot more webmasters.
In the early days of the web, when Google was first coming into play as a major search engine, one thing they did was release the Google Toolbar. You might remember this as one of many, many toolbars pushed by services all over the place, from Alexa to Yahoo. the Google Toolbar had a number of potentially useful features, including an ever-present search box, but the biggest new feature was the visibility of PageRank.
Now, PageRank has been a metric that is important for site since day one. Though many people think the name itself refers to the rank of a page, it is in fact named after Larry Page, co-founder of Google. It has always been part of the Google algorithm, and in fact WAS the Google algorithm for a long time. Even today it’s still a core part of the algorithm, it’s just not public.
Back in the day, though, PageRank was invisible. It was something Google mentioned in research papers, not something the public could see. Then, when the toolbar came out, they made it visible to any web browser or webmaster. All they had to do was enable the PageRank Meter, and they would be able to see a reading of how valuable and important Google considered the page they were visiting. Better pages had higher PageRank. Worse pages had lower or missing PageRank.
It was an imperfect system, of course. There have always been some issues with PageRank, since it’s on a simple scale of 1-10. There’s so much potential variation within a number that it was a barely viable metric at best.
The problem, though, wasn’t that Google made PageRank visible. It was that by doing so, they enabled an incredible push towards using it as a results metric. Marketers took over. They constantly pushed and prodded, testing and tried, looking for anything they could do that would increase PageRank.
The end result was the general reverse-engineering of PageRank in such a way that everyone could understand it. PageRank relied heavily on incoming links. More links, particularly more links from higher PageRank sites, meant a higher PageRank yourself. This is where it all went wrong.
If you’re a webmaster, you can probably look through your spam inbox and find hundreds of messages asking for links or “guest posts” and other sorts of deals. These deals all have one thing in common; they come from someone who wants a link on your page. Why do they want a link from your page? Because your page has a high PageRank and by getting a link from you, the owner of the site you’re linking to gets value.
God help you if you ever showed up on a list like these.
With one move, Google turned their internal metric into a million-dollar business. Companies sprang up who knew how to boost PageRank – or claimed to know – and who made a ton of money in gaming that metric. It kicked off a major war between Google and spammers, an arms race of techniques to game the system and techniques to detect unnatural link building.
To get a better PageRank, you needed more and better links. To get those, you needed to, more often than not, pay for them. Like money in politics, this monetary investment meant the biggest sites had the power to stay big, and little sites had barely any means to get to the upper tiers.
Google has largely spent the last 15 years struggling to win the war they inadvertently started. Google won a suit against a PR-gaming firm SearchKing. They implemented NoFollow. They launched Panda and Penguin and all the other algorithm updates in an attempt to devalue links and boost the value of quality content.
Yet through it all, PageRank has still been there, updated on the toolbar, a beacon for marketers who fear change. Updates started tapering off, but the metric has remained public, even if the data behind it has been out of date for months or years. Five years ago, the toolbar was dropped from Firefox. Three years ago it was 10 months between data updates for PageRank. Since the start of 2013, there have only been two updates, one in February of that year, and another in December.
Since then, the debate has raged. Is PageRank dead? Is Google ever going to update it again? Is it still valuable as a metric even with older data? Part of the problem here is that, when asked about it directly, Google spokesman Matt Cutts claimed they weren’t going to retire PageRank because a lot of “casual web users” still used it. It was a less and less valuable metric for SEO professionals, but normal web browsers were still using it to indicate a measure of trust in a site.
Of course, this is at odds with the reality. The Google toolbar doesn’t work on Chrome, was dropped from Firefox, and is very hard to install on IE or Edge. It’s simply unavailable for most people. And now, as you noticed because you’re reading this post, PageRank is well and finally dead.
As of March 7, 2016, PageRank is officially dead. Any tools, first or third party, that checked PageRank will cease to function. This is, of course, the external public PageRank display only. Internally, PageRank is a core part of Google’s algorithm, and it’s still in use. You just can’t see it, just like you can’t see much of anything else.
Links still matter to your search ranking. You still have a ranking internally that judges the quality and the number of links coming in to your site. Penguin still filters them, and you still want a varied link profile. It’s all just no longer public.
Now, I know that as marketers, webmasters, and SEOs, we all want some nice little metric to record that we can measure our efforts and report our successes. Search ranking is nice, but with the variability between keywords, it’s not a really objective metric. It requires a lot of context that is hard to present in a short post. PageRank was a limited metric that objectively graded a site based on links, and it worked well for that, despite the war it started.
There are other metrics you can use. Over the last few years, with PageRank never receiving an update, other companies jumped in to harvest data, analyze it, and present their own rankings. These are the alternatives you can use as authoritative measurements of your site quality and position.
Moz has a few major metrics it uses to help marketers understand different aspects of their sites. They have Page Authority, Domain Authority, MozRank and MozTrust.
You can find all of these metrics, and a lot more analytics data, by using Moz’s open site explorer. If you’re a pro-level subscriber, you can also use the MozBar, an analytics toolbar available for various browsers, which can replicate the easy functionality of the Google Toolbar.
If you want something other than Moz – and it’s possible you don’t like them, don’t want to pay for their tools, or just want other data sources – there are other options as well.
Majestic is one of the top-tier link analytics sites on the web, and they have produced an innovative way of measuring link quality and presence. These are two interconnected metrics known as Trust Flow and Citation Flow. Every link to a site has a value between 0 and 100 for each of them, which positions it on a 100×100 grid. Majestic uses this grid to visually display the distribution of links to a site. It’s very neat to look at.
Citation Flow is a number that measures how influential a page is based on how many links are flowing into it, one degree removed from you. A link coming into your page will be measured and that page’s links will be compiled to make its citation flow.
Trust Flow is a similar metric, except it takes into account trust rather than just the number of links in a mathematical equation. Trustworthy sites tend to link to other trustworthy sites, and get links in return. The more trust that is circulating in a network when that network links to you, the better your link’s trust flow will be.
Each link is rated and placed on a chart, and the data from that chart is used to calculate ratings for your page if you were to link out to another. So each page has these flow metrics, and each link coming in also has them.
Both metrics can be seen easily on a site-wide basis with Majestic’s free tool, and more data about them – on page level and link level – can be seen using paid versions of their tool.
Ahrefs is another of the top-tier link explorers in the industry today, competing with Majestic. It has several different tools, each with a bunch of metrics you can explore to use for your reporting. The site explorer is a general backlink profile study, which shows you the number of pages and backlinks coming in to your site, the traffic estimates, the ranking and rating for both URLs and domains as a whole, and a global ranking just for fun. It’s the most comparable to PageRank and works pretty well.
Other tools include the positions explorer, which helps look over your organic search rankings, and the content explorer, which shows you ranking and trending content in your niche, including your own positioning. There’s more data you can study, of course, but the main rankings for useful comparative analysis are contained in these tools.
This is a smaller tool, and it’s actually less useful in general for a comparative analysis, because it’s not an internet-wide tool. Rather than dealing with link value from all the sites in Google’s index, Oncrawl’s tool focused entirely on your site itself. It uses a general database to determine the value of your pages, and analyzes your internal linking to find opportunities for distributing link juice internally in a more optimized manner.
There are a bunch of other tools out there that measure links as well, but the best are already on this list. If you find yourself needing something more, you might consider deeper analytics tools, and looking at metrics other than just links.