Two elements of web usage are often the source of controversy. Bounce rate and click through rate are both simple to understand, with some complexity that some people don’t realize exists. The real question, though, is do they have an effect on your SEO? Let’s examine them both.
First, I want to talk about bounce rate, because bounce rate is the trickier of the two to grasp fully.
At first blush, it seems easy to understand. A user performs a search and gets search results. They click a result. They quickly click away from the result. That site records a bounce visitor. It’s not great to have, right? The user is indicating that they didn’t really want to be on the site, so they left.
There’s more depth to it. For example, what happens if the user clicks to the site and then closes their browser? There’s no second point of reference, so it counts as a bounce. Now what happens if the user simply leaves the browser window open? There’s still no second click, no second reference trigger, so it counts as a bounce. That user might have a guide open for an hour, referencing it step by step, but because they didn’t click a second time, there’s no indication they did anything, so it counts as a bounce.
There’s also the case where you’re looking for a specific piece of information, like the routing number of your bank. You search it and find your bank’s page of routing numbers, and click it. You get your number and leave.
It’s a bounce, because you only viewed one web page, but you still got the value you wanted and you left satisfied. This is why bounce rate is not necessarily all bad, and it’s why a high bounce rate can’t count against you. If Google penalized websites for providing quick and accurate information, the search results would be populated with two-stage pages that ask a question to draw in users and answer it after a click. It would be inefficient and exploitative, not an environment Google wants to foster.
That’s all theory, but the reality is somewhat more practical.
How would Google measure your bounce rate? They would need to have some kind of code on your site. “They have Google Analytics” you say, and that’s true; a lot of sites do use Google Analytics. However, plenty of other sites do not, and Google can’t give sites preferential treatment because they use the GA suite. Sure, GA is one of the better analytics suites out there, but that doesn’t mean it should be required.
On top of that, even if everyone did use GA, Google would need to set standards and publish definitions. What counts as a bounce and what doesn’t? How can they tell the difference between a good and a bad bounce? It’s a problem Google has not been able to address, and by all accounts it’s firmly on the back burner. Things like their AI they plan to have take over the world shortly.
There are a lot of different factors that can cause a high bounce rate, and only some of them are bad. If you have some of the bad ones, it’s probably worth fixing them if you can.
First, your site design might be turning people away. A site with a cluttered visual design, a site with usability issues, obnoxious flash animations, far too many ads, or a bunch of redirects and interstitials will lead to a high bounce rate. The more time a user has to spend digging and looking for your content, or fighting your design to view it, the more of them will simply leave and find a better resource. There’s far too much competition on the modern web to be able to get away with a poor design.
Second, the content on your site might cause a bounce. This comes in two forms: good and bad. Bad, poor quality or thin content will cause a bounce, and thin content will hurt your site in other ways. Users are convinced the site holds no value and leave. By contrast, great content that answers a question quickly and immediately can also cause a bounce, because users find what they’re looking for and leave. The solution to this problem is actually to surround your content in enticing links and CTAs to other parts of your site. It’s why internal links and related post widgets are so popular. They draw the user to other parts of your site, keeping them from bouncing.
Third, actual technical issues can cause problems and lead to bounces as well. For example, the Forbes welcome screen still loads if you have an ad blocker installed, but the timed redirect and the button to continue do not load. This causes a bounce when the user can’t proceed. Other sites might have similar issues. Or their URL is missing in a redirect and the page 404s. Or the site might have broken scripts that cover the top half of the page in code. Or the images don’t load, the CSS doesn’t load, and it looks like a txt document with no formatting. There are a ton of different issues that can drive users away.
Fourth, there might be a mismatch between keyword and content. If you use one keyword frequently enough to rank for that search, with content that is good but not targeted at that topic, the user might visit and find themselves unsatisfied. They’ll bounce, not because your content is bad, but because it’s not relevant. Ideally, this will be less and less of an issue as Google gets better at parsing meaning, but it can still happen so it’s good to pay attention to your keyword usage.
Fifth, an external site might link to you under the wrong context. A cursory glance at a page might make it seem more valuable and relevant than it is, and an overworked author might include the link without a proper review. A user clicks the link and visits the page, only to find that the page isn’t actually relevant or valuable, so they bounce. Thankfully, this tends to be relatively rare.
So, that’s bounce rate. There’s another half of the title question, however, and that’s click-through rate. Let’s talk about that, shall we?
Click-through rate is a somewhat more complex issue than bounce rate, because it’s something Google can in fact directly track. They serve your link, people click on it or don’t, and they can see exactly what percentage of users do so. Organic click rates can be measured easily. Paid click rates are a different story, but they also try to keep money out of SEO, so it makes sense that they would only pay attention to organic data.
Moz did a pretty good study on click rates and their effect on SEO, so let’s take a look. You can read the post here. There’s a lot of discussion of Google’s RankBrain AI and it’s role in search ranking, but it’s not the main emphasis here. Larry Kim did some testing with organic and paid search data, using queries that don’t have a lot of incoming links or strong ranking data, which is about the most even playing field you can get for testing this kind of thing.
What he found was that in terms of paid search, long tail keywords didn’t perform as well as they did in organic search. Organic search results seemed to improve the higher the CTR, but it’s a chicken and egg problem. Does the site rank higher because it has a higher click rate, or does it have a higher click rate because it’s ranked higher?
It seems as though Google’s algorithms predict a certain CTR for a given page, with content and links and all the rest taken into account. If the page performs better than expected, it is given a higher search ranking. That means if a page is more attractive or more valuable than the average post for that search, the post will do better. Seems logical, right?
So how can you ensure that you get a higher CTR, so you can rise in the ranking according to Google’s AI and their other search factors?
First, you need to figure out what pages on your site are holding you back. If you go to the Google search console, you can download the data for each indexed page on your site. Sort through those and figure out what pages have low or below average click through rates. These are the pages holding your site down, and you need to either remove them or buff them up so they’re more attractive. The better they perform, the better your site as a whole will perform.
Larry Kim recommends optimizing titles – the meta title that shows up in search, primarily – with emotional impact. Which of these posts would you rather click?
Personally, the second one seems a lot more interesting. The first implies a lot of boring data and not much analysis. The second promises conclusions that are “awful”, which might not be something I want to see. More importantly, though, it’s something I probably need to know. You don’t have to pick a negative emotion, though; positive emotions work just as well in most cases.
Second, you need to build brand awareness. People are a lot more likely to click through to sites they recognize. In SEO, for example, I’m a lot more likely to click on a search result from Moz, from Search Engine Journal, or from KissMetrics, than I am from Bob’s SEO Page or www.ultimateSEOprofessionalsOnline.biz.
Normally, building brand awareness is a tough and lengthy process, but you can speed it up significantly by identifying your core audience and targeting them with awareness-based ads. Simple, cheap website click ads on Facebook and through AdWords will go a long way towards getting you brand awareness. Just make sure not to go all-in on the low quality ad networks that will add you to spam sites; that’s brand awareness and associations with sites you would rather not be associated with. It’s like setting up shop in a bad neighborhood.
Third, of course, you need excellent content. Rand Fishkin at Moz calls it 10x content, but it’s a concept I’ve espoused a lot on this site as one-up content. The concept is the same, regardless: content that is better than what exists already. Anything your competitors can do, you can do better.
Here are the criteria Rand puts forth for his 10x content concept:
Make great, compelling content and get people to click through to it. It will help your search ranking.