Message us

All great projects start with a conversation.

Thank you. We received your message and will get back to you soon!

Browse our showcase

or Send another message?

👀 We're on the lookout for a Senior Digital Designer

SEO

Why bounce rate doesn’t matter (and when it might)

Bounce rate doesn't matter. Or, to be more precise, bounce rate doesn't matter because you're reading it wrong. Here's why...

why bounce rate doesn't matter, and when it might


Here’s an analogy. You are the new CEO of a supermarket chain, and you sit down for your first board meeting, where the Head of Fruit (yes, it’s a real job) informs you that 40% of your fruit is thrown away because it’s gone bad.

40% you say. That’s crazy. 

Yes, the Head of Fruit says, we’re trying to get it down. 

Of course, if your Head of Fruit knows what he’s doing, he’ll be looking at which fruits are going bad before sale and focusing his efforts on those fruits – perhaps looking at ways to sell them more quickly or package them better. 

It would be wrong to look at the top-level figure and think that you’ve got a problem with all your fruit.

And here we are, looking at website bounce rates and wondering what to do about them. Why do we take our website bounce rate metric so seriously? 

Mainly, it’s Google’s fault. Bounce rate is a very prominent metric within Analytics. Google think it’s so important, they’ve put it up alongside pageviews within and total visits. 

But bounce rate doesn’t matter – at least, it doesn’t matter in the way you think it does. 

A single bounce rate metric – one for your whole site – is misleading and uninformative. A bounce rate of 90%, for instance, could be good. A bounce rate of 30% could be bad. And Google doesn’t care about your bounce rate. 

So why do we care at all? 

What actually is bounce rate anyway?

It’s a little murkier than you thought. A bounce is when someone visits a single page, and leaves. But the implication of the word ‘bounce’ is that the visit is short. 

A ‘bounce’ can be a ten-minute visit or a ten-second visit. Google Analytics is unable to tell you how long these people stayed on the page. Single-page visits are always counted as zero seconds. 

So not only is measuring an overall bounce rate the wrong thing to do, Google is unable to report accurately on what a bounce is. With a reported time on site of ZERO seconds, it naturally looks bad. 

A 90% bounce rate can be good 

You can only understand engagement if you segment your content properly. 

The Head of Fruit obviously knows that his 40% bad fruit rate was being ruined by a 90% bad pear rate. Melons were running at 1%, so melons are good. 

Therefore, the 40% figure is meaningless.

Blog posts will always have a high bounce rate (they are the pears of your bounce rate fruit world) – especially if you are promoting on social media where the platform is what people naturally return to. 

If you’ve got a 90% bounce rate on a blog post, that means that you actually have a 10% conversion rate of moving people through to a new page – from a piece of content that wasn’t designed for that purpose. 

That’s pretty good. 

And if 90% of your traffic is from blog posts, you’ll always have a high overall bounce rate metric. 

If you have a page that answers a user’s query, you’ve done your job. You ought to be proud of a 100% bounce rate in this case, because you’ve answered the query, fulfilled the user’s intent, and everyone is happy. 

A 30% bounce rate can be bad

Now, if those users haven’t had their query satisfied, they might start hopping around the site. They could spend a single second on each page, thinking “hm, no, not this one… not this one…” and disappear. Your bounce rate is very low – everyone is celebrating. 

But none of your users are satisfied. If they’re not converting, in what world does a 30% bounce rate appear good? It’s irrelevant, at best.

That bounce rate metric on the front page of your Google Analytics is hiding bad news – or hiding good news. 

What if only 20% of your visitors matter?

Let’s not pretend that everyone wants to buy from you, or cares about what you do. In fact, they may be landing on your site for entirely the wrong reason. 

Often, you can’t prevent this. Just like a supermarket can’t prevent people from wandering in, you can’t put a block on your home page saying “you’re not my kind of web visitor, go away.” 

Although it is kind of tempting. 

You need to find a way of profiling those visitors and measuring their engagement, not the engagement of people who are never going to buy from you. 

That’s not easy. 

You could create content groups within analytics to define the types of content that bring the right kinds of visitors, or you could create a filter that excludes blog traffic (which very rarely converts for anyone). You can analyse bounce rates through different sources of traffic, or via different landing pages. 

All of a sudden, bounce rate starts to mean something. 

So what should we actually measure? 

The bounce rate of certain landing pages is a good metric – and this means you have to dive into analytics to analyse the bounce rate of each individual page. 

But what’s more important very much depends on each landing page. If you have a sign-up form – and people are filling out the form and going to a thank you page, then the bounce rate is the mirror of your conversion rate. 

If you’re using a lightbox instead of a thank you page, you’ll have a 100% bounce rate, and it doesn’t matter. They’ve achieved what you wanted them to achieve. 

In the process, you’ve increased your overall bounce rate. 

Every page has an aim – so group pages together according to their intent, and define the metrics that matter for that group of pages – it could be time on page, average session visit, pages per visit, conversion rate, total visits… 

And when should you measure bounce rate? 

Some examples of where it makes sense: 

  • When you’ve redesigned a page, and you want to compare performance over a period of time
  • When you’ve developed a feature on your blog page which is designed to entice users to read further articles
  • When comparing similar pages with similar levels of visitor numbers

To stretch the Head of Fruit analogy even further, you can only compare apples with apples (see what I did there?) 

The Head of Fruit should not be comparing the gone-off rate of pears with the gone-off rate of lemons, but he could be grouping together the lemons and the limes, which could have similar rates. 

But more than that, he should be learning from the gone-off rate. If the rate at which you’re throwing away a certain fruit is too high, ask yourself why customers aren’t picking that fruit, and play with the metrics you can play with: price, packaging, display… 

When you consider bounce rate in this light, you can see that it’s good to know, if you know what you’re looking for. 

Otherwise, it’s misleading and just plain wrong.