Bounce rate is NOT a ranking factor – SEO myths

Google has repeatedly said that bounce rate is not used in rankings. Neither the evidence or simple logic support the idea, but the myth persists.

There are two misconceptions wrapped up in one with bounce rate. First, that search engines use it as a ranking factor. Second, that it’s a reliable measure of user experience and page quality.

Where does this myth come from?

Google has said it wants to understand how users experience the pages it serves in its results. Many have speculated that it might use the bounce rate measured in Google Analytics. The thinking goes that a page with a low bounce rate probably offered a better experience. The user remained on the site, so they must have been satisfied by the content. They also point to studies claiming to show correlation between low bounce rates and good rankings.

Those who should know better continue to perpetuate it. For example, as of February 2024, popular SEO advice site Backlinko, now owned by SEMRush, still includes on-page engagement metrics in its “complete list of ranking factors”. Other pages on both the Backlinko and SEMRush sites also suggest that bounce rate is a ranking factor. (I’ve decided not to link to their pages because inaccuracies are plentiful. Google it if you want to.)

Of course, correlation does not equal causation and this one doesn’t stand up to scrutiny. Let’s unpick it by looking first at why using data in Google Analytics isn’t desirable, then examining the metric itself.

Data isn’t universally available, or reliable

Logically, there are some obvious reasons why it would be a terrible idea for Google to use any data, let alone bounce rate, from site’s Google Analytics to help determine rankings:

  • Not every site uses Google Analytics. Some sites use other providers. Some have no analytics package at all. Estimates suggest about half of all websites use it, but these numbers are falling among the web’s top sites.
  • Google Analytics does not track every document. For example, Google search results frequently link to PDF documents. Google Analytics can only track clicks to PDFs from within the website. The document itself does not contain the tracking code needed to record direct entrances or views.
  • Google Analytics data is often not as reliable as people assume it is. Mistakes in tracking configuration happen, a lot. Website changes can cause it to break for extended periods. If Google wanted to use this data in its algorithms, it would need to discard data from sites with flawed tracking. This would mean needing to audit them all.
  • It’s open to really easy manipulation. If you want a really low bounce rate, just have your tracking code fire a second page view. Another reason why Google’s algorithm would first have to audit the data to ensure it’s robust.
  • Sites receive a lot of irrelevant traffic, and not all of it is human. Some of the most common issues with Google Analytics data is the flood of bot traffic it records. And there’s no guarantee the owner has set filters to exclude internal traffic either. Google’s search algorithm is not there to serve great results to automated bots.

Bounce rate is a poor measure of user experience

Some users bounce because they found exactly what they wanted straight away and left the site. They come back later, in a new session, and make a purchase. That’s a sign of good experience, but it increases the bounce rate.

Others can’t find what they wanted and have to use the site’s search bar. Or land on the site and immediately have to dismiss a bunch of intrusive pop ups. Those things damage user experience, but they lower the bounce rate.

Marketers want to keep people on their websites for longer and love to decrease bounce rate. Unfortunately, the easiest way to do this is to make the site more frustrating and time consuming to use.

In fact, even the concept of bounce rate is fundamentally flawed. It suggests that a second action is required before the user has engaged with the content. The very act of visiting a page is a form of engagement.

Using bounce rate as a marketing KPI is bad practice because it’s so devoid of situational context. The same applies to the concept of Google using it for ranking. In fact, Google tried to exclude bounce rate from GA4, only bringing it back (in altered form) after a backlash.

If not bounce rate, then what?

We have clear evidence that there are more reliable ways that Google can measure user satisfaction. It likely relies on a combination of methods within these areas.

Search and Click logs

Google doesn’t need to use a website’s Google Analytics metrics. It has its own datasets that don’t rely on outside parties to create and maintain them. It doesn’t use Google Analytics on its own search results. This data comes in the form of search and click logs. These record how people use Google search and how they interact with the results.

These logs tell Google what keywords people used, in what sequence, what they clicked, how they scrolled and so on. Exactly how Google uses this data, we cannot be sure. Many people conflate use of clicks with “click-through rate”, another messy metric. A click from a SERP to a specific result is just one type of click that Google could record.

The use of click log metrics in the ranking algorithms probably explains why there are correlations with Google Analytics metrics. Details matter, so we still need to be clear that the latter are not part of the ranking algorithm.

Google Chome user data

We know for certain that Google uses a form of this data in measuring the page experience ranking factor. This is the Chrome User Experience (CrUX) report, a dataset that can be publicly queried via Google Cloud applications.

How else it might use the wealth of browsing data is a matter for research. We can do this through patent applications and research papers, for example. What’s clear is that Chrome is another source of data that’s more reliable than Google Analytics. Google’s engineers can control it and have better methods for ensuring the usage they’re recording is from legitimate users.

The Search Quality Raters

To find out whether people think something is good or bad, and why, you really need to ask them. No amount of quantitative data can explain how people feel.

Google understands this. It adds context to its data through the qualitative feedback delivered by its Search Quality Rater Program. This feedback benchmarks the quality of its results and indicates whether proposed changes to algorithms will improve them.

Busy marketers rely too heavily on quantitative data to measure user experience because it’s easy. User research is more time consuming. Unfortunately, this tends to constrain a proper understanding. As Jerry Muller wrote in his book The Tyranny of Metrics, “what is most easily measured is rarely what is most important, indeed sometimes not important at all”.


Posted

in

by

Tags: