Are your bounce rate numbers accurate?

The point of analytics is to measure customer behavior. If we’re measuring bounce rates, we want to know when customers bounced, why they bounced, and strategize ways to keep them from being so dang bouncy. Colloquially, we define a bounce as anytime a user navigates away from a website after viewing only one page. But if you manage one of the millions of websites using Google Analytics, your bounce rates may mean something different:

In [Google] Analytics, a bounce is calculated specifically as a session that triggers only a single request to the Analytics server, such as when a user opens a single page on your site and then exits without triggering any other requests to the Analytics server during that session.

Google Analytics Help

In a standard Google Analytics setup, this mechanism should accurately calculate bounce rates because, typically, requests to the Analytics server only happen when the page loads. But if your page’s tracking structure is more complex – for example, you’re selling lots of different products via ecommerce – you might have a problem on your hands.

Imagine the following scenario: You own a site that sells t-shirts (here I’m using Google’s Ibby’s T-Shirt Shop training tool as an example). When people get to your homepage, they’re presented with a grid of your different t-shirt options.

Above: Screenshot from https://enhancedecommerce.appspot.com/

As a savvy, strategic web marketer, you’re interested in understanding your site’s sales funnel. So you use Google’s Enhanced Ecommerce tracking method. This allows you to count, among other things, the number of times a user views any particular t-shirt on any particular page. Google calls this a product impression.

To count that impression in Google Analytics, you’ll need to fire a tag. This may inadvertently trigger a call to the Analytics server.

Therein lies the problem: That call to the Analytics server, if not handled properly, may break Google’s bounce rate mechanism. Every time a user hits your homepage, the Analytics server would be called twice:

  1. On page load
  2. On the firing of the product impression tag

If the user stays on the page long enough for the product impression tag to fire but then leaves immediately after, Google will not count this as a bounce. This can be confusing, because the actual user behavior matched that of the colloquial definition of a bounce.

I learned this lesson the hard way. After spending nearly a month on a complex enhanced ecommerce setup for a client — working with their developers to ensure everything was installed correctly, troubleshooting rigorously and making sure everything worked properly — we launched, and their bounce rate trend shifted dramatically:

If you’ve been working in the internet for a while, you know those single-digit bounce rates at the end of the graph aren’t normal. They were the result of adding in this new product impressions tag and not declaring the tag as a non-interaction event (this is a small but important detail on which Bounteous has a great writeup).

When it comes to metrics, colloquial definitions don’t always line up with the realities of the tracking mechanism. This is a problem for the typical non-technical Google Analytics user who simply wants reliable website metrics for use in marketing strategy.

Other potentially misleading Google Analytics metrics

Bounce rates aren’t the only way default Google Analytics fails at being human. The New vs. Returning User report allows you to see the difference between first-timers and returning users, right? Well, again, kind of.

Google Analytics assigns each user a Client ID, a randomly-generated string of numbers that identifies a user by their device or browser moment. Typical users access websites from multiple devices (phone, personal laptop, work computer, etc.), and Google Analytics generates a different Client ID for each. So the same person visiting the site for the first time on their phone and then again on their laptop would be counted as two new users. Data analysts are most likely aware of the difficulties in tracking cross-device behavior, but most clients and agencies I’ve worked with – especially those doing non-technical marketing work – take the “New vs Returning” moniker at face value. Teams making decisions based on this metric should be aware of this nuance.

Here’s another fun one: Sessions aren’t always sessions. Google Analytics ends sessions after 30 minutes of inactivity, at midnight local time, or when UTM parameters change. That last point is an especially tricky one. You typically see UTM parameters expressed after a question mark in tracking URLs:

www.example.com/?utm_source=source&utm_medium=medium&utm_campaign=campaign&utm_content=content

However, Google Analytics generates values for UTM parameters whether they’re declared in the URL or not. There’s a good chance you’ve signed up for an account on a website using a “Log in with Facebook” prompt. Clicking that prompt usually takes you to Facebook momentarily to complete registration before redirecting you back to the original site.

On a past project, a client noticed an abnormal trend in their Google Analytics: A large number of users sourced from Facebook were completing transactions specifically on their second session. Also, the landing page for this second session was strangely located in the middle of the checkout process.

What had happened was that the client had added a new Facebook Login functionality near the beginning of the checkout process. Users browsed the site and filled their carts, but they weren’t asked to create or log into an account until they were at checkout. Those who chose to log in with Facebook were taken to Facebook, and the redirect back changed the session UTMs accordingly. Effectively, each session was counted twice, and Facebook was given all the credit for the sale.

Why this all matters

We make decisions based on metrics, and what we measure affects what we do. Our job as marketers is not only to understand what data we want but also to ensure we’re properly calibrating the mechanisms used to collect that data. Inaccurate data leads to fairy tale insights into the customer journey, and this leads to bad strategy. While we can’t change the mechanisms that make Google Analytics work, we can be aware of its nuances to make better decisions in tracking and strategy.

Patrick Brown is Director of Media & Analytics at Bradley and Montgomery (BaM), an independent creative agency that has worked with brands including JPMorgan Chase, Microsoft, and Xbox.

Source

Leave a Reply

Your email address will not be published. Required fields are marked *