Four Kitchens
Insights

Google Analytics myths that just won’t die – part 1

5 Min. ReadDigital strategy

Before we dive into what website analytics can be, it would be helpful to talk through a few things that it isn’t. Here are the first five myths in a two-part series that I’ve encountered more times than I care to count.

Myth #1: If a user views a single page in a session it’s a bounce.

We’ll lead off this series with a particularly pernicious myth that comes up time and time again. Why? Because even Google’s own analytics help documentation is incorrect.

“A bounce is a single-page session on your site.”

Except… when it’s not.

Yes, I just contradicted the Big G.

In actuality, a bounce is defined as any session that sends a single interaction hit to Google’s servers before the session time-limit expires (default setting in Google Analytics is 30 minutes).

Hit possibilities defined by Google are pageviews, events, and certain ecommerce and social interactions.

What this means is that you can have a single pageview but if the user performs an interaction event (say, viewing a video which you are tracking in GA) then that won’t be a bounce!

It gets even stranger, though. You can have a session without a pageview! If a user is already on your site and leaves a page open without interaction for more than 30 minutes that will end that session. But then, let’s say they click to watch a video that you track via GA. That would fire an event but no pageview. Don’t believe me? I set up a sample segment on Google’s own demo analytics account. It’s not exactly a lot of traffic, but it’s there!

Tip: Content-heavy sites sometimes choose to fire an interaction event after a set period of time (say, 30 or 60 seconds). That way, even if a visitor only views a single page but spends a lot of time digesting the content, that won’t be seen as a negative experience in analytics.

Myth #2: My bounce rate went up. That’s terrible! (or) My bounce rate went down! Success!

Any analytics number in a vacuum is meaningless. Pages can have wildly different bounce rates and, without additional context, we can’t say whether that’s good or bad.

Perhaps you have a post on your site giving information about an upcoming event and it links to a registration signup page handled on a different site.

You send a lot of traffic to the page from your social accounts and maybe an email newsletter. The page could have a 90% bounce rate but if you saw a lot of people clicking to register for the event, perhaps it was a smashing success!

There’s no such thing as a “good” bounce rate. It depends on many variables. What is your market? What is the page and its purpose? What are the traffic sources that are leading into that page? Is your analytics implementation set up correctly? Are your events on the page properly configured as interaction or non-interaction events as needed? (psst: If you said “huh” at that last part, check out what Google has to say about that. If you’re still saying “huh,” drop me a line.)

Never stop investigating with just the first data point. It’s important to always understand the goal of the site and how that number fits into your larger strategies.

Myth #3: This report uses sampled data and therefore can’t be trusted.

Do any of you feel a rising panic whenever you see the dreaded orange sampling shield in GA?

First, in order to debunk this myth let’s run through a quick lesson about sampling. In some situations, the free version of Google Analytics may sample your reports. Google will begin sampling when both of the following conditions are met:

1) the time period in your report contains over 500,000 sessions AND

2) you’re viewing something other than a standard precompiled report. Most standard reports in GA are unsampled at any date range unless you add something like a secondary dimension or segment. In those cases, Google doesn’t have that data aggregated already–so to save computational resources they sample it.

OK. So you are looking at a sampled report. There are exceptions to every rule, but generally, you should only be most concerned when you are looking at a data set contains relatively low numbers (maybe you’re looking at a conversion that typically only happens a couple times/day) and the sample based on a relatively small percentage of traffic. In those cases, the data might be dramatically skewed by sampling.

But there’s no need to panic if you’re looking at a report that has a sampling rate of 85% and is reporting out on hundreds of thousands of pageviews. The sampled data is going to be extremely close to the actual data in those cases and it will certainly be close enough to gain useful insights. Don’t get hung up on sampling fears in those situations!

Not worth worrying about:

Worth worrying about if you’re looking at a small dataset:

Myth #4: Campaign A had a conversion rate of 4% and Campaign B had a conversion rate of 2%. Campaign A was twice as effective.

Hold up. That’s quite a leap there. It’s always important to understand the context for the data you’re looking at.

What if the CPC of Campaign A was $20/click while the CPC of Campaign B was $4? You could be spending $5,000 for 10 leads with Campaign A or the same amount for 25 leads with Campaign B. Assuming the leads are of equal quality, would you rather have 10 or 25? I know where I’d be putting my money.

Although the first numbers you see in many reports seem to be conclusive, there’s often more to the story behind the scenes.

Myth #5: Correlation implies causation

OK. So this is a very old fallacy that pops up in many different fields. But it’s still one I see rearing up repeatedly when looking reviewing analytics data with others. It usually goes something like this… “Our overall site conversion rate is 2% but users who watch a video on our site convert at rate of 6%. Ergo, let’s pay for and produce a lot more videos and put them everywhere on our site!”

The problem is that the people who are engaged and browsing your site are already more likely to convert. What you haven’t shown is that the video influenced those conversions. Those users very well may have converted at the exact same rate (or perhaps even better) regardless of the video.

Tip: This would be a great thing to explore with an A/B test. That can help you find out whether having a video on a page actually increases your conversions.

Stay tuned, next week, I’ll share another collection of Google Analytics myths that just won’t die.