© 2019 SourceMedia. All rights reserved.

4 governance warning signs when reviewing presented analytics

With the sales industry having existed in some form since time immemorial, the public consciousness is fully aware of the rhetorical tricks of marketers and sales staff. The patter, the flattery, the upselling — it’s all familiar in some fundamental way that means we know broadly what to expect when we deal with sales assistants.

But with time comes the development of new presentation tools and methods that complicate things, requiring savvy business people to learn how they work so they can guard against them. The customizable analytics suite (of which Google Analytics is the prime example) is a particularly fearsome tool, able as it is to feed compelling charts, graphs and percentages that can easily be re-framed to push whatever conclusion you’re seeking.

Let’s run through why it’s so important that you know how to interpret analytics data, why it’s dangerous to assume that presented data is representative, four warning signs that you’re dealing with sweetened data, and how you can generally guard against trickery.

Why you need to be able to interpret analytics data

In the business world, presentations are important and extremely common. The larger a business becomes, the more likely it is to explore the possibility of acquiring (or investing in) other assets and companies, or even consider merging with a competitor for mutual benefit. With each option will come some form of pitch and an exchange of performance metrics.

But being a budding entrepreneur has its fair share of presentations too in the form of case studies: whether you’re looking to buy a website, choose a hosting service, or sign up to some scalable SaaS systems, you’re going to need to make tough decisions based on performance. Case studies can give you the context you need to decide.

So it really doesn’t matter what exactly your professional circumstances are. If you aspire to great things in the business world, you must know how to parse analytics data and understand precisely how significant it really is. Sometimes you’ll have someone to help you with it, but not always — and there are inherent risks to trusting consultants, no matter how solid their reputations may be. In the end, you’ll need to offer some kind of useful input.

The dangers of assuming analytics data to be representative

In every kind of business negotiation, each party is trying to get the best possible deal for their side of the equation. That’s entirely understandable, but it leads to people taking liberties with the truth when the time comes to break out some performance data. If you’re in a presentation and the presenter brings up a chart showing smooth upward progress, they might say that it’s a chart of their sales, but you won’t actually know unless you take a very close look.

data presentations.jpg
The silhouette of an attendee is seen while he works on an Apple Inc. laptop computer while participating in the Yahoo! Inc. Mobile Developer Conference Hackathon in New York, U.S., on Tuesday, Aug. 25, 2015. The Hackathon is an opportunity for mobile developers to come together and hack around the Yahoo! Inc. Mobile Developer Suite. Photographer: Victor J. Blue/Bloomberg

What if you trust the presenter, though? Surely then you can just take their word for it? Unfortunately, even if they are as trustworthy as you believe them to be, their competence might not be on the same level. Presenters don’t always create their own presentations, and even when they do, their ability to create presentations doesn’t guarantee that their work is even close to representative. They might genuinely feel that their chart shows incredibly sales performance, all because they never read the fine print.

And when you simply shrug your shoulders and opt to believe the narrative you’re told, you’re assuming a great deal of risk. It could be that the business that looks so strong is actually a dud and will start dragging your operation down the moment you acquire it. Something like Google Data Studio can easily make the most nonsensical data set look professional, after all.

That aside, let’s go through 4 warning signs to look out for:

Warning sign #1: Suspiciously-round numbers

However round a business might make its pricing, performance stats very rarely happen to sum to neat round numbers. If you’re reviewing some analytics and you spot that every number happens to be curiously round, that’s a major red flag and cause to challenge the data. It’s even considered a sign of likely malfeasance.

Of course, rounded numbers don’t always stem from efforts at deceit, or even inaccuracy in the source data — sometimes they’re even rounded down just to neaten things up (perhaps thinking that boasting 100k Twitter followers sounds more impactful than boasting 100,317), but when you’re trying to make a decision about value, you need to know what you’re getting the data in a plain and unaltered state. It’s always better to stick to the legitimate figures.

Warning sign #2: Arbitrary metrics

Each business will have distinct KPIs — that’s perfectly normal, resulting from differences in goals, methods, and products and/or services. But unless a metric speaks for itself (as in the case of Net Profit, for instance), its presence must be clearly justified, or else give rise to the suspicion that it was included simply because it looked positive. You need to know which stats matter and why.

Using default Google Analytics metrics, for instance, I could likely assemble a positive-sounding full-page report about almost any website you care to mention, leaning heavily on myriad meaningless metrics that happen to sound good. If you see a performance report that prominently boasts of a 75% monthly rise in Twitter referrals, there’s a decent chance that it’s referring to a total of 7 Twitter referrals, up from 4 the previous month in a wholly-insignificant and arbitrary “improvement.”

Warning sign #3: Unexplained timeframes

Since performance is to be demonstrated over time, it’s entirely standard to chart year-on-year or month-on-month metrics. Done transparently with the right metrics, it’s a great way of honestly and quickly showcasing the overall performance of a business, but it isn’t always done in a such a thoughtful way — that’s where unexplained timeframes come in.

For instance, you might look at a report and see three highly-positive charts, only to look more closely and notice that they use radically different timeframes. One goes back two years month-on-month, but another dates back to the start of the year, while the third only covers the last 30 days. This is a worrying sign that timeframes have been chosen specifically to avoid showing negative results — and if you only get to see the positive results, you’ll leave with a very inaccurate idea of the value of the business.

Warning sign #4: Cherry-picked comparisons

To lend some additional context, it can be useful to provide industry comparisons in an analytics-heavy presentation. After all, a conversion rate outside of any context doesn’t mean that much — if you can’t relate it to the previous conversion rate of that system, you can relate it to the industry average, showing superior performance and making it easier to gauge value.

But what if you’re reading a report and you see that the average post length of a blog is 2.5k words, far above the average of the industry-leading competitor? You might well wonder why that specific thing merited a mention, and further inspection might reveal that every other metric falls short relative to the competitor. The presenter simply searched for a metric (any metric) that could allow a favorable comparison to a hit website, all in the hope that people would see it and draw wild conclusions about overall relative quality.

Always factor in the bias of the presenter

We’ve been through why you need to know how to interpret data, how it can be very damaging to assume that data is representative, and 4 warning signs to look out for when you’re reviewing a presentation: but your skepticism shouldn’t begin and end there. It isn’t always fun to be doubtful of everything you read, but it’s essential that you find a way to do it.

To that end, always factor in the bias of the presenter. With a vested interest in the consequences of the presentation, they’ll be eager for the data to come across in a particular way, so anticipate that and be ready to ask follow-up questions if needed. If you have to directly request an unfiltered view of the source analytics, do so.

Keep in mind that it can be quite hard to be dispassionate about business, particularly when you’re talking about the value of something you’ve put a lot of work into — if you were to try selling your business one day, you’d likely discover that for yourself. So don’t hold it against someone if they’re a little too eager to dismiss the negatives. It’s possible to be polite, assume good intentions, and remain skeptical about everything presented to you.

In the end, being too trusting opens you up to manipulation and exploitation, so don’t leave yourself vulnerable. Know what data really means, and never be fooled again.

For reprint and licensing requests for this article, click here.