My previous post cautioned that while becoming data-driven is a laudable goal we should be wary of becoming too driven by data. Especially when we allow our understanding of a complex system to be driven by single metric—and then incentivize performance based on that metric. Chip and Dan Heath called this the focusing illusion. “You don’t live in a one-variable world,” the Heaths explained. “In your complicated, squishy world, if you’re dreaming up an incentive plan, you’re almost certainly in the grips of a focusing illusion. You’re trying to maximize or optimize or minimize something.”

Of course, we do this with the best of intentions. We collect, report, and monitor data to use as key performance metrics for such things as maximizing sales, optimizing use of customer contact data, or minimizing data quality issues. While it is certainly easier to focus our attention on one metric, it ignores other variables in complex equations. More sales might conceal low profit margins or high operating costs. Emailing every customer on every marketing campaign may earn you the reputation as a spammer. And high-quality data unfortunately doesn’t guarantee high-quality business decisions.

Furthermore, the focusing illusion also makes it easier for us to be susceptible to what is known as the illusion-of-truth effect.

As David Eagleman described this cognitive bias, “you are more likely to believe that a statement is true if you have heard it before—whether or not it is actually true. In one study, subjects rated the validity of plausible sentences every two weeks. Without letting on, the experimenters snuck in some repeat sentences (both true and false ones) across the testing sessions. And they found a clear result: if subjects had heard a sentence in previous weeks, they were more likely to now rate it as true, even if they swore they had never heard it before. This is the case even when the experimenter tells the subjects that the sentences they are about to hear are false. Despite this, mere exposure to an idea is enough to boost its believability upon later contact. The illusion-of-truth effect highlights the potential danger for people who are repeatedly exposed to the same religious edicts or political slogans.”

The illusion-of-truth effect also helps explain why although most organizations acknowledge the importance of data quality, they don’t believe that data quality issues occur very often because the data made available to end users in dashboards and reports often passes through many processes that cleanse or otherwise sanitize the data before it reaches them. Therefore, even when an organization regularly evaluates its data quality they are likely to rate themselves as having excellent data quality because they are repeatedly exposed to what appears to be high-quality data, but which might be the result of what could be called the illusion-of-quality effect caused by the excessive filtering and data cleansing performed by up-stream processing.

Is it possible some of the metrics your organization is relying on to support its data-driven business decisions could be the result of the focusing illusion oversimplifying a complex challenge? Furthermore, could key performance metrics indicating that your organization is currently doing well be affected by the illusion-of-truth effect or be the result of the illusion-of-quality effect?

Originally published at OCDQ Blog. Published with permission.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access