I'm just getting around to reading Issue 2 (4 annual) of Big Data, by Mary Ann Liebert, Inc. Publishers. I like the publication quite a bit it's found a nice data science niche between talking head bloggers and the often too stuffy academics.
Issue 2 has a nice interview with sociologist and data scientist Duncan Watts by founding editor Edd Dumbill, late of O'Reilly Media and formerly program chair for the O'Reilly Strata and Open Source conferences.
Two years ago, I wrote a two-part series on Watts' wonderful book “Everything is Obvious: *Once You Know the Answer How Common Sense Fails Us”. Watts' thesis is that common sense is "exquisitely adapted to handling the kind of complexity that arises in everyday situations But 'situations' involving corporations, cultures, markets, nation-states, and global institutions exhibit a very different kind of complexity from everyday situations. And under these circumstances, common sense turns out to suffer from a number of errors that systematically mislead us. Yet because of the way we learn from experience the failings of commonsense reasoning are rarely apparent to us The paradox of common sense, therefore, is that even as it helps us make sense of the world, it can actively undermine our ability to understand it."
Indeed, current research in psychology and economics suggests that unbridled rationality is not the behavioral norm. Humans are very biased thinkers and often adopt simplifying (and irrational) heuristics such as framing, anchoring, availability and loss aversion to “guide” their behavior in common sense situations.
A seeming remedy to “common sense” for complex problems is to deploy “more systematic ways to go about gathering data, running experiments, gathering evidence, and making decisions, evidence-based decisions. I think that is where big data comes in... I think as we have more and more capability to test our intuitive hypotheses against data, we are going to start learning this lesson—that, in fact, we need to pay attention to the data; we need to run the experiments. We cannot trust our intuition just because our intuition is so persuasive to us.”
Common sense does well when “you get to run repeated trials of the same experiment over and over again...if the answer is yes, then I think that is an environment in which your intuition can be trained to work pretty well...That is how doctors are trained. They just see the same thing over and over again, and they learn that when they see X, Y, and Z, this is what they need to do.”
And this thinking works for many simple business decisions. For example, “if you do not know what particular design to use or the right way to present information or what the default setting should be, there is a certain amount of systematic ?eld experimentation that can be done to learn the answer, rather than just guessing and going with it.” Moreover, it may be possible to solve disagreements “with evidence and experimentation rather than simply arguing about it and then doing whatever the winner
thinks is right.”
Yet Watts is uncomfortable attempting to solve complicated business problems from experiential evidence alone. Business challenges are often much more idiosyncratic than can be handled with just analytics. “...problems that people in the business world have to grapple with are so complex, in terms of the underlying cause-and-effect relationships, that you cannot approach them in the same way or on the same timescale as you would a scientific problem.” What may work for predicting simple processes fails for the more complicated.
Even with the limitations of big data and traditional scientific methods in business, Watts is uncompromising that evidence-based skepticism drive all discussion in strategy. 'When someone says, Oh, you know, this is how things work,’’ the question should be,Well, okay, what is the evidence?’’ Right? Not just, Oh, that is a plausible assertion. I am going to believe you, because it conforms with what I already think.’’ But, “Okay, that is plausible, but there are five other things that are also plausible. Why is your plausible assertion more correct than any of these other alternative hypotheses?” We just do not really think that way.'
But we should.