It’s a rare thing for a magazine editor to seek an interview with another magazine editor, but in the case of Chris Anderson, the editor in chief at Wired magazine, I didn’t hesitate. Anderson, who wrote the widely-praised book, The Long Tail, is now on a tangent that challenges the precepts of science by proposing that correlations among data are becoming more important than models, and that evidence of something being true can be more immediately useful than understanding why.


In a nutshell, “The End of Theory,” Anderson’s essay in July’s Wired, suggests that the vast amount of data and the computing muscle now available to statisticians has empowered them to provide valid observations in the absence of any underlying hypothesis, experiment or testing.


This goes a magnitude beyond Tom Davenport’s notion of analytic proficiency as competitive differentiation; it says we can assign confidence to certain truths in the absence of physical evidence. (In a twist, Davenport advocated the scientific method in his most recent column in BI Review/DM Review.)


One definition of agnosticism is belief in what can be seen, and we don’t usually put correlations in league with manifest objects, since correlations are relative, not absolute. Yet there are exceptions and precedents: we know of scientific theories that predicted subatomic particles before they were proven to exist, and it’s true that new discoveries occasionally come by accident. Science corrects its own stumbles as new discoveries allow. But with the onset of the data age, Anderson says we can skip some of that linear learning and jump straight to conclusions.


In other words, it’s Google. “Google’s philosophy is that we have an abundance of data and a shortage of useful conclusions to draw,” Anderson told me. “They made a business out of making data monetizeable and said, ‘we’re not going to be experts on anything other than data analysis.’ ‘We’re not going to be experts on language or semantics or taxonomy or what people want or why they’re looking for it. We’re simply experts on being able to sift this data statistically so that the latent knowledge in the network is exposed.’”


Anderson believes the original PageRank algorithm behind Google deserves a Nobel Prize for its agnosticism toward subject matter.


“It says that the network is its own answer, that we don’t know whether this page is more relevant than the next page. What we do know is that the statistics of the network suggest that the crowd believes so. The people who know these domains have voted and we have measured their votes. You end up with a library without theory, without a Dewey Decimal System or taxonomy, a library without an understanding of what’s in it. You end up with this emergent structure that comes out of being completely oblivious to the reasons why that structure exists in the first place.”


If I’d had time for a rebuttal, I imagine I’d have heard from Megan Burns at Forrester Research who believes that statistics surrounding Web surfing are more about behavior than about attitude, and can end up being more inferential than illuminating. I’d add that, unlike a scientific proof, behavior on the Web is dynamic and subject to change, even if a measure of truth can be captured in a snapshot. (I also recall that I chose to forget the proofs of calculus as soon as I had my first scientific calculator.)


But you’d be wrong to put Google at the root of Anderson’s thesis. A physicist by training, Anderson builds his argument mostly around biological research (which he’ll expand upon in an upcoming co-authored story in the scientific journal Nature). The essay in Wired is followed by a half dozen examples of the trend, including the shotgun gene-sequencing project of J. Craig Venter that retraced the circumnavigation of Captain Cook to discover thousands of new species of bacteria and other life forms that have never been physically encountered.


For all the value we attach to data, I still found it somehow perverse that Anderson’s thesis suggests that trust in data has now become an article of faith, and that Venter’s work now sounded existential. If you can prove the existence of a new species without being able to physically describe it, does it really exist?


I didn’t have a ready answer, so I asked if it didn’t feel like Huxley’s Brave New World to say, ‘we can believe things because they are so.'


“I know where you’re going and I think you’re probably right so I don’t want to over-generalize,” Anderson replied. “I’ve gotten many emails from biologists who don’t think what Craig Venter is doing is science. They say what he’s doing is observation.”


That got us into some tricky historical questions about what science actually is.


“What Darwin did for most of his career was observe,” Anderson continued. “Only later on did he come up with a theory to explain his observations. Only later did we come up with the scientific method with its hypothesis, synthesis and experiments and testable or falsifiable theories. In a sense, we have created a definition of science that excludes much of the world that could potentially have answers, or at least useful information for science, because we’ve constructed this method that is predicated on testable hypotheses.”


Scientific leaps have long been the irreversible measure of our advancement as a species, so I had to ask whether we should expect a qualitative backlash to the quantitative revolution.


“I don’t know the answer to that because I think we’re looking for new language,” he replied. “Google thinks statistically and most of us don’t. One reason this is such a difficult concept for the scientific community to absorb is because it really runs counter to our human instinct to embody and visualize things and make them tangible. Statistics explicitly says you can’t do that now and maybe you never will be able to. Statistics are blind to human dimensions.”


Anderson said this is analogous to why quality initiatives such as Six Sigma can be so difficult for businesses to implement, because they ask people to put aside their knowledge, put aside their gut, ignore their personal experience and let numbers be the judge. “Six Sigma is to some extent the Google philosophy applied to business practices.”


As much as I admire Chris Anderson’s talent, intellect and prose, I probably won’t be first in line for a “What Would Google Do?” bumper sticker (as he half-jokingly proposed), even in the most measurable age in history. And while I’ve respected the outcomes of some Six Sigma exercises, I’ve also chafed at their nose-to-the-grindstone blandness.


It’s always been true that we don’t review ideas in the same way we envision them, but if algorithms are destined to deliver answers that science has no ideas for, we may need to revisit the semantics of words such as “know” and “understand.” I can’t dismiss Anderson’s argument, but it does call for a response.


If you’d like your own comments to be considered for our next column, please submit them with attribution and permission for use by July 24, 2008, to my email below.


Read "The End of Theory" at


Next time: A qualitative response 

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access