I’m still reeling from the provocative but important new book, “Big Data: A Revolution That Will Transform How Live, Work and Think,” co-authored by Oxford professor Viktor Mayer-Schonberger and Economist editor Kenneth Cukier, that I twice blogged on a few weeks back.

As I noted then: “I must admit my traditional statistical grounding has taken a hit with Big Data. The notions that the core scientific method techniques of sampling, measurement error, and the experimental method’s cause and effect, may well lose importance as central components of the analytics’ tool chest hasn’t quite registered with me yet– and maybe never will.” Could it be that statistical practice as we know it is on its death bed, an artifact of technology and computation limitations of an earlier age, destined to be supplanted by new N=all, messy data and correlation-only methodologies?

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access