Economics and BI
I should have known. Less than two weeks after my Information Management blog series on Economics and BI, The Economist magazine comes out with its own Modern Economic Theory, Where it went wrong-and how the crisis is changing it, edition. And of course it's outstanding. I must admit I breathed a sigh of relief after reading the articles that the main topics –  macroeconomics, the efficient-market hypothesis and behavioral economics – were the same as mine.
The state of economics: The other-worldly philosophers, details the the turmoil among macroeconomic camps that's currently playing out in political theaters, arguing, ironically, that the divisions may ultimately prove good for the discipline. Perhaps Nobel laureate Paul Krugman's “Dark Age of macroeconomics” will finally yield productive compromise between classical and Keynesian economists that includes consideration of the neglected financial system. In the end, it seems both sides must give on their beloved mathematical models. One radical but intriguing suggestion is for economists to scrap such onerous math for more bottom-up computer “searching” heuristics that grind out thousands of simulations looking for economic patterns. A similar challenge was issued to mathematical statisticians by machine learning outsiders a number of years ago – much to the benefit of current data analysis.
Many economists feel the efficient-markets hypothesis (emh) is even more culpable in the current crises, having provided the academic foundation for a financial engineering discipline that started the  bankers' “party” and created tremendous wealth for a few while purportedly making the financial system safer. Behavioral economist and Nudge author Richard Thaler opines that the woes of the last few years may actually benefit financial economics as behavioralists and humbled efficient marketers consolidate their disciplines. According to Thaler, the emh has two components that have been severely tested by the current crisis, the “no-free lunch part and the price-is-right part, and if anything the first part has been strengthened as we have learned that some investment strategies are riskier than they look and it really is difficult to beat the market.” But, as the article notes: “The idea that the market price is the right price, however, has been badly dented.”
Perhaps as much as anything, it was 25 years of prosperity that caused economists to drop their guard, becoming complacent, unwilling to compromise, satisfied with their restrictive models, believing “this time is different.” Alas, as the black swan demonstrated, it isn't.

Statistical Learning and BI

I recently received an email on one of my R support lists from Trevor Hastie, Professor and Chairman of the top-rated Stanford University Statistics Department, announcing a new two day seminar, Statistical Learning and Data Mining III. I attended the Data Mining II class in Boston last fall and learned a ton. While I won't be off to Austria in the fall, I'm certainly putting the spring, 2010 seminar in Palo Alto on my calendar.
For those engaged in predictive modeling, a class from Hastie and Stanford colleague Rob Tibshirani every 18 months to keep up with the latest techniques in statistical learning is  a prudent investment. Sitting at the juncture of statistical theory and machine learning, the instructors and their students are prodigious producers of ever more efficient and accurate predictive models. They also obsess on evaluating and validating their new methods using the latest computer simulation techniques. Their overall approach is a productive compromise between the rigid statistical purists whose models seem more important than performance, and the machine learners, who seem more enamored with their arcane algorithms. The good news for practitioners is that all the latest methods are implemented in packages freely-available with the R Project for Statistical Computing.

Predictive Analytics World

I also received an email from an R list announcing the second Predictive Analytics World, October in Washington DC. I attended the first PAM last winter in San Francisco and was pleasantly surprised by the quality of a “Version 1” product. The keynote speakers were generally excellent and most of the case studies at least adequate.
Having enjoyed the bounty of the R platform for predictive models over the last seven years, I felt many of the analyses presented were a bit dated. Multiple and logistic regression, the techniques most popular in the case studies I attended, are certainly still pertinent and useful for hypothesis testing. But for flat out prediction, the statistical learning shrinkage methods like the lasso, along with bagging and boosting models, generally produce better results. Ironically, most of these methods are not yet supported by the large proprietary statistical software vendors. Those wishing to deploy the latest and greatest techniques are “forced” to use freely-available open source R. For me,  it's also comforting to know that the distributed software was coded by the same people who developed the theory.
The speakers list is shaping up nicely. I'm glad to see a top academic on docket to provide perspective on the state of the craft. Back in the winter, I said it'd been nice to have either Hastie or Tibshirani from nearby Stanford address the conference on the evolution of predictive modeling. Many of the speakers for PAM II are repeats from I. I look forward to hearing their new perspectives.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access