I got an email the last week in January from the R help list announcing the release of the newest version of glmnet, a statistical learning algorithm that fits lasso and elastic net regularization paths for squared error, binomial and multinomial models via coordinate descent. Don’t be ashamed if you find that description a bit abstruse: just know you’re not alone! Suffice it to say that glmnet is a state-of-the-art modeling package that handles the prediction of interval and categorical dependent variables efficiently. 

The package’s creator is Trevor Hastie, co-author with Jerome Friedman and Rob Tibshirani of the accompanying arcane-sounding paper: Regularized Paths for Generalized Linear Models via Coordinate Descent, published last summer. Hastie, Friedman and Tibshirani are also eminent professors of Statistics at Stanford University, the top-rated such department in the country. Last Fall, I attended a statistical learning seminar with Hastie and Tibshirani where similar models were presented at a dizzying pace.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access