The metaphor for the theme of the entertaining book, “The Flaw of Averages,” by Sam Savage, is a drunk attempting to cross a dangerous highway. The state of the drunk at his *average* position (i.e. the center line) is alive. However the *average* state of the drunk, wherever he is on the road at a given point of measurement, is dead! The dead drunk embodies Savage's Strong Form of the Flaw of Averages.

FOA takes on the flaws of basic statistical thinking in the conduct of business, many of which derive from misuse of traditional measures of central tendency such as mean, standard deviation and correlation. It also mixes in a healthy dose of uncertainty and risk management, using precepts from basic economics and portfolio theory. The author's antidote to the current mistakes with statistical thinking in business? The emerging field of Probability Management, or PM, which uses simulation and Monte Carlo probability/statistical techniques to redress the naïvete of much of the current point-based statistical thinking. Instead of simplistic answers such as the mean and standard deviation, which often provide inadequate information for decision makers (the Weak Form of the Flaw of Averages), PM generally promotes an entire probability distribution that can be deployed more reliably.

About 80 percent of “Flaw of Averages” is devoted to identifying problems with the use of statistics in business, with special attention to applications of finance and supply chain. The remaining 20 percent articulates the probability management cure. The 370-page book is divided into 47 short chapters – perfect for an attention-challenged reader like me. The tone is light, irreverent and at times somewhat annoying – but effective. The remainder of my blog focuses on flaw of averages problems; next week's article will introduce the discipline of probability management.

FOA identifies “The Seven Deadly Sins of Averaging” that routinely trip up analysts. Most are concerned with contrasting the behavior of a *function of an average value with the average of a function value*. The two are often different and the difference can be quite consequential. In the end, the seven sins are actually eleven. Among them:

– Planners often assume that each of many project tasks will be completed in average or less time. But with a task list of ten items to be completed in six months “the chance that all ten come in on their average or sooner is the same as flipping ten heads in a row, so the chance of finishing by six months is less than one in a thousand.”**Why is everything behind schedule?**

– This argues for the wisdom of portfolios. If a basket of 10 eggs is dropped with a 20 percent likelihood, all eggs are lost. If, on the other hand, there are 10 baskets – one for each egg, that 20 percent chance destroys just one egg.**The egg basket**

– This sin has tripped up analysts for years, and has to do with the whole potentially being quite different from the parts. Derek Jeter of the New York Yankees hit .250 in 1995, .314 in 1996 and .291 in 1997 for a combined .300. David Justice hit .253 in 1995, .321 in 1996 and .329 in 1997 for a combined .298. Justice's overall batting average is less than Jeter's for that time period, even though he outhit Jeter in each of the three years. How can that be? The answer has to do with the mix of at bats and hits that underlie the averages. Jeter's .250 in 1995 was based on just 48 at bats, while Justice's .253 derived from 411. On the other hand, Jeter's .314 in 1996 had a solid foundation of 582 at bats, while Justice's .321 had only 140. The combined averages are not simply the averages of the three years. Rather, they are the sum of hits divided by the sum of at bats. So yearly averages with many at bats count for more than those with less. Jeter's better averages came when he had the most at bats.**Simpson's Paradox**

– In a bottom-up budgeting exercise, if each of 10 VP's “sandbags” her budget by providing a 90 percent certain figure, the CEO's aggregate budget is not 90 percent sure but rather 99.998 percent, a multiplicative beneficiary of the 10 separate inputs. The combined budget is more stable and less prone to extreme behavior than its components. “The smaller the sample size, the greater the variability of the average of that sample.”**The Flaw of Extremes**

– A supply vs. demand scenario where there's a restriction on capacity. In this situation, the average profit is less than the profit associated with average demand. This is the negative or concave case of Jensen's Inequality – downside without corresponding upside.**Ignoring restrictions**

– A supply vs. demand scenario where there's an option to halt production. In this case, the average profit is greater than the profit associated with average demand. This is the positive or convex case of Jensen's Inequality with upside without corresponding downside.**Ignoring options**

– Taking credit for something – rejecting the null hypothesis – “that was in fact just the luck of the draw.” This is the topic of the entire splendid book “The Drunkard's Walk” by Leonard Mlodinow.*Taking credit for chance occurrences*

– Occurs when multiplying uncertain numbers like margin and balance. “If the two uncertain numbers are inversely related, the average revenue is less than the revenue associated with the average uncertainties. If the two uncertain numbers are positively related, the average revenue is greater than the revenue associated with the average uncertainties.”**The Scholtes Revenue Fallacy**

Even before I first peeked inside “The Flaw of Averages,” I was thinking stock market and “black swans.” For it's with market returns that simple averages, in the absence of corresponding risk measures, are at their most flawed. And indeed it's the 80 pages on finance where “The Flaw of Averages” is at its teaching best. Finance is also the application that best gives credence to the nascent thinking on probability management.

Looking for a blunt introduction to the problems with retirement financial planning? Read Chapter 21. Wish to learn about Markowitz efficient frontiers? See Chapter 22. Want an introduction to Monte Carlo simulation of market performance? Read Chapter 23. Wish to be introduced to the Capital Asset Pricing Model and the concepts of beta and alpha? Read Chapter 24. Want an understandable lesson on call and put options? Review Chapter 25. Looking for a readable introduction to option pricing, hedging and the Long-Term Capital Management debacle? Read Chapter 26. In the end, the Real Finance discussion of the use of optimization and simulation pertaining to the challenges of managing project portfolios for Shell Oil in Chapter 29 is quite enlightening.

Supply chain management is another area where flaw of averages mistakes can trip up business. The point of departure for much thinking in SCM is the Newsboy Problem, where “If demand is less than the number of papers purchased, the excess will be wasted, whereas demand greater than the number of sales results in lost sales.” FOA promotes the use of resampling to estimate actual demand and cost, providing the analyst with a series of choices that “trade off the risk of high cost against average lower cost.” SCM problems get a great deal more complicated, of course, but the tools of resampling and Monte Carlo can play a big role in mitigating overall risk. And experience with the use of simulation to tackle SCM problems was foundational for probability management, demonstrating at an early stage “that a scenario library, independently created on one simulation system in one department, can be used on a different system in a different department.”

Read Part 2 of the blog here, where I discuss Savage's probability management cure for today's flaw of averages ills.

## Comments