I’m not generally a fan of management books, maybe because I’m not a manager. So it’s probably just as well that I didn’t realize that The Flaw of Averages, by Sam Savage, was a management book before I started reading it. The highest praise I can give it is that I finished reading it — all the way through — which is something I don’t think I’ve ever done with a management book. Savage is a clear and gifted writer, which helps, and I’m interested in the subject matter, which also helps.
But there was something else which kept me reading: I was waiting for the other shoe to drop, and it never did. The basic thesis of The Flaw of Averages is not only true but mathematically provable: when you’re dealing with probability distributions rather than certainties, you can find yourself making all manner of horrible and costly errors if you try to boil those probability distributions down to a single number like an average. Instead, contemporary software, much of it based on Savage’s own research and development, allows you to create and manipulate those distributions directly, with much more useful results.
Savage advocates that companies create a new position, the Chief Probability Officer, charged with coordinating the institutional knowledge about probability distributions. He writes, in what might be the nub of the whole book:
Managers at many levels are just bursting with probabilistic knowledge, which if properly channeled can be put to good use.
Of course, the question of how to put the knowledge of lower-level managers to good use is not a question confined to probability distributions: really, it’s the central question of all management theory. But substantially all of this book deals with the question of how best to deal with probability distributions; there’s nothing at all on how to smell them to see if they make any sense, or how to judge how accurate they are.
Most startlingly of all, there’s no discussion of what probability is. One of my favorite parts of Riccardo Rebonato’s magnificent book Plight of the Fortune Tellers is chapter 3, “thinking about probabilities”. He makes the hugely important distinction: on the one hand there’s frequentist probability, where you can run the same experiment thousands of times to see what different results occur. On the other hand there’s subjective probability: if I ask what the probability is that oil will hit $100 per barrel in the next five years, you can’t do that.
Many people, Savage included, love to run Monte Carlo simulations in order to try to reduce subjective probability to frequentist probability, but there’s a category error going on whenever that happens, which is one reason that financial instruments designed by running Monte Carlo simulations blew up so spectacularly during this financial crisis. Monte Carlo simulations are very bad at showing the risk of something unprecedented happening, but as Nassim Taleb loves to point out, it’s the unprecedented events — the black swans — which tend to be crucially important.
On page 291 of his book, Savage prints an admittedly hypothetical distribution of future sea levels. He then goes on to explain why the distribution is oversimplified, and why we can’t trust it in its initial oversimplified form. But the fact is that his base-case scenario, the place where he starts his analysis, is a thin-tailed normal distribution, with the chance of sea levels rising in future being exactly the same as the chance that they will fall.
I just can’t believe that that kind of normal distribution is ever a useful place to start when thinking about something like climate change — the subject of the chapter at hand. The chances of sea levels falling from their current level are tiny — much lower than 50%. And the histogram going out is very bumpy indeed: to a first approximation, either Greenland and the west Antarctic ice sheet melt into the sea, or they don’t. Tails don’t get much fatter than this one.
But this whole book reads as though it was written in what Taleb calls “mediocristan” as opposed to the real world of “extremistan”. Tails are thin; Black and Scholes and Merton and Markowitz are heroes; probability distributions can be modeled and tamed and understood on a seat-of-the-pants level.
It’s true that the world of The Flaw of Averages is better than the world we’re just emerging from, where things like value-at-risk and correlation were disastrously boiled down to single numbers. But I’m still not sure I want to live in Savage’s world: it seems to me to be lacking a healthy dose of fear of the unknown. Quite the opposite, in fact: large chunks of the book are devoted to the riches that can be struck by identifying “real options” and buying them on the cheap from people who, looking only at averages, might overlook a lot of option value.
My fear is that if Savage’s souped-up Excel spreadsheets catch on, the corporate world will fall into the overconfidence trap which did for the financial world during the Great Moderation. Savage’s statistical distributions are extremely powerful tools, both in terms of identifying profitable opportunities and in terms of avoiding massive potential downside. But if companies become particularly adept at avoiding crashes, then that’s a recipe for yet another Minsky bubble. The fewer corporate disasters we see, the more risk and leverage that companies will feel comfortable taking on, and the more likely it is that another system-wide crash will occur.
Savage’s techniques are very good at discovering existing correlations which might not be immediately visible to senior management. But they’re useless at discovering correlations which were never significant in the past but which suddenly and terrifyingly go to 1 in the future when a Black Swan arrives.
If we all take Savage’s advice, we’ll weather most storms much better than we do right now. But I fear we’ll fare even worse in the event that a hurricane hits.