By Jeffrey P. Snider
I have to thank my colleague Joe Calhoun for passing alone a very topical article written by Nassim Taleb and Gregory Treverton in Foreign Affairs. Taleb is, of course, well-known for his "black swan", but it is really far more than that as it gets to the failure of modern economics as nothing more than a study of statistics. Conventional statistics, axiomatically, is the study in exclusion by the very definitions of technological limitations. Taleb's argument is that observation itself may be flawed because we have, as a species, still to witness and catalogue every "unknown."
This latest article is Taleb's next step which is really a systems approach to pretty much anything. Saying we live in a complex world is not just a cliché but rather an important technical distinction. A complex system is one in which variables are innumerable, therefore making prediction of the interaction of variables nigh impossible. But economists will try, and do so, as I said above, by excluding a great deal.
My interest here is what is really the criminal (euphemistically, not legally, though some downstream impacts might lead in that direction as intentional neglect) ignorance of fractal geometry and chaos (mathematical theory). Central banks, and even governments, have a vested interest in maintaining average occurrences, shortening kurtosis as it were, and shooting for a steady state whereby only small deviations are "permitted." As Minsky observed a generation ago, such a steady state is a false paradise because the longer it goes, like a rubber band stretching, the worse the inevitable reversion to the mean (which doesn't exist).
To Taleb, that concept has been condensed as "fragility", which is really borne out of this counterproductive notion to try to narrow the bounds of "acceptable" outcomes.
Although centralization reduces deviations from the norm, making things appear to run more smoothly, it magnifies the consequences of those deviations that do occur. It concentrates turmoil in fewer but more severe episodes, which are disproportionately more harmful than cumulative small variations. In other words, centralization decreases local risks, such as provincial barons pocketing public funds, at the price of increasing systemic risks, such as disastrous national-level reforms. Accordingly, highly centralized states, such as the Soviet Union, are more fragile than decentralized ones, such as Switzerland, which is effectively composed of village-states.
Thus a "tail event" isn't really a low probability outcome but rather the revelation of a frighteningly fragile state that was unexpected only due to the steady appearance of nominally "average" outcomes for a significant length of time. "Things" seem to work, whether in politics, economics, finance or even weather forecasting (and climate forecasting) until they suddenly go haywire. The "haywire" isn't really a "tail event" but rather more so a deficiency in statistics and operational theory that favors something like the "mean" or "average" event. Dynamic systems do not easily lend to such shortcuts (which is why concepts like strange attractors strangely makes sense even in this stylized world of modern "science").
The most significant aspect of chaos theory is sensitivity to initial conditions, which simply means that at some horizon what conventional statistics believed to be a random error will change irrevocably the course of the whole system (paradigm shifts of this nature are almost always violent).
Of course it was Benoit Mandelbrot who introduced mathematical theories about complex systems and took to trying to better grasp the nature of "error" in forecasting and in complexity. What Mandelbrot understood, standing sharply against the trend in mathematics and statistics of that age, was that there was usefulness in the irregularity of nature. Just as Lorenz saw the dangers of overlooking it, Mandelbrot saw the benefits of defining it, even if only poorly or incompletely.
Modern statisticians, particularly economists, see the world as only smoothed contours. The reason for that is as simple as Dr. Lorenz's transposition error, in that the mathematics breaks down at a certain level of complexity. The modern tools of calculus in the statistical discipline assume smoothness in everything because there is no method to extract every little and possible detail, taking instead small leaps of assumption and leaving much behind in doing so.
That is not just a question for academics as there are very real practical implications. Again, central banks act, in systems theory, as agents trying to append "everything" to some average condition which it defines as "normal", really acceptable, bounds of function. Deviation from that "normalcy" is believed to be a flaw rather than a feature of dynamic reality (which is why central bankers hate truly free markets so much; preferring by far manipulated and curtailed "market" function). As Taleb and Treverton argue, the success of staying in those bounds is really failure. Systems that are not subject to actual and serious deviations are, in Taleb's words, fragile.
The fifth marker of fragility takes the proposition that there is no stability without volatility a step further: it is the lack of a record of surviving big shocks. States that have experienced a worst-case scenario in the recent past (say, around the previous two decades) and recovered from it are likely to be more stable than those that haven't. In part, this marker is simply providing information: countries that sustain chaos without falling apart reveal something about their strength that could not be discovered otherwise. But this marker also involves the idea of "antifragility," the property of gaining from disorder. Shocks to a state are educational, causing them to experience post-traumatic growth.
The opposite condition to what the Taleb and Teverton describe above is those systems that fall apart under the slightest wrinkle. I think we are witness to exactly this kind of "fragility" at this very moment. Again, central banks have been dedicated these past seven years to "normalcy", enforced financially with no shortage of power and direction. Janet Yellen herself had taken to referring to the financial system as totally "resilient", which is just her way of saying functioning within the FOMC's present and narrow tolerances. Instead, the appearance of that, in comparison to 2008, and even 2011, was as Taleb describes of fragility - the apparent steady state that artificially masks the unsteady nature of dynamic chaos.
Thus the small introduction of one single word, taper, in the middle of 2013 (though it likely started in the months before) produced a disproportionately massive response - that is the hallmark of chaotic and complex systems in a critical state. A small "error" led to a near-global meltdown, including economic disruptions and currency crises all over the place; the colloquial butterfly flapping its wings out of Ben Bernanke's mouth.
It has been said that chaos theory amounts to exactly that point, called often "exploding errors." I have no doubt that Ben Bernanke was totally surprised at how the global financial system betrayed (and the degree to which it did) what the FOMC saw as very narrow and robust tolerances (Yellen's resiliency). That would extend, certainly in my estimation, to the last half of 2014 as well, considering that credit market events in 2013 (that small "error") set off a chain reaction that has yet to fully run its course.
You would think the constant state of rolling crisis, both economic and financial, would have by now enforced some true resiliency given that we have "survived" it all. But I think that is the problem with such heavy monetarism as it takes all of the vital lessons of past "mistakes" and banishes them to simple history (after being downplayed) in favor of going right back and doing the exact same thing all over again. Central bank ideology is calling the rat that fails to stop taking the electric shock in favor of the other path to the cheese "average" simply because that is what happened in the recent past. In other words, central banks' ideas about "normalcy" override on a simple information basis widespread acceptance that what was taken previously as average or acceptable was really not; we should go back and do all the same things that led to the housing and dot-com bubble because it looked like such a prosperous period?
The role of chaos theory and systems analysis is to observe, strictly, that doing the same thing will lead to the same result. It wasn't random chance that the bubbles collapsed, just as it doesn't matter if central banks and monetary practitioners "learn" from the mistakes (because it is their narrowly-defined boundaries of what constitutes a mistake). Thus we can live in a condensed historical period where asset bubbles of epic proportions play out one after another without any pause. Tail risks indeed.