The Dow Jones Industrials Avg. fell 531 points (3.1%) on August 21st, 2015 in the middle of the first significant correction in three years in the US markets. A drop of 588 points (3.5%) occurred during the very next trading session on the following Monday, August 24th, and after a short-term bounce, a drop of 470 points (2.8%) occurred on September 1st. Benoit Mandelbrot and Richard Hudson have shown in their interesting book "The Misbehavior of Markets" (2004, New York, Basic Books, 328 p.), that Modern Portfolio Theory (MPT) predicts there should have been about 58 days from 1916 to 2003 when the Dow moved more than 3.4%, but in actual fact there were 1,001 such days. MPT further predicts that there should have been only 6 days over the study interval when the Dow moved more than 4.5%, but in fact there were 366 such days. And swings in the Dow of more than 7.0% should only occur once every 300,000 years according to the MPT, while in fact there were already 48 such moves by 2003. The crash of 1987, with its drop in the Dow of >20% in one day, would never be expected to occur even in hundreds of billions of years. Obviously, some of the assumptions underlying MPT are wrong, as I have pointed out previously. The markets are far more volatile than theory admits, and any asset allocation model used by investors should recognize this fact.
Nassim Taleb wrote in his important book ("The Black Swan: The Impact of the Highly Improbable", 2007, New York, Random House, 366p.) an explanation of why theory rarely predicts or even contemplates large-scale, seemingly improbable events. He is very critical of accepted statistical theory in many fields, but especially in economics and finance. Examples of statistical theories he criticizes in finance include both MPT and the Black-Scholes-Merton (BSM) options pricing model, which together represent the mainstream of quantitative modeling theory as applied by many large institutions that manage money. His criticisms are based on the tendency for these models to under-estimate risk by huge (and fatal) amounts, as was shown for MPT alone in the above discussion. Major catastrophes or shocks to a system, like the 9/11 terrorist attacks or the 1987 stock market crash are referred to by Taleb as "Black Swan" events. He calls them Black Swans because by analogy, these surprising events are as unexpected as seeing a black swan for the first time, when all you've ever seen previously were white swans.
Taleb defines Black Swan events as: 1) being outside the realm of regular expectations; 2) having an extreme impact; and 3) being explicable only in hindsight. Taleb presents the Long Term Capital Management (LTCM) financial crisis of 1998 as a Black Swan, and as another example of how wrong the risk estimates of financial theory can be. In this spectacular example, Myron Scholes and Robert C. Merton, both Nobel laureates in economics, attempted to apply their own theory (BSM options pricing) to the real world by participating as advisors to the new hedge fund LTCM. The fund used leverage of as much as 20:1 because their theories excluded even the possibility that large-scale events can happen frequently enough to wreck things. But when the Russians unexpectedly defaulted on their debt in the summer of 1998, the assumptions underlying LTCM's black box models were proven false and the firm went under, reportedly threatening to take the entire U.S. financial system with it. Greenspan's Federal Reserve and the biggest money-center banks intervened to save the day, but LTCM was gone along with many billions of dollars in investors' assets.
There are two main reasons why statistically-based theories like MPT and BSM options pricing habitually under-estimate the risk faced by investors. The first is that these theories use statistical approaches that are "Gaussian" in nature, meaning that an idealized "bell-curve" is used to approximate reality because it permits rigorous mathematical treatment of the probabilities of any potential events. However, as Taleb points out, this statistical precision can often lead to circumstances where one is "precisely wrong", rather than a more acceptable outcome of being "approximately right". Thus the rigor of Gaussian statistics provides false comfort to practitioners, because the real world is not subject to easily modeled "bell-curve" distributions of variability. In finance, the largest one day stock market loss (1987) was >20%, at least twenty (Gaussian) standard deviations above the normal variability.
Because of my background, I like to look at nature for analogies involving statistics. For example, the largest volcanic explosion known from the geologic record was not just five times as big as that of Mt. St. Helens in 1980 - it was 100 times bigger. Under-estimating such an event occurring now would imperil entire countries. Such catastrophic volcanic eruptions are definitely possible in places like Indonesia, or the Aleutian Islands. Likewise the 2007-2009 credit crisis involved the re-pricing of assets inside highly leveraged (sound familiar?) hedge funds and Structured Investment Vehicles (SIVs), both of which used black box models to evaluate downside risk. Both of these got the actual risk wrong, and not just by some small amount, but catastrophically. These mistakes ended up costing investors at least $250 Billion, as estimated by the Royal Bank of Scotland and reported by The Wall Street Journal in 2008. But as we now know, the knock-on effects were much bigger, and the entire financial system eventually failed as a result of a whole range of cascading unexpected extreme events.
The second main reason for the failure of (Gaussian) statistically-based theories to predict risk accurately, according to Taleb, is based on a variety of common errors arising directly from human nature. For example, there is a commonly-observed tendency in people to commit "confirmation error", i.e., to look for instances that confirm one's beliefs or models, and then, lo and behold, to find them while ignoring contra evidence. There is also the "fallacy of silent evidence", which involves the notion that the absence of evidence is the same as the evidence of absence. As Taleb illustrated, think of the physician who tells you that you don't have cancer, because there is no evidence of it. The absence of evidence is conflated to mean that there is evidence of absence - a mistake that could easily kill you.
Another common error is the "problem of induction", which involves errors in going from specific information or data, to a seemingly logical generalization. Taleb uses an analogy from the philosopher Bertrand Russell to explain this. Think of a turkey that is fed daily for one-thousand days. The turkey has come to love his feeder/farmer human owner, and is absolutely certain that when he is called each morning, food is on the way. But on Day 1,001, the call by the feeder/farmer is followed by an ax blow, because it is Thanksgiving Day. For the turkey, this is a Black Swan. The turkey's naïve projection of the future based on past events was an error in inductive reasoning. This is the most common error investors make: remember that "past performance is no indication of future results".
Yet another common error is called the "narrative fallacy," which deals with our human need to fit a story or pattern to a series of disconnected facts. An example from nature would be the commonly held notion prior to 2005 that no hurricane could destroy a modern city like New Orleans, killing over 1,000 people. This was believed because no such destruction had occurred in modern times, and there was an extensive network of massive levees to protect people, and people could easily be evacuated by modern transportation. But a catastrophe such as the destruction of New Orleans by Hurricane Katrina was completely predictable based on known geologic and oceanographic risk factors, and in fact I and many other geologists knew it might happen someday, dating from 25 years before it actually happened. Note that no one could predict exactly when it would happen, only that conditions were right for it if a major hurricane were to impact the coast east of the city. In my case, this was based on explicit discussions of the risk to New Orleans in grad school, and the training I received as a geologist. The situation only deteriorated in the intervening years, such that the fact pattern was the opposite of public perception, with the result that no one was ready for what happened, and a Black Swan was experienced by the entire region.
Finally, the so-called "Ludic fallacy" can cause trouble because of errors built into our methods for the "gaming" of complex systems. Taleb gives the example of a casino in Nevada that built a very intricate gaming model for catching cheaters to prevent casino gambling losses. All went well with the model until an irreplaceable performer in their floor show was attacked and maimed by a tiger, generating a loss of $100 million. They had even thought in advance that this was a possibility, but felt it was too remote a potential (because they couldn't model it), to insure against.
Taleb has put his finger on something really important to investors: the combination of problems with statistically-driven theories and models, and human error in thinking about risk, has been a contributing factor in many disasters, and will be important in many more. It is absolutely critical that the true risks investors are exposed to be properly evaluated. But what is one to do if MPT and BSM options pricing models are inherently wrong, but there is no apparent replacement for them? What approach can be used to evaluate actual, real-world risks in a meaningful way?
Warren Buffett pointed out the logical way to deal with this on a single stock basis some years ago, with the following example. Assume that a very good company's stock drops 30% due to some temporary circumstance, and yet long-term revenues, profit margins, free cash flow, dividends, earnings, economic moat and great management appear to continue on an upward trend as before. What is the risk for this company's stock? According to academic theory (MPT), which is necessarily backward-looking, the risk (standard deviation) just went way up because the stock's price fell 30%. However, if it is still a great company, going forward the risk of financial loss to investors just went way down. This is because we can calculate a discounted present value (i.e., a "fair value") of all future returns using either estimated cash flows or estimated growth in assets, and this fair value will now be much higher than the currently low price. Comparison with the now reduced price of our hypothetical stock would indicate a large gap representing what Graham and Buffett called a "margin of safety". This means that if we bought the stock at the reduced price, with a demonstrated fair value that is substantially higher, our downside risk should be reduced going forward, and yet the company's stock should return to a higher level over some reasonable period of time (but perhaps stretching to years).
I believe that a prudent investor must prepare for both small-scale losses by applying margin of safety analysis to individual holdings, and also for large-scale losses generated by black swan events. One point that is critical to remember though, and which has been misunderstood by many commentators, is that by definition we can't really see a black swan coming. So what pundits mention as a potential black swan at a given moment is going to be less than helpful on average. Taleb has been very clear about this distinction. But if I remember right, Taleb was at one point involved in advising a hedge fund on trades insuring against potential black swans. This makes sense because Taleb was not predicting specific events as I understand it, but rather evaluating the potential for an event based on the fragility of the system (or certain parts of a system) and the catalysts that might be available to unbalance what is in effect a stable disequilibrium (see a detailed discussion of fragility in Taleb's book, Antifragile, 2012, New York, Random House, 519p). What can those of us who are everyday advisors and investors do about black swan risk? Several things come to mind. If I may use a perhaps tenuous metaphor, before a black swan occurs, there may be precursor problems, which I will call "wounded ducks." These are less catastrophic problems that are nevertheless useful as warning signs about the fragility of the system.
For example, before Lehman Brothers collapsed in September of 2008, we saw the demise of Bear Stearns in March of 2008. That event was a "wounded duck" in my view, or a warning about the financial system. I heeded that warning based on the sage advice of observers like David Rosenberg, John Mauldin, Gary Shilling, John Hussman, Lacy Hunt, and a very few others. Literally you could count the economists and market analysts who saw what was coming on just your fingers and toes. But many of the largest value investing shops doubled down on the banking industry after the Bear Stearns failure, because the stocks were so "cheap." Their subsequent losses, even after buying at deeply depressed prices, were disastrous.
Another example from that time period was the presence of an inverted yield curve many months before the recession actually started. This is a classic warning sign of a "wounded duck" economy that is likely headed south. The yield curve inversion did not lead to the crisis, but it was a great indicator that the system was fragile and degrading. There were in fact many indicators of fragility, and these were discussed in detail by the above handful of prognosticators all through 2007 and early 2008. So an advisor or investor could have put in place certain defensive measures (puts, long/short strategies, reduced equity positions, large bond and/or large cash positions, etc.) to be prepared somewhat for the unexpected. I know a number of people who did this, but the vast majority did not.
Because the prognosis for the next few months (2016) is somewhat in doubt, the risk of financial loss is increasing. There may be a few "wounded ducks" present as well. This does not mean a black swan event is coming - we can't know that. But conditions are starting to set up for more fragility. For example, yields are negative all the way out to seven years for German bunds, and throughout the EU and in Switzerland as well. Debt has doubled in the emerging economies over just the last seven years, and recently the dollar has risen 20%, putting big pressure on debtors. Student loans in the US have grown by trillions of dollars in just the last seven years, and default rates on them are climbing rapidly. Swap spreads in the US have been negative for a while, which is at least a little strange, and US corporate bond liquidity has been greatly reduced. High yield bond spreads are soaring, and distressed debt ratios are also soaring. Cash holdings at mutual funds in the US are at 15 year lows.
Market breadth has been very narrow for months and the market has been effectively flat for over a year. Profit margins may be in the process of mean reverting, and without buybacks, earnings would have been negative in the Third Quarter. The CBOE skew measure for options has reached all-time highs in recent months. Industrial production measures such as the ISM indicate a shrinking level of activity, many millions are underemployed, and housing data indicate rising prices in the midst of stagnant wage trends. I believe that taken together, these factors indicate a potentially high degree of fragility in the sense that Talib uses the term, and that limited defensive measures would be prudent here. I don't know if a black swan is coming (of course), and I'm not predicting one, but I don't like the look of things with regard to the number of "wounded ducks" now in place.
Disclosure: I/we have no positions in any stocks mentioned, and no plans to initiate any positions within the next 72 hours.
I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.
Additional disclosure: This article is intended to provide information to interested parties. As I have no knowledge of individual investor circumstances, goals, and/or portfolio concentration or diversification, readers are expected to complete their own due diligence before purchasing any stocks or other securities mentioned or recommended.