Full index of posts »
StockTalks

EVA From A Static To A Dynamic Value MVA: Divergent Tendencies In Firms http://seekingalpha.com/p/22nxp Nov 20, 2014
Latest Comments
 BankimPatel on The Liability Side Of The Fed Balance Sheet And The Bloating Case Of Excess Reserves: The Illusion Of Monetary Transmission Excellent, some hard facts here. So roughly $3....
The Curious Case Of Probability Weighting Function: Why Central Banks Hesitate To Make Any Change 0 comments
Left tail and Right tail distributions
Decision making under uncertainty has been the core subject of Kahnemann and Tversky; their thirty years of work in the formulation of the Prospect Theory and later Kahnemann's book, "Thinking Fast and Slow" have brought new insights in the area of what constitutes the process of making decisions where information and data point to various levels of uncertainty and in a probabilistic distribution of outcomes how does the mind choose. The distributions themselves are never anywhere close to normal, as life's events do not happen in such an ordered form that follows a normal distribution, rather almost all distributions are skewed either with a left tail or a right.
Let me take Taleb's example of incomes in a population, it is naturally skewed with a left tail as those earning thousand times the average are very few while the majority take the distribution to the left of the average, the average itself is meaningless as the majority is nowhere close to the average. But how do economists deal with this data, or marketers, or even government forecasters and data analysts? Do they take the population as a distribution of incomes or they take a 'value of convenience' like the mean, median or mode to make the matter look simple? Is there any way that a policy targeted for the average income of the population of any consequence to the majority of the people or to the minuscule minority? Is there a way that a product targeted for the average earner have thousands of hits if the average earner is a minuscule minority (it could be for example in a town where Bill Gates lives, the average earner of that town would have a theoretical earning that no one actually earns anywhere close to)?
Policies are made with a targeted population in mind, or products are launched with average earners in mind, but the distribution of a statistic if it is heavily skewed, how do we frame the correct response for the different segments in that distribution? Let me take the example of the framing of an insurance policy where a risk event is plotted with frequencies in a population, let's say we look at a distribution of age for deaths in a population and use that for the future. There is a multiplicity of factors operating for that target population, and for the past period for the data to be distributed in a manner. Most likely if this target population is for a place like Palestine, the distribution would be a left tailed skewed distribution, for a place like Japan this would be a right tailed one; similarly periods in History could have a strong bearing on the outcome, but could we use a prior event and distribution to predict a future event? Of course, using Bayesian conditional probability, where prior probabilities are used to frame posterior probabilities.
The question that is becoming increasingly important is that Kahnemann and Tversky used a framework in their Prospect Theory that had four pillars, namely, 1) reference dependence, 2) loss aversion, 3) diminishing sensitivity, and 4) probability weighting. Out of these four while human predilection to 'not making losses' is more potent than making gains has been more often quoted while the importance of probability weighting, which is the human tendency to overweight the low tail probabilities while underweighting the high tail probabilities comes as a major imponderable in decision making under uncertainty. Also Kahnemann and Tversky's research showed that "the simplification of prospects can lead the individual to discard events of extremely low probability and to treat events of extremely high probability as if they were certain."(An Analysis of Decision under Risk: Prospect Theory, pages 263291)
Let me try to elaborate this in details.
Probability weighting
Let us take the classic Russian Roulette game with six players. Probability of first player surviving is 5/6. Probability of second player surviving is the probability of the first player dying OR them both surviving which equals [1/6 + (5/6)(4/5)] = 5/6 again. Probability of third player surviving is the probability of first player dying + the second player dying + all three surviving. This is = 1/6 + 1/6 + 3/6 = 5/6. One can add because they're mutually exclusive. So it would seem that the choice of being the first player or the second player based on the probability of survival would make no difference. But would one like to be the first player is the question. In a risk event as a Russian Roulette, one would be tempted to think differently about the odds of dying than what the probability estimates would suggest, thus creating a probability distortion in our minds. Even a equally likely event could be thought of having a different probability based on what precedes and what supersedes, the reason being the aspect of probability weighting function, which Kahnemann and Tversky found out was based on the human mind's intrinsic tendency towards overweighting of small probabilities, underweighting of large probabilities, and subcertainty (i.e., the sum of the weights for complementary probabilities is less than one).
Kahneman and Tversky also noted that the probability weighting function may not be 'well behaved' near the endpoints 0 and 1.This is predominant where gains and losses are not evenly distributed, or the mind perceives the aversion to losses to be more important than making the gains.
Let us examine a few real life cases. Just before the Lehman Brothers collapsed, just around that time the flow of credit assumed the highest level, if we denote it by w which could vary from 0 to 1, then w was very close to 1 just before the crisis struck.
Before Lehman Brothers collapsed w (0,1) à 1
When the Lehman Brothers collapsed and the financial world went through a shock, this same w of the flow of credit moved to the opposite extreme, tended towards 0:
After Lehman Brothers collapsed w (0.1) à 0
How did such a dramatic change happen? How did the market swing from one extreme to the other in such a short span? It was mainly due to probability weights, where the weights moved into a nonlinear distribution, with every bank manager increasing the estimate of default from near zero to close to one, thus covering up for the potential loss was the first knee jerk reaction, thus taking an enormous amount of liquidity from the market, stalling all interbank lending.
Thus a low tail risk event of default for a banker as solid as the Goldman Sachs tempted it to approach Warren Buffet to bail it out for $5 Billion, which if one uses how Warren Buffet had estimated the probability weights was just the opposite of what the Goldman Sachs executives had estimated. The same low tail risk event was given different probability weights by two sets of people; Warren Buffet perhaps made a huge bonanza from this investment. It is also not a matter of conjecture any more that when the interbank lending did start it has continued to be basking in the paleness of the 'crisis' syndrome, it has never reverted to the long term mean position, this is one of the fundamental reason why in spite of Federal Reserve's unprecedented monetary loosening, almost a Trillion Dollars is parked as the Commercial banks' excess reserves lying at the liability side of the Fed balance sheet, which does not make any meaningful contribution to goods and services.
Let me take the case of a high tail risk event like the changes in interest rate for its impact on equity markets versus the bond markets. The delta change in the positive direction in interest rates works almost inversely to the bond prices, while for equity markets the resultant is a function of what the general economy would be doing in such a situation in the future and the impact of the residual consumption expenditure on the performance of companies needs to be evaluated. Here we are dealing with conditional probability paradigm, when a posterior event needs to be assessed with the aid of a prior event. The problem with this is that the bond prices being lower as interest rates harden the attractiveness of equity versus the bonds becomes a dominant factor at play as well. These two work in tandem to determine the long run outlook of the equity prices. The real problem however is that in a dampened economy the money flow being constant, when bonds and equity start to attract a higher proportion of the total, that left for the consumers to actually spend on consumption goods becomes less, so here we have a situation that is not sustainable that bonds and equity market on their own cannot forever be disjointed from the rest of the economy and the consumer's spending habits; the positive change in the interest rates on consumer's spending habits would not change in the positive direction either. What does this leave us, we need a constant dosage of money supply to make this dysfunctional equation work, for a temporary period, it can never be a permanent affair.
But how does this make probability weighting work? The bond holders assess the interest rate shift and its timing with certain probability weights, which may or may not coincide with what the equity holders would be assessing, while the general consumers given the state of the economy make their own judgment on the timing of interest rate change and the quantum. Federal Reserve in their forward guidance creates some room for these judgments to be influenced. The resultant is what we get as the 'reaction', which either is metamorphosed as 'jobs created', or 'income levels' or at a very aggregate level the GDP itself; for each individual sector, the weights could be quite different.
The weighting of individual sectors, consumers and corporate and the imponderables that determine the shift in strategy is a complex matter, to even put all factors at play and to model it would need above all the use of nonlinear complex mathematics. It therefore suffices to say that when Central Banks try to simplify the impact of any change, what they actually intend to do is not make any change at all; it is the illusion of change that gyrates in prices when bets and counter bets challenge the paradigm.
Procyon Mukherjee, 13th August, 2013
Instablogs are blogs which are instantly set up and networked within the Seeking Alpha community. Instablog posts are not selected, edited or screened by Seeking Alpha editors, in contrast to contributors' articles.
Share this Instablog with a colleague