Please Note: Blog posts are not selected, edited or screened by Seeking Alpha editors.

How The Markets Ignore Conditional Probability And The Shrinking Sample Space

The tide of daily information comes in spurts, compounded by bloggers, news media, independent analysts, activists and a whole lot of government bodies. A bulk of the information is sequential, and as new information is unloaded sometimes the mind is influenced by the more latest than by the earlier, while it could well be that the latest is less potent. Compounding the issues is the availability of lots of good news for the market, while bad news is less in number, the relative potency of good news versus bad news is left to the availability heuristic, creeping nature of biases suggest that judgment calls are taken that has a lot to be desired.

In the area where markets are sensitive like information that could influence business, trade, positions on trade in stocks or commodity futures or options, the cascade of information in sequence has some far reaching consequences if the ignorant mind is oblivious to the shrinking sample space as new information arrives.

How does the mind react to new information?

Let me give some examples.

The first information is that NYSE is pointing to a lower opening. The normal reaction of a trader would be to expect a rise during the day therefore. Some others could wait for the next set of data or information to take a view.

The next information is that jobs report is expected to show a better job addition than last month. The reaction of the traders who expected the rise to be happening during the day could actually move to a position that when the actual report would be coming out the timing of the rise would be more certain. The others who waited for the next set of data could actually infer that the rise is confirmed by the end of the session.

Amongst the myriad of possibilities what positions the two groups could be taking, based on the sequential information if one would have applied conditional probabilities, the problem would have looked as follows:

Probability of a lower opening: P (NYSE:A)

Probability of a job report showing better job numbers: P (NYSE:B) & Probability of a job report showing less job numbers P (B'). Let us assume P (B) =0.9 & P (B') =0.1

Probability of lower opening (A) given that B has occurred: P (A/B); let us assume this is as 0.8 & P (A/B') = 0.9

Using Bayesian Statistics: P (A) = P (A/B)*P (B) + P (A/B')*P(B') = 0.8*0.9+0.9*0.1=0.81

This problem actually gets many times compounded when a series of information starts to pour in as the day progresses and the opening stock value is converted into a dynamic statistic.

Does the human mind actually evaluates the impact of a squeezing sample space as new information arrives and every evaluation is based on the intuitive judgment based on Bayesian conditional probability?

Humans have a typical weakness towards understanding conditional probabilities. To frame a typical example let me take the Blue Cab / Yellow Cab example:

A city has 80% Yellow Cabs and 20% Blue Cabs. What is the probability that a cab that met with an accident is Yellow, given that a yellow cab has 25% chance of making an accident and a Blue Cab has a 50% chance of making an accident?

Using Bayesian Statistics we know the answer as 66.7% chance that the cab is Yellow and a 33.3% chance that the cab is Blue. But our intuitive mind would never get any close to this answer. This is simply because our mind is not aligned to the shrinking sample size. Without getting into Bayesian Conditional probability, we can simply draw the following chart to explain the solution:


Rash Driver (Accident Prone)

Normal Driver (Accident Free)


Yellow Cab




Blue Cab








Let us assume there are 100 Cabs in the city.

Out of the total 100 Cabs, only 30 cabs are prone to accidents, out of which 20 cabs of Yellow are prone to accidents. So in a sample size of 30 accident prone cars, there is a chance of 20 of them being Yellow, thus making the probability 66%.

This problem of a shrinking sample size as additional information is made available, is a problem of information rejection by the more idle part of the brain, which rejects this additional information and wants to resort to the availability heuristic to get to the quick answer.

What is the probability of a blue car to be accident free? It is 10/70 as the sample size squeezes to 70 and ten of these are Blue.

New information has the potential of squeezing the sample size, while our mind is anchored to the original sample size. Let me take a more mundane example.

The unemployment % in a city of predominantly male population (60%) is 10% for male which is 20% for female. If a person is found unemployed, what is the probability that she is a female?

















Out of 100 as the population, this becomes the distribution.

The answer to this question would be 8 / 14 as the sample size changes to 14 and only 8 out of them are females.

What is the probability of an employed person being a female? Here the sample size changes to 86, thus making it 32 out of 86 as the answer or 37%. This will be intuitively difficult to answer.

Prior and posterior Probability Example

Let us assume that a test of a virus has a 98 % chance of success when someone who has actually the virus is tested and the chances of a test showing negative results for a person who does not have the disease has a 97% success rate and the incidence of a disease is 0.4% in a given population (which means in a population of 1000, 4 persons have a probability of contacting the virus). What is the probability that a person who is tested positive actually has the virus?

P (positive/virus) = 0.98

P (positive/no virus) =1-0.97=0.03

P (Virus) = 0.004

P (Positive) = P (positive/virus)*P(virus)+P(positive/no virus)*P(no virus)

= 0.98*0.004+ 0.03*0.996


So the probability of a person tested positive actually contacting the virus is just 0.0338 or less than 4%.

This result is intuitively never within the scope of human understanding for the following reason:

A very low probability of a person having the disease in a given population ordains that only 4 persons in a population of 1000 could have the disease, so if a person is tested positive who is taken from these four and the chances of testing positive who are not taken from these four have two highly skewed outcomes, while the former has a high chance 9although the chance of a person taken from the four infected itself has a very low probability), the latter has zero chance; the human mind does not distinguish the relative scale and actually uses the availability heuristic which is the incidence of the disease (0.4%) to substitute the answer to the question 'probability that a person who is tested positive actually has the virus'. But actually this probability is lower than the incidence.

The rising complexity with new information and the sequential nature of judgment calls

As information is sequentially received by different constituencies, the availability heuristic starts to act together with creeping bias as complexities cannot be tackled by the human mind. A sequence of heuristic judgments lead to an overall denouement that at best could be a biased judgment of one single heuristic that ignores all information sequences and is based on a chance judgment call, with a mix of experience or wisdom.

The exact opposite of this is an algorithm based approach that assigns probabilities to each new event and applies conditional probabilities to get to the computation of the probability using Bayesian statistics.

The real world does not act on either of these principles; rarely do we see complete reliance on heuristics, nor do we see complete dependence on computation based approaches. But being in the middle of two extremes is even worse.

But it is better to be aware that from a string of information, one must be cautious enough to assign probabilities to each chance event and have the mind tuned to the shrinking sample space as new information arrives. To be able to discern the good from the bad is not difficult; the problem which does not have a solution is the inherent nature of bias in dealing with the good from the bad as the relative importance of one could have far reaching implications for the overall event, if the weights are so distributed.

A similar situation prevails when annual planning exercise is done in corporate. The bottom up approach uses certain weights in arriving at the overall picture, although the working is more deterministic than probabilistic. When the top down approach is used with certain broad objectives like sales growth or profit growth as a pre-determined number, we have the overall planning exercise mired in what has been described as planning fallacy.

When the actual spate of events during the year starts to unfurl, the unknowns are replaced by new known or new information, the planning errors need a course correction as we go through the year. The problem with the course correction is that it cannot be done with a deterministic mindset as the new information must be used with a conditional probability mindset, which means that the original planning model in any case needs to be transformed into a more probabilistic model.

The reason why this is difficult to achieve is embedded in the way markets behave. Markets are not very comfortable with probabilistic model; when instant gratification is the current value, there is no question of a probabilistic objective function. It is almost like saying, 'the company has a 95% chance of success given that inflation would be lower than 2% and unemployment lower than 8%'. The market will react with a deaf ear probably to such objective statements to its own peril.

Procyon Mukherjee

1st April 2013

Disclosure: I have no positions in any stocks mentioned, and no plans to initiate any positions within the next 72 hours.