Nice paper by Glaeser attacking the glib and simplistic Surowiecki ‘Wisdom of Crowds’ idea.
“[W]e suggest that social learning is often best characterized by what we call Credulous Bayesianism. Unlike perfect Bayesians, Credulous Bayesians treat offered opinions as unbiased and independent and fail to adjust for the information sources and incentives of the opinions that they hear. There are four problems here. First, Credulous Bayesians will not adequately correct for the common sources of their neighbors’ opinions, even though common sources ensure that those opinions add little new information. Second, Credulous Bayesians will not adequately correct for the fact that their correspondents may not be a random sample of the population as a whole, even though a non-random sample may have significant biases.
Third, Credulous Bayesians will not adequately correct for any tendency that individuals
might have to skew their statements towards an expected social norm, even though peer
pressure might be affecting public statements of view. Fourth, Credulous Bayesians will
not fully compensate for the incentives that will cause some speakers to mislead, even
though some speakers will offer biased statements in order to persuade people to engage
in action that promotes the speakers’ interests…
In Section V of the paper, we assume that errors in private signals are correlated
across individuals. Credulous Bayesians overestimate the extent to which these signals
are independent. The first proposition of the paper shows that when individuals are
Credulous Bayesians, their post-deliberation beliefs become more erroneous and they
acquire more misplaced confidence in those erroneous beliefs. This proposition helps
explain why socially formed beliefs, like those about religion, politics, and constitutional
law (and sometimes science as well), can be quite strongly held, despite a lack of
evidence and an abundance of other groups holding opposing beliefs.
Our second proposition shows that when individuals are Credulous Bayesians,
accuracy may decline as group size increases. As group size increases, mistakes can
become more numerous and more serious. After all, the essence of Credulous
Bayesianism is that people misuse the information of their neighbors, so more neighbors
means more errors. This finding suggests that in some settings individuals may be wiser
as well as less extreme than crowds (compare Surowiecki, 2005; Page, 2006)…
A large body of research has discussed the human tendency to give statements that
conform to an expected community norm. For group deliberation, the problem is that
people may discount this tendency and think, wrongly, that public statements actually
convey information. In Section VI, we model conformism by assuming that individuals’
statements reflect a combination of private information and an expectation of what
individuals think that the group wants to hear. Credulous Bayesians fail fully to adjust
for the fact that statements are skewed to the norm. The combination of conformism and
Credulous Bayesianism creates error, tight homogeneity within groups, and greater
heterogeneity across groups. If people utter politically correct statements, with the aim of avoiding the wrath of others, then Credulous Bayesianism could help explain both the
blue state/red state phenomenon of ideological homogeneity within areas and
heterogeneity across areas (Glaeser and Ward, 2006)…”
“In Section VII, we assume that some individuals, like legal advocates or politicians,
have incentives to report misleading information in their quest to change people’s
decisions. “Polarization entrepreneurs,” in law and politics, might attempt to do exactly
that. This claim is in a similar spirit to Mullainathan, Schwartzstein and Shleifer (2007),
who examine the interaction between persuasion and categorical thinking. In this case,
Credulous Bayesians fail fully to correct for the motives of those around them. The
combination of incentive-created misstatements and Credulous Bayesianism always leads
to less accurate assessment and can lead to bias as well. The degree of bias depends on
the imbalance of resources or incentives across persuaders, not the persuasion per se.
”
http://www.economics.harvard.edu/faculty/glaeser/files/socproof19.pdf