An election is coming up, and pundits, commentators and assorted pooh-bahs are full of predictions.
"Expert Political Judgment," a recent book by University of California psychologist Philip Tetlock, suggests that we shouldn't be too impressed by these predictions. With some exceptions, they're not much better than those generated by a good pair of dice.
Tetlock convinced almost 300 experts to make quantifiable predictions about events that might or might not come to pass. The people, all of them professionals paid for "commenting or offering advice on political and economic trends," made more than 80,000 predictions over the period of the study.
Although wrong so very frequently, the unnamed experts, in the same manner as supermarket tabloid psychics, are rarely held accountable -- and, when they occasionally are, they cite the usual litany of excuses for their mistakes:
They were right in principle, but their timing was off.
A wholly improbable event caused their reasoned and reasonable prediction to fail.
If only their failed predictions were interpreted correctly, they would have been seen to be correct.
If some weasel word were taken one way rather than another, they'd be right.
Their prediction was wrong, but it nevertheless demonstrated a correct analysis.
Some of their predictions are only slightly disguised versions of empty forecasts, like "Things to Stay the Same Until They Change" or "Things to Change Eventually."
Many of the experts' mistakes result from common psychological foibles that afflict most people. One is the so-called Linda problem. A woman, Linda, is described as very bright, single, outspoken, a philosophy major and an anti-nuclear activist. Given this background, you are asked to say which of the following descriptions of her is more likely: (a) she's a bank teller, or (b) she's a bank teller who is an active feminist.
Most people will choose "b," because the feminist characterization fits comfortably with her background. Nevertheless, "a" is the more likely description, because for it to be true, only the condition that she's a bank teller needs to be met. In order for "b" to be true, however, two conditions must be met -- that she's a bank teller and that she's an active feminist. Satisfying two conditions is always less likely than satisfying one.
In generating predictions, experts, since they either know so much or have such strong preconceptions, tend to weave more elaborate scenarios than others do. Because of the mistaken intuition underlying the Linda problem, these scenarios can seem more plausible but are often less likely.
This tendency to be swayed by extraneous facts and unexamined preconceptions is just one common failing. Experts, like commentators, politicians and everybody else, often fall victim to confirmation bias. This is the tendency, once you've made even a tentative decision, to avidly search for evidence confirming the wisdom of your decision and largely ignore evidence disconfirming it.
Garden variety instances of confirmation bias occur in areas as disparate as the stock market, personal relations and legal conflicts. A more extreme case is the prosecution of the Iraq War, which is a textbook example of, among other things, an almost delusional myopia and the staggering enormity of the consequences to which it can lead.