Behavioral Puzzles in Business and Diplomacy

Aug. 3, 2003 -- "Why was I so stupid?" In their already classic book Judgement Under Uncertainty, psychologists Amos Tversky and Daniel Kahneman give at least a partial answer to this perennial question.

They describe some of the myriad ways in which we, subject at times to cognitive illusion, act irrationally. Kahneman received the 2002 Nobel prize in economics for his work, and Tversky would have shared it had he not died. Applying many of the Tversky-Kahneman findings to business and the stock market, economist Richard Thaler and others have recently developed the new field of behavioral finance.

It's a burgeoning field with applications that transcend finance and extend to everyday transactions as well as to issues of war and peace. One of its most important insights is that how situations are framed affects our choices in specific ways.

Inconsistent Monetary Choices

Consider a situation in which an imagined benefactor gives $1,000 to everyone in a group and then offers each member of the group the following choice. He promises to a.) give them an additional $500 or else b.) give them either nothing or an additional $1,000, depending on the outcome of a coin flip. Most people choose to receive the additional $500.

Contrast this with the choice people in a different group make when confronted with a benefactor who provisionally gives everyone $2,000 and then offers the following choice to each of them. He will a.) take back from them $500 or else b.) will take from them either nothing or $1,000, depending on the flip of a coin. In this case, in an attempt to avoid any loss, most people choose to flip the coin. The punchline, as it often is, is that the choices offered to the two groups are the same: a sure $1,500 or a coin flip to determine whether they'll receive $1,000 or $2,000.

In another, more complicated study, subjects generally chose to receive $45 with a probability of 20 percent rather than $30 dollars with a probability of 25 percent. This is reasonable since the average gain in the first case is $9 (20 percent of $45), whereas the average gain in the second is only $7.50 (25 percent of $30).

Now let's frame the question a little differently. With a probability of 75 percent the subject is eliminated at the first stage and receives nothing. However, if someone reaches the second stage, he or she has the option of receiving $30 for certain or $45 with a probability of 80 percent. This is equivalent to the same problem: a choice between $30 with a probability of 25 percent or $45 with a probability of 20 percent (since 80 percent of 25 percent is 20 percent). In this case the majority of subjects make the opposite choice and opt for the seemingly safer $30 option, influenced apparently by the idea of certainty.

Many other scenarios support the proposition that people are greatly influenced by the way a choice is framed and are considerably more willing to take risks to avoid losses than they are to achieve gains. This may be part of the reason cover-ups tend to be worse than the original scandals.

Over-Confidence, Ignoring Alternatives

Although these matters are tricky and frequently counter-intuitive, people are rarely short of confidence in confronting them. This leads us to another general lesson that cognitive psychology teaches: People are too often certain of their decisions because they fail to look for conflicting evidence, distort the facts and their memories of them, ignore alternative views, and let their own explanatory schemes seduce them.

Regarding the latter, consider an experiment in which subjects were told of two firemen — one successful, one not.

Half the subjects were told that the successful fireman was a risk-taker and that the unsuccessful one was not. The other half of the subjects were told that the successful one was not a risk-taker and that the unsuccessful one was. Afterward, they were informed that the firemen did not exist and that the experimenters had simply invented them.

Amazingly, they continued to be strongly influenced by whatever explanatory stories they had concocted for themselves. If they had been told that the risk-taking fireman was successful, they thought that prospective firemen should be chosen for their willingness to take risks; if not, then not. If asked to account for the connection between risk-taking or its absence and successful firefighting, the members of each group gave a cogent explanation consistent with the imaginary story originally told them. I leave it to the reader to judge the (ir)relevance of this to people's responses to the war in Iraq.

A famous study by psychologist Peter Wason neatly illustrates how we tend to look only for confirmation of our ideas, seldom for disconfirmation. Wason presented subjects with four cards having the symbols A, D, 3, and 7 on one side and told them that each card had a number on one side and a letter on the other. He then asked which of the four cards needed to be turned over in order to establish the rule: Any card with an A on one side has a 3 on the other. Which cards would you turn over? (The answer is below.)

These are just a few ways in which we systematically fall victim to psychological illusion.

Answer: Most subjects picked the A and 3 cards. The correct answer is the A and 7 cards.

Professor of mathematics at Temple University and adjunct professor of journalism at Columbia University, John Allen Paulos is the author of several best-selling books, including Innumeracy, and the just released A Mathematician Plays the Stock Market. His Who’s Counting? column on ABCNEWS.com appears the first weekend of every month.