Faced With Evidence, We Still Get It Wrong

In decision-making, weak evidence may be worse than no evidence at all.

March 9, 2011— -- So you want your political party to take control of the state legislature, and you've just learned that the largest newspaper in the state has thrown its support behind your favorite candidate. Will that make you more or less likely to believe your party will succeed?

It may seem counterintuitive, but the answer is less likely, according to a new study from Brown University in Providence, R.I.

Why would evidence of growing support for your party make you less optimistic?

Because it's weak evidence, according to cognitive scientist Philip Fernbach, professor of psychology at Brown and lead author of a study in the journal Cognition.

Obviously, one endorsement in a political campaign with many candidates for many seats is not going to swing the entire election in one direction. The endorsement may be positive evidence, but it's not conclusive, and when that's the case, according to the researchers, the evidence may backfire.

Weak Evidence Can Lead People to Bad Decisions, Researchers Say

In five experiments involving participants across the country, people who were given weak evidence in support of a possible cause (more troops should lead to stability in Afghanistan, for example) were less likely to believe the war-torn country would eventually see stable days than people who were given no evidence at all.

Fernbach and two colleagues, Adam Darlow and Steven Sloman, say their research shows that weak evidence can cause someone to make a bad decision, because humans tend to be focused on whatever they have just learned or experienced.

"It turns out that if you give people some evidence that is positive but weak, then actually they focus too much on that piece of positive evidence" and are less likely to incorporate other facts in their decision, Fernbach said in a telephone interview. In all five experiments, participants were required to make a quick decision, so it was an "intuitive judgment that happens quickly, like when you are shopping."

But weak evidence probably influences long-term decisions as well, he added, and undoubtedly plays a role in everything from shopping to electing a president.

'Weak' Evidence Doesn't Raise Probability Very Much, Researcher Says

All five experiments followed a similar scenario in which one piece of evidence supported a possible conclusion -- the federal government is going to give everyone who buys a hybrid or electric car $250, so one out of five drivers will be driving such a vehicle in 15 years, for example. Some participants were furnished the evidence and others were not. And in some cases, participants were asked if the evidence really enhanced the probability of success.

In nearly every case, participants who did not receive the evidence were more optimistic about the outcome than participants who actually saw the evidence.

The evidence is considered weak if "it doesn't raise the probability very much, but it shouldn't decrease it, but that's exactly what we found," Fernbach said.

Fernbach specializes in learning why we do the dumb things we do, or as he put it, "trying to understand irrationality in people's judgments and decisions."

Human Judgments Are Intuitive, Based on What We've Just Learned

The findings on the effect of weak evidence may seem counterintuitive, because it would seem that any positive evidence -- even weak facts -- should boost our confidence in what we hope will be the eventual outcome. But it doesn't, according to the research.

Of course, we all like to think we are more deliberative than that. But Fernbach argues that many of our judgments are intuitive, based to some degree on what we've just learned.

"This is potentially quite pervasive and important," he said, "judging the potential state of the world given that I know some piece of information. That is just so prevalent in our lives. Am I going to take an umbrella today, it's cloudy? Will my stocks go up? Will Libya have stability if we do a no-fly zone?"

Some of that probably calls for "deliberative thinking," as he put it, but the decision is frequently based on shaky ground. After all, who wants to lug around an umbrella just because it's cloudy? Clouds are weak evidence, so leave the umbrella behind.

And then, the deluge.

There is an upbeat part of this story, however. The scores of participants who took part in the research apparently had no trouble figuring out that the evidence was weak. That's the good news. The bad news is that they usually made the wrong decision.