Your Voice Your Vote 2024

Why millions don't trust the election results, despite no evidence of widespread fraud: Experts

Experts weighed in on why we believe disinformation, despite a lack of evidence.

November 22, 2020, 12:04 PM

Since Election Day, numerous false allegations of fraud have been published on social media and repeated elsewhere, most of which have been easily debunked, yet a large swath of the population still appears to believe them.

According to a recent poll roughly three-quarters (77%) of Trump backers say former Vice President Joe Biden’s election win was due to fraud despite there being no evidence to back this up.

So what is it about the human psyche that makes us so susceptible to disinformation?

“The short answer is that it has less to do with the content of the information and more to do with the social identity of the person,” Dannagal Young, political psychologist and associate professor at the University of Delaware, told ABC News. “What's driving some of these inclinations is about who these people feel they are, what groups they're associated with, who they identify as and who they identify with."

Exploiting divisions

As of Friday, Biden had nearly 80 million votes -- some 10 million more than the previous record set by Barack Obama in 2008. But President Trump also eclipsed Obama's record with nearly 74 million votes at last count. The record turnout was propelled in part by record mail-in voting due to the pandemic, which Trump and his allies, have claimed, without evidence, is ripe for fraud.

Also fueling doubt is the fact that Trump appeared to lead in several key states on election night, only to see those leads reversed when mail-in ballots were counted. Trump has also been relentlessly attacking the ballot-counting procedures in several key states since Election Day.

Young argued that the political parties in the United States have become increasingly correlated with two distinct cultures defined by religious identity, racial identity and geographic location. As a result it’s easier to create a false story that taps into those identities, making one side or the other more likely to believe it.

Add onto the political environment the fact that we’re living through a pandemic, when people are extremely anxious and uncertain about the future, and you have a perfect storm of conditions to sow disinformation, the experts said.

“If you just feel like things are out of control, that that can be really debilitating. So people want to impose order on the world," said Young. So if someone offers a wild theory, even though it might not be logical, you’re more likely to believe it because it helps explain your situation and give you control.

Accompanying the deepening divisions in the U.S. is anger and distrust of the others side.

Young said that if you can create a target and turn that fear into anger, that will give an extra incentive for someone to believe you. “It seems counterintuitive, but anger makes people feel optimistic because anger has a forward driving momentum."

'Cognitive misers'

Dr. David Rand, a cognitive scientist at M.I.T., acknowledged that people are more likely to believe something that aligns with how they see the world but argues that there’s a much simpler reason for why people fall for disinformation. He says that people are "cognitive misers," which he says essentially means that the brain will always look for the simplest solution to a problem, and that especially with social media, people just don’t take the time to analyze the information properly.

“Our work suggests that if you ask people to stop and think about is this true, most people are actually pretty good at telling sort of like fake news from true news,” he told ABC News.

The platforms, by design, are like built to focus your attention on things other than whether content is accurate or not," he added. Firstly users are scrolling so fast they don’t have time to engage their brains -- people are thinking about what will get them more likes and retweets not necessarily whether what they post is true. “It makes you think about how are people going to like this? What's it going to say about me? Not 'Is it accurate?”

Where the information is coming from

Where the information comes from also influences how likely you are to believe it. Rand explained that people are more likely to believe information from people that they trust and that they think are reliable. "You can have something that you find really surprising, doesn't fit with your previous beliefs at all," he said.

"But if it's from a source you really trust, then you think, 'OK, I guess I was wrong.' Whereas if it's from a source you think is sketchy, then you're like more likely the source is wrong than everything I know about the world is wrong," Young said.

Young also emphasized this point saying it's particularly dangerous when elites spread disinformation, because the reader's mind is less likely to do the critical thinking if he or she thinks someone they respect has already done this for them. “This is why the rhetoric of elites like politicians or journalists or people we respect is so powerful because, again, their status serves as a cue," added Young.

As a result of media coverage, people are more aware of disinformation but nobody seems to think they will ever be duped by it -- what Young calls the "third-person effect." “Everyone is susceptible to disinformation. We think that other people are, but we're not. It's such a logical fallacy because we can't all be right or it wouldn’t be a problem,” Young added.

Of course, if someone believes they’re immune to disinformation, that means it’s very difficult to change their minds once they’ve taken hold of a false narrative.

Research suggests that debunking a falsity can actually have the opposite effect and help propagate the original falsity if not done properly. Young suggests debunking be done using the "truth sandwich" effect, whereby you preface the falsity with what is true, discuss the false allegation and then reiterate what is true.

Introducing a "speed bump" that forces people to think more about the information they consume, like the warning labels platforms like Twitter and Facebook have begun to place on false or misleading posts, are proven to lessen the spread of those posts according to Rand's research.

“There are several papers now showing if you just put a warning on something when people first see it, it makes them less likely to believe it and less likely to share it, regardless of whether it aligns with their ideology or not,” said Rand.

Related Topics