Coronavirus misinformation on WhatsApp is going viral, despite steps to combat its spread

The messaging service boasts over 2 billion users worldwide.

March 24, 2020, 5:53 PM

LONDON -- As the novel coronavirus pandemic continues to upend modern life as we know it, it is only natural that we look for every source of information to see how best to protect ourselves and our loved ones. Never before has the accuracy and authenticity of that information been so important.

Yet, in the midst of the crisis, misinformation has been spreading virally. On WhatsApp, the social messaging network that boasts over 2 billion users worldwide, the spread of misinformation in many ways mimics how COVID-19 itself moves through societies, from individual to individual, group to group.

Unlike other social networks though, WhatsApp messaging is very difficult to trace with end-to-end encryption, making it difficult to tell where messages came from in the first place (only that they were "forwarded," for instance). At the same time, they are being pushed directly to your phone and come from what is likely a trusted friend or group of friends drawn from your phone contacts. The company said it is working on mitigating the effects this is having on the coronavirus pandemic through partnering with health authorities and other mitigation measures.

Myths and conspiracy theories

Since being purchased by Facebook for $19 billion in 2014, WhatsApp has cemented itself as the world’s premier mobile messaging app. As of February 2020, it boasts over 2 billion users, and has become the primary means of instant messaging outside the US, particularly in much of Europe and India, according to Wired.

Despite that scale, it is a “relatively intimate app,” as you need to know someone’s phone number, and it is “largely used for direct contacts in Europe and the US,” according to Stephanie Hankey, the co-founder and Executive Director of Tactical Tech, an international NGO that explores the impact of technology on society.

“The messages are also encrypted,” she told ABC News. This means researchers, journalists and Whatsapp themselves can not see the content of messages -- they can see who the users are and who they are sending things too and how often, but they do not know what they are saying. This makes it much more difficult to identify, flag or remove misinformation and to trace its source.”

PHOTO: A woman looks at social networking applications Facebook, Instagram, Snapchat, Whatsapp, Twitter, Messenger and Linkedin on a smartphone in Kuala Lumpur, March 22, 2018.
A woman looks at social networking applications Facebook, Instagram, Snapchat, Whatsapp, Twitter, Messenger and Linkedin on a smartphone in Kuala Lumpur, March 22, 2018.
Manan Vatsyayana/AFP via Getty Images, FILE

Such a reach has made the messaging service integral to how large parts of the world communicate and receive information.

Amidst the coronavirus pandemic, WhatsApp has appeared to step up its efforts to ensure that its users can get verified information. The World Health Organization has launched a Health Alert in partnership with WhatsApp that has the potential to reach all its users and provide the latest news on coronavirus, which sets up an individual conversation which helps users have their questions answered. WhatsApp has also issued extensive guidelines on how to use the service healthily, and how it is bringing people together during the crisis.

A number of other social media companies have partnered with health authorities to combat the spread of misinformation in recent months. "COVID-19 outbreak has seen a massive 'infodemic'—an over-abundance of information—some accurate and some not—that makes it hard for people to find trustworthy sources and reliable guidance when they need it," the WHO told ABC News earlier this month.

However, at times such as these the rumor mill in ordinary society goes into overdrive, and WhatsApp is no exception.

What to know about Coronavirus:

Last week, with rumors that London was going into lockdown reaching fever pitch, a picture of military vehicles, emblazoned with English flags, driving down the highway, apparently towards the capital was shared by users on the messaging app. The image, seen by ABC News, was widely shared, spreading fears that a military shutdown was forthcoming, although the U.K. had not yet even moved to close restaurants and bars to help prevent the spread of the virus.

However, a cursory inspection of the image showed that the vehicles were driving on the right-hand side of the road, while in the U.K. cars drive on the left. The picture was undated, and there were no other clues as to its origin or authenticity. So why was it so widely shared between users?

“Yes, there is an increase in misinformation at a time of crisis. Misinformation works because it plays in to our hopes and fears and confirms our biases,” Hankey said. “What we have seen again with this crisis is that WhatsApp enables the circulation of false information presented as facts that play into our greatest fears.”

But it’s not just images that are being circulated on the app, with written messages "forwarded" between users being one of the principal means misinformation is circulated. One message “forwarded” between users and seen by ABC News, has been circulated, claiming to cite information from a source who had “all their information from a friend [of] theirs who works in the Royal media office” that the U.K. would be going into lockdown for 15 days as of last Friday. The message claimed to have inside information that the “army are being deployed to London to support.”

This, of course, turned out be false – but it demonstrates the level of anxiety users are experiencing, and what people will believe in times of national crisis.

"We think the most important step WhatsApp can take is to help connect people directly with public health officials providing crucial updates about coronavirus,” a WhatsApp spokesperson told ABC News. “The WHO announced today that over 10m people have used their WhatsApp Health Alert service since Friday, and we have also launched services in India, Indonesia, Singapore and Israel with more to come. Also compared to SMS, we provide labels to forwards and chain messages to help people know when they have received information that did not come from their immediate friends or loved ones. We also have built advanced machine learning to prevent mass messaging that helps us ban over 2 million accounts per month attempting to abuse the service."

False cures

Another example of misinformation spread on the app is conspiracy theories of the coronavirus’s origins, but one of the most sinister is the “mass circulation of false cures,” Hankey said.

An example of one of these messages, seen by ABC News, was purported to have been forwarded from an internal email for staff at St George’s Hospital in London. The message contains a mixture of true and highly damaging false information. First, it says that the symptoms of COVID-19, such as a “dry and rough cough,” separate it from flu or the common cold, but then the message goes on to give fake cures and false preventative measures.

PHOTO: An MTA conductor stands in a beam of light at Grand Center Terminal that is sparsely populated during rush hour due to COVID-19 concerns in New York City, March 20, 2020.
An MTA conductor stands in a beam of light at Grand Center Terminal that is sparsely populated during rush hour due to COVID-19 concerns in New York City, March 20, 2020.
John Minchillo/AP

These myths include that “hot drinks… should be consumed abundantly during the day,” to avoid “drinking iced water,” to make sure your throat is never dry and drink by drinking “a sip of water at least every 15 minutes.”

The WHO has published a "Myth Busters" page on their website, which says that fake measures such as this, and other false remedies such as "taking a hot bath," have no effect on the spread of the virus. The most effective way to combat the spread of the virus is to wash your hands, according to the WHO.

Another message seen by ABC News, claiming to be from an “NHS [National Health Service] friend,” said that the use of ibroprofen left four young people in Ireland hospitalized in intensive care and the use of the painkiller should be ceased immediately. The hospitalizations were determined to be fabrications.

The use of ibroprofen being damaging appears to have originated in theories that have been considered by scientific journals such as the Lancet, but so far, the evidence is thin, as demonstrated by ABC News last week.

The spread of fake cures online can have potentially disastrous effects, according to Hankey.

"These are equally, if not more, dangerous than conspiracy theories,” she told ABC News. “They work because they play into our hopes yet they make the work of health professionals extremely difficult and could cost lives.”

Going viral

There are two central features of WhatsApp that make it different from other social media networks when it comes to spreading misinformation.

“First, it is delivered straight into your hand and unless you change the default settings, pushes messages to you. It's immediacy and the fact that for many people it is mixed with important regular messages from friends, family and colleagues means it has a more personal and direct form of reach,” according to Hankey.

In addition to the inherent credibility of messages from friends, WhatsApp is exceptional in the way users spread misinformation, which almost mimics the viral contagion that is so dominating our current lives.

An estimated 90% of messages sent on the platform are sent one to one, and most groups contain less than 10 people, according to WhatsApp, but the information still has the capacity to spread incredibly quickly.

“Much like the diagrams people may have seen of how CoronaVirus spreads -- one person passes it to 2 or 3 others and so on in a chain, that [e]ffect does not take long to continue until thousands have it, if not hundreds of thousands,” according to Hankey.

Among the world leaders to urge individuals to stop sharing “unverified info” on WhatsApp is Leo Varadkar, the leader of the Irish government, who last week tweeted: “I am urging everyone to please stop sharing unverified info on What's app groups. These messages are scaring and confusing people and causing real damage. Please get your info from official, trusted sources.”

WhatsApp has taken a number of steps to reduce the spread of misinformation after those controversies, by restricting the group sizes to 256 users, reducing the number of times a message could be “forwarded” to five, and flagging them as “forwarded” in the first place. WhatsApp has also donated $1 million to the International Fact Checking Network to address coronavirus misinformation, and are currently working on new features to help users find out more about messages that have been forwarded multiple times.

Yet, with end-to-end encryptions to insure users’ privacy, there is a limit to how much Facebook and WhatsApp can do, according to Charlotte Jee, a journalist at the MIT Technology Review.

“The fact [is] it's end-to-end encrypted, so it's literally impossible for Facebook to moderate it like an open platform,” Jee told ABC News. “They can't see what people are sharing… But we badly need people to build up their defenses against misinformation and hone their ability to spot it."

Related Topics