Could Meta ending fact-checking lead to rise in health misinformation?
Meta said it's moving towards community notes to moderate content.
Meta -- the company that operates Facebook, Instagram, Threads and WhatsApp --announced on Tuesday it was ending third-party fact-checking.
Some social media policy experts and public health experts are worried that the end of fact-checking could lead to the spread of medical and science misinformation and disinformation. This is especially worrisome as the U.S. is in the throes of respiratory virus season and is fighting the spread of bird flu.
"There's going to be a rise in all kinds of disinformation, misinformation, from health to hate speech and everything in between," Megan Squire, deputy director for data analytics and open-source intelligence at the Southern Poverty Law Center, told ABC News. "[Health] is supposed to be a nonpartisan issue, and … we do see people trying to leverage health [misinformation], in particular, toward a political end, and that's a real shame."
"I'm hopeful, but I'm also concerned that this new structure that all the Meta properties are embarking on, it's just not going to end well," she added.
The social network giant said it was following the footsteps of X, replacing the program with user-added community notes.
In a press release of the announcement, Joel Kaplan, chief global affairs officer for Meta, said that the choices about what was being fact-checked showed "biases and perspectives."
How fact-checking, community notes work
Meta started fact-checking in December 2016. Meta's fact-checking works by Meta staff identifying hoaxes or by using technology that detects posts likely to contain misinformation. The fact-checkers then conduct their own reporting to review and rate the accuracy of posts.
If a piece of content is identified as false, it receives a warning label and the content's distribution is reduced so fewer people see it.
Fact-checkers put in place following Donald Trump's 2016 election win were found to be "too politically biased" and have destroyed "more trust than they've created," Meta CEO Mark Zuckerberg said in a video posted by the company.
By comparison, community notes work by a user adding context to a post that may be misleading. It is then upvoted or downvoted by other users.
Zeve Sanderson, executive director of NYU Center for Social Media & Policy, said after the 2016 election, there was immense pressure for social medial platforms, including Meta, to commit resources to combatting misinformation.
Following the election, most posts being fact-checked were to combat political misinformation, according to Sanderson. During the COVID-19 pandemic, this was expanded to combat medical misinformation, he said.
Sanderson said there were a lot of claims going unchecked online because Meta has not had enough fact-checkers to check every post. Additionally, he said some people didn't trust fact-checkers.
"There were groups of people online who didn't trust fact checkers, who saw them as biased, often in a liberal direction," he told ABC News. "This crowd-sourced content moderation program … it's going to do different things well and different things poorly. We just don't know how this is actually going to work in practice."
Meta referred ABC News back to its Tuesday announcement in response to a request for comment on plans for its community notes or potential spread of misinformation.
Spread of misinformation during COVID-19
During the COVID-19 pandemic, millions were exposed to a deluge of information including news, research, public health guidance and fact sheets, which the World Health Organization referred to as an "infodemic."
People were also exposed to misinformation and disinformation about what treatments work against COVID-19, how much of a risk the virus poses to children and whether COVID-19 vaccines are effective.
A 2023 KFF survey found that most Americans were not sure if health information they had encountered was true or false.
A report from the U.S. Surgeon General in 2021 found that misinformation led to people rejecting masking and social distancing, using unproven treatment and rejecting COVID-19 vaccines.
Experts told ABC News that members of the general public often do not have enough health literacy to determine if they should trust or not trust information they encounter online or on social media.
Squire said sometimes government agencies do not put out information in an "interesting" format, which may lead people to click on "entertaining" content from misinformation and disinformation peddlers.
"Some of these YouTube videos about health misinformation are a lot more entertaining. Their message just travels faster," she said. "When you're presenting scientific information -- I know this firsthand as a former college professor -- that's a struggle. You have to be pretty talented at it and, a lot of times, where the expertise lies is not necessarily where the most expedient, fun videos are and stuff."
How to combat health misinformation
Meta's change comes as the U.S. faces an increase in bird flu cases and continues treating patients falling ill with respiratory illnesses.
As of Jan. 8, there have been 66 human cases of bird flu reported in the U.S., according to data from the Centers for Disease Control and Prevention.
It's also flu season. As of the week ending Dec. 28, 2024, there have been at least 5.3 million illnesses, 63,000 hospitalizations and 2,700 deaths from flu so far this season, according to CDC estimates.
Meanwhile, health care professionals have been encouraging Americans to get their flu shot and other vaccines -- including COVID and RSV -- to protect themselves against serious disease.
Experts are worried that with the change from fact-checking to community notes that misinformation could spread about the effectiveness of vaccines or how serious an illness is.
"I am concerned about the sheer amount of inaccurate information that's out there," Dr. Brian Southwell, a distinguished Fellow at nonprofit research institute RTI International and an adjunct faculty member at Duke University, told ABC News. "That's something that you know ought to bother all of us as we're trying to make good decisions. But there's a lot that could be done, even beyond, you know, the realm of social media to try to improve the information environments that are available for people."
Southwell said one thing that public health experts and federal health agencies can do is to get an idea of the questions that users are going to have about medical topics -- such as bird flu and seasonal flu -- and be ready with information to answer those questions online.
To combat being exposed to information, the experts recommended paying attention to where the information is coming from, whether it's a respected source or someone you are unfamiliar with.
"There are various skills that are important, things like lateral reading, where rather than just evaluating the claim, you do research about the source of that claim and what you can find out about them to understand what some of their incentives or track record might be," Sanderson said.
"This is obviously something that, sadly, social media platforms are not designed in order to incentivize this sort of behavior, so the responsibility is thrust on users to sort of look out for themselves," he added.