Facebook and Google made clear on Monday that they will not broker advertisements for inclusion on websites and mobile apps that the internet giants have determined display fake or misleading news after some have raised questions about whether either company may have played a role in the presidential election.
Both Facebook and Google, however, stopped short of agreeing to prevent the spread or appearance of such content in users’ newsfeeds or in search results on their own platforms.
The announcements come after months of widespread concern over the rapid increase of fake, wrong or misleading news articles on the internet.
Facebook and Google generate money primarily through advertising shown on their own websites and pages. But they also broker and manage advertisements for display on third-party websites and applications. Revenues generated through this arrangement are split between the tech companies and the owners of the websites or applications.
Now, Facebook and Google are clamping down on which third-party sites and apps they will allow to be part of this business model.
Facebook says the move to call out fake news sites was done to clarify its previous policy.
Facebook said in a statement: “We do not integrate or display ads in apps or sites containing content that is illegal, misleading or deceptive, which includes fake news.”
“While implied, we have updated the policy to explicitly clarify that this applies to fake news,” the company added. “Our team will continue to closely vet all prospective publishers and monitor existing ones to ensure compliance.”
Google said it would now "restrict ad serving on pages that misrepresent, misstate, or conceal information about the publisher, the publisher's content, or the primary purpose of the web property."
“We've been working on an update to our publisher policies and will start prohibiting Google ads from being placed on misrepresentative content, just as we disallow misrepresentation in our ads policies,” Google said in a statement.
In an interview with the BBC that was released on Tuesday, Google CEO Sundar Pichai was asked about fake news showing up in Google search results. He told the British broadcaster that his company was undergoing “a learning moment” and that it will “definitely work to fix it.”
He acknowledged that fake news could have affected some people’s votes and possibly the election outcome in some areas, noting the narrow margins by which candidates won in certain places.
As the dissemination and consumption of news have shifted increasingly to online platforms where the costs and barriers to entry are lower, it has allowed websites that offer fake or misleading news to proliferate, causing consternation among those in politics and the media.
On Monday, BuzzFeed reported that a “renegade” group of Facebook employees had “formed an unofficial task force to question the role their company played in promoting fake news in the lead-up to Donald Trump’s victory in the US election last week.”
Defending his company, Facebook CEO Mark Zuckerberg said on Saturday that “of all the content on Facebook, more than 99 percent of what people see is authentic. Only a very small amount is fake news and hoaxes.”
“Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other,” he added. “Identifying the ‘truth’ is complicated. While some hoaxes can be completely debunked, a greater amount of content, including from mainstream sources, often gets the basic idea right but some details wrong or omitted.”
The tech CEO said that the company had to be careful about injecting itself into the filtering of content.
“An even greater volume of stories express an opinion that many will disagree with and flag as incorrect even when factual,” he said. “I believe we must be extremely cautious about becoming arbiters of truth ourselves.”