Facebook exec says company will make itself 'more transparent'
Facebook top executive Nick Clegg appeared on ABC's "This Week" Sunday.
Facebook will implement new tools to increase transparency and safety for users following Tuesday's explosive whistleblower hearing, Facebook Vice President of Global Affairs Nick Clegg said Sunday.
"We will, of course, seek to make ourselves ever more transparent so people can hold us to account," Clegg told ABC “This Week” anchor George Stephanopoulos.
“We understand that with success comes responsibility, comes criticism, comes scrutiny, comes responsibility, and that's why we're the first Silicon Valley company to set up an independent oversight board that independently adjudicates on these difficult content decisions,” Clegg added.
Facebook whistleblower and former employee Frances Haugen testified before a Senate subcommittee last week, accusing the social media giant of ignoring evidence that its content is harmful to young users and dangerous to democracy.
“I saw Facebook repeatedly encounter conflicts between its own profits and our safety. Facebook consistently resolved these conflicts in favor of its own profits,” Haugen told senators. “The result has been more division, more harm, more lies, more threats and more combat.”
Facebook CEO Mark Zuckerberg denied Haugen’s claims in a statement following her scathing testimony. “At the most basic level, I think most of us just don't recognize the false picture of the company that is being painted,” Zuckerberg said.
Clegg said Facebook is working on giving users more control, acknowledging that they want to see “more friends, less politics.”
During Tuesday's hearing, Sen. Richard Blumenthal, D-Conn., said Facebook is “facing a Big Tobacco moment, a moment of reckoning.”
Clegg called the comparison “extremely misleading.”
“We can move on beyond the slogans, the soundbites, the simplistic charcuteries and actually look at solutions and, of course, regulations,” Clegg said.
Stephanopoulos also pressed Clegg on Facebook’s efforts to create a safer environment for kids and teens.
“You also say Facebook's job is to mitigate the harm and amplify the good on social media. But even researchers and -- critics -- say you can be devoting more resources to positive interventions for teens. Is Facebook prepared to do more on that?” Stephanopoulos asked.
“Yes, we are,” Clegg responded. “We're now going to not only provide those new parental tools but we're going to introduce new measures ... [if our] systems see that a teen is dwelling on content that may be correlated with something that's not good for their well-being, we would nudge them to look at other content.”
He went on, “We're also going to introduce new tools, what we call 'take a break,' to really kind of urge teens to take a break from using Instagram if they appear to be doing so, you know, for long periods of time."
Clegg defended Facebook’s algorithm, which prioritizes the content users see first and was a target of Haugen’s testimony. Haugen claimed Facebook’s algorithm incites misinformation and violence.
Stephanopoulos asked why the company does not just remove the algorithm altogether and display content chronologically. Clegg said that would make things worse.
“If you were just to sort of across the board remove the algorithm, the first thing that would happen is that people would see more, not less, hate speech, more, not less, information, more, not less, harmful content,” Clegg said. “Why? Because those algorithmic systems precisely are designed like a great sort of giant spam filter to identify and deprecate and downgrade bad content.”