Key takeaways from Facebook whistleblower Frances Haugen's Senate testimony
Facebook has attempted to discredit Haugen's knowledge of the company.
A former Facebook employee turned whistleblower testified before a Senate Commerce subcommittee on Tuesday -- alleging blatant disregard from Facebook executives when they learned their platform could have harmful effects on democracy and the mental health of children.
"Facebook has not earned our blind faith," said former Facebook product manager turned whistleblower Frances Haugen in her opening statement before lawmakers. "There is a pattern of behavior that I saw [at] Facebook: Facebook choosing to prioritize its profits over people."
"You can declare moral bankruptcy, and we can figure out a fix [to] these things together because we solve problems together," Haugen said.
Minutes after her testimony, Facebook issued a statement attempting to discredit Haugen, stating that she worked for the company "for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives – and testified more than six times to not working on the subject matter in question."
Although senators from both parties appeared to support her calls to regulate Facebook, how and when that might happen was unclear.
Here are some key takeaways:
Lawmaker: Facebook facing its 'Big Tobacco moment' in targeting of children
One of the matters in question: Haugen described how she said both platforms, Facebook and Instagram, target children as potential users.
"Facebook understands that if they want to continue to grow they have to find new users. They have to make sure that the next generation is just as engaged with Instagram as the current one, and the way they'll do that, making sure children establish habits before they have good self-regulation," Hagen said.
"They know that children bring their parents online -- so they understand the value of younger users for the long-term success of Facebook," she added.
With several comparisons to the tobacco industry, a majority of Haugen's testimony focused on harmful consequences once she said children get addicted to Facebook's platforms.
Notably, around 2019, Facebook started using a revamped algorithm called "downstream MSI," which she said made a post more likely to appear in a user's News Feed if the algorithm calculated people were likely to share or comment on it as it passed down the chain of reshares.
This method has led some people, including children, to content promoting eating disorders, misinformation, and hate-targeted posts, according to Haugen and what she said was in internal company documents she's submitted to the committee after leaking them to numerous media outlets.
"Facebook knows its engagement ranking on Instagram can lead children from very innocuous topics like healthy recipes [...] to anorexic content over a very short period of time," Haugen alleged. "Facebook knows they are leading young users to anorexia content."
Haugen claimed children are a targeted demographic for Facebook, referencing the company's recent project "Instagram Kids." The company paused the project after it came under public scrutiny.
"I would be sincerely surprised if they do not continue working on Instagram kids," Haugen speculated, adding Facebook intends " to make sure that the next generation is just as engaged with Instagram as the current one, and the way they'll do that, making sure children establish habits before they have good self-regulation."
Whistleblower: 'Buck stops with Mark'
Haugen detailed numerous incidents in which she said executives at Facebook, including CEO Mark Zuckerberg, were made directly aware of their platforms' potentially negative influence on the mental health of children.
Zuckerberg and other executives were at one point presented with "Project Daisy," a strategy that removed the number of likes from public Instagram posts. Studies proved the project was not effective, yet Zuckerberg and others went forward to appease regulators and journalists, according to Haugen.
"It would get us positive points from the public," Haugen recalled. "That kind of duplicity is why we need to have more transparency and why, if we want to have a system that is coherent with democracy, we must have public oversight from Congress."
Zuckerberg, she said, was apparently also presented with options to remove the MSI algorithm in the case of Myanmar, a country where Facebook has been allegedly used to incite violence and spread hate speech.
"Mark was presented with these options and chose to not remove downstream MSI in April of 2020," Haugen told the subcommittee.
Asked why Facebook wouldn't get rid of downstream MSI when data showed the system expanded hate speech, misinformation and violence-inciting content, Haugen claimed that employee bonuses are still currently tied to the system.
Lena Pietsch, director of policy communications at Facebook, released a statement following whistleblower Frances Haugen's testimony attempting to discredit her knowledge of the company, while calling for new internet regulations.
"It's time to begin to create standard rules for the internet," Pietsch said in a statement. "It's been 25 years since the rules for the internet have been updated, and instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act."
Lawmakers on the panel from both parties, operating in a normally divided Washington, were united in calling for Zuckerberg and other Facebook officials to testify before Congress as Haugen had. Zuckerburg has remained silent on Haugen's allegations for days, and multiple senators noted the billionaire's recent social media posts of him put sailing with his wife.
Lawmakers signal more hearings, oversight to come
Haugen, while outlining what she said were Facebook's flaws, offered up several solutions. She said Facebook could be forced to do things such as forcing a user to click on a link before sharing it, which platforms like Twitter have found that significantly reduced misinformation, she said.
She also called for oversight of advertising when it comes to children -- a proposal senators appeared on board with exploring as they pursue possibly regulating Facebook.
"I strongly encourage banning targeted advertisements to children," said Haugen. "And we need to have oversight in terms of [how] the algorithms will likely still learn the interests of kids and match ads to those kids."
"Facebook today [makes] approximately $40 billion a year in profit," she said at another point. "A lot of the changes that I'm talking about are not going to make Facebook an unprofitable company -- it just won't be a ludicrously profitable company like it is today."
After Haugen raised concerns around Facebook's resourcing of counterterrorism and teams intended to counter foreign influence -- signaling she was speaking with another congressional committee on that matter -- lawmakers on the subcommittee opened the door to holding another hearing.
"I believe Facebook's consistent understaffing of the counterespionage, information operations and counterterrorism teams is a national security issue, and I'm speaking to other parts of Congress about that," Haugen said.
Sen. Dan, Sullivan, R-Alaska, followed up, "So, you're saying in essence that the platform whether Facebook knows it or not, is being utilized by some of our adversaries in a way that helps push and promote their interests at the expense of America's?"
"Yes," she replied. "Facebook is very aware that this is happening on the platform, and I believe the fact that Congress doesn't get a report of exactly how many people are working on these things internally is unacceptable because you have a right to keep the American people safe."
ABC News' Zunaira Zaki, Mary Kathryn Burke and Libby Cathey contributed to this report.