Family of American terror victim asks Supreme Court to curb immunity for social media
The case is a landmark test for Section 230 of the Communications Decency Act.
NORWALK, California -- Seven years after Islamic State extremists murdered their daughter, the family of Nohemi Gonzalez, the only American killed in the 2015 Paris terror attacks, heads to the U.S. Supreme Court on Tuesday seeking to pin some responsibility for the tragedy on social media giant YouTube.
"If some changes can be done to prevent these terrorist people [from] keeping killing human beings, then that is a big thing," Beatrice Gonzalez, Nohemi Gonzalez's mother, told ABC News in the family's first interview about the case.
Beatrice Gonzalez alleges that Google's YouTube algorithms -- a series of proprietary software instructions which recommend video content to users -- effectively amplified Islamic State-produced materials in support of the extremists that killed her daughter, a 23-year-old college student who had been studying in France.
The family wants to bring a case against the company under the Anti-Terrorism Act but has been blocked from doing so because of a landmark federal law that has given sweeping legal immunity to social media companies for more than 25 years.
Section 230 of the Communications Decency Act of 1996 states that internet companies, including social media platforms, cannot be sued over third-party content uploaded by users -- such as photos, videos and commentary -- or for decisions site operators make to moderate, or filter, what appears online.
Oral arguments at the Supreme Court set for Tuesday in Gonzalez v. Google, the parent company of YouTube, will focus on the scope of that immunity, whether it covers algorithms, and whether Gonzalez should be able to pursue her claims in court.
"Hopefully this will change the laws and it'll be for the good by being more careful about the social media, so [other parents] never have the pain that we're feeling," said Nohemi Gonzalez's stepfather, Jose Hernandez.
The company has expressed sympathy to the Gonzales family but strongly denies any connection to the attack.
YouTube says it bans terrorist content across its platform and that its algorithms help catch and remove violent extremist videos, noting 95% of those removed last year were automatically detected -- most before receiving fewer than 10 views.
"Undercutting Section 230 would make it harder for websites to do this work," YouTube spokesperson Ivy Choi told ABC News. "Websites would either over-filter any conceivably controversial materials and creators, or shut their eyes to objectionable content like scams, fraud, harassment and obscenity to avoid liability -- making services far less useful, less open and less safe."
Lower courts have said Section 230 protects algorithms from liability claims, siding with Google.
For years, members of Congress from both parties have debated changes to Section 230 in order to promote greater transparency and accountability of internet companies.
President Joe Biden in a recent Wall Street Journal op-ed called for "fundamental reform" to the law, but there is not a political consensus on the way forward.
The Gonzalez case is the first time the nation's highest court will consider limits to immunity for internet companies.
"There are enormous amounts of money at stake if the platforms were to be held liable for every time a terrorist attack could in any way be tangentially traced to material that the platforms carried," said Michael Karanicolas, executive director of the Institute for Technology Law & Policy at UCLA.
Section 230 was passed by bipartisan majorities in Congress and has long been considered a cornerstone of the modern internet, protecting online platforms as spaces for creativity, innovation and open public debate.
The crucial 26 words in the statute say: "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."
Internet companies "get to decide what to carry. They get to decide what not to carry," said Karanicolas. "And they get to decide how to design their algorithms -- to amplify certain types of content or to de-emphasize other types of content."
Subjecting those decisions to legal scrutiny could have major implications for how the internet functions, experts say.
"Large companies can maybe throw a battalion of lawyers at a problem and litigate their way forward, but new startups will simply not be able to get over that [financial burden]," said Matthew Schruers, president of the Computer and Communications Industry Association.
"No digital service wants their products to be used by bad actors. But to try to use liability here is actually going to produce a contrary result," Schruers added.
Advocates for overhauling Section 230 say the legal protection far exceeds what Congress intended, much earlier in the development of the modern internet, and insulates companies from accountability.
"When this statute was enacted in 1996, it was for the express purpose of protecting kids from seeing obscene material online and protecting companies who take obscene material offline to protect kids. And it's been turned on its head," said Matthew Bergman, an attorney and founder of the Social Media Victims Law Center, who represents hundreds of plaintiffs alleging harm from social media use.
Frances Haugen, the former Facebook insider who has warned Congress about the harms of internet companies' algorithms, said setting new limits on legal immunity could incentivize companies to improve their products.
"We have the tools, but all these things decrease usage. They make the companies a little less money," Haugen said. "So in a world where our business models are fueled by clicking on ads, there aren't independent market incentives for making products that help people be healthy and happy."
Haugen believes Section 230 immunity does not have to be all or nothing but says regulators need to update the law to reflect current internet use and the proliferation of documented psychological harms.
"The Supreme Court isn't really the right actor for dealing with this issue. You know, they can come in and do a very blunt judgment. They can't, for example, set up a new regulatory framework that might be a more effective way to govern the internet," Haugen said.
The tech industry agrees that lawmakers, not the high court, should be the final arbiters of internet policy and that changes to immunity protection are not in Americans' best interest.
But Bergman, the attorney for social media users claiming harm, and the Gonzalez family argue that the justices need to act under a plain reading of the law and permit the Gonzalez family to move forward with their suit against YouTube's parent company.
"It will certainly provide a more sensible opportunity for families to hold companies accountable," Bergman said. "All it will do is allow them to seek discovery and prove their case. Everyone is entitled to a defense, as are the social media companies, but it will simply kind of open the courthouse door."
"It will certainly provide a more sensible opportunity for families to hold companies accountable," Bergman said. "All it will do is allow them to seek discovery and prove their case. Everyone is entitled to a defense, as are the social media companies, but it will simply kind of open the courthouse door."
Beatrice Gonzalez said she did not bring the case seeking financial compensation from Google and is instead seeking to enact a small change to the system in her daughter's memory.
"We want justice, but we're not angry," she said. "If we can do a little change in our community by knowing that it can be a bigger change in the world is what brings me peace in my heart."