Supreme Court takes case on content policing: Here's how a Section 230 ruling could impact social media
The ruling could dramatically change the posts users see, experts said.
The Supreme Court agreed this week to hear a challenge to a fundamental legal protection enjoyed by social media platforms like Facebook, Twitter and TikTok. The ruling could dramatically change how those platforms operate, even affecting search engines like Google, legal experts told ABC News.
The case concerns Section 230 of the 1996 Communications Decency Act, which protects social media platforms and other sites from legal liability that could result from content posted by users.
The law has drawn criticism from elected officials across the political spectrum. In a rare point of agreement, President Joe Biden and former President Donald Trump have both called for the repeal of Section 230 — but for different reasons.
Typically, Democrats argue that Section 230 allows platforms to evade accountability for permitting harmful or misleading content, claiming the rule lets platforms off the hook for policing too little speech.
While Republicans take issue with what they consider big tech censorship, saying the legal protection allows the platforms to police too much speech without facing consequences.
Some big tech companies, like Facebook and Google, have supported reform of Section 230 that would raise the standard that platforms would need to meet in order to qualify for immunity. But the companies largely support preserving the law in some form to protect them from legal liability tied to user-generated content.
The case, Gonzalez v. Google LLC, concerns a lawsuit brought by the family of Nohemi Gonzalez, an American woman who was killed in an ISIS terrorist attack in Paris in 2015. The lawsuit against Google, the parent company of YouTube, alleges that YouTube recommended ISIS recruitment videos to users.
The case centers on whether Section 230 protects online platforms from legal liability when it comes to their recommended content.
If the high court rules in favor of Google, it would formally extend legal immunity to the algorithms at the heart of many social media products and search engines; but if the Supreme Court rules in favor of the plaintiff, the decision could expose the platforms to a raft of new legal vulnerabilities and produce major changes, legal experts told ABC News.
"The Supreme Court could make Section 230 a little more speech friendly or it could functionally eliminate it as a defense for services, which would radically reshape the internet," Eric Goldman, a Santa Clara University law professor who studies Section 230, told ABC News.
"The Supreme Court really does have the future of the internet in its hands," he added.
Google has called on lower court judges to dismiss the case, saying its operations are protected under Section 230. In a response to the Supreme Court petition, Google noted that YouTube's user rules prohibit material that promotes terrorism and that the platform employs moderators to review content around the clock. There is no evidence that any of the Paris attackers received recommendations for ISIS videos from YouTube, Google said in the brief.
Here are two major ways that social media platforms and other sites could change as a result of this case, according to experts:
Altered recommendation algorithms
The online tool at the heart of the case is the recommendation algorithm. Importantly, such algorithms are used not only by social media platforms like Facebook and Twitter but also video sites like YouTube and search engines like Google, Goldman said.
A high court decision that eliminates legal protection for recommended content could significantly alter the type of posts that appear before users on Facebook's News Feed or Twitter's timeline, said Eugene Volokh, a professor of law at the University of California, Los Angeles.
"Sites would be a lot more cautious about those types of recommendations," Volokh told ABC News. "Whenever they see something that might be potentially dangerous for them, they'll exclude it from recommendations."
Posts that could concern social media sites after the ruling include libelous comments and instructions for committing criminal acts, not just the terrorist propaganda at issue in the Supreme Court case, he said.
For example, consider a post featuring a news story critical of the Church of Scientology, Volokh said. If the Church of Scientology writes a letter to a social media site warning that the news story is libelous, the site may stop recommending posts with the story out of caution, he added.
"The platforms might decide to recommend cat videos instead," Volokh said.
While such decisions could provide an advantage for well-off or litigious actors, the moves could also benefit the public interest, he added. "What if the story about Scientology really is libelous? It's possible," he said.
Online platforms may respond to the court's decision by shifting their recommendation algorithms in a different direction, however, instead ceding greater control to users as a way to lessen their own liability, said Adam Candeub, a professor at the University of Michigan School of Law.
"If users could say that they're making a conscious effort to seek out messages rather than Facebook forcing them onto you," he told ABC News. "Facebook isn't a speaker."
More professionally-generated content
A more risk-averse recommendation algorithm, for fear of legal liability, could lead to the recommendation of a larger proportion of professionally-made content, some experts said.
"If a company is deciding what to include in its news feed or a recommendations feed, then including a traditional mainstream news article is a pretty safe bet," said Volokh, of UCLA.
Goldman, of Santa Clara University, agreed. Twitter, he said, could prevent all users without blue verification checks from posting on the platform or prevent their posts from appearing on the timeline.
"It's inevitable that services will move away from user-generated content and toward a model like Netflix," he said. "It'll be professionally produced, it won't have the diversity it has, it won't give speech platforms to as many people and to compensate professional producers, it's more likely to be paywalled."
Other experts contested the extent to which such a shift would take effect. User-generated content will still make its way into the recommendation algorithm and go viral, Volokh said. After a court decision that limits Section 230, however, that content will more likely be innocuous than controversial.
"People haven't stopped selling cars just because they face liability for legal defects on cars," Volokh said. "They may buy insurance for facing risks to liability or may adjust to it being the cost of doing business."
Candeub, of the University of Michigan, said the court ruling wouldn't affect the experience on social media for a typical user.
"I don't think it would change much, actually," he said. "Platforms already have tremendous ability to control how content is promoted. They will have to make wiser decisions and be held accountable for those decisions."
One solution, Volokh said, would allow the social media platforms to preserve products centered on recommendations while policing them tightly: More employees.
"They may need to hire a lot more people," he said.