Potential mass layoffs at Twitter could cripple content moderation, some experts say
Prospective owner Elon Musk reportedly wants to cut 75% of its workforce.
Over the course of a monthslong bid to purchase Twitter, Elon Musk has signaled major changes at the platform. As Musk stands poised to acquire the company in the coming weeks, the transformation appears even more far-reaching than some anticipated.
In discussions about his plans for the company, in recent months, Musk talked about potential mass layoffs that would reduce the staff by 75%, the Washington Post reported.
While details of the potential layoffs remain limited, the move could compromise the platform's capacity to police false or harmful content, with ramifications that extend to social issues like election integrity, experts told ABC News.
The experience of a typical user could change significantly, they added, noting the possible rise of harassment and other forms of corrosive discourse.
Even if Musk's Twitter deal does not come to fruition, the company's current management has reportedly planned to cut its payroll by $800 million by the end of next year, the Post reported.
A spokesperson for Twitter declined to comment on the report of layoffs but confirmed to ABC News the existence of a company memo to employees obtained by Bloomberg News.
"We do not have any confirmation of the buyer's plans following close and recommend not following rumors or leaked documents but rather wait for facts from us and the buyer directly," the memo by Twitter General Counsel Sean Edgett said.
Representatives for Elon Musk did not immediately reply to ABC News' requests for comment.
Speaking at a Twitter all-hands meeting shortly in June, Musk did not rule out downsizing if he were to acquire the company.
"Anyone who is a significant contributor should have nothing to worry about," Musk reportedly told employees, according to tweets from Bloomberg reporter Kurt Wagner.
Cuts to the content moderation workforce would align with statements made by Musk in recent months about his commitment to the principle of free speech, suggesting that Twitter should permit all speech that stops short of violating the law, the experts said.
"Content moderation will be a lot harder without people doing content moderation," Zeve Sanderson, the executive director at New York University's Center for Social Media and Politics, told ABC News.
"If there is more harassment and other forms of toxic speech, if there is more misinformation and disinformation, then people's experience on the platform is going to be really different," he added.
However, the layoffs may never happen at all; or, if they do, could forego changes to the content moderation workforce, the experts said. Even a downsizing of content moderation employees could be mitigated in part by an expansion of automated content policing, they added.
Twitter currently imposes limits on a range of speech, including hate speech, targeted harassment and media that features graphic violence.
Typically, tech platforms police content through both automated systems and manual decisions made by individuals, Eric Goldman, a professor at Santa Clara University School of Law who studies content moderation, told ABC News. The automated content policing tackles obvious policy violations and flags difficult cases that employees weigh in on, he said.
Posts under consideration often require human judgment when context could dramatically alter their meaning, such as nuanced remarks or satire, Goldman added. For instance, an automated system might misunderstand a post that includes the word "murder" as a death threat, when it's merely discussing issues related to the subject, he said.
The potential reduction or elimination of the content moderation workforce would leave the platform vulnerable to forms of harassment or misinformation that evade automated policing, David Kaye, a professor at the University of California, Irvine School of Law, and chair of the board of directors of the Global Network Initiative, told ABC News.
"If Musk wants to advance a platform that is open to free speech, to public debate, he actually needs humans to moderate it," Kaye said. "Policy decisions need to decide the context that makes sense, so the platform doesn't become awash in racism, anti-Semitism, Islamophobia, harassment and all that."
The potential dearth of working content moderators also poses a threat to the platform's role in securing election integrity, since individuals help prevent systemic abuses of Twitter that attempt to spread disinformation, said Goldman, of Santa Clara University.
"You need people on the beat looking for the latest attempts at gaming the system," he said. "If you don't have those humans paying attention, then the bad guys can run amok."
To be sure, a shift toward a greater share of automated content moderation would align with a general trend in the industry as technology improves, said Sanderson, of NYU.
"Automation is getting better and more scalable," he said.
Further, a shift toward additional automated content policing could end up removing more content than Twitter currently does, if Musk decides to err on the side of content removal as a cost-cutting measure, rather than employ people to make tough calls, said Kaye, of the University of California.
But such an approach contradicts Musk's avowed commitment to an expansive notion of the type and volume of speech that should be permitted, Kaye added.
Ultimately, Musk could forego the job cuts altogether, the experts said. In recent months, he has spoken publicly about many possible plans for the platform, some of which almost certainly will not come to pass.
Once news of the potential mass layoffs went public, however, the company potentially suffered unavoidable harm in the form of diminished morale and likely employee departures, Goldman said.
"Musk's threat to lay off 75% of Twitter's workers has already driven the stake into Twitter's heart," he said. "The damage is already done."
ABC News’ Patricio Chile contributed to this report.