Meta’s Oversight Board says viral video left on Facebook threatened LGBTQ+ people in Nigeria

Meta’s oversight board has expressed serious concern over the company’s failure to take down a viral graphic video showing two men bleeding after they were apparently beaten up for being allegedly gay

ByCHINEDU ASADU Associated Press
October 15, 2024, 12:18 PM

ABUJA, Nigeria -- Meta's oversight board expressed serious concern Tuesday over the company's failure to take down a viral graphic video showing two men bleeding after they were apparently beaten up for being allegedly gay.

The video was posted in Nigeria, one of more than 30 of Africa’s 54 countries where homosexuality is criminalized by laws that garner broad public support despite constitutional guarantees of freedoms. Such laws are often used to target and illegally arrest people suspected of being gay, with abuses against them often ignored.

The report said the damage done by the video, which was viewed more than 3.6 million times between December 2023 and February this year, was “immediate and impossible to undo.”

The board said the content “shared and mocked violence and discrimination” and though reported multiple times and reviewed by three human moderators, it stayed on Facebook for about five months despite breaking four different rules.

“With the video left up, the odds of someone identifying the men and of the post encouraging users to harm other LGBTQIA+ people in Nigeria increased,” the panel said. “Even after it was removed, the Board’s research shows there were still sequences of the same video remaining on Facebook.”

In the video, two men were seen bleeding as a crowd of people interrogated them about their identity and sexual orientation.

Meta couldn’t be reached for immediate comment.

The company admitted two errors regarding the said video, the panel said, in that its automated systems identified the language spoken in the video as English while it was the Igbo language spoken in southeastern Nigeria “but not supported by Meta for content moderation at-scale,” and that Meta’s human review teams also misidentified the language as Swahili.

“This raises concerns about how content in unsupported languages is treated, the choice of languages the company supports for at-scale review and the accuracy of translations provided to reviewers working across multiple languages,” the panel said.

In its report, the board recommended Meta update its Coordinating Harm and Promoting Crime Community Standard to include clear examples of “outing-risk groups,” conduct an assessment of enforcement accuracy of the prohibition on exposing the identity or locations of those alleged to be a member of such groups, ensure language detection systems identify content in unsupported languages and provide accurate translations while routing content for review.