Facebook employees questioned apparent restrictions on Palestinian activist's account: Documents

A congressional staffer provided the redacted documents to news outlets.

October 29, 2021, 5:07 AM

Earlier this year, multiple Facebook employees questioned the apparent restrictions on well-known Palestinian activist Mohammed El-Kurd's Instagram account, according to internal Facebook documents shared with ABC News and a group of other news organizations.

The document, titled "Concerns with added restrictions/demotions on content pertaining to Palestine," shows concern among some employees over content moderation decisions during the May escalation of violence in Gaza and the West Bank.

The documents were disclosed to the U.S. Securities and Exchange Commission by Facebook whistleblower Frances Haugen, a former employee, and provided to Congress in redacted form by Haugen's legal counsel. They were provided to ABC News by a congressional staffer.

Facebook’s independent Oversight Board called for an investigation into whether moderation disproportionately targeted Palestinians last month.

The document also points to frustration by employees who were, in the moment, unable to pin down exactly why an activist's online reach was being limited.

PHOTO: Smoke rises from 14-story building as Israeli fighter jets continue to pound a Palestinian building called "Ash-Shuruq" at Omar Al-Mukhtar neighborhood in the Gaza Strip,  May 12, 2021.
Smoke rises from 14-story building as Israeli fighter jets continue to pound a Palestinian building called "Ash-Shuruq" at Omar Al-Mukhtar neighborhood in the Gaza Strip, May 12, 2021.
Anadolu Agency via Getty Images, FILE

In the post, the Facebook employee, whose name was redacted, warned the Instagram Stories of El-Kurd, a prominent activist in the East Jerusalem area of Sheikh Jarrah, were apparently being "demoted" in error. Demotion refers to the practice of limiting the reach of a post judged to violate Facebook's rules.

And El-Kurd's account wasn't the only one facing apparent restrictions, according to the document's author.

"Can we investigate the reasons why posts and stories pertaining to Palestine lately have had limited reach and engagement, especially when more people than ever from around the world are watching the situation unfold?" the author wrote.

While the employee's post is not dated, it includes an unredacted link to a May 12 tweet by El-Kurd, which includes a photo of an Instagram error message.

"I keep getting messages like this one. My Instagram story views went down from 150k to like 50k. So much of what I post is disappearing. Why are you silencing Palestinians?" his tweet read.

At the time of El-Kurd's May 12 tweet, violence had already broken out over the forced evictions of Palestinians in East Jerusalem. In the resulting crisis, according to the United Nations' Office of the High Commissioner for Human Rights, around 245 Palestinians, including 63 children, were "seemingly killed by Israeli Defense Forces." Rocket attacks by Palestinian armed groups resulted in 13 deaths in Israel, including two children, according to Human Rights Watch.

A Facebook spokesperson noted that in May, Instagram experienced a technical glitch affecting the Stories of millions of users, including many Palestinians. The issue was quickly fixed, the company said. Facebook also acknowledged reports that users felt Stories about the conflict were having an unexpectedly limited reach, which the company said was due to a change to the way stories are prioritized that privileged original posts over re-shares. That change was later reversed, the company said Thursday.

PHOTO: The letter 'f' stands in a corner of the Facebook offices in Cambridge, Mass. Dec. 10, 2014.
The letter 'f' stands in a corner of the Facebook offices in Cambridge, Mass. Dec. 10, 2014.
Boston Globe via Getty Images, FILE

According to the internal document about El-Kurd, the activist had previously been the subject of "false positives," the mistaken removal or limiting of a piece of content.

This ran counter to a new effort within Facebook, according to the document.

"There have been false positive[s] reported against his account in the past and now that we (FB) have taken a stance to minimize our over-enforcing on content from Palestine -- due to the necessity of allowing folks on the ground to share what's going on -- there should be no reason his content is getting removed or restricted," the document read.

A follow-up comment added to the undated post points to confusion and delays in resolving the problem.

"I'd really like to understand what exactly is breaking down here and why. What is being done to fix it given that this is an issue that was brought up a week ago?" the unidentified commenter wrote.

Another commenter chimed in, reporting that they had investigated the issue and not found any restrictions put in place by the "Inauthentic Behavior" team. Inauthentic Behavior is a term used within Facebook for a range of violations, including the use of false identities and the artificial boosting of a post's popularity.

As employees continued to look for a cause of the possible crackdown on El-Kurd's account, other comments expressed frustration.

"Also getting reports about this from friends and the conversations are harder and harder as days pass without a root cause being found and tackled internally," another comment read.

It's not clear, according to the document, whether a cause was ever found.

“We’re sorry to anyone who felt they couldn’t bring attention to important events," Facebook spokesperson Drew Pusateri said in a statement to ABC News Thursday.

El-Kurd has not responded to ABC News' request for comment.

In the wake of that crisis, nearly 200 Facebook employees signed an open letter calling on Facebook to address claims of censorship against pro-Palestinian voices on the platform, according to a report by the Financial Times.

Facebook's Oversight Board called for an independent review into "allegations that Facebook has disproportionately removed or demoted content from Palestinian users and content in Arabic," in a Sept. 14 statement. The board also called for a probe into whether Facebook was "not doing enough to remove content that incites violence against Israeli civilians."

The Oversight Board said Facebook had wrongfully taken down a post, which mentioned a Palestinian militant group but which did not contain any incitement to violence.

An Oct. 25 report by The Associated Press, based on other internal Facebook documents, pointed to broader issues within Facebook around the moderation of Arabic-language posts and accounts, including the deletion of profiles belonging to Palestinian journalists and activists and the automated red-flagging of mere mentions of militant groups.

Former Facebook employees told the AP multiple governments threaten the company with regulation and fines in an effort to pressure it. A former regional policy head for Facebook, who left the company in 2017, also told the AP that Israeli officials "flood" the company with take-down orders for Palestinian accounts and posts.

In response to the AP, Facebook rejected the idea that it removes content "solely under pressure from the Israeli government" and said of its Arabic-language moderation, "We still have more work to do. ... We conduct research to better understand this complexity and identify how we can improve."

Related Topics