Facebook to remove all accounts 'representing' QAnon

The move is a major step up from only removing QAnon posts promoting violence.

October 6, 2020, 6:11 PM

Facebook announced Tuesday it was removing any accounts "representing" QAnon across all of its platforms in a major, more-proactive step to stop the spread of the baseless conspiracy theory online.

"Starting today, we will remove any Facebook Pages, Groups and Instagram accounts representing QAnon, even if they contain no violent content," the company said in a blogpost Tuesday.

This is an update from their previous policy that removed content associated with QAnon only when they discussed potential violence.

The company added Tuesday that it will also "continue to disable the profiles of admins who manage Pages and Groups removed for violating this policy, as we began doing in August."

QAnon, a baseless conspiracy theory that paints President Donald Trump as the hero attempting to take down a global Satanic pedophilic and cannibalistic syndicate, began as a fringe online movement but has grown in mainstream popularity in recent months -- with many placing the blame for this growth on social media.

PHOTO: A person wears a QAnon sweatshirt during a pro-Trump rally on Oct. 3, 2020, in the borough of Staten Island in New York City.
A person wears a QAnon sweatshirt during a pro-Trump rally on Oct. 3, 2020, in the borough of Staten Island in New York City.
Stephanie Keith/Getty Images

The QAnon movement has been blamed for numerous acts of violence and has been deemed a potential domestic terror threat by the FBI.

Facebook said the decision to remove not just posts that support violence, but all QAnon pages came after it found the latter content was tied to "different forms" of harm.

"For example, while we’ve removed QAnon content that celebrates and supports violence, we’ve seen other QAnon content tied to different forms of real world harm, including recent claims that the west coast wildfires were started by certain groups, which diverted attention of local officials from fighting the fires and protecting the public," the company said.

The company added that it will start enforcing the new policy and removing content Tuesday, but expects the work to take weeks.

Facebook saw a 651% growth in pages and groups from March to August, according to researcher Marc-André Argentino, a Ph.D. candidate at Concordia University in Montreal, Canada, who studies the nexus between technology and extremist groups.

QAnon groups have regularly changed strategy in spreading messages, including avoiding use of the letter "Q," which could make Facebook's job hard.

"QAnon messaging changes very quickly and we see networks of supporters build an audience with one message and then quickly pivot to another," Facebook said in it statement. "We aim to combat this more effectively with this update that strengthens and expands our enforcement against the conspiracy theory movement."

ABC News' Chris Francescani and Evan McMurry contributed to this report.

Related Topics