Facebook Bans QAnon, Following Earlier Action To Limit Its Social Spread

After Twitter took steps to limit the ability of QAnon-themed posts to spread on its social platform, Facebook implemented an outright ban on posts related to the conspiracy-centered group.

“Starting today, we will remove any Facebook Pages, Groups and Instagram accounts representing QAnon, even if they contain no violent content,” the company said in a blog post Tuesday.

The ban was a step beyond various measures put in place in August removing QAnon material when it discussed potential violence and limiting its reach across the company’s mammoth global networks. Twitter had earlier imposed restrictions on QAnon-themed tweets.

QAnon, a loose collection of conspiracy theories from the dark corners of the internet, posits that powerful, elite Democrats and other prominent figures are operating a sex-trafficking ring preying on underage children. Purportedly fueled by regular information drops from a shadowy government insider nicknamed “Q,” the rabidly pro-Donald Trump group, as inchoate as it is, has fueled real-world violence and catapulted politicians into elected office in U.S. Congress.

Trump himself has re-tweeted related posts on Twitter, one of which was taken down recently by the tech giant. Asked about support from the group during a White House press briefing in August, Trump said, “I don’t know much about the movement other than I understand they like me very much, which I appreciate.”

The “Pizzagate” conspiracy fostered by QAnon resulted in a shooting at a Washington, D.C., pizzeria in 2016, though no one was hurt in that incident. Other episodes have resulted in injuries and deaths. Given that QAnon has been able to gain traction for years on Facebook, the ban was criticized by some observers Tuesday as being way overdue.

Facebook doesn’t quite see it that way. “We’ve been vigilant in enforcing our policy and studying its impact on the platform but we’ve seen several issues that led to today’s update,” the blog post added. “For example, while we’ve removed QAnon content that celebrates and supports violence, we’ve seen other QAnon content tied to different forms of real world harm, including recent claims that the West Coast wildfires were started by certain groups, which diverted attention of local officials from fighting the fires and protecting the public.”

Often, quick changes are made to QAnon messaging, Facebook said, “and we see networks of supporters build an audience with one message and then quickly pivot to another. We aim to combat this more effectively with this update that strengthens and expands our enforcement against the conspiracy theory movement.”

More from Deadline

Best of Deadline

Sign up for Deadline's Newsletter. For the latest news, follow us on Facebook, Twitter, and Instagram.