YouTube bans videos promoting conspiracy theories like QAnon that target individuals

Jonathan Shieber
·3 mins read
NEW YORK, NY - OCTOBER 03: A person wears a QAnon sweatshirt during a pro-Trump rally on October 3, 2020 in the borough of Staten Island in New York City. The event, which was organized weeks ago, encouraged people to vote Republican and to pray for the health of President Trump who fell ill with Covid-19. (Photo by Stephanie Keith/Getty Images)
NEW YORK, NY - OCTOBER 03: A person wears a QAnon sweatshirt during a pro-Trump rally on October 3, 2020 in the borough of Staten Island in New York City. The event, which was organized weeks ago, encouraged people to vote Republican and to pray for the health of President Trump who fell ill with Covid-19. (Photo by Stephanie Keith/Getty Images)

YouTube today joined social media platforms like Facebook and Twitter in taking more direct action to prohibit the distribution of conspiracy theories like QAnon.

The company announced that it is expanding its hate and harassment policies to ban videos "that [target] an individual or group with conspiracy theories that have been used to justify real-world violence," according to a statement.

YouTube specifically pointed to videos that harass or threaten someone by claiming they are complicit in the false conspiracy theories promulgated by adherents to QAnon.

YouTube isn't going as far as either of the other major social media outlets in establishing an outright ban on videos or articles that promote the outlandish conspiracies, instead focusing on the material that targets individuals.

"As always, context matters, so news coverage on these issues or content discussing them without targeting individuals or protected groups may stay up," the company said in a statement. "We will begin enforcing this updated policy today, and will ramp up in the weeks to come."

It's the latest step in social media platforms' efforts to combat the spread of disinformation and conspiracy theories that are increasingly linked to violence and terrorism in the real world.

In 2019, the FBI for the first time identified as a domestic terrorist threat fringe conspiracy theories like QAnon, as well as adherents to the conspiracy theory that falsely claims famous celebrities and Democratic politicians are part of a secret, Satanic, child-molesting cabal plotting to undermine Donald Trump.

In July, Twitter banned 7,000 accounts associated with the conspiracy theory, and last week Facebook announced a ban on the distribution of QAnon related materials or propaganda across its platforms.

These actions by the social media platforms may be too little, too late, considering how widely the conspiracy theories have spread... and the damage they've already done thanks to incidents like the attack on a pizza parlor in Washington, DC that landed the gunman in prison.

The recent steps at YouTube followed earlier efforts to stem the distribution of conspiracy theories by making changes to its recommendation algorithm to avoid promoting conspiracy-related materials.

However, as TechCrunch noted previously, it was over the course of 2018 and the last year that QAnon conspiracies really took root.

As TechCrunch noted previously, it's now a shockingly mainstream political belief system that has its own Congressional candidates.

So much for YouTube's vaunted 70% drop in views coming from the company's search and discovery systems. The company said that when it looked at QAnon content, it saw the number of views coming from non-subscribed recommendations dropping by more than 80% since January 2019.

YouTube noted that it may take additional steps going forward as it looks to combat conspiracy theories that lead to real-world violence.

"Due to the evolving nature and shifting tactics of groups promoting these conspiracy theories, we’ll continue to adapt our policies to stay current and remain committed to taking the steps needed to live up to this responsibility," the company said.