'A long way to go:' Why TikTok still has a QAnon and COVID-19 vaccine conspiracy theory problem

Jessica Guynn, USA TODAY
·4 min read

A new wave of videos are spreading QAnon and COVID-19 vaccine conspiracies despite TikTok’s efforts to scrub them from the platform.

Earlier this month, Advance Democracy, a research organization that studies disinformation and extremism, found that 10 TikTok accounts were using the hashtag #GlobalRebellion4Freedom to push bogus claims.

TikTok took down the 10 accounts in early April after being alerted by USA TODAY.

New research out Wednesday from Advance Democracy shows that a previously inactive account then posted nine videos using the same hashtag to pump out the same content in an effort to evade the crackdown.

Views of the new videos between April 6 and April 13 surpassed the number of views before the April 2 takedown, according to the research shared exclusively with USA TODAY.

As of Tuesday, the new videos had received 1.3 million views, compared with 1.1 million views for the previous set of videos between March 5 and March 26, according to Daniel Jones, president of Advance Democracy.

Hide likes? Facebook and Instagram will leave that decision up to you

YouTube Gorka ban: YouTube bans channel of former Trump aide Sebastian Gorka over election claims

TikTok removed the account for violating its disinformation policy after being contacted by USA TODAY.

"We will start redirecting #GlobalRebellion4Freedom to our Community Guidelines," the company said.

Chloe Colliver, head of digital policy at the Institute for Strategic Dialogue, says TikTok "still has a long way to go to improve the reality of safety on its platform."

"TikTok’s efforts to deal with COVID-19 disinformation, as well as extremist content or conspiracy theories on the platform, are characterized by a serious enforcement gap," Colliver said. "There is a vast difference between what the platform claims to be taking action on and the evidence of how it implements its policies in practice."

A sign supporting QAnon at a rally in Olympia, Washington in May 2020.
A sign supporting QAnon at a rally in Olympia, Washington in May 2020.

This pattern falls in line with Facebook, Twitter and Google's YouTube, Colliver said. The nation's leading social media companies have been roundly criticized for reacting too little and too late to combat dangerous and inflammatory content flooding their platforms, particularly during the pandemic and the presidential election.

Concern about the spread of falsehoods, hoaxes and conspiracy theories on TikTok has grown along with the platform. Best known for its short-form viral videos, TikTok is popular with kids and teens and has at least 100 million users in the U.S.

In response to pressure from policymakers or the media, TikTok has cracked down on potentially harmful content, sometimes more quickly than other platforms, banning hashtags or topics, Colliver said.

Starting last year, TikTok took a series of steps to curb QAnon. In December, TikTok said it would remove misinformation about COVID-19 and vaccines.

FBI director Christopher Wray told a Senate committee hearing Wednesday that at least five self-identified QAnon followers were arrested in connection with the Jan. 6 attack on the Capitol. He fingered social media for playing a key role in helping violent domestic extremists spread their messages.

“Social media has become, in many ways, the key amplifier to domestic violent extremism, just as it has for malign foreign influence,” Wray said in a worldwide-threats hearing held by the Senate Intelligence Committee.

"Blanket approaches" to extremist content can be ineffective and risky, Colliver said.

"The bluntness of some of these efforts risks capturing irrelevant content in the takedown net – sometimes including content intended to fact-check or challenge disinformation on the platform," she said.

The approach also misses "swathes of relevant disinformation or conspiracy content that might use just slightly different language or terminology."

A teenager presenting a smartphone with the TikTok logo
A teenager presenting a smartphone with the TikTok logo

The latest QAnon and COVID-19 vaccine conspiracy videos on TikTok advanced debunked theories that members of the U.S. government are trying to “overthrow our constitutional government,” and that Bill Gates and Dr. Anthony Fauci caused COVID-19 to profit from it.

Two of the nine videos accused social media platforms including TikTok, Twitter and Facebook, of colluding with the media to censor speech and enforce a “fascist dictatorship.” Advance Democracy said the videos were likely posted in response to the previous TikTok takedown.

The anonymous account uncovered by Advance Democracy posted 32 other videos between April 6 and April 13 that were not tagged with #GlobalRebellion4Freedom, but also promote QAnon and COVID-19 vaccine conspiracies including a video claiming that a small group of elites “rules over” 99% of the population. Advance Democracy found. That video was originally posted by one of the 10 accounts TikTok removed.

This article originally appeared on USA TODAY: Why TikTok still has a QAnon and COVID vaccine conspiracy problem