Despite Announced Crackdown, YouTube Can’t Get Anti-Vaxxers Off Its Platform

·2 min read
In this photo illustration the YouTube logo seen displayed - Credit: Rafael Henrique/SOPA Images/LightRocket/Getty Images
In this photo illustration the YouTube logo seen displayed - Credit: Rafael Henrique/SOPA Images/LightRocket/Getty Images

On Wednesday, September 29th, after facing years of criticism that it was not doing enough to remove health-related misinformation from its platform, YouTube updated its policies to formally remove all vaccine-related misinformation from its platform. YouTube also announced it would be removing members of the “Disinformation Dozen,” or influencers who have been deemed by the Center for Countering Digital Hate most responsible for propagating such misinformation, including as Robert F. Kennedy, Jr., Sherri Tenpenny, and Joseph Mercola.

Despite having implemented this ban, however, YouTube appears to continue to struggle with banning vaccine-related misinformation. According to a Media Matters investigation, the platform has allowed at least 50 videos featuring these influencers on the platform, which have garnered nearly 4.9 million views in total; the investigation also found that some of these videos ran with ads, indicating that YouTube was profiting off such content.

More from Rolling Stone

According to the investigation, many of the videos still up on the platform are posted by other accounts, but allow them to continue to broadcast their anti-vaccine views. For instance, one video by YouTuber Mikhaila Peterson, daughter of intellectual dark web pundit Jordan Peterson, features an interview with Mercola, an osteopathic physician who has been referred to by the New York Times as a “pioneer of the anti-vaccine movement.” That video with Peterson is still running with ads and has more than 776,000 views. Media Matters also found several videos featuring prominent anti-vax influencer Robert F. Kennedy, Jr. still on the platform

YouTube has long gotten criticism for failing to curb misinformation, particularly via its algorithm, which critics and researchers say promotes and suggests increasingly extreme and sensationalized content based on viewers’ interests. (YouTube has stated it has implemented sweeping changes to its algorithm to combat this, and denies that the platform serves as a radicalization vector.) It has also been under pressure to remove influencers like RFK Jr. (who was removed from Instagram in February) from its platform.

Last month, after facing years of criticism that it was not doing enough to fight the spread of conspiracy theories and misinformation on its platform, YouTube announced it would be removing content that “falsely alleges that approved vaccines are dangerous and cause chronic health effects, claims that vaccines do not reduce transmission or contraction of disease, or contains misinformation on the substances contained in vaccines will be removed.” But if the Media Matters report is any indication, that may not be enough.

Best of Rolling Stone