Facebook’s tougher stance on vaccine misinformation is probably temporary

The company's policies could change again after the pandemic.

POOL New / reuters

When Facebook announced this week that it was widening its efforts to curb misinformation about vaccines, it seemed like a major shift for the company.

Under the changes, Facebook won’t just remove erroneous posts about coronavirus vaccines, but more “general” misinformation about all vaccines. This includes claims that vaccines cause autism, or are full of toxins or are part of a government conspiracy. In other words: exactly the kind of misinformation that’s been most pervasive and long vexed public health advocates.

Previously, the company committed only to removing a subset of false information about COVID-19 vaccines. But those rules didn’t cover false claims about other vaccines, which have spread widely on Facebook and Instagram for years. Instead of removing content, the company has preferred to use fact-checking and algorithmic tweaks to make the content less prominent.

Prior to the coronavirus pandemic, Facebook had only removed vaccine misinformation in two instances: in response to a deadly measles outbreak in Samoa and a polio resurgence in Pakistan. As recently as September, Mark Zuckerberg suggested Facebook wouldn’t target anti-vaccine misinformation more broadly. “If someone is pointing out a case where a vaccine caused harm or that they're worried about it — you know, that's a difficult thing to say from my perspective that you shouldn't be allowed to express at all," he told Axios.

So Monday’s policy update was a notable change for the company. “Today, we are expanding our efforts to remove false claims on Facebook and Instagram about COVID-19, COVID-19 vaccines and vaccines in general during the pandemic,” Facebook wrote in an update. Much of the initial coverage focused on the fact that Facebook would now act on claims about “vaccines in general.” But those last three words — “during the pandemic” — are just as important because it leaves the door open for Facebook to walk back some of these changes once the pandemic is over.

On Thursday, Facebook’s VP of content policy, Monika Bickert confirmed as much, saying that Facebook was “looking at it through the lens of the pandemic” and that “after the pandemic is over, we will continue to talk to health authorities and make sure that we are striking the right approach going forward consistent with our misinformation, policies and principles.”

Facebook’s help center article on the topic is even more explicit (emphasis added). “Similarly, for the duration of the COVID public health emergency, we remove content that repeats other false health information, primarily about vaccines, that are widely debunked by leading health organizations such as the World Health Organization (WHO) and the Centers for Disease Control and Prevention (CDC). The goal of this policy is to combat misinformation about vaccinations and diseases, which if believed could result in reduced vaccinations and harm public health and safety.”

Again, the thinking seems to be that once the pandemic ends, then the issue of vaccine misinformation may become less urgent. But that would be a mistake, says Imran Ahmed, CEO of the Center for Countering Digital Hate (CCDH), a nonprofit organization that has urged social media companies to crack down on vaccine misinformation.

“Such a solution would be short-sighted at best, and utterly cynical and self serving in reality,” Ahmed told Engadget. “We have a crisis, and that crisis was caused by the contagion of misinformation. We've seen the perfect storm of what happens in an environment which social media, in its modern form, runs riot in a community that can't can't really talk to each other face to face.”

To be clear, Facebook’s current efforts could continue to have an impact long after the pandemic. The company has already banned Robert F, Kennedy Jr. from Instagram, for example. Kennedy has long been one of the most influential purveyors of anti-vaccine propaganda, with more than a million followers on social media, according to a recent report from the CCDH.

But Kennedy’s Facebook page remains active as do the Instagram accounts of other prominent anti-vaccine activists. (Facebook officials have said it will take time to fully enforce the new policies.)

Ahmed notes that Facebook has made similar promises in the past, only to fall short. “The problem has never been which policies they have, the problem has been their will to act on those policies in a fair and consistent way. 2020 proved without dispute, when it comes to election misinformation, hate, COVID, or vaccines, that these things take lives.”

This article contains affiliate links; if you click such a link and make a purchase, we may earn a commission.