As the global pandemic spread, Facebook staffers warned Mark Zuckerberg that misinformation and anti-vaccination posts were running rampant on the platform. Yet the company was failing at cracking down on removing and moderating misleading and false posts related to COVID-19, according to an investigative series from The Wall Street Journal.
Zuckerberg had said Facebook was going to push 50 million people toward COVID vaccines, yet Facebook’s own research was revealing that comments were spreading about anti-vaccine sentiments, ranging from people’s objections to conspiracy theories. For months, Facebook continued trying to catch up to these posts that were flooding its platform and undermining the vaccination efforts globally.
Another internal memo showed that 41% of vaccine-related comments on English posts discouraged vaccination. Users were exposed to comments on vaccine-related posts 775 million times per day, according to the documents. In response, Facebook later changed its ranking for health content in the aim to reduce views on what it deemed “health misinfo” around June 2020.
In May 2020, a video called “Plandemic” propagated false information, including the claim that masks would worsen the coronavirus. It became highly popular and promoted on Facebook before the company took it down.
In August, advocacy group Avaaz reported that the top 10 accounts producing health information got four times as many estimated views on Facebook as the top 10 authoritative sources. The group warned Facebook about taking more extreme measures against these networks spreading COVID misinformation.
This year, as vaccine rollouts began, anti-vaccine individuals and groups took to Facebook. Out of some 150,000 posts in Facebook groups disabled for COVID misinformation, 5% were responsible for producing half of all posts. Around 1,400 users were inviting half the group’s incoming members, according to the report.
In August, Facebook said it had removed 20 million items violating its COVID policy. Since The Journal’s articles, lawmakers have proposed in investigation into Facebook’s internal research on Instagram’s impacts on teen users. Some Facebook officials are also concerned that its leaders could face further questioning from the government regarding their past statements and testimonies.