Facebook says it will who repeatedly share misinformation. The company introduced new warnings that will notify users that repeatedly sharing false claims could result in “their posts moved lower down in News Feed so other people are less likely to see them.”
Until now, the company’s policy has been to down-rank individual posts that are debunked by fact checkers. But posts can go viral long before they are reviewed by fact checkers, and there was little incentive for users to not share these posts in the first place. With the change, Facebook says it will warn users about the consequences of repeatedly sharing misinformation.
Pages that are considered repeat offenders will include pop-up warnings when new users try to follow them, and individuals who consistently share misinformation will receive notifications that their posts may be less visible in News Feed as a result. The notifications will also link to the fact check for the post in question, and give users the opportunity to delete the post.
The update comes after a year when Facebook has struggled to control viral misinformation about the coronavirus pandemic, the presidential election and COVID-19 vaccines. "Whether it’s false or misleading content about COVID-19 and vaccines, climate change, elections or other topics, we’re making sure fewer people see misinformation on our apps," the company wrote in a blog post.
Facebook didn’t indicate how many posts it would take to trigger the reduction in News Feed, but the company has used a similar “strike” system for pages that share misinformation. (That policy has been a source of controversy that Facebook officials removed “strikes” from popular last year.)
Researchers who study misinformation have pointed out that it’s often the same individuals behind the most viral false claims. For example, a recent report from the Center for Countering Digital Hate found that of anti-vaccine misinformation was linked to just 12 individuals.