Since people first started complaining about "fake news" on Facebook, the phrase has evolved—from a useful way to identify false-information-masquerading-as-traditional-news, to a term that means basically nothing, now wielded by President Donald Trump against stories he doesn't like, and also, drunk people in bars screaming about things and/or sports results they disagree with.
But the original problem still genuinely exists. And Facebook finally came out with its long-awaited response to beginning to cut away at the issue.
Spotted on Twitter on Friday night, the tool identifies links to sites known to produce misinformation. The tool cites third-party fact-checking organizations like Snopes and Politifact—the kind of sites that Trump supporters also like to dispute.
Facebook is flagging links to fake sites now, looks like: pic.twitter.com/N7xaWDkdYA
— Anna Merlan (@annamerlan) March 3, 2017
Facebook started testing related features and promised updates similar to what debuted this week in December. The solution to the spread of misinformation put the onus on Facebook users, not Facebook, to identify false stories. Third-party fact-checkers must agree to a fact-checking code of ethics.
Now it seems the tool's been made available to more users. Facebook added a section on "disputed" news to its help tools. Users can see why stories were marked as disputed.
Facebook also added information about how to flag a story as fake:
As Facebook said, the tool isn't available to every user yet, but once it is, get ready to see some "disputed news" on your newsfeed. And then, be ready for the disputes over that.