Facebook failed America this year — now it should kill the News Feed

Https%3a%2f%2fblueprint-api-production.s3.amazonaws.com%2fuploads%2fcard%2fimage%2f279201%2fap_791877668683
Https%3a%2f%2fblueprint-api-production.s3.amazonaws.com%2fuploads%2fcard%2fimage%2f279201%2fap_791877668683

Every employee at Facebook should be ashamed of what their product became this year: a tangled mess of bizarre falsehoods and outdated information used as ammo to help people scream at one another. 

It has failed our bitterly divided country through its News Feed, which has weighed legitimate, reported information from good news organizations against propagandistic junk written by trolls and found there is no difference. Posted to your News Feed, a Pulitzer-winning New York Times report looks just the same as a blog post from The Right Stuff, an anti-Semitic content factory of the so-called "alt-right."

SEE ALSO: Why we should worry about Facebook's Trending topics disaster

And it has failed, in a very basic way, to keep anyone legitimately informed through its "Trending" topics feature, from which it fired several paid journalists earlier this year. Trending is designed to show you the latest news according to how many people are posting about a topic on Facebook. On Wednesday afternoon, many hours after Donald Trump was elected president, the latest political news the social network had for me was a useless tidbit about our new commander-in-chief monitoring his wife's vote.

Putting all of this very simply: Facebook sucks as a news enterprise. It was especially bad this year when misinformation became the hallmark of a successful presidential campaign that split the nation in half. To save face, the social network should dismantle its News Feed and start over with something that either treats media differently or doesn't allow it at all.

Lies, lies, lies

There's an argument to be made that Facebook, with the lies it allows to spread on its News Feed, is responsible for the election of Donald Trump. Let's glance at Max Read's take on Select/All, a New York Magazine blog, published Wednesday:

To reiterate: Millions of voters, nine months of lies on Facebook. You may not feel President-elect Trump is a problem, but let's not argue his ascendence hasn't been destructive. The majority of Americans (Clinton did win the popular vote after all) now feel anxious about the path forward on issues critically important to how we will live and breathe every day: LGBTQ rights, climate change and immigration among them. 

And we are furious with one another. I told a complete stranger to "shuuuuuttt uuuuppp" (yep) on a status posted by an acquaintance Wednesday (yep!) simply because he commented with information I knew to be false — probably because he read it on Facebook. (My acquaintance wisely deleted both comments.)

For crying out loud, a completely fabricated story about Hillary Clinton and Huma Abedin running a child sex ring was allowed to spread on the platform last month like any old news. Just one version of that post, which was aggregated and published by numerous platforms, was shared to a potential audience of more than 2 million people according to CrowdTangle, a data firm that measures how information spreads online.

As an anecdotal bit of evidence to back up the numbers, a profile of a Trump supporter published in The Washington Post last month indicated how certain people feast on complete Facebook bunk intended to stoke fears and compel further shares — serving an outlet's bottom line when millions of people flood into their website and tick up potential ad impressions.

"One of the things that has made Facebook so problematic is that everything is being juxtaposed," Finn Brunton, an assistant professor at New York University who specializes in the relationships between society, culture and technology, explained to me when I rang him up Wednesday.

"All of the major news orgs are relying on a huge amount of viral Facebook traffic, in the same way that those [bogus sites] are relying on a massive amount of viral Facebook traffic," he added. "So you end up with a News Feed where a huge amount of things are juxtaposed."

That looks like this:

Because Facebook's News Feed holds so much value — it is the single greatest source of traffic via social media for most, if not all, major websites — outlets learn how to game its system. People make careers out of this, and it's all possible because everything is watered down to a stupid little white box when you post on the social network.

We're about to get so nerdy here, but bear with us

Format matters a lot. Facebook is a reasonably pleasant app to use, because it has a deliberate, clean design, but a lot of weird problems come up when you force disparate media — lots of square, triangular and round pegs — to fit through the same hole.

"By virtue of writing a good article about something, an article that's researched and thought-through and posted on Facebook, you're giving credence to all of these other junk articles," Brunton explained. 

In other words, a bit of dissonance happens when you put a great piece of journalism next to something that, well, isn't. Say you bought a biology textbook that had a full-page ad for Scientology in the evolution chapter — you'd be kind of like, "what?"

"In the old days on the web, you would stumble across some horror show of an anti-Semitic website," Brunton continued. "Along with your existing critical sense and your ability to evaluate information, it was a site that was clearly run by hateful people. You could see it in the design. You could see it in the things they linked to. When something is circulated on Facebook, it's just circulated on Facebook. It has that same staid, infrastructural presentation."

Less than journalistic works, articles are just little bits of content that fill Facebook's News Feed, hopefully hooking you enough to keep you visiting the app time and time again — and maybe sharing some content of your own.

"Anecdotal activity, researched news, and complete bullshit. It's all at the same level," Brunton said.

Kill the News Feed

Journalists like me have thrashed against the monolithic social media force for months, insisting it should take some editorial responsibility for how it distributes content to its 1.18 billion daily users. I'm tired of expecting responsibility from a Silicon Valley company worth several hundred billion dollars that has a wannabe vampire on its board, and I'm quite honestly exhausted by one of the most hateful and convoluted presidential campaigns history has ever witnessed. The time for ethics from Facebook has passed.

Instead, the company should focus on the fact that its News Feed product is really bad. And like many bad Facebook products, the company should just kill it.

That's easier said than done, of course — the News Feed is only the spine of the entire Facebook product as we know it. But we know what the stakes are now: Bad information leads to destructive action.

Facebook should shave the little calloused "Trending" nub, which does nothing for no one, and then get to work reshaping how people share media on the platform. It's a ridiculously knotted problem: Writers rely on Facebook for traffic, while traffic means advertising revenue and food on our tables. It's hard to imagine modern media without the social network as a distribution platform. But the status quo isn't working — it hasn't worked for a long time. 

There are potential "outs" for media to actually work well on Facebook. Maybe customized branding becomes more prominent. Maybe Facebook even offers a way for publishers to charge for content — paid magazines, at least, were made to suggest premium quality. Maybe it re-tools its algorithms to get smarter about legit content, or uses fact-checking software or even (re-)hires human journalists.

You'd think the possibilities would be endless for a company that has developed a working prototype of a drone that flies around large swaths of planet Earth, beaming internet access via lasers. For whatever reason, truthful news media obviously hasn't been a priority for Facebook despite promises to the contrary. We've seen the consequences. Now let's fix the problem.