Why Online Comments Suck (and How to Fix Them)

image

Decades of upgrades to the Internet have yet to fix one nagging problem: the people on it.

The recent drama at Reddit is just the latest example. The managers of that expansive archipelago of forums have apparently just realized that the site provides a platform for some singularly hateful people. And yet they don’t seem sure what to do about it, beyond hiding some of the site’s uglier neighborhoods from public view.

But let’s be fair to co-founder and now chief executive Steve Huffman (whose predecessor Ellen Pao resigned under fire after removing a community leader and banning some hate-mongering “subreddits”): People have been banging their heads against this problem for a long time.

We’ve Seen This Movie Before

Last Thursday, Huffman said (via Reddit post, of course) that the site would bury but not ban “content that violates a common sense of decency.” The most common response to that was: What about your commitment to free speech?

But Reddit is neither the government nor the Internet at large. It’s a for-profit firm that racked up $50 million in financing last fall and sells ads to name-brand companies.

For any company in that kind of business, unsupervised comments represent a risk. When some people inevitably act like jerks online, others may flee, and advertisers may not stick around either.

Twitter, for example, has proudly labeled itself “the free speech wing of the free speech party.” But after realizing that it was enabling online harassment, the service has had to come up with better tools for fighting abuse. The Islamic State’s use of tweets to broadcast mass-murder multimedia led Twitter to ban the promotion of terrorism.

Nonprofit forums can also collapse from trolling. The distributed system of message boards called Usenet foundered in the late ’90s largely because some of the unmoderated “newsgroups” became ungovernable.

“What would happen in those groups was lots of trolling, lots of abusive material published,” said Purdue University computer-science professor Gene Spafford, a pioneering Usenet organizer. “The majority of the people who wanted useful distribution mechanisms left, because the trolls just polluted it.”

Rules On the Books Aren’t Enough

image

Ellen Pao (Photo: Reuters)

Reddit reminds me of Usenet in good and bad ways. Like newsgroups at their best, it can be a never-ending source of help, humor, and humanity. It has informed my coverage and given me useful feedback.

But it also harbors a nasty undercurrent of hate in forums proudly devoted to sexism, racism, anti-Semitism, and worse. Reddit’s seedy side also figured in last year’s “celebgate” sharing of stolen nude photos.

Reddit has its rules, but the company puts much of the responsibility for policing users on the moderators of subreddits. And if those administrators themselves are bigots (there’s a fascinating study of Reddit’s spectrum of attitudes), you have a nasty feedback loop.

Reddit, like some other online forums, does offer one incentive for civility that Usenet lacked: Your account can be anonymous, but anybody can see the numeric “karma” score you earn when other Redditors rate your input.

That basic level of accountability — more than a ban on anonymity — is what can steer an online community in the right direction. But such technical decisions are often delayed until it’s damage-control time.

Journalist Sarah Jeong makes this point in her just-released book The Internet of Garbage, comparing the massive investment companies have made in fighting spam with the little they’ve done to combat trolling. Often, it’s left to outsourced employees to process abuse reports.

How Do We Fix This?

That won’t do the job. You need committed humans who feel ownership of the rules and who show up in the forums — which can cost money. As entrepreneur and writer Anil Dash pithily summarized it in a post four years ago, “If your website’s full of assholes, it’s your fault.”

And there’s no one-size-fits-all comment-moderation system ready for Reddit or anybody else to install. As Jeong told me, “What works for Facebook may not work for Twitter, may not work for Reddit, and so on.”

(FYI: At Yahoo Tech, writers and editors can promote or demote comments, and those we reply to also move up.)

But some of the people most experienced with Web forums, like Dash, remain convinced this is a solvable problem with settled principles.

My friend Esther Schindler, a prolific Redditor who has been active in online communities since running a CompuServe forum in 1990, compared the moderator’s job to two core tasks of running a bar: “The barkeep makes everyone feel welcome; the bouncer ensures that people play by the rules.”

That enforcement should be visible and understandable. “If the moderator steps in publicly (but never meanly), the community will (a) recognize that this is a safe place and (b) feel empowered to apply the rules when you aren’t around.”

Greg Barber, digital news projects editor at the Washington Post, made the same point, saying that clear, consistent moderation can convey a site’s culture to its members. “The best communities I’ve seen at the Post are ones that have evolved organically, with community members playing a part in setting and maintaining the tone.”

Barber also works on the Coral Project, an effort by the Post, the New York Times, and Knight-Mozilla OpenNews to develop better online-community systems. They recently launched a Tumblr blog highlighting insightful feedback from readers across the Web.

But it may take years of more work before this blog’s title doesn’t inflict an instant sense of dread among many of you: “Do Read the Comments.”

Email Rob at rob@robpegoraro.com; follow him on Twitter at @robpegoraro.