The Question of Liability Looms Over Tech Companies Following Tragedies

The logo for the social networking site Gab

A growing list of tech companies, including GoDaddy and PayPal, have turned their backs on Gab after one of the social media network’s users was arrested in connection with the Oct. 27 mass shooting at a Pittsburgh synagogue that left 11 worshipers dead.

But should those companies have acted sooner? Should they have acted at all? And should social media sites and other businesses be held legally liable for playing a role in atrocities in the same way that some want to make gun-makers answer for mass shootings?

“It’s tricky,” said John Carroll, a media analyst for NPR’s “Here & Now.” “This whole benign we’re-just-the-empty vessel argument is having less and less of an impact for these tech platforms. There are just too many of these types of incidents and there’s an increasing impression among the public that these tech companies either can’t or won’t regulate themselves.”

The outcry over Gab, which caters to the alt-right and describes itself as a defender of “individual liberty and free expression online,” could give traction to an effort to repeal Section 230 of the Telecommunications Act of 1996. The law shields social networking websites and other online service providers from being held liable for what users post on their sites.

“It basically says they’re not responsible for the content on their platform,” Carroll said. “But they are responsible for removing it in certain cases, including copyright infringement. They’ve been in a reactive situation for a long time and the public wants them to be proactive now.”

A spokesman for PayPal wrote in an email that the company had been “closely monitoring Gab and was in the process of canceling the site’s account before the tragic events occurred.” A GoDaddy spokeswoman wrote that the company told Gab to move its domain to another registrar after receiving complaints about the site “over the weekend,” when the shooting occurred. Both companies declined interview requests.

At least four other tech companies—Joyent, Stripe, Medium and, most recently, Pusher—have parted ways with Gab since the shooting. And Gab has dared Twitter to follow suit.

“Show the world you are a publisher and deserve to have Section 230 revoked,” Gab wrote in its Twitter taunt tweet. Gab asserted that its account, which remained active Tuesday, could not violate Twitter’s terms of service because it was “newsworthy.”

In the absence of government oversight, social media users are going to keep pressuring tech companies to take responsibility for the content on their platforms, said David Karpf, an associate professor of media and public affairs at George Washington University.

Karpf noted that GoDaddy, which he said is known more for its racy Super Bowl ads than its progressivism, must have been “feeling tremendous pressure from users” when it gave Gab the boot.

And that’s not entirely fair, according to Gad Allon, a professor of operations, information and decisions at the Wharton School of the University of Pennsylvania. He said tech companies are too often being held to a higher standard than other service providers who have a more solid footing in the offline world.

“We might not like what we see, but take that same behavior and ask if it can be done in any venue that is not online,” he said. “Can I lease an office space to a firm like Gab? The answer most likely would be yes. You cannot decline someone just because you don’t like the area they deal with. Should credit card providers decline to work with them? What about Verizon or Spectrum?”

Of course, tech companies can use service agreements to close the curtains on users who perpetuate hate or violence—and many of them do. Last year, Google cited its policy against hate speech when it banned Gab from offering its Android app in the Google Play Store. Gab responded with an ongoing lawsuit.

“One perspective on this would be that tech platforms are going to have to come up with a kind of Good Housekeeping seal of approval,” Carroll said. “Whether it’s the industry or the government that does that, I think it’s in the best interest of these tech companies.”

But Allon’s not so sure. To him, the question at the heart of the debate should hinge on whether the activity in question is illegal.

“What are the expectations that we have of the First Amendment and these social networks?” he asked. “Do we want Mark Zuckerberg to police speech on Facebook?”