A mass shooting in Christchurch, New Zealand, in March was streamed live on Facebook; another, in Odessa, Texas last month, involved a suspect who posted a racist diatribe online beforehand. The incidents brought scrutiny to the inflammatory role of hate speech online, especially on popular social media platforms and search engines like Google (GOOG, GOOGL), Twitter (TWTR), and Facebook (FB).
On Wednesday, U.S. Senators are questioning officials from those three companies at a hearing on extremism online — the latest in a string of inquiries on Capitol Hill that focuses on Big Tech.
Among those testifying on Wednesday is Monika Bickert, head of global policy management at Facebook, where she oversees the rules that determine what the site’s 2.4 billion monthly active users are allowed to post.
In a recent exclusive interview at Facebook’s Menlo Park headquarters with three executives who oversee content at Facebook, including Bickert, the company addressed hot-button issues like hate speech, misinformation, and the platform’s supposed anti-conservative bias.
The execs — John DeVine, VP of Global Operations; and Guy Rosen, VP of Integrity; and Bickert — spoke at length about the efforts the company has made to identify and remove harmful content.
Guidelines for hate speech on the site, which the company made public last April, require blunt rules, Bickert said.
“We write rules that are very objective,” Bickert says. “One of those rules is, if you are going to talk about people by…[a] characteristic, like race, religion, or gender. If you're going to say all these people, so all people of this religion are scum, or these people don't belong on our planet, or whatever the situation is, that's something that we would classify as hate speech and remove it.”
She acknowledged that the rules on the platform cannot always accommodate the nuance intended by a given statement.
“In terms of hate speech, look, one of the challenges here is it's really hard for us to know the context of why a specific person says a specific thing,” she says.
“Sometimes people will use coded language,” she adds. “Sometimes, I may say something to you and it may sound like a joke to me, but you may feel bullied by it. So trying to get those lines right is one of the challenges that we will always have.”
Public inquiries into Big Tech are underway in both chambers of Congress. In June, the U.S. House Judiciary Committee held the first in a series of antitrust hearings about whether and how to address the concentration of market power in big tech.
The hearing on Wednesday of the Senate Committee on Commerce, Science, and Transportation is titled, “Mass Violence, Extremism, and Digital Responsibility.”
“In light of recent incidents of mass violence, this hearing will examine the proliferation of extremism online and explore the effectiveness of industry efforts to remove violent content from online platforms,” says a description on the committee’s website. “Witnesses will discuss how technology companies are working with law enforcement when violent or threatening content is identified and the processes for removal of such content.”
In addition to Bickert, testimony will be heard from Nick Pickles, Public Policy Director at Twitter; Derek Slater, Global Director of Information Policy at Google; and George Selim, Senior Vice President of Programs at the Anti-Defamation League.
Andy Serwer is editor-in-chief of Yahoo Finance. Follow him on Twitter: @serwer.