What people misunderstand about Facebook's abuse reporting system

Facebook CEO Mark Zuckerberg makes the keynote speech at F8, theFacebook’s developer conference, Tuesday, May 1, 2018, in San Jose, Calif. (AP Photo/Marcio Jose Sanchez)
Facebook CEO Mark Zuckerberg makes the keynote speech at F8, theFacebook’s developer conference, Tuesday, May 1, 2018, in San Jose, Calif. (AP Photo/Marcio Jose Sanchez)

The abuse-reporting process on social media websites is often a black box, with tech companies offering limited visibility to their processes.

But at the Collision tech conference in New Orleans, Karuna Nain, global safety manager at Facebook shared a few insights about how the system works, what the biggest misconceptions are, and a few tips.

Nain stressed that Facebook does in fact pay attention to abuse reports, and said that generally if a company has gone to the trouble to add a flagging feature, they probably take it seriously. On Facebook, at least, everything is viewed by teams in over 40 languages, who compare the content or account against community standards. The team doesn’t just look at the rule a reporting user cites, but all the rules.

However, it pays to be specific, Nain stressed.

“I always tell people, report exactly what they want [community managers] to see,” she said. “Report the content, not the whole profile.”

The biggest misconception about Facebook’s reporting system, Nain added, was that people think more reports about someone’s abuse make it more likely it will be taken down or addressed.

“It doesn’t work that way,” she said. “We treat every report the same. We ask you a series of questions and pay attention to them.”

Nain added that sometimes Facebook does make mistakes, which suggests that more reports would at least give the abuse reviewers another chance to remove abusive content, should it slip by an initial review.

During the same panel, Catherine Teitelbaum, Kik’s head of trust and safety, pointed to a study that suggests reporting and blocking abuse has positive effects.

According to Teitelbaum, the Kik study found that kids who are resilient to bullying online often successfully and frequently used the blocking features inherent to the platforms.

Having controls — like blocking and specifying what content you want to see — was part of the key for Facebook, Nain added, because community standards for an online community of over two billion people is impossible. (What Norway considers PG could be scandalous in India, Nain said.)

The way to govern such a diverse community, she said, was to not just have policies that define what users can share but to also give people tools to control their own experience.

Ethan Wolff-Mann is a writer at Yahoo Finance. Follow him on Twitter @ewolffmann. Confidential tip line: FinanceTips[at]oath[.com].

Facebook has a window to self-regulate, and it’s taking advantage

Zuckerberg: How Facebook uses your browsing data

How Facebook is changing your privacy settings