Facebook says it mistakenly asked users for views on grooming

Facebook says it mistakenly asked users for views on grooming

In today's episode of 'wtf was the tech industry thinking', Facebook has been caught asking users if they think it's okay for an adult man to ask a 14-year-old girl for "sexual pictures" in a private chat.

The Guardian reported that Facebook ran the survey on Sunday asking a portion of its users how they thought it should handle grooming behavior.

One question received by a Facebook user who was sent the survey read: "In thinking about an ideal world where you could set Facebook's policies, how would you handle the following: a private message in which an adult man asks a 14 year old girl for sexual pictures."

Facebook offered four multiple choice responses that users could select, ranging from being able to approve of such content being allowed on Facebook to saying it should not be allowed or stating they have no preference.

We reached out to Facebook to ask about its intentions with the survey and also to ask how many users received it; in which countries; and what their gender breakdown was.

A Facebook spokesperson emailed us the following statement in response:

We sometimes ask for feedback from people about our community standards and the types of content they would find most concerning on Facebook. We understand this survey refers to offensive content that is already prohibited on Facebook and that we have no intention of allowing so have stopped the survey. We have prohibited child grooming on Facebook since our earliest days; we have no intention of changing this and we regularly work with the police to ensure that anyone found acting in such a way is brought to justice.

The company declined to answer any specific questions, though we understand the survey was sent to thousands not millions of Facebook's 2.1BN global users.

It's also unclear whether the company links any of the information it gathers from product surveys like these to individual Facebook users' profiles for ad targeting purposes. We've asked Facebook and will update this post if it provides clarification of how else it might use this kind of user generated data.

Facebook's handling of child protection issues has sporadically attracted criticism -- including a year ago, after a BBC investigation found it was failing to remove reported child exploitation imagery. Though it's hardly the only social media firm taking flak on that front.

In May last year a UK children's charity also called for Facebook to be independently regulated, urging a regime of penalties to enforce compliance.

Since then there have also been wider calls for social media firms to clean up their act over a range of 'toxic' content.

So quite what Facebook's staffers were thinking when they framed this particular question is hard to fathom.

The law in the UK is unequivocal that it's illegal for adults to solicit sexual images from 14-year-old children -- yet the survey was apparently running in the UK.

According to the Guardian, another question asked who should decide the rules around whether or not the adult man should be allowed to ask for such pictures -- with responses ranging from Facebook deciding the rules on its own; to getting expert advice but still deciding itself; to experts telling Facebook what to do; and finally to users deciding the rules by voting and telling Facebook.

The survey also asked how users thought it should respond to content glorifying extremism. And to rank how important they felt it is that Facebook’s policies are developed in a transparent manner; are fair; took into account different cultural norms; and achieved “the ‘right outcome’", according to the newspaper.

Responding to its digital editor, Jonathan Haynes, after he flagged the issue on Twitter, Facebook's VP of product, Guy Rosen, claimed the question about adult men asking for sexual imagery of underage girls was included in the survey by "mistake".

"[T]his kind of activity is and will always be completely unacceptable on FB," Rosen wrote. "We regularly work with authorities if identified."

Last summer Facebook kicked off a community feedback initiative asking for views on a range of so-called "hard questions" -- though it did not explicitly list 'pedophilia' among the issues it was putting up for public debate at the time.

(But one of its 'hard questions' asked: "How aggressively should social media companies monitor and remove controversial posts and images from their platforms? Who gets to decide what’s controversial, especially in a global community with a multitude of cultural norms?" -- so perhaps that's where this error crept in.)

This January, in the face of sustained criticism about how its user generated content platform enables the spread of disinformation, Facebook also said it would be asking users which news sources they trust in an effort to engineer a workaround for the existential problem of weaponized fake news.

Although that response has itself been pilloried -- as likely to further exacerbate the filter bubble problem of social media users being algorithmically stewed inside a feed of only their own views.

So the fact Facebook is continuing to poll users on how it should respond to wider content moderation issues suggests it's at least toying with the idea of doubling down on a populist approach to policy setting -- whereby it utilizes crowdsourced majority opinions as a stand in for locally (and thereby contextually) sensitive editorial responsibility.

But when it comes to pedophilia the law is clear. Certainly in the vast majority of markets where Facebook operates.

So even if this ethical revisionism was a "mistake", as claimed, and someone at Facebook wrote a question into the survey that they really shouldn't have, it's a very bad look for a company that's struggling to reset its reputation as the purveyor of a broken product.

Asked for comment on the survey, UK MP Yvette Cooper, who is also chair of the Home Affairs Select Committee -- which has been highly critical of social media content moderation failures -- condemned Facebook's action, telling the Guardian: “This is a stupid and irresponsible survey. Adult men asking 14-year-olds to send sexual images is not only against the law, it is completely wrong and an appalling abuse and exploitation of children."

"I cannot imagine that Facebook executives ever want it on their platform but they also should not send out surveys that suggest they might tolerate it or suggest to Facebook users that this might ever be acceptable,” she added.

The approach also reinforces the notion that Facebook is much more comfortable trying to engineer a moral compass (via crowdsourcing views and thus offloading responsibility for potentially controversial positions onto its users) than operating with any innate sense of ethics and/or civic mission of its own.

On the contrary, instead of facing up to wider societal responsibilities -- as a the most massive media company the world has ever known -- in this survey Facebook appears to be flirting with advocating shifts to existing legal frameworks that would deform ethical and moral norms.

If that's what Zuck meant by 'fixing' Facebook he really needs to go back to the drawing board.