Facebook board that will eventually decide Trump’s fate overturns several content decisions

  • Oops!
    Something went wrong.
    Please try again later.

LONDON — An independent group tasked with evaluating how Facebook handles online content on Thursday reversed the social media giant’s decision to delete content in four out of five test cases — its first rulings on the company's content decisions.

The cases — which dealt with hate speech, Covid-19 disinformation and other content that may have broken the tech giant’s digital rules — provide the first window into how the board may weigh Facebook's decision to lock former President Donald Trump’s account in the wake of the Capitol riot.

The so-called Oversight Board, a collection of legal and human rights experts whose decisions are binding on how the company treats potentially divisive online content on its global platform, made its rulings as tensions continue over the role social media companies play in fomenting unrest online.

The board said it will start allowing people to submit public comments for the Trump review on Friday. An announcement in that case will be made by April.

“We think, after careful consideration, that first of all there were difficult cases, but we don’t think [Facebook] got it right,” said Helle Thorning Schmidt, the former Danish prime minister and co-chair of the Oversight Board. “We’re saying to Facebook that they need to be better at telling users why their content is getting removed.”

Among the posts Facebook deleted were one quoting Nazi propagandist Joseph Goebbels, one showing women’s breasts and another it said incited hate speech against Muslims.

In its first round of decisions made public Thursday, the Oversight Board spent almost two months reviewing a series of Facebook posts the company had initially removed for breaking its content rules.

When asked if Thursday’s decisions could be seen as a precedent for the upcoming ruling on Trump’s Facebook account, Thorning Schmidt said no. “You can't read anything into that,” she said.

The group has the power to determine if such deletions were justified or unfairly restrict people’s freedom of speech, but the experts are not able to review Facebook posts that remain online.

That will change in the next couple of months, Thorning Schmidt added, and the group will be given the power to adjudicate on posts that Facebook has not removed.

The board is run separately from the company, but its $130 million budget is provided by the tech giant. Online users or the company can ask the body to review cases, and more than 150,000 referrals have been submitted since October. The group plans to announce its next round of cases on Friday.

Myanmar, Covid drugs

In one ruling, the board said that a post from a user in Myanmar that appeared to criticize Muslims, which Facebook had removed because it said the content had breached the company’s hate speech standards, should be reinstated because while the comments could be seen as offensive, they did not meet Facebook’s own standards for what constituted hate speech.

In another, the group said that a deleted Facebook post from France which criticized local officials’ failure to use hydroxychloroquine, a malarial drug, to treat Covid-19 — a debunked claim that remains widely popular across the country — should also be returned to the social media platforms because it did not represent an imminent harm to people’s lives.

A third decision ordered Facebook to reinstate an Instagram post from Brazil that included female nipples as part of a breast cancer awareness campaign, which the company’s automated content moderation system had initially removed for falling afoul of the photo-sharing app’s nudity policy. Facebook eventually reposted the image on Instagram, but outside experts criticized the company for failing to have sufficient human oversight of such automated decisions.

“Everyone can see that these are not easy cases and it has been difficult to come to a final decision,” said Thorning Schmidt, adding that not all of the rulings were backed universally by the group’s members.

The only case in which the Oversight Board agreed with Facebook’s decision to remove a post related to a Russian-language attack on Azerbaijanis that the experts agreed had broken the company’s hate speech standards.

Despite the outside group’s willingness to overturn how Facebook handles potential dubious posts across its platform, not everyone has welcomed the increased oversight.

Damian Collins, a British lawmaker and co-founder of The Real Oversight Board, a campaigning group critical of its namesake, said that the body’s inability to review Facebook’s wider content moderation policies and failure to rule on potentially harmful posts that remained on the platform, made its work mostly toothless.

“These types of decisions should not be left to Facebook,” he said. “The decision to remove content or not should be in the hands of a government or politically-elected figures.”