Exclusive: An in-depth look at Facebook's content police

President Donald Trump and Massachusetts Democratic Senator Elizabeth Warren don’t agree on much — but both have condemned Facebook (FB) for how it polices posts and ads on its site. So have concerned parents, human rights advocates, United Nations investigators, and U.S. House Speaker Nancy Pelosi.

Concerns about misinformation on Facebook reached a fever pitch after the 2016 presidential election, the outcome of which some have attributed to a Russian disinformation campaign on the platform. When revelations surfaced last year that Cambridge Analytica, a consulting firm hired by the Trump campaign, had harvested the data of 50 million Facebook users, distrust of the company worsened.

In an exclusive interview at Facebook’s Menlo Park headquarters with the three executives who oversee content at Facebook, the company responded to its detractors.

The execs — Monika Bickert, Head of Global Policy Management; John DeVine, VP of Global Operations; and Guy Rosen, VP of Integrity — spoke at length about the efforts the company has made to moderate content and mitigate and adjudicate hot-button issues like hate speech, misinformation, and hacking.

With 2.4 billion monthly users spread globally across the company’s four primary platforms — Facebook, Instagram, WhatsApp, and Messenger — it’s no small task.

Zuckerberg and Sandberg are heavily involved with content issues

Just how hands-on are CEO Mark Zuckerberg and COO Sheryl Sandberg when it comes to sensitive content issues at Facebook)? According to top executives at the social media giant, they are “incredibly involved.”

That’s important because with Facebook under fire from regulators over questions on several fronts including possible antitrust violations, freedom of speech, data security, and national security, the focus of the company’s two top executives reflects the social media giant’s priorities and how it responds to myriad criticisms and challenges.

The three executives who spoke to Yahoo Finance describe the process of navigating sensitive content issues as “nuanced” where intelligent and good-intentioned participants often disagree. And they, and others at the company, spoke of Zuckerberg and Sandberg’s stepped-up involvement in that process.

“Any time that we're dealing with something that is close to the line or it's something where it's not really clear how the policies apply or it's something that's particularly important, we will, at the very least, send an email up to Mark and Sheryl so that they know what's going on,” Bickert told me. “Very often, we will end up having a back-and-forth with them about why we're making the decision we're making, and make sure they're OK with it.”

CORRECTS TO SAY THAT PHOTO WAS TAKEN DURING PREPARATION FOR THE SUMMIT ON WEDNESDAY, NOT THE ACTUAL SUMMIT ON THURSDAY - In this Wednesday, June 21, 2017, photo, Facebook CEO Mark Zuckerberg speaks during preparation for the Facebook Communities Summit, in Chicago, in advance of an announcement of a new Facebook initiative designed to spur people to form more meaningful communities with Facebook's groups feature. (AP Photo/Nam Y. Huh)
Facebook CEO Mark Zuckerberg speaks during preparation for the Facebook Communities Summit, in Chicago. (AP Photo/Nam Y. Huh)

With the president, as well as high-level Democrats, highly tuned to the vagaries of social media, it’s understandable that Zuckerberg and Sandberg would want to be kept in the loop when these issues crop up. It might also make it difficult, however, for them later to suggest they were out of the loop when it comes to a content issue at the company.

“The leadership is very involved,” says DeVine, who’s in charge of the global operations of content management, which includes overseeing the now 15,000 full-time, part-time employees, and contractors who monitor content. “At a minimum, on a weekly basis, we're all sitting down, the three of us, as well as a group of other people and Sheryl and Mark and going over some of our most important topics that week, to check to see, are we getting it right.”

“Mark [is] incredibly involved in...the deepest, hardest, especially product issues, that we're looking at right now,” says DeVine. “The involvement is very deep.”

[Read more about Mark Zuckerberg and Sheryl Sandberg’s involvement]

'We do not' have an anti-conservative bias

When asked whether the company has an anti-conservative bias, Bickert explicitly told Yahoo Finance, “We do not.”

The topic of supposed anti-conservative bias by tech giants has been a favorite one of President Donald Trump, who in July accused Facebook, Google and Twitter of “terrible bias” against him and his supporters. Previously, in April, Republican senators claimed in a hearing that the same tech giants muzzle conservative voices.

When asked how users can be sure there’s no anti-conservative bias, Bickert pointed to the fact that back in April 2018 Facebook published the internal guidelines it uses to decide whether to remove posts.

“When John's [DeVine’s] team members are looking at a piece of content and making a decision, we want people to understand that they're not applying their own subjective beliefs about what they think should be on the site,” Bickert said. “They have very granular rules that they have to apply. And those rules are now public for people to see.”

Still, more recently, in August, Facebook released the findings of a report it commissioned on anti-conservative bias showing that “there is still significant work to be done” to satisfy concerns from conservatives who believe the social network discriminates against them.

The investigation was conducted by former Republican Senator Jon Kyl and his legal team at Covington & Burling LLP.

[Read more on Facebook’s position on bias]

Facebook doesn't want to decide 'what's true and what's false’

Anti-vax myths, distorted Nancy Pelosi videos, a conspiracy theory that a recent mass shooter was a supporter of presidential candidate Beto O’ Rourke — misinformation abounds on Facebook. The three executives said the company has made progress addressing false posts but still struggles to identify them, especially in the most high-stakes regions where misinformation can lead to deadly violence.

“We don't want to be in the position of determining what is true and what is false for the world,” says Bickert. “We don't think we can do it effectively.”

“We hear from people that they don't necessarily want a private company making that decision,” she adds.

Reluctant to judge veracity on its platform, Facebook partners with fact-checking organizations that vet posts, an arrangement that began after the 2016 presidential election. But Bickert acknowledged that the company often lacks such partnerships in violence-prone regions.

“The sad reality is, in the places in the world where you are most likely to have on the ground violence, those are often the same places where it's hard to have a fact-checking partner, or even a safety organization, tell us what the real situation is on the ground,” she says.

WASHINGTON, DC - FEBRUARY 26:  Facebook Head of Policy Managment Monika Bickert participates in a discussion and question-and-answer session about 'Internet Security and Privacy in the Age of Islamic State' at the Washington Institute for Near East Policy February 26, 2016 in Washington, DC. A former U.S. attorney at the Justice Department, Bickert began work at Facebook in 2012 as lead security counsel, advising the company on matters including child safety and data security.  (Photo by Chip Somodevilla/Getty Images)
Facebook Head of Policy Managment Monika Bickert. (Photo by Chip Somodevilla/Getty Images)

The Mueller Report, released in April, detailed Russia-operated Facebook Groups like “United Muslims of America” and “Being Patriotic” that each had hundreds of thousands of followers.

“There's always going to be continued challenges,” Rosen says. “And it is our responsibility to make sure that we are ahead of them and that we are anticipating what are the next kind of challenges that bad actors are going to try to spring on us.”

[Read more about Bickert’s thoughts on misinformation]

Election protections have ‘evolved a lot’ since 2016

With the 2020 presidential election rapidly approaching, Facebook has a lot to prove this time around.

In the fall of 2017, Facebook disclosed that Russian operatives trying to influence U.S. politics had put up 80,000 posts that may have reached roughly 126 million Americans. Though Facebook wasn’t the only social media company targeted, it was certainly one of the largest.

Facebook’s approach to election protections has changed since the 2016 presidential elections, Rosen told Yahoo Finance.

“If you think about how we approach elections broadly, not just 2016, how we think about the work we do going forward,” he said, “it's evolved a lot in the past few years.”

A flag of the United State is shown between monitors as workers sit at their desks during a demonstration in the war room, where Facebook monitors election related content on the platform, in Menlo Park, Calif., Wednesday, Oct. 17, 2018. (AP Photo/Jeff Chiu)
A flag of the United State is shown between monitors as workers sit at their desks during a demonstration in the war room, where Facebook monitors election related content on the platform, in Menlo Park, Calif., Wednesday, Oct. 17, 2018. (AP Photo/Jeff Chiu)

‘New types of threats’

Rosen said the company has initiated a line of defense. “A lot of what we have done and built in the past few years is trying to address holistically all the range of problems that we see across elections,” Rosen said. “And we've implemented this, not just for 2020, but for the 2018 midterms, for India and Indonesia and the EU parliament, who had really big elections earlier this year.”

Rosen explained that in 2016, the company’s biggest concerns were account hacking and leaked information — but that has changed, Rosen said. “I think all of us and the world have learned that there are new types of threats, there are governments that are looking to influence public opinion, to distribute information.”

In May, Facebook released a post detailing the company’s approach to fake accounts as part of its “Hard Questions series,” which addresses the impact of Facebook’s products on society.

[Read more on election protections at Facebook]

Facebook execs calls for government ‘regulatory standard’ for content

The three executives called for government regulation of posts that appear on the site, saying the company has helped public officials better understand the platform and how to write rules for it.

But the executives did not provide specifics about the potential rules or how they may be enforced on a platform with 2.4 billion monthly active users worldwide.

“We think there does need to be a regulatory standard for how to deal with content,” says Rosen.

“We sit down with governments and we share, this is how our systems work, this is how content is reported, how it's reviewed,” he adds.

Facebook's CEO Mark Zuckerberg, left meets with French President Emmanuel Macron at the Elysee Palace after the "Tech for Good" summit, in Paris, Wednesday, May 23, 2018. French President Emmanuel Macron seeks to persuade Facebook CEO Mark Zuckerberg and other internet giants to discuss tax and data protection issues at a Paris meeting set to focus on how they could use their global influence for the public good. (Christophe Petit Tesson/Pool via AP)
Facebook's CEO Mark Zuckerberg, left meets with French President Emmanuel Macron at the Elysee Palace after the "Tech for Good" summit, in Paris, Wednesday, May 23, 2018. (Christophe Petit Tesson/Pool via AP)

‘How do you measure what is effective?’

Zuckerberg initially called for government regulation of content in a March op-ed in the Washington Post, inviting, “third-party bodies to set standards governing the distribution of harmful content and to measure companies against those standards.”

Since the site’s users span the globe, potential regulation raises questions of how to make rules and enforce them, Rosen says.

“The world is trying to figure out what that looks like,” Rosen says. “How do you set benchmarks? How do you measure what is effective?”

“For us, it’s making sure that we're out there, and we're opening up, and we're sharing everything we've learned, and what we think sets up the right kind of measures and incentives, is exceptionally important,” he adds.

Addressing whether Facebook should determine if posts contain false information, Bickert says, “We hear from people that they don't necessarily want a private company making that decision.”

[Read more on Facebook’s call of a regulatory standard]

Facebook to appoint members to its oversight board by the end of 2019

Facebook plans to appoint members to its content oversight board later this year.

“We will be releasing a charter that will explain, basically, how the board is going to work,” she said. “We are still in the process of figuring out some of the specifics and exactly who will be on it, but this is something that we are hoping to have people in place for the board by the end of this year.”

The oversight board will make decisions on whether to allow certain types of content on Facebook’s platform, as it grapples with how to combat controversial user content on its platform.

“We are creating a council, an oversight board, to which people can appeal,” Bickert added. “If they go through a Facebook appeal and they don't like our decision, they can appeal and ask this board to take a look and make a decision about whether or not that piece of content should be on Facebook. And that decision will be binding.”

Facebook has held events in various cities across the world throughout 2019 “with hundreds of folks that are experts in freedom of expression and internet and governance and digital rights,” said Rosen.

The company is also taking steps to ensure the board remains independent in its decision making, especially as it relates to how board members are compensated.

“We need to create some sort of independent entity that we pre-fund, because we do need to make sure the board has the facility to operate and to compensate people for their time,” Rosen added. “But we don't want to be involved in any of those decisions on an ongoing basis, because we need to make sure that they can be truly independent and they can overrule us.”

[Read more on Facebook’s plans for content oversight]

All Markets Summit
All Markets Summit

The following reporters contributed to this report: Max Zahn, Heidi Chung, Scott Gamm, and Adriana Belmonte.

Andy Serwer is editor-in-chief of Yahoo Finance. Follow him on Twitter: @serwer.

Read more:

Facebook's Zuckerberg and Sandberg are this involved with the company's content issues

Trump’s firing of John Bolton 'shows a deeper problem,' says former U.N. Ambassador Samantha Power

Negative interest rates are coming and they are downright terrifying