Inside Facebook: The Content Police (Transcript)

ANDY SERWER: Hello, everyone. I'm Andy Serwer. And welcome to "Yahoo Finance presents Inside Facebook-- The Content Police." I'm here in Menlo Park, California, at Facebook headquarters to delve into a subject that is top of mind for Mark Zuckerberg and Sheryl Sandberg, and that is moderating content on the social media giant's platform.

How does the company handle hate speech, hacking, and misinformation? This was the same thing with the Nancy Pelosi video, that you leave it up, but you mark it, because you want people to have a conversation about whether it's authentic or not.

MONIKA BICKERT: Well, it's a little more nuanced than that.

ANDY SERWER: OK. OK, explain that to me. Because that nuance is lost on people, I think.

MONIKA BICKERT: We don't want to be in the position of determining what is true and what is false for the world.

ANDY SERWER: Donald Trump has criticized Facebook for having an anti-conservative bias. Does Facebook have an anti-conservative bias?

MONIKA BICKERT: We do not.

ANDY SERWER: Mark Zuckerberg and Sheryl Sandberg, how often do you guys talk to them about your work, and how often do things rise to their level?

MONIKA BICKERT: Very often.

JOHN DEVINE: Once or twice.

ANDY SERWER: OK, all the time.

GUY ROSEN: When we were starting to scale up this operation, and just as the process was kicking off, I get a note from Mark. And he says, hey, before I even start it with anyone else, how much do you guys need?

ANDY SERWER: I'm joined by the three top executives who oversee this area, Monika Bickert, head of global policy management, John DeVine, VP of global operations, and Guy Rosen, VP of integrity. I want to get back to what that title means, Guy, in particular. Monika, let's start with you-- actually, all three of you. And tell us a little bit about what your roles are.

MONIKA BICKERT: Sure. I'm Monika Bickert. And my team is responsible for the content policies.

Those are basically the public rules that tell people what you can and cannot post on Facebook. Before I came to the company, I spent more than a decade as a federal criminal prosecutor for the Justice Department. So I was already working on safety issues. And when I came into the private sector and joined Facebook, for me, a lot of that was about finding another way that I could still work on the safety issues that have been really the driving force of a lot of my career.

When I joined the company seven years ago, just to kind of give you a sense of how things have changed, we're talking about a handful of people on my team working on this. Whereas now, if you fast forward, we've got more than 100 people in 11 offices around the globe with backgrounds in everything from law enforcement to safety NGOs to academia and so forth. But a lot of our challenges, day to day, are, how do we write a set of rules that are globally applicable across more than 2 billion people, and how do we also make sure that we can actually enforce those rules, and that they're not just sort of lines on a paper?

ANDY SERWER: And these guys are your partners in that. And John, you've got a lot of people reporting up to you. What do you do exactly?

JOHN DEVINE: Great. It's good to see you again, Andy. Welcome to Facebook.

ANDY SERWER: Thank you.

JOHN DEVINE: Well, we work closely with both of these leaders. And as Monika said, I run a global operations team. And so there's just people around the world, around the globe.

And our job-- a big part of that job-- is content moderation. And so we're taking those policies that Monika described and just making sure that the content that people are posting on Facebook everyday meets those policies. So it's a tremendously challenging task.

There's an enormity and a scale to it. But it's something we take very seriously. And we rely a lot on those policies. And we rely a lot on the tools and the product that the Guy builds.

ANDY SERWER: All right, Guy, so the integrity title, tell us what that is. I mean, you're a product guy, essentially, right?

GUY ROSEN: Right, right. So I lead the teams that work on the product and engineering side of what we call safety and integrity. That means all things related to abusive content. So it can be anything from fake accounts to hate speech to spam on the site, and including things like electronic security.

Now, these teams have been around for a very long time. Actually, one of the very first teams that did this was called site integrity. And we kind of owe the name to that original team back in the day.

And back then, though, the teams were pretty small. And the technology was very reactive. We needed people to report content. We needed reviewers to review content in order for us to take it down.

Now, fast forward. In the past couple of years in particular, as a company, we've massively invested in the space. I mean, across our teams, there's now over 30,000 people working on safety and security, all included. And the technology has also come a long way. And we can now be much more proactive, getting to bad content before people report it to us, sometimes before people even see it on the site.

Now, there's a lot of progress we've made. In many areas, most of the content we take down is detected proactively before people report it to us. But there's also a lot of work we still have to do. And what we've learned, especially working together, this crew, in the past few years, is, to do that work, we need to work across these three disciplines really, really closely as we build out these efforts.

ANDY SERWER: You said most of it's taken down proactively. I understood that, actually, a lot of it was surfaced by users.

GUY ROSEN: So we actually publish, even, numbers these days. So one of the things we've done over the past few years is try to take a very methodical approach to how we approach our work on content moderation. And that means measuring how much content is being taken down, how much of it is detected by our systems versus how much is reported by people, and how much we leave up, which is actually the piece that's the most important. How much of it is missed by our systems and by our processes? And that's what we really want to focus on.

And in the report that we now issue-- we've issued it twice a year. I think we've now issued three of these reports-- it shows that for a number of areas-- let's take something like graphic violence. Around 99% of the content we take down is detected by our systems before people report it to us. Same for areas like nudity, fake accounts, even hate speech, which is really hard for a system to understand language and nuance. We went from very little proactive detection just a year and a half ago to-- now, about 2/3 of the hate speech we take down is detected by our systems.

ANDY SERWER: One of the problems, I guess, is that people remember the stuff that gets through, and they don't remember your successes, I guess, by definition, in a way. Go ahead, Monika.

MONIKA BICKERT: Well, one of the things that's interesting about how these teams have to work together for hate speech-- I mean, one of the challenges is writing the right rule. Because people see speech so differently. Then when we have Guy's team detecting this content, they can often tell what might violate. And they're very good at that.

But then, ultimately, we often need a person on John's team to look at it to make that determination. Let's say we're talking about the use of a racial slur. Well, somebody might use that to attack somebody else, in which case we would remove it from the site.

Somebody else could say, this morning, when I was riding the bus, I heard somebody use this word. It was really upsetting. And that's the sort of content we would want to leave on the site. And so the technology has been instrumental in helping us to find this content. And the people are instrumental in helping us understand it.

ANDY SERWER: Right. I mean, ultimately, you do need people to weigh in and oversee some of the machine learning and the algorithms. Of course, we'll get into that in a minute. I want to ask you about the content standards forum where you set these rules. How does that work, Monika?

MONIKA BICKERT: Well, one thing we have to recognize is that, as speech continues to change, and as we have new events in the world, we need to make sure that we are refining our policies to stay current. So every other Tuesday, we gather around the world on a video conference call-- and I'm talking, you know, more than 70 people on this call from different teams across the company-- lawyers, and engineers, and operations specialists, and safety specialists. And we come together to discuss specific refinements that we are considering for the policies.

You know, to give you an example, it might be, should we change the way that we are thinking about this type of hate speech? Or what should we do with images of fetuses or images of capital punishment? And many times, we will have groups outside of the company that feel very strongly about this issue on both sides.

So a big part of the process for us when we come to those meetings is speaking to stakeholders beforehand from around the world, and then presenting those findings in the meeting and saying, you know, here's how much of this speech we're seeing on Facebook, here's how people are feeling about this in the safety community and the freedom of expression community around the world, and here are our options for dealing with it, and then coming up with a decision that makes sense for the entire global community.

ANDY SERWER: John, and you oversee these content moderators. And there's thousands of them now-- maybe 15,000 full-time employees and contractors. Can you talk a little bit about that network of people around the globe and how they make decisions?

I mean, some people have complained. You're asking people like that to make decisions where they really almost need a law degree and to make a decision very quickly. So what is that like?

JOHN DEVINE: Well, it's hard. It's hard at scale. I joke that Guy has the easy job. He takes a lot of easy decisions. We get the hard ones.

But we're proud of that. You know, and you need human beings. We want human beings to be part of that decision process.

What is it like? Well, it involves working with partners and vendors around the globe. So it's, you know, just getting the people there to start building great policies.

We take those policies and are constantly refining them, learning how the reviewers are reading the policies, making them translatable in different languages. We operate in 50-plus languages around the world. So it's a Herculean task, just in terms of language and culture and context.

And so it's an iterative process. And it's grown massively in the past couple of years. I mean, this was really nothing like I was describing a couple of years ago.

We were mostly reactive. This was very small. It's gotten very big very fast.

One of the things I'm very proud of here is the way we're, I think, truly a learning organization. We do an incident review meeting every Friday, where we take-- and there's plenty of them. Because like you said, probability is, you get 99 right. You know, there's always going to be one you can look at.

And every week, we unpack that one. And we look at the policy. We look at the protocols, you know, the training material.

We look at the enforcement, the quality and the training that I'm giving to those people. And we look at the policy and we say-- you know, and the product-- what could we have done better in these cases? And we're constantly iterating that process.

But I think we're building something new here at Facebook. And it's also an industry challenge. This is an industry challenge. So we're working very closely with our peers and our vendors to get that right.

ANDY SERWER: Do you ever consider, John, the adverse effects on user traffic when you're deciding whether or not to implement a new rule? Or maybe that's more for you, Monika. Guy, you're shaking your head.

GUY ROSEN: No.

ANDY SERWER: You don't?

GUY ROSEN: This is just important. As we think about the company, if you even think about how Mark talked about the priorities for the company, how the company's performance is measured that ultimately rolls into compensation for all of our employees, the first pillar in that is to make progress on the social issues that affect the internet and our company. And that is a first level citizen for us, because we know we have to get this stuff right. It is a commitment. And I've heard it straight from the top.

MONIKA BICKERT: The reality is that, long-term, if you want people to come to Facebook and have this place to share, then it has to be a safe place. And we've seen-- you know, history has sort of taught that lesson. And when you look at the internet, if there are platforms that are not considered safe, then people don't want to use them. So getting this right is actually fundamental to building this community we're trying to build.

ANDY SERWER: I want to dive into some specific incidents that are high profile that have vexed you guys or that you guys have had to address, some successfully maybe, some more successfully than others. Let's start with China and the protests in Hong Kong. There was stories that talked about how the Chinese government had used Facebook and other social media platforms to put out news that supported their side of the situation in Hong Kong, and surreptitiously, without disclosing that it was the Chinese government. How did you guys discover that?

MONIKA BICKERT: Well, one of the things that's been important for us in the past few years has been developing partnerships with academics and experts who are looking at this sort of behavior, and also with other companies. And so it was through those partnerships here that we became aware of this behavior. And I would say this is sort of a trend that we've seen in the past few years of different stakeholders, these academics and other companies, working together to try and spot when people are behaving what I'll call inauthentically.

And by that, I mean you might have a state actor, or you might have-- you know, or a political party, or somebody else who just cares about a political outcome trying to use social media services, deceiving people about who they are to try to push a certain political message. So over the past few years, we've built up a team across all of our organizations that's really focused on understanding how to define this sort of inauthentic behavior and how to tackle it. Every time that we are dismantling one of these operations, we're putting out a blog post describing it.

And that's actually what we did in the recent situation in Hong Kong. We put out a post saying, here's what we've identified, and here's what we're moving.

GUY ROSEN: One way to think about it is, it's like searching for a needle in a haystack, especially the sophisticated bad actors who are very good at this. They're constantly refining how they try to evade the different systems. And so the first part of it is the system that we have built at scale to try to tackle fake accounts. We take down billions of fake accounts every year. And then we have teams--

ANDY SERWER: Billions?

GUY ROSEN: Billions of fake accounts every year. Then we have teams who are searching for the needle, for the really sophisticated actors. These are expert security investigators who are looking for the most sophisticated ones, whether it's the Chinese example, or Russia, or Iran, or different takedowns that we've had in the past year.

And they're able to really find and spot, what is the next trend? What is the next kind of evolution in this arms race? How do we find and take down that bad activity?

ANDY SERWER: So with China, you said it was a partner that helped you identify this.

MONIKA BICKERT: Well, often, what we see is that--

ANDY SERWER: What kind of partner is that?

MONIKA BICKERT: --we may have a piece of this puzzle, and other-- we may have a piece of the puzzle, and other companies, or academics, security companies will have other pieces of the puzzle. So sometimes, this is about saying, we're seeing this behavior. Are you seeing this behavior? And trying to kind of match those puzzle pieces together.

And like I said, we've done this not just with this most recent takedown, but also around the world. We've worked with these companies and other industry partners to identify these groups and remove them. It is a challenge.

Because what some people would describe as inauthentic behavior, others would say, well, that's politics, especially if you're talking within a specific country. You've got an election, and it's in a certain country, and you've got one political party that's campaigning. And as part of that, they might try to appeal to other people, and they might say something about candidates using a different voice. You know, where do you draw the line on that?

ANDY SERWER: Right. I mean, just another drill-down point with China, that Twitter took down the state media posts, and you guys decided not to. Can you guys-- do you know about that? Can you address that at all?

MONIKA BICKERT: I don't know if there are specific posts you're referring to. But we do allow media to use our site. What's in protest was that people are being authentic.

ANDY SERWER: Right, OK. I want to go through some other examples. The Odessa shooter, with Beto O'Rourke's campaign manager saying that Facebook allowed a post that linked Beto O'Rourke to the shooter, saying that the shooter had a Beto O'Rourke campaign sticker on his truck.

Of course, he wasn't driving a truck, and he didn't have one. And that post got 34,000 likes, apparently, on Facebook. When did you see that?

Do you let that stay up? I mean, this is one of those things. Is that something you would keep up? Do you know about that one specifically?

MONIKA BICKERT: That specific post, you may be able to address. But we should probably talk more broadly about what we do with misinformation.

GUY ROSEN: I can talk broadly about misinformation. Like, the program that we have, the goal is to tackle viral misinformation. We don't want these things gaining a large amount of traction across our platforms.

And the challenge and the thing that we are really focused and working on is, how do you get to them quickly enough so that we can work with third party fact-checkers, for example, to check and say that this is actually something that didn't happen? And what we do is, then, we demote that content so that it doesn't get as much distribution on our platforms.

JOHN DEVINE: People should be able to say and post things to each other, but we don't want it to go viral.

ANDY SERWER: Right. And I know this was the same thing with the Nancy Pelosi video, that you leave it up, but you mark it, because you want people to have a conversation about whether it's authentic or not.

MONIKA BICKERT: Well, it's a little more nuanced than that?

ANDY SERWER: OK. OK, explain that to me. Because that nuance is lost on people, I think.

MONIKA BICKERT: You know, so I'll start by saying--

ANDY SERWER: Some people.

MONIKA BICKERT: --that if you asked people around the world what they think a private company's role should be in determining what is true and false? You will get very different answers. And misinformation doesn't always look the same.

Sometimes, it is by fake accounts. In fact, we often see fake accounts will try to engage in bad behavior, whether it's spam or sharing misinformation. And so we've gotten really aggressive at tackling that through Guy's team.

Sometimes, it is through-- this is about financial incentives. Most sharing misinformation is about financial incentives. So sometimes this is about finding ways that we can disrupt those financial incentives.

And then sometimes, you get into these little grey areas, where people-- maybe it's being shared by somebody with a real account and a real name. Maybe they don't even know that it's false. And trying to figure out the right response there is something where we are talking to a lot of people and trying to get it right.

So first, what we are doing is working with third party fact-checking organizations. And that's because we don't want to be in the position of determining what is true and what is false for the world. We don't think we can do it effectively, and we hear from people that they don't necessarily want a private company making that decision. Those are the fact-checking organizations that are certified by Poynter of meeting certain standards that we use. And when they rate something as false, that's when we put related articles that are by the fact-checkers giving people the true story next to it.

ANDY SERWER: OK, can something be too false?

MONIKA BICKERT: What do you mean?

ANDY SERWER: I mean, there's a spectrum, right? In other words, over here, there is information that will cause violence against people. You take that down, right?

MONIKA BICKERT: If we have misinformation where a safety partner is able to confirm that it can contribute to imminent or ongoing violence on the ground, then we will remove it.

ANDY SERWER: So there are these grey areas where you have to adjudicate. I mean, that's what your people do. That's what your policies do, correct?

MONIKA BICKERT: Even there, we recognize that we are not in the best position. I mean, the sad reality is, in the places in the world where you are most likely to have on-the-ground violence, those are often the same places where it's hard to have a fact-checking partner or even a safety organization tell us what the real situation is on the ground. We are often not in a great position to make that determination. So we're aggressively reaching out to these fact-checking organizations and safety partners to get that confirmation so that, based on that, we can remove the content from the site.

ANDY SERWER: I mean, you could have violence in the United States-- Ferguson, Baltimore-- with false information here as well, though. It's possible, right?

MONIKA BICKERT: You absolutely could. And that's why we partner with fact-checking organizations here as well.

ANDY SERWER: So with all of these things, you guys-- say, anti-vaxxers, Holocaust deniers, when is it acceptable to use racial slurs, Pizzagate-- all these things that people can spend, you know, months on one issue, you have to address all of them. That's just the United States. Those are just some of the things I talk about in the US, which is only 13% of your content.

So I mean, isn't this ultimately like boiling the ocean-- do you feel like that-- or Sisyphean, even, that it's just such a difficult task? I want to ask you, John, because you're overseeing this global operation. Does it feel Sisyphean?

JOHN DEVINE: Well, it's a task that we think is important. Ultimately, as Monika and Guy said, people come to Facebook because they want authenticity and they want those places safe. I would look at the last two years and say the progress that has been made by this company in writing policies that are effective and building products and tools that are effective and in employing human beings to make good decisions at scale is what we're trying to do.

It's never going to be perfect. But at scale, it's a remarkable achievement. And so it's something I think all three of us are very proud to be a part of.

So it's something-- you know, is it impossible to do? It's going to be impossible to probably get it perfect. But we know, everyday, we're making progress against problems, against adversarial behavior. And so we feel really good about it.

ANDY SERWER: John, you're a real metrics guy. Excuse me one second. So do you have KPIs? Do you have, OK, we had these number of incidents, and now-- But your pie is growing. But you're solving more problems even as the pie is growing.

JOHN DEVINE: Well, I think the most important metrics-- and one of things we've realized as a company is, given that this is a relatively difficult, if not impossible, task, the more transparent we can be about our policies and about our enforcement, the better. And so twice a year, we put together this community standards enforcement report. Guy was referring to some numbers from that.

I think that's a remarkable achievement. First of all, by the way, just measuring things that happen at such a small scale is very, very difficult. We have a lot of human beings just trying to measure something that's happening, you know, in a millionth of, you know, incidents.

But what we've committed to doing is getting more and more transparent about those metrics, about the prevalence of content on the platform, about the rate at which we're finding these things proactively. And I think that's all part of us trying to engender the trust that this platform and its users deserve.

GUY ROSEN: One of the things I think we've learned is, any product you build at scale of the scale that Facebook and Instagram and our services operate, you have to have these objective measures that let you know if you're making progress and let you know where you have the biggest gaps. And especially in the past couple of years, as we've tried to really invest in this space, building out those KPIs, which there's no playbook for in the world-- people know that, if you build a new app, you measure downloads or users.

ANDY SERWER: Key performance indicators, we should say. I started the jargon. I apologize. It was my fault. Key performance indicators.

GUY ROSEN: Key performance indicators. People know, if you're building a new app, for example, then you want to measure downloads or active users. But if you're working on content and keeping people safe, what measures do you use?

And so we have spent a lot of time trying to understand how we measure these issues, particularly so that we can be proactive and focus on the things that are the most important to over 2 billion people around the world, and not just the ones that maybe are in the headlines, not just the ones that people are, you know, getting called about. Because that's what actually is going to have the biggest impact to the most people.

And so we've developed those metrics. We use them internally. We look at them everyday and every week. And literally, the same numbers in my internal dashboard are the ones that we push out twice a year in our enforcement report.

ANDY SERWER: We talked, Guy, earlier this year about this being like crime, that it cannot be eradicated completely. So what is a reasonable crime rate?

GUY ROSEN: I think society is grappling with that, and has been grappling with that for centuries.

ANDY SERWER: I mean, in terms of the Facebook platform. I'm not asking about crime. It was a metaphorical question.

GUY ROSEN: Look, we publish the prevalence rates, which is, how often is someone likely to encounter a piece of violating content on our site? That's the number we need to keep pushing down. We, as is the rest of the industry, are early in this work. And I think we're going to keep pushing that down and seeing how far down we can get. Ultimately, we don't want people to see any of this violating content on our site at all.

MONIKA BICKERT: It's also something where we have to be-- we do want the metrics, but we also have to make sure that we are just keeping focused on the overall goal. You look at something like terror propaganda. Now, the prevalence-- how common it actually is on the site--

ANDY SERWER: What is that, actually?

MONIKA BICKERT: Terrorist propaganda.

ANDY SERWER: Oh, terrorist propaganda. I'm sorry.

MONIKA BICKERT: Yeah, somebody sharing propaganda from a terror group. You look at the metrics that Guy's team produces about how much of this content is on the site. It is very, very low, well under half of 1%.

At the same time, if we just index on that, then that doesn't train us to go looking for this content as hard as we can. You know, we want to make sure that we are working with all of the people across these teams to make sure that we are thinking about these problems holistically. It's not just about proactively identifying. It's about making it easy for people to report. It's about, you know, reviewing it accurately. It's definitely a partnership.

GUY ROSEN: And just to correct the number, it's under 0.03%.

MONIKA BICKERT: He's good with the percentages.

ANDY SERWER: OK, yeah, good.

JOHN DEVINE: I would also-- you asked a question like, how good is good enough, or is there a clearer definition of truth? You get down to the finest percent, and it's so hard. You know, I could share pieces of content that we review, and the four of us could look at it, and two of us would say it's violating, and two of us would say it wasn't.

And the same is true of a room of 100 people. You know, you get to a point where there is some subjectivity. But the good thing is, I think we're isolating more and more in on that very fine percentage.

ANDY SERWER: But you just-- in a way, you guys are saying, well, it shouldn't be up to a private company to adjudicate this stuff. But in a way, aren't you adjudicating this stuff?

MONIKA BICKERT: No, I was saying that about misinformation. So there is a big difference to us between somebody who is, you know, sharing terror propaganda or sharing an image of child sexual exploitation. Those things we have clear rules about. And we say, do not share this content, and we will remove it.

When it comes to misinformation, it is very difficult to determine often what is true and what is not. And sometimes, the person who is sharing the content, even if it is completely false, they may not know that. And so taking the step of removing content and saying, you know, you've misbehaved on Facebook, you can't share content for a day, or something like that, is not the right response.

ANDY SERWER: All right, two examples that I want to get into-- Facebook Live and the woman who shot a man on Facebook Live just in August of-- in late August, she just pled guilty to shooting someone on Facebook Live. So how do you guys police that, essentially, Facebook Live?

GUY ROSEN: When we build any product across the company, we try to think about, what are the abusive use cases, what are the bad things that could happen, and what protections we can put in place. And Facebook Live is absolutely part of that. There are limitations to what artificial intelligence is able to do in real time.

ANDY SERWER: Real time?

GUY ROSEN: And it's something that we're constantly working to improve. Now, with Facebook Live, some of the key things we've done in the last year are improve how easy it is to report things to us and make sure that we are accelerating. When something is reported and it is broadcasting live, that goes to the top of the queue, so that John's team can look at it fast.

Because we know, if something is a live broadcast, there could be some real-time harm happening. There's an opportunity for us to actually help someone who might be in need.

JOHN DEVINE: Which we do a lot.

MONIKA BICKERT: Yes.

ANDY SERWER: OK, I'm going to ask you kind of a silly one. But this is real. And this happened to a friend of mine.

He put a picture up on Facebook. He was with his wife. He put a picture up on Facebook of him kissing a lobster.

He said, can you believe I'm kissing this lobster on date night? Men are pigs. It was flagged for hate speech and removed. He appealed and didn't hear back.

And you know, this gets to the volume. Because of course, if this rose to your attention, you would say, oh, that's silly, I guess. Would you say, that's silly, leave it up, first of all?

MONIKA BICKERT: Actually, so first of all, to address the appeals, he should hear back on the appeal. And that's something that we have built out in the past year and a half. And anytime somebody disagrees with one of our decisions, they can ask us to revisit it. And John's team does revisit it, and we get back to them.

In terms of hate speech, look, one of the challenges here is, it's really hard for us to know the context of why a specific person says a specific thing. So we write rules that are very objective. And one of those rules is, if you are going to talk about people by what we call a protected characteristic, like race, religion, or gender, if you're going to say all these people-- so all people of this religion are scum, or, you know, these people don't belong on our planet, or whatever the situation is, that's something that we would classify as hate speech and remove it. Now, there may be times that people are using it--

ANDY SERWER: He was joking.

MONIKA BICKERT: There may be times that people are using it as jokes. As John can tell you, determining--

JOHN DEVINE: Satire is very difficult.

MONIKA BICKERT: Yeah, determining what is humor--

JOHN DEVINE: Satire is very difficult, yeah.

MONIKA BICKERT: And another thing when it comes to-- it's not just hate speech. It's also bullying and harassment. Sometimes, people will use coded language. Sometimes, I may say something to you, and it may sound like a joke to me, but you may feel bullied by it. So trying to get those lines right is one of the challenges that we will always have sometimes.

JOHN DEVINE: Sometimes, it's self-referential. Sometimes, people are reclaiming those terms that others would find hateful and, you know, embracing them with pride. That's the distinction that requires human judgment and context. And you can't get it right every time. But that's exactly what our pursuit is.

ANDY SERWER: I mean, there are innocent people, like my friend here. But then there are also people who are working proactively to try to circumvent the rules and your policies-- the bad guys, right? I mean, how do you-- and that's sort of like-- in a way, it's akin to law enforcement. Because the bad guys are working it, and you guys are there. And by definition, a little bit, you're reactive, right?

MONIKA BICKERT: It's one of the reasons-- well, in some ways. It's one of the reasons the feedback loop between these teams is so important. Just a quick example-- regulated goods. We don't allow-- you know, if you wanted to sell a gun, you couldn't come to Facebook and say, I'm going to sell this gun.

Well, one of the things our review teams flagged was, we are sometimes seeing people say, here's a can of Coke, and it costs $500. And then there is a picture of a gun in the background. That's the sort of thing that John's team or Guy's team will recognize and flag to us so that we can evolve the policy.

ANDY SERWER: Right. What are some of the most important new content policy moderations that you've adopted. And this, by the way, pertains to all the different platforms at Facebook, right, like Instagram and WhatsApp, right?

MONIKA BICKERT: A lot of what we've been trying to do is make sure that we are tackling misinformation in new ways, and also that we are getting it right with hate speech. Hate speech, like I said earlier, it's really contextual. And people around the world see it very differently.

So those are two areas where not only is the feedback here important, but we also talk to groups outside of Facebook all the time. You know, the content standards forum that we talked about-- every two weeks, we get together-- part of that process requires us to talk to groups from around the world to understand exactly what they are seeing. How is this sort of hate speech looking on the ground in Kenya versus in South Korea, so that we can make the right decision and give the right guidance to our teams?

ANDY SERWER: One thing we haven't talked about is how involved Mark Zuckerberg and Sheryl Sandberg are with the work that you do. So I'll ask you, how often do you guys talk to them about your work, and how often do things rise to their level?

MONIKA BICKERT: Very often.

JOHN DEVINE: Once or twice.

ANDY SERWER: OK, all the time? OK, go ahead.

JOHN DEVINE: Probably 50 times a year or so.

GUY ROSEN: No, this is really important. You know, one of the days I really remember is back-- this is now almost two years ago, when we were starting to scale up this operation. We were going through a planning process for what the sizes of different teams would be. This was for 2018.

And the way the process typically works at a company like Facebook is, different teams have different sort of asks of, hey, if we had this number of people, we could do these things. And at Facebook, ultimately, you know, Mark and Sheryl and the team sit down and sort of try to allocate, you know, where are the engineers going to go? And just as the process was kicking off, I get a note from Mark. And he says, hey, before I even start it with anyone else, how much do you guys need?

And that, for me, was really that point when it was clear this is going to be such a big growth area. This is such an important thing for us as a company. And it resulted in all this growth that we're seeing.

ANDY SERWER: When the CEO says, what do you need, that's a good start, I guess, right?

GUY ROSEN: It's a good start.

ANDY SERWER: What about things on an ongoing basis? What things have risen to their level? How do you make that call?

MONIKA BICKERT: On my team, any time that we're dealing with something-- and this is in coordination with John's team. But any time we're dealing with something that is close to the line, or it's something where it's not really clear how the policies apply, or it's something that's particularly important, we will, at the very least, send an email up to Mark and Sheryl so that they know what's going on. Very often, we will end up having a back-and-forth with them about why we're making the decision we're making and make sure they're OK with it.

ANDY SERWER: Did the Nancy Pelosi thing rise to their level, for instance?

MONIKA BICKERT: With anything that is, you know, very big that a lot of people are talking about, we will absolutely loop them in.

ANDY SERWER: Loop them in.

MONIKA BICKERT: Yeah.

ANDY SERWER: OK.

JOHN DEVINE: I would say, at a minimum, on a weekly basis, we're all sitting down, the three of us, as well as a group of other people and Sheryl and Mark, and going over some of our most important topics that week to check to see, are we getting it right? If we're not getting it right, why are we not getting it right? Mark incredibly involved in some of I, think the, deepest, hardest especially product issues that we're looking at right now.

I guess I won't go into details. But the involvement is very deep. And I think that leadership is-- you know, for me, I joined the company a year ago. I mean, you come into the company-- I'm sure, from the outside, people wonder, how does it work? How are the people? The leadership is very involved. And I think, you know, just the people-- the three of us sitting here, it's a team that I feel personally incredibly impressed by, in terms of the values, the intellect, and the intent, you know, to do these very hard jobs and get them right.

ANDY SERWER: Last spring, Guy, you and Monika had a dinner with Mark with some academics to discuss problems on the platform-- or issues, maybe opportunities. What were some of the things that the academics raised, some questions?

MONIKA BICKERT: I don't think we can go into specifically what they shared, because that was at a private dinner for them. But I will say that sort of interaction is really common. I mean, we have-- for my team, in the course of a week, we are sitting down with stakeholders like that many times and getting feedback on the policies.

GUY ROSEN: One of the biggest things that we've learned, I think, over the past couple of years is, it's not enough to just do this work. But we need to be really open about it. We need to be willing to open up to sit down with academics, with experts around the world, to hear what they think we should be doing, what they think we're missing, where we can improve, how this might look like in five or 10 years. Because ultimately, this is just bigger than just what one company or one team needs to do. This is something that, across society, we're all grappling with. So it's critically important for us to get these external perspectives.

JOHN DEVINE: I would agree on that. I would just say this feels like a civic endeavor here. Monika talked about her background. And you know, for me, I spent 12 years in the military. This feels like a civic endeavor that goes well beyond the walls of this building, and something that, you know, we take very seriously.

ANDY SERWER: Well, I'm glad you said that, John, because it provides a nice segue again into politics-- what a great topic-- and the elections and politicians, which, of course, is also something that you guys are very much connected to. So I'm sorry to bring back the Nancy Pelosi thing. But so you would put warnings on any doctored pictures of any politician or not?

GUY ROSEN: The goal of our misinformation policy more broadly is to find the things that are not what they actually make out to be, that are false, and ensure that we put the right warnings on them, and we make sure that they don't go viral on Facebook.

ANDY SERWER: Right. OK, Donald Trump has criticized Facebook for having an anti-conservative bias. And I think there was a Facebook-commissioned study that, to an extent, backed up that claim. Some of the third party fact-checking groups are considered left-leaning by conservatives, and the site has removed pro-life ads. Is Donald Trump right?

MONIKA BICKERT: So let me just say that, with the Kyl report, which is what I think you're probably referring to, that was the result of a back-and-forth, and us sort of showing them what we're doing, and us saying, we would love to hear areas where we think we can get better. So when it comes to pro-life ads, that was specifically about a policy that we've had for a long time saying that we would not allow ads that showed people, basically, with medical tubes in their body. And a reason for that is because, when it comes to ads, where we are actively putting content in front of people that they didn't necessarily choose to see, we want to make sure that they're not being unnecessarily upset.

This policy we've had on ads for a long time. We don't allow people to share images in an ad that show somebody with medical tubes in their body. And that's because if somebody's seeing an ad on Facebook, we don't want them to be surprised or unnecessarily upset.

But what we were hearing from pro-life groups was that this was a really important part of them getting out their political message. So we refined that policy, in part because of some of that input. Now we're looking to see, how was the ad more broadly intended, and are there signs of distress? You know, if not, if this is something where they're saying, you know, this baby survived and is living, and it's a wonderful thing we should celebrate, and there happens to be a medical tube in that body, then that's something we'll allow.

ANDY SERWER: OK, what about the anti-conservative bias, Monika? Does Facebook have an anti-conservative bias?

MONIKA BICKERT: We do not. One of the things that you will--

ANDY SERWER: How do I know that? I mean, that's proving a negative, right?

MONIKA BICKERT: One of the things-- well, I'll tell you one way. One thing that you could do-- if you look at our community standards, we took the step a year and a half ago of publishing the details on how we define things. A big reason we did that was because we were hearing from different groups-- not just conservatives-- I mean, groups all over the world with different stances-- saying, I need to know more about how you're making these decisions. Because how do I know that they're not being made unfairly?

So we published the details of these policies. We talked to more than 100 groups before we did it to make sure that the way we were articulating our policies was sufficiently detailed and easy for people to understand. And when John's team members are looking at a piece of content and making a decision, we want people to understand, they're not applying their own subjective beliefs about what they think should be on the site. They have very granular rules that they have to apply. And those rules are now public for people to see.

ANDY SERWER: Why does Ted Cruz think that you guys have an anti-conservative bias, Guy?

GUY ROSEN: I think the point here is, as I said, we need to be open about how we do this work. These are big systems that process a lot of information. And people from the outside don't quite understand.

And it seems it can be concerning. And that's why there's been so much effort into publishing the details of the guidelines that our reviewers use, into publishing the numbers of the takedowns we make, the numbers of the appeals that we get and how we correct them, so that we can help to bring in more people. And we've even had people-- whether it's journalists or academics-- come and join some of our meetings, so that we can get more of that word out and create more transparency into how the systems work and where mistakes are made-- and mistakes will be made-- and making sure that there's just a better understanding of how this works at scale.

JOHN DEVINE: I would also say it's not just conservatives who might have a hypothesis about what happens here and how we make decisions. It's hardly only conservatives. I mean, we work with brand advertisers who worry about brand safety. We've heard supposition that we don't care about terrorism, we don't care about crime. All of these things we do care a lot about. Any one person can find an example of a decision or a piece of content that can validate their hypothesis.

ANDY SERWER: But that's an important point. But that speaks to just the incredible difficulty that your job presents you with. And is it possible even do it?

GUY ROSEN: I think, broadly, there's two kinds of problems we deal with. There's the ones where I think people by and large agree on what should be done. And then we should be measured on, are we doing the job well enough? Are we getting rid of terrorism, of content that's abusive to children on the site? And then there's a big class of problems where reasonable people will disagree as to the outcome.

ANDY SERWER: How do you-- yeah, but so you want to have-- so your audience is 2 billion people. And how is that possible to do?

GUY ROSEN: That's why we need to be open about it and make sure that we're telling people why something is the way it is, what happened along the way, whether it's an individual who uses our service, giving them an option to appeal, explaining why we took something down, which, two years ago, we didn't. We just said, we took down your content. We didn't even say, was it for hate speech, or for nudity, or for spam? And so people would jump to whatever conclusion they happened to have. Making sure that we're transparent both to people, but then also to all the experts in the space.

ANDY SERWER: Is there one global standard, or are there standards around the globe?

MONIKA BICKERT: There's one set of global standards. And that's because this is a community where we want people to be able to share content across borders. But of course, that creates challenges.

I will say they're not-- so I agree very much with guy. I think that, if you think about terrorism or child exploitation, everyone sort of agrees. When you think about misinformation and hate speech, which are the two areas I would point to as being particularly challenging, a lot of people have reasonable belief that differ with one another. It is very hard for us to draw those lines.

ANDY SERWER: But don't you have different standards in countries like Saudi Arabia or places like Myanmar than you do in the United States?

MONIKA BICKERT: We have one set of global policies with the caveat that if a-- well, two caveats. One is, if a country asks us to restrict something because it is illegal speech there, then there is a separate process. That's outside of our community standards.

That's a process where we will look at their legal request. We have our legal team helping us with that. And we may restrict the content in that jurisdiction only. It would still be visible everywhere else.

The other caveat is that, even though our policies are global, we try to make sure that we have the local context to implement them right. So Myanmar-- hate speech looks different there. The words that people use look different there.

And so we have one hate speech policy. It's global. You can read about it. When we are working with our operations partners to define exactly what they should be looking for, we have Myanmar-specific context for them.

ANDY SERWER: Would you ever feel that you were presented with a request by the Saudi government, for instance, that would violate either American laws or American values?

MONIKA BICKERT: For us, it's about three things. One is making sure that the legal process is correct. Two is making sure that the content actually is illegal. So we will actually talk to our own outside counsel and say, this government is saying this is illegal. Is it illegal?

And then the third thing is making sure that this is consistent with international norms and human rights. And if we think that something that's not consistent, we will push back. And we do have instances where we've said to governments, we are not willing to block this speech. We publish in our community standards-- or in our government request report, we publish numbers on when we have restricted content in countries based on legal requests.

ANDY SERWER: Does your model or mission sort of fly in the face of sort of the zeitgeist of our time, meaning that you guys are trying to create a global community while, at the same time, we're seeing this rise in nationalism and tribalism? Does it ever feel like you're swimming against the tide that way?

MONIKA BICKERT: I mean, one of the things that we see about Facebook is that it's actually very positive for most people. And so, look, our jobs are really focused on, day to day, looking at abuse and trying to make sure we're removing it. But when I talk to people in the community about how they use Facebook, this is a place where people do build very positive communities. I know it's been that for me personally. And so we don't want to lose sight of the fact that most of the experience of most of the people on the site is that.

ANDY SERWER: Right, OK. Let's talk about the election of 2016 and the election in 2020. Obviously, you guys have spent a lot of time unpacking, if you will, what happened in 2016. What went wrong?

GUY ROSEN: So if you think about how we approach elections broadly, not just in 2016, how we think about the work we do going forward, it's evolved a lot in the past few years. So 2016, not just us, but the security, the intelligence community, everyone was focused on accounts being hacked, content being leaked, things of that nature. And that's what everyone was looking for.

And I think all of us and the world have learned that there are new types of threats. There are governments that are looking to influence public opinion to distribute information. And a lot of what we have done and built in the past few years is trying to address holistically all the range of problems that we see across elections.

And we've implemented this not just for 2020, but for the 2018 midterms, for India and Indonesia and the EU parliament, who had really big elections earlier this year. That's hundreds of millions of people who use Facebook who went to the polls. And we felt we had to make sure that we were bringing our best foot forward and that we were building those defenses.

At a high level, all the things that we think about are, first and foremost, fake accounts. Like, Facebook is really about authenticity and making sure that we are building the defenses that can take down fake accounts at scale and find a needle in a haystack-- the sophisticated actors who may be trying to distribute content and aren't who they say they are. It's about bringing transparency to advertising.

If you want to run an ad that talks about a political or social issue, for example, here in the US, we need to make sure that you are who you say you are. We ask for ID. We will verify and we will show who is actually running the ad. In the same way, on TV, you may say--

ANDY SERWER: These are changes that you guys have made subsequent.

GUY ROSEN: These are changes that we have made. Then we have our work on viral misinformation, which we've spoken about. And perhaps most importantly is the collaboration across the industry and the law enforcement and security community.

Because the bad actors on the other side, they're not looking at one platform. They're looking at everything. And we need to make sure that we are all talking and that there is information that we as a private company don't have and don't know. We need to make sure that we are working together to make sure that we're ready for that.

ANDY SERWER: So will misinformation be less of a problem in the 2020 election on Facebook than it was in 2016?

GUY ROSEN: We've already made a lot of progress on misinformation. And I think we've seen in the past year that has borne fruit. There's always going to be continued challenges. And it is our responsibility to make sure that we are ahead of them and that we are anticipating, what are the next kind of challenges that bad actors are going to try to spring on us?

ANDY SERWER: I was going to say-- I mean, you have this confirmed organization label. Won't people just try to get around that?

MONIKA BICKERT: People will try to get-- any security measure you put in place, people will try to get around it. I mean, that is a big part of all of our jobs, is understanding what's next.

GUY ROSEN: And it's very difficult. You know, it's a presidential US election. It might seem very simple to figure out who's paying for an ad, who's running for an ad, and to make that transparent.

It gets more and more difficult as you go deeper into the elections process. And certainly, you go to other countries, and the infrastructure to certify somebody as a kind of valid advertiser or connected to a campaign, it becomes more difficult. But just like we said before, the fact that it's hard isn't preventing us from doing it.

You know, in each of these elections-- there's been, as I mentioned, India, Indonesia this past year, Canada, Australia. Each of these elections has been a process by which we are continuing to refine and improve our toolset as well as our processes.

ANDY SERWER: John, there was a kind of high profile article that said conditions for content moderators were bad. There were problems. People were engaging in bad behavior. Is that true?

JOHN DEVINE: First of all, we take those articles and those concerns very seriously. As I've said, we've dramatically increased the scale of reviewers we have globally. And so any one reviewer who's having a difficult or bad experience as a worker, we take that as something that is essential for us to get right.

Here's what I would say, is, we talk to-- I spend a lot of time with our vendors, with our reviewers. That's not the experience at large at all That's not the experience at large at all.

They are very proud of the work that they do. I talked before about this kind of, you know, almost like a first responder type of culture, you know, civil servant type of culture. They're very proud of the work that they do.

They work-- we have made a real point to give them support. That means counseling. Any of our reviewers can talk to a counselor.

We have counselors on-site during every shift. They can talk to counselors by the phone. We give them training.

Resiliency is a very personal thing. We all have different standards for what we need and different exercises that help us feel healthy. We've addressed our incentive systems.

But by and large, the things that create stress for these workers is the pressure of getting it right. All the questions you're asking us, that just cascades down to the front line workers. And they feel-- because we're asking them to get these decisions that right at scale, they feel a lot of pressure to hit their quality scores.

We check their work. We check the checkers. And so what have we done this year?

In addition to putting in place all of these support mechanisms, training, et cetera, changing the incentive, you know, we're looking at a balanced scorecard, where we're helping them with resiliency. We're balancing quality and effectiveness along with well-being.

ANDY SERWER: And you're still scaling up. Are you adding more people? How many people would you be adding?

JOHN DEVINE: Well, we know it's not a public number. But we've grown significantly in the past two years. And we are going to continue to grow as needed to hit the proactive phenomenon that Guy described. Going from reactive to proactively finding these things just requires people to be involved.

ANDY SERWER: Going back to the-- go ahead, Guy. I'm sorry.

GUY ROSEN: Just to add, well-being of our reviewers is also something that we think about on the engineering side. Because there are lots of things we can do in the tools themselves. So if someone is reviewing a photo, for example, that has some violence in it-- maybe it's a very graphic photo-- there's little things like, maybe we can make it black and white.

If it's a video, you can mute the audio. You could blur a face. You could even blur the whole photo.

Something we all see, even when we sit in some of these meetings and we go over some of these problematic cases, is, you put something up on the screen, everyone sees it, and then you have a discussion for a few minutes. Now, our reviewers go through that all the time. They may see a photo, and then they need to go and consult the policy and make sure they're getting it right.

When you're doing that, you can blur the photo in the meantime. It doesn't need to be staring at you in the face. And I think those are really important things that our teams are working on to make sure that John's team has the tools they need.

ANDY SERWER: So that's a product feature for the people who work for Facebook. My understanding is, you have a new feature for users, your audience-- a vaccine pop-up. Is this something you guys just announced, in terms of being able to block things that people consider to be misinformation? Am I correct in this?

MONIKA BICKERT: So we didn't just-- we made an announcement months ago. And now we are executing on that.

ANDY SERWER: Executing.

MONIKA BICKERT: We're rolling out some of the features that we announced earlier. But yes, when it comes to specific hoaxes that pertain to vaccines, we now have worked with the Centers for Disease Control and the World Health Organization to have content that, if somebody-- in the specific situations where somebody might encounter one of those hoaxes, we will actually put this information from the CDC and the WHO right there for them to see.

ANDY SERWER: Right. What about this notion that your business model, obviously, is based on engagement, and that a lot of people have suggested that emotional posts make people engage more and also create anxiety? I mean, this whole thing. And then it gets back to the Like button, which you are testing, in terms of reducing that. Is that something that's important to you guys?

MONIKA BICKERT: Again, we have to focus-- for the health of the company, we have to focus on the long-term here. And we want people to be in a place where they actually feel comfortable connecting with themselves. And that overwhelmingly just means, they've got to be safe.

A big part of that is control. So if you come to Facebook, we want you to know that you can have the experience that you want to have. You can interact with a small group or a big group. You can post and have only your best friend see it. Or you can post publicly.

You can block people. You can unfollow people. Understanding those controls is critical. And we actually-- this is not our job, but we have a team that is focused on making sure that those controls are easy for people to understand, and, every once in a while, saying, you know, knock, knock, and kind of putting that up in front of them again so they can check their settings.

ANDY SERWER: Similar with this Off-Facebook Activity feature?

GUY ROSEN: It's all about control and it's all about transparency. People need to understand, whether it's what information we have, whether it's how the system works, whether it's being able to control the things they see in their feed. Those things are what's really important to people.

ANDY SERWER: OK. Not too many features, too much functionality, it makes it difficult to use.

GUY ROSEN: There's a balance here. This is exactly where-- as you build product, and as a guy who's, like, sitting and trying to build a bunch of these products, it is striking the balance between giving that transparency, but layering it so that people can see just the controls and transparency that they want. But I think, overwhelmingly, what we've heard in the past years is, people want more control. People want more transparency. They want a better opportunity to understand.

And ultimately, I think that helps hold us accountable. And it helps people keep us in check. And I think that's really important.

ANDY SERWER: Maybe a little out of your purview, but I want to ask, with just integrating Instagram more into the Facebook ecosystem, is that something that's happening? And why are you guys doing that?

MONIKA BICKERT: The policies that we have and that we're working to enforce do generally apply across Instagram as well, with a couple exceptions. Like, for instance, on Facebook, we interpret authenticity differently. You have to use your real name.

On Instagram, you still have to be authentic, meaning you can't try to pretend to be somebody else. But we don't require that you use your real name. But generally, the two products have the same policies.

There are some differences in the way that people use Instagram and Facebook. And so, you know, any time that we're coming up with the policy, we have to be really mindful for, how is that going to look for Guy's team as they are trying to identify this sort of content, or how's it going to look for John's team as they are trying to understand, you know, whether this is impersonation, or what the context is like for making their decisions?

ANDY SERWER: Is Europe particularly difficult for you guys to do business in, given that regulators seem to be much more keen on regulating your company and Google and others?

MONIKA BICKERT: No. I would say that the challenges we face in terms of crafting the policies, there is not one area of the world that stands out to me. Actually, one of the hardest things for us is if there are changing circumstances.

You have an area that's in conflict. You have an area where people are using language in a new way. You have an area where there are many different languages.

And sometimes, that makes a place like the southern Philippines or India challenging. You just have so many languages. Those are the areas where it tends to be tougher for the three of us to make sure we are doing the right thing.

ANDY SERWER: Do you guys feel like you're under siege, attacked by Washington, the media?

MONIKA BICKERT: Our job is-- I mean, we want to be in these jobs. Our job is to make sure that we are keeping the community safe. And look, as you've heard from us, we know this is never going to be done. It's never going to be perfect. But we are passionate about this. And it is-- I mean, it's why I'm here. I wouldn't want to be--

ANDY SERWER: Is it fun?

MONIKA BICKERT: I would not--

ANDY SERWER: Is your job fun?

MONIKA BICKERT: The job is fun. The job is interesting. The job is also very difficult. And it's new everyday. I mean, but I think that if we didn't have this passion for it, then we wouldn't be here.

GUY ROSEN: I think people care about the work we do. And that's actually-- that's inspiring. This stuff really matters.

And even as I think about the people we've hired as we've grown in these spaces over the past couple of years, all the attention and all the headlines and all the criticism has actually helped us get some of the best people. We're talking about, you know, engineers or data scientists who can work on some of the toughest problems that society has grappled with around harmful content, around misinformation, around election security. And they're drawn to come to work and try to address some of these really thorny challenges, because it's just really important.

JOHN DEVINE: I totally agree with this. I think everybody that's here-- certainly for me, the reason why we're excited about this job is because it matters. I mean, there's a few billion people who use this platform. They use the platform because it's an important tool in their daily life. And they get real, real benefit from it.

And so therefore, the work that we're doing here really matters. And I think the idea of being under siege-- you know, people care. That's because people care. So that means-- that, to us, is a source of gratification.

You know, again, I joined a year ago. And it's just been heartening to see-- initially, people would be, why go there? We've managed to just attract, I think, some great talent across the board. Because this is a task that matters. And it attracts, I think, people that want to do good.

ANDY SERWER: What do you think your job is like? So in other words, if I work in an aluminum factory, it's like working in a steel factory, only we use a lot more electricity. So I know your platform, and it's never been done before. It's singular. But you've all had different jobs, at least some of you.

What's it like? What is this job like? It's like doing what.

MONIKA BICKERT: Well, there are certainly parallels to my former job. At a high level--

ANDY SERWER: As a?

MONIKA BICKERT: As a criminal prosecutor. At a high level, I would say, for a lot of people on my team, they went from jobs in safety, often in government, law enforcement, or nonprofits to jobs in safety. So for a lot of them, this is sort of like more of the same.

I think one of the differences is the enormous scale. So when I think about being a criminal prosecutor, any given case was a handful of people that I was worried about. Now, everyday, I am thinking about, what are the issues that may emerge over this community of 2 billion people around the world, and how do we make sure we're doing the right thing? So it is a very broad scale. But the high level is the same.

JOHN DEVINE: I feel a lot of analogies, as we've mentioned, to government and service. For me, personally, I think the dynamic problem-solving and the technology is just-- it's an incredible place to be. The evolution of machine learning and AI going hand-in-hand with human beings is a fascinating threshold to live and work at everyday. But the most important thing is, it's for a good purpose.

ANDY SERWER: I mean, do you think the government would like to hear that you're kind of like the government? And Guy, you're working with the government maybe sometimes, in terms of election security and having those conversations. What does the government say about Facebook?

GUY ROSEN: I think we're just trying to work with all the right players across the ecosystem, whether it's the industry, whether it's the government, to make sure that we are sharing this information. This is the way that you operate a system of scale the way that you think about how you operate on content how you keep people safe is, I think, something that all of the companies that operate in this space need to learn from each other.

I think that we sit down with governments and we share, this is how our systems work. This is how content is reported, how it's reviewed, how our systems work. I've personally spent time with governments helping to walk through some of those systems so that people can understand it.

Because governments have also talked about regulation. We've talked about regulation. We think there does need to be a regulatory standard for how to deal with content, right?

And the world is trying to figure out what that looks like. How do you set benchmarks? How do you measure what is effective? And for us, making sure that we're out there, and we're opening up, and we're sharing everything we've learned and what we think sets up the right kind of measures and incentives is exceptionally important.

ANDY SERWER: We're running out of time. But you know, there's GDPR and then there's the state rule here in California and other states proposing those kinds of regulations. Are you looking into that?

MONIKA BICKERT: Absolutely. I mean, we think that-- like our founder has said, we think that regulation is an important component and one that we want to be a part of. I will say, in terms of, you know, how we work with governments and are we like governments, there's a real big difference between us and a government. And that is, you don't have to be on Facebook.

You know, if you're a citizen of a country, you may not have a choice about that. You always have a choice whether you don't want to use Facebook. That is something that is in our minds all the time. We want to make sure this is a place where people want to be. That is--

ANDY SERWER: A lot of people say that, you know, they sort of don't have a choice, because in places like Myanmar, that Facebook sort of was the internet.

MONIKA BICKERT: Around the world, we do see that Facebook is more important in certain places. And that's something we take really seriously. I will say-- like, you look around the US. The average American has more than seven social media or communications apps on his or her phone. So this is a place where we know that people do have a lot of choice. We want our community to be one where everybody feels welcome and where people want to come and share themselves.

ANDY SERWER: So I want to ask you about your external oversight board. This is something that Mark announced-- I guess it was last year-- in terms of an independent board overseeing content policies, I guess in specific proportions.

MONIKA BICKERT: In specific decisions, yes.

ANDY SERWER: OK. Can you talk a little bit about that and about where that stands right now, Monika?

MONIKA BICKERT: Sure. One thing that we heard from people over the years was that they wanted more insight into why we were making decisions. And so the first step that we took-- and this was about a year and a half ago-- is, we launched appeals so that, if people disagreed with a specific decision, they could ask us to revisit it.

We also built out more detailed messaging about why we took the actions we took. And we started publishing reports and blog posts and other things to give people information. But we also want to make sure that it's not just us, that people know that there is some sort of oversight.

And so we are creating a council, an oversight board to which people can appeal. If they go through Facebook appeal and they don't like our decision, they can appeal and ask this board to take a look and make a decision about whether or not that piece of content should be on Facebook. And that decision will be binding.

One thing we've been hearing-- I mean, this has been a very consultative process that Guy can talk to. But one thing we're hearing is that it's really important how we do this, how transparent it is, and how it works. And I don't know if you want to elaborate.

GUY ROSEN: Yeah. As we build out this mechanism, there's a lot of questions we had. Who should be on this board? How does it select cases?

ANDY SERWER: Are they paid by Facebook? How about that? Are they paid by Facebook?

GUY ROSEN: Who pays them and so forth? Those are all those questions we had going into this. And we resolved to take a very consultative approach.

And we've spent, really, all this year. We've organized six big workshops, over 20 roundtables, different places around the world, with hundreds of folks that are experts in freedom of expression, and internet, and governance, and digital rights. And we published a draft charter in January, which said, this is a potential outline for this, but these are really questions.

And we spent time understanding, what are the different nuances? What are different ideas that people have? And one of the questions that came up was, how do we make sure this board is independent? How do you make sure that it can actually exercise independent judgment?

And isn't a rubber stamp that the decisions are binding to us and that Facebook can't just go and remove a member because we might not agree with a decision? So for example, as a result of that consultation process, one of the things we've realized is that we need to create some sort of independent entity that we pre-fund. Because we do need to make sure the board has the facility to operate and to compensate people for their time. But we don't want to be involved in any of those decisions on an ongoing basis. Because we need to make sure that they can be truly independent and they can overrule us.

ANDY SERWER: So it's a pre-funded pool of money to pay for them?

GUY ROSEN: That is one of the ideas that we have been working through as we've gone through this consultation process.

MONIKA BICKERT: One of the other things we've been hearing is that it's really important to people that the decisions are transparent. And that's important for people out in the community to see. It's also really important for us.

Because you know, I've spoken about how we talk to groups around the world to try to make sure our policy lines are in the right place. This is one more vehicle for getting some external feedback on-- you know, you can write a policy line that sounds great. But then sometimes, when it's applied to specific pieces of content, it seems like it's completely wrong. And this is a way that we will hopefully get some of that feedback that can help us build better policies.

ANDY SERWER: So you do have a charter that you're releasing now?

MONIKA BICKERT: We will be releasing a charter that will explain basically how the board is going to work. We are still in the process of figuring out some of the specifics and exactly who will be on it. But this is something that we are hoping to have people in place for the board.

ANDY SERWER: OK. Any idea how big the board will be?

MONIKA BICKERT: We're still working through some of those details.

ANDY SERWER: OK, great. All right, Monika, John, and Guy, thank you so much for your time, all of you.

GUY ROSEN: Thank you.

MONIKA BICKERT: Thank you.

JOHN DEVINE: Thank you.