Mark Zuckerberg: Facebook Will Hire 3,000 Staffers to Review Violent Content, Hate Speech

Facebook CEO Mark Zuckerberg, after recent incidents in which people have broadcast violent murders and other graphic content on the social service, said the company will increase the size of its staff that reviews flagged content by 66% in the next year.

The exec, in a Facebook post Wednesday, said the company will add 3,000 people to its community operations team around the world — in addition to the 4,500 it employs today — to review “the millions of reports we get every week, and improve the process for doing it quickly,” he wrote.

“Over the last few weeks, we’ve seen people hurting themselves and others on Facebook — either live or in video posted later,” Zuckerberg wrote. “If we’re going to build a safe community, we need to respond quickly. We’re working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down.”

In one of the most disturbing events involving Facebook video recently, a Cleveland man on Easter Sunday (April 16) uploaded a clip showing himself shooting and killing a random 74-year-old man on the street; two days later police found the suspect, Steve Stephens, dead in his car after a multi-state manhunt. And last week, a man in Thailand used Facebook Live to broadcast himself killing his 11-month-old daughter before committing suicide.

After the Cleveland homicide video, Facebook said it was “reviewing our reporting flows” to be sure people can report videos and other material that violates our standards as easily and quickly as possible.

Clearly, the company realized that process improvements and better tools alone weren’t going to be enough, leading to Zuckerberg’s announcement of staffing up to address the problem.

Zuckerberg said that in addition to catching violent acts like murders and suicides, the additional reviewers Facebook plans to hire also will help improve its ability to remove other content the service forbids, including hate speech and child exploitation.

At the same time, Facebook is continuing to build tools to make it simpler to report problems as well as speed up the time it takes for the reviewing teams to determine which posts violate Facebook’s standards, Zuckerberg said. It’s also working to make it easier for Facebook staff to contact law enforcement if someone needs help.

According to Zuckerberg, Facebook last week received a report about a user broadcasting video on the service who was considering suicide and the company was able to contact police in time to prevent him from harming himsefl. “In other cases, we weren’t so fortunate,” Zuckerberg wrote.

Get more from Variety and Variety411: Follow us on Twitter, Facebook, Newsletter