Facebook loves video. Facebook also loves money. Video is bringing in money, also.
But Facebook is starting to realize—finally—that not every video's worth hosting. No matter how many views it gets.
On Monday, a day after a video went viral on Facebook for showing a murder in Cleveland, Facebook issued an apology about its own failure in reporting the crime.
"We disabled the suspect’s account within 23 minutes of receiving the first report about the murder video, and two hours after receiving a report of any kind. But we know we need to do better," Juston Osofsky, Facebook's VP of Global Operations, wrote in a blog post.
Facebook's recap of the incident:
Various initial news stories had misreported what videos were posted when and which were livestreamed on Facebook. In response to these reports, Facebook issued a timeline clarifying what happened and when.
To be sure, it's not Facebook's fault that someone killed someone and that people saw the event on Facebook. The site offers an ability for users—all nearly 2 billion of them—to report a post for violating Facebook's Community Standards, which condemn violence.
But no one reported it quickly, according to Facebook's blog post: "We did not receive a report about the first video, and we only received a report about the second video—containing the shooting—more than an hour and 45 minutes after it was posted. We received reports about the third video, containing the man’s live confession, only after it had ended," the blog post reads.
Facebook does have a team of human moderators that actively monitor live videos if they reach a certain threshold, the company told Mashable earlier this year. But, in this case, that threshold was apparently not reached in time.
Still, Facebook said it will do better. "As a result of this terrible series of events, we are reviewing our reporting flows to be sure people can report videos and other material that violates our standards as easily and quickly as possible," the blog post reads.
That effort to do better also includes introducing more artificial intelligence into monitoring Facebook videos, since the site does receive so much content every minute that human moderators are not able to keep up—so they say. Whether or not those humans' A.I. counterparts will do any better remains to be seen.