YouTube Says Updated Hate Speech Policy Resulted in Removal of Five Times as Many Videos in Q2

Click here to read the full article.

YouTube said its hate-speech crackdown resulted in a spike in videos and channels purged during the second quarter of 2019 — although critics say hateful content continues to be a problem on the video platform.

In June, the Google-owned video giant announced an update to its hate-speech policy that banned videos “alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status.” It also explicitly prohibited videos espousing conspiracy theories denying that certain violent events took place, like the Holocaust.

More from Variety

In a blog post Tuesday, YouTube said the policy change led to a fivefold increase in the number of videos removed (111,185) and channels (17,818) in Q2 for hate-speech violations compared with previous typical quarters. It also said total comments removed in the quarter nearly doubled, to about 538 million, in part as the result of the updated hate-speech policy. YouTube said the surge in the volume of removals was partly due to the removal of older comments, videos and channels that were previously permitted.

Even with the change, however, a “significant number” of YouTube channels continue to “disseminate anti-Semitic and white supremacist content,” according to an analysis by Anti-Defamation League’s Center on Extremism released last month.

In its latest blog post, YouTube reiterated that when it updates content policies, it strives to balance free expression with “protecting and promoting a vibrant community.” The platform’s dedicated policy development team “systematically reviews all of our policies to ensure that they are current, keep our community safe, and do not stifle YouTube’s openness,” YouTube said.

YouTube claims hate speech represents a very small portion of videos published on the platform. According to the site, it pulled down almost 30,000 videos for hate speech over the last month; altogether, those generated “just 3% of the views that knitting videos did over the same time period,” according to YouTube.

To be sure, spam remains the biggest category — by far — of content violations on YouTube. In Q2, YouTube deleted about 6 million videos for violations of its spam policy. It also removed 3.7 million channels for spam, up more than 50% from prior quarters with YouTube citing improvements in automated spam-detection systems.

Over time, as Google’s team of 10,000-plus content reviewers view and remove more content based on new policies, YouTube’s machine detection also will improve. That said, “Machines also can help to flag hate speech and other violative content, but these categories are highly dependent on context and highlight the importance of human review to make nuanced decisions,” YouTube explained in the blog post. According to YouTube, more than 80% of videos automatically detected as violating its policies were removed before they received a single view in Q2 of 2019.

YouTube says since the beginning of 2018, it has made 48 updates to enforcement guidelines and policies. And it’s continuing to review those rules: For example, the video service said it is working to update its policy on harassment by YouTubers against other creators, coming after an outcry over a right-wing vlogger’s chronic bullying of a gay Hispanic video journalist.

Sign up for Variety’s Newsletter. For the latest news, follow us on Facebook, Twitter, and Instagram.