Google Refuses to Remove Anti-Muslim YouTube Video

Samantha Murphy
September 15, 2012
Google Refuses to Remove Anti-Muslim YouTube Video

White House officials on Friday asked YouTube to review an anti-Muslim video cited as fueling violent protests worldwide -- and was the alleged catalyst behind recent attacks on the U.S. embassy in Libya -- but according to The New York Times, the Google-owned site doesn't have any intention of taking it down.

[More from Mashable: Caine’s Cardboard Arcade Inspiration Movement Continues [VIDEO]]

Google told the publication that the "Innocence of Muslims" video does not violate terms of service for YouTube regarding hate speech because it is focused on the Muslim religion and not the people who practice it. Although Google put up a temporary block on the clip in Egypt and Libya due to local violations in those countries, it still remains accessible to most worldwide.

SEE ALSO: ‘Innocence of Muslims’ YouTube Video Spurs Protests Across Mideast

[More from Mashable: White House Asks YouTube to Review ‘Innocence of Muslims’]

The 14-minute video, which was a trailer for an upcoming "Innocence of Muslims" film created by an Israeli-American director -- upset the Muslim community for insulting the religion's Prophet Muhammad. It has since been credited as causing the attack on the U.S. consulate in Libya, which resulted in the deaths of four U.S. citizens including a U.S. ambassador, and igniting other protests across the globe.

Google's decision to not comply with the White House's request is in line with a company policy from 2007 that said it would consider laws, local policies and culture when deciding whether to remove or restrict a video.

“One type of content, while legal everywhere, may be almost universally unacceptable in one region, yet viewed as perfectly fine in another,” Rachel Whetstone, senior VP for communications and public policy at Google, said in the policy. "We are passionate about our users, so we try to take into account local cultures and needs.”

YouTube’s Community Guidelines “encourage free speech” and “defend everyone’s right to express unpopular points of view.” However, it does not allow “hate speech,” which the company defines as “speech which attacks or demeans a group based on race or ethnic origin, religion, disability, gender, age, veteran status, and sexual orientation/gender identity.”

The Obama administration initially “reached out to YouTube to call the video to their attention and ask them to review whether it violates their terms of use,” Tommy Vietor, spokesman for the National Security Council, told the Washington Post.

This story originally published on Mashable here.