Apple Suspends Parler From App Store, Following Google’s Move: “There Is No Place On Our Platform For Threats Of Violence And Illegal Activity” – Update

UPDATED, 5:20 PM: Apple has removed Parler from its app store, less than a day after it had enacted a 24 ban for the self-described far-right altnerate to Twitter, to “remove all objectionable content.”

“We have always supported diverse points of view being represented on the App Store, but there is no place on our platform for threats of violence and illegal activity. Parler has not taken adequate measures to address the proliferation of these threats to people’s safety. We have suspended Parler from the App Store until they resolve these issues,” Apple said in a statement to Deadline.

More from Deadline

The Parler suspension also comes after Google banned the app from its Google Play Store.

See Apple App Store’s letter to Parler below.

“To the developers of the Parler app,

Thank you for your response regarding dangerous and harmful content on Parler. We have determined that the measures you describe are inadequate to address the proliferation of dangerous and objectionable content on your app.

Parler has not upheld its commitment to moderate and remove harmful or dangerous content encouraging violence and illegal activity, and is not in compliance with the App Store Review Guidelines.

In your response, you referenced that Parler has been taking this content “very seriously for weeks.” However, the processes Parler has put in place to moderate or prevent the spread of dangerous and illegal content have proved insufficient. Specifically, we have continued to find direct threats of violence and calls to incite lawless action in violation of Guideline 1.1 – Safety – Objectionable Content.

Your response also references a moderation plan “for the time being,” which does not meet the ongoing requirements in Guideline 1.2 – Safety – User Generated content. While there is no perfect system to prevent all dangerous or hateful user content, apps are required to have robust content moderation plans in place to proactively and effectively address these issues. A temporary “task force” is not a sufficient response given the widespread proliferation of harmful content.

For these reasons, your app will be removed from the App Store until we receive an update that is compliant with the App Store Review Guidelines and you have demonstrated your ability to effectively moderate and filter the dangerous and harmful content on your service.

Regards,
App Review Board”

PREVIOUSLY, January 8: Google said today that it has banned the Parler app from its Google Play Store “in order to protect user safety.” Separately today, Buzzfeed obtained an email Apple to Parler brass that gave them 24 hours to “remove all objectionable content” and submit “requested moderation improvement plan” or face being axed from the Apple Store.

Founded in 2018, Parler is described as a far-right alternate to Twitter whose users include people who have been banned by the social networking giant. Numerous posts on the app, which bills itself as “the world’s town square,” advocated for violence and promoted participation in Wednesday’s attack on the U.S. Capitol.

A spokesperson for Alphabet-owned Google issued this statement:

“In order to protect user safety on Google Play, our longstanding policies require that apps displaying user-generated content have moderation policies and enforcement that removes egregious content like posts that incite violence. All developers agree to these terms and we have reminded Parler of this clear policy in recent months. We’re aware of continued posting in the Parler app that seeks to incite ongoing violence in the U.S. We recognize that there can be reasonable debate about content policies and that it can be difficult for apps to immediately remove all violative content, but for us to distribute an app through Google Play, we do require that apps implement robust moderation for egregious content. In light of this ongoing and urgent public safety threat, we are suspending the app’s listings from the Play Store until it addresses these issues.”

Apple told Parler executive in the letter that the tech goliath has “received numerous complaints regarding objectionable content in your Parler service, accusations that the Parler app was used to plan, coordinate, and facilitate the illegal activities in Washington D.C. on January 6, 2021 that led (among other things) to loss of life, numerous injuries, and the destruction of property The app also appears to continue to be used to plan and facilitate yet further illegal and dangerous activities.”

Apple added that Parler is required to submit a “moderation improvement plan within 24 hours of the date of the message.” which was Friday morning.

On Thursday, Parler CEO John Matze blasted Twitter for suspending President Donald Trump’s account. In a post on his site, Matze wrote: “The cowardly authoritarians at Twitter booted president Trump from Twitter for 12 hours. What a bunch of cowards.”

Best of Deadline

Sign up for Deadline's Newsletter. For the latest news, follow us on Facebook, Twitter, and Instagram.