Sex helped build Snapchat, and then sullied its corporate image

Snapchat has been trying to distance itself from sex for years. Now, the need to clean up its platform has become a huge business imperative: 96% of the company’s revenue in 2016 came from advertising, according to its recent public filing. But those advertisers are beginning to realize they run the risk of their carefully-created, million-dollar material being seen immediately before or followed up by pornography and risqué selfies.

GroupM, the media investing arm of WPP, a huge advertising firm, recently warned its clients that ads running on Snapchat’s “Stories” feature might be seen by users right before or after “explicit adult content,” the New York Times reported (paywall.) That’s because what users see on Stories is a succession of content from all the accounts they follow—so if someone follows an account that decides to send sexual content out, there’s little the advertiser, the user, or indeed Snapchat, can do to stop it (though users do have to click each story to make it play.)

Snapchat has been trying to scrub its platform—and its reputation—clean of sexual content and association for a long time. Anyone aged 13 or above can create an account, and users skew young, with the platform particularly popular among 18-25 year olds. The app’s unique selling point was originally that images sent disappeared in a few seconds, and the freedom of sending ephemeral content meant it quickly became associated with “sexting.”

As the app developed video features and the ability to stitch clips together into longer Stories which could be watched more than once, a more organized business of pornographic performance began to establish itself. Early accounts that featured nude videos and other adult content gathered hundreds of thousands of followers, arguably helping to establish the brand.

The company doesn’t want to be associated with this sexual content, which is alluded to in its public filing in 2016. Its community guidelines and terms of service say that porn isn’t allowed. But since there are few barriers to setting up an account, the company has always relied on being reactive, removing users that contravene its edicts, rather than preventing the creation of such accounts. In 2014 it started a serious program of deletion. But, clearly, people are still seeing some explicit content on the site.

Snapchat said it has tools in place to keep users and advertisers safe from seeing or being associated with content they don’t explicitly choose. These include teams dedicated to safety and abuse, which review reports and seek out violations using both humans and technology. A spokesman for the company also said that the GroupM memo raised a specific concern about one user who, while she did post explicit content elsewhere on the internet, was not publishing sexual material to Snapchat.

The company is also soon rolling out an in-app abuse-reporting tool, currently beta testing in Australia.

The issue is one of control, for all parties concerned. The companies behind the tech tools we take for granted can’t always predict how they’ll be used—as evinced by Google’s troubles with advertisers upset that their ads appeared next to hateful content; by Twitter’s battles with racist and misogynist trolling or its use by would-be terrorists; and by Facebook’s recent experience with its Live feature used to film real-time attacks and suicides.

People using the platforms often do so at their own risk, and parents trying to police their kids’ use are finding there are few adequate tools at their disposal. Advertisers are having to get used to media that is no longer highly curated and controlled—or get off those platforms altogether.

Sign up for the Quartz Daily Brief, our free daily newsletter with the world’s most important and interesting news.

More stories from Quartz: