This creepy Facebook stalking app was a hoax—but it should still scare the hell out of you

Earlier this week, sketchy-looking app  Facezam caught the internet's attention, and whipped up some controversy. It was all just a hoax engineered by a marketing company looking to score some viral gold. But that doesn't mean you still shouldn't be watching this story.   

The app was touted as "Shazam for faces," with its "makers" claiming it'd let users upload photos of strangers, then, leverage Facebook's repository of profile pictures to identify the people in those photos. 

SEE ALSO: Facebook's image recognition can now tell what you're wearing

Facezam raised red flags immediately. The scant marketing materials made it even worse. They declared that "privacy is over" and used a sexualized image of a woman to show how the bogus product supposedly worked: Just take a picture of an innocent person, let the app find their profile on Facebook and "the rest is up to you."  

It was a skeezy play, but it raised legitimate questions about online privacy, and how the data that internet giants like Facebook have collected could be abused.  

How they pulled off the hoax

The team behind Facezam spread the word about the fake app by contacting reporters directly, and offering email interviews with the sham company's founder and CEO, "Jack Kenyon," for some more insight on the app before its supposed rollout on iOS. 

The Kenyon character presented an unapologetic stance in several interviews. 

"Facezam could be the end of our anonymous societies," he told The Telegraph. "Users will be able to identify anyone within a matter of seconds, which means privacy will no longer exist in public society." 

One of the pictures provided by the Facezam team to reporters.
One of the pictures provided by the Facezam team to reporters.

Image: facezam

Mashable was contacted directly with an offer to interview Kenyon, with a request for coverage that "didn't go for the creepy angle" taken by The Telegraph. After we responded with some questions about the legal and ethical implications of what Facezam was offering its users, Kenyon fell off the map. A follow-up requesting more proof that the app actually existed also went unanswered.

Things got sketchier. There was no proof that Kenyon, CEO and founder of Facezam, even existed. A cursory Google search didn't turn up anything other than his statements to reporters about Facezam from that morning. We later confirmed via DM that two reporters who quoted Kenyon in their Facezam coverage only corresponded with him through emails after receiving a pitch to cover the app.       

Kenyon gave no insight into how the app's tech worked, but claimed it could search through "billions" of Facebook profiles. (At last official count, Facebook had just over 1.86 billion active monthly users.) The Facezam Twitter handle only started tweeting in response to media coverage, and its website provided no details about its terms of service. Its domain was also registered anonymously. 

Facebook wasn't having it  

Facezam is BS, but could an app like it actually exist? Facebook wants you to believe no.

People trust us to protect their privacy and keep their information safe," a Facebook spokesperson said via email when asked about Facezam—before the revelation that it was a hoax. "This activity violates our terms and we’ve reached out to the developer to ensure they bring their app into compliance.” 

It all comes down to how developers are allowed to use Facebook's data in their apps. According to Facebook's API terms, app-makers are barred from collecting or "scraping" user content through automated means like Facezam's hypothetical facial recognition system without Facebook's permission. 

The reps made it clear that Facebook hadn't given (and would never give) Facezam (or anything like it) the go-ahead to access that data—and if someone tried to, there are defenses in place to keep API users in check and even blacklist them from using it.

We eventually touched base with Kenyon via email, and he explained that things got out of hand.

"We were intending to end the hoax on the designated launch date, March 21st," he wrote. "However, the site went viral far [sic] quicker than we expected, and some people were not happy. After five hours in the viral spotlight, Facebook's legal team contacted us. Things had got serious quickly. We explained that Facezam was a hoax, so we weren't actually scraping data of Facebook, which is very very very illegal. This made them a lot happier."

This isn't the first time Facebook has had to stamp out an app that looked to abuse of its facial recognition data. The social media giant blocked the Google Glass-affiliated NameTag from performing a similar ID function back in 2014.   

We also reached out to Apple to see if a program like Facezam could ever make it through the App Store's approval process, but haven't yet received a response. 

Could this, like, actually work?

Alessandro Acquisti, professor of information technology and public policy at Carnegie Mellon University, has published extensively on the use and abuse facial recognition systems—and he says something like Facezam is all too plausible.

"While Facezam is a hoax, developing similar tools is, in principle, already possible (we demonstrated the feasibility of mass scale face recognition via social media photos in our 2011 experiment)," he said in an email.  

He said there are three distinct obstacles for these tools on the road to becoming "practical." 

The first is the matter of legality and who owns the image data. Facebook owns the data in question here, and it stepped up to the plate to defend it. 

The second issue has to do with technological limitations. 

"Truly mass-scale facial recognition is bound to be computationally demanding and afflicted by false positive errors," Acquisti said. 

The last concern centers around the ethics of the system. Acquisti posed a tough question to illustrate the point: "Will we accept... a world where anonymity in public is no longer possible?" 

That question has, unfortunately, already been answered in the real world.  

A similar facial recognition app, FindFace, ignited controversy in Russia last year after its facial recognition function was used to identify and harass sex workers and porn actresses through their personal profiles on Vkontakte (VK), a Russian social media network. 

VK took steps to curb abuse, but FindFace still exists. The Facezam team even referenced the controversy when it unveiled the hoax. So it's not a stretch to think an app just like it could be twisted for the same use or even designed for it, giving trolls and stalkers yet another tool to abuse people online.  

Karissa Bell contributed to the reporting for this article.

WATCH: The future of selfies is this camera drone with facial recognition