Why the Cops Hate the New Apple and Google Phones

Illustration of cellphone and key
Illustration of cellphone and key

The police are upset about Apple and Google’s latest smartphone advances.

The problem is the encryption in both the iOS 8 and Android Lollipop operating systems. It is turned on by default, and there is no master key that Apple or Google can give to investigators, even if they have a warrant.

“Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data,” Apple brags on its site. “So it’s not technically feasible for us to respond to government warrants.”

Google is not as direct in its security sales pitch for Android 5.0 Lollipop, shipping on the Nexus 6 and (slowly) coming to other devices: “Full device encryption occurs at first boot, using a unique key that never leaves the device.”

What Apple and Google tout as a feature, police see as a bug, and not the kind they can use to listen in on the bad guys.

“Encryption threatens to lead all of us to a very dark place,” FBI Director James Comey warned in a speech last month. “Justice may be denied because of a locked phone or an encrypted hard drive.”

Crypto wars
To understand why Comey and others are upset, consider how cryptography works. Intensely complicated math yields a digital lock that cannot be picked in any reasonable period of time. If you want to see the communication, you either get the password, or you give up.

Digital encryption is at work right now, as you read this: Yahoo Tech (and all Yahoo sites) use encryption that scrambles the data flowing between it and your browser. And your Web-mail service is increasingly likely to use encryption, too, so your messages can’t be read if they are intercepted.

How iOS 8 or Android Lollipop differ from most encryption is where the key goes. Instead of appearing on multiple servers, it’s a snowflake of a secret, unique and isolated to one device. It’s generated and kept on the device; even the manufacturers cannot get the key to decrypt a phone.

Comey and other law enforcement veterans want Apple and Google to relax the robustness. As former U.S. Attorney General Michael Mukasey put it an event in Washington last month: “The toothpaste needs to get back in the tube.”

What if it isn’t put back, and we’re in a world where unbreakable crypto is widely available and used?

We don’t need to speculate, because that world has existed since the 1990s. Pretty Good Privacy — an open-source crypto program the government tried to quash by investigating developer Phil Zimmermann for violating weapons-export laws — also doesn’t offer any “golden key” for the police.

PGP is not the easiest app to set up, but Apple’s FileVault has been in OS X since 2011, while Microsoft’s BitLocker has been in some Windows releases for even longer. And there’s no evidence of either Apple or Microsoft having built in back doors that police officers or other agencies can use.

Resorting to other tricks
Yet police investigations still catch the bad guys. At one extreme, investigators have used court orders to plant malware on suspects’ machines to record passwords. At the other, there’s traditional police work: The Intercept’s Dan Froomkin and Natasha Vargas-Cooper noted that encryption had nothing to do with solving three of the four cases Comey cited in his speech.

For the law-abiding among us, encryption is a good thing — as the FBI notes in its advice to business travelers. But even if you never leave the United States and have no corporate foes, knowing that your data can’t be sucked out of your phone or computer should give you comfort.

And knowing that there’s some back door or master key that can be handed over to a governmental agency, or stolen by a criminal one, should make that comfort vanish. It is a rare vulnerability that stays locked up in one lab; just ask a former National Security Agency contractor who learned much about the agency’s surveillance during his time there.

“These things get found all the time, and they make a product completely untrustworthy and unreliable,” Edward Snowden said via a remote and encrypted video interview at a digital-security workshop for journalists on Nov. 7. “You’ll see a whole bunch of intelligence agencies exploiting the same capability.”

Adam Wandt, a professor and deputy chair at the John Jay College of Criminal Justice in New York, said the encryption on iOS 8 and Android 5.0 is harder to crack than past encryption systems — in part because of requests from police and courts for better security on their own devices. But could you revise them to hide a secret backup key that only police and courts can access? For now, he said, “it’s near impossible.”

Mukasey’s description of the Internet at last month’s event also applies to any crypto back door: “Like a gun, it doesn’t know whether to defend or attack.”

Attack the network instead
But while device encryption has grown much stronger, the networks that make these devices useful still look woefully weak. On Friday, The Wall Street Journal’s Devlin Barrett reported that the Justice Department has been flying planes equipped with “dirtboxes” — devices that impersonate cellphone transmitters to snoop on the data sent back and forth.

The idea here is to zero in on the phones of suspects and ignore those of others (we must take the DoJ at its word there). Police can already do that by using “IMSI catchers” to spoof cell towers, but a Cessna above a city will capture far more chatter. And we had no idea this aerial surveillance was going on until a few days ago.

Meanwhile, all the data that’s automatically synced from our phones to cloud storage can be vulnerable to low-grade hacking and compromise, to say nothing of NSA-level prying.

Companies like Apple and Google will make a big mistake if they pat themselves on the back for making encryption the default on a new device and don’t keep working just as hard to secure the network services that give those gadgets life.

Meanwhile, the law enforcement advocates angry at Apple and Google for frustrating their jobs should take a moment to think about how their requests look after all these revelations of widespread and industrialized surveillance of networks. Asking tech companies to break their products, even as strong crypto remains a reality elsewhere, is not an argument they can win or should win.

Email Rob at rob@robpegoraro.com; follow him on Twitter at @robpegoraro.