Lessons from the Apple-FBI fight

image

Short version of the FBI’s latest statement on the Apple encryption case: Never mind!

Not even a month and a half after the Federal Bureau of Investigation convinced a judge that without Apple’s help, it could never unlock the encrypted iPhone 5c used by San Bernardino murderer Syed Rizwan Farook last year, the agency announced it had gotten into the phone without Apple’s help.

Monday’s news — telegraphed a week before when the FBI asked for a delay in a hearing on its attempt to compel Apple to load special software on Farook’s iPhone 5c that would allow unlimited guesses of his unlock passcode — ends that case. But the conversation about it is nowhere near finished.

How did the feds do this?

The government’s three-sentence filing states concisely but vaguely that it “has now successfully accessed the data stored on Farook’s iPhone.”

As a result, it no longer needs Apple to perform the vulnerability transplant required by Judge Sheri Pym’s Feb. 16 order to further its investigation of the Dec. 2 attack in which Farook and his wife, Tashfeen Malik, murdered 14 people before being killed by police.

Apple, for its part, responded by sharing a statement with the media that declared, “This case should never have been brought.” But the company pledged its continued help with law enforcement investigations (that don’t involve it weakening its own cryptography) and its continued participation in “a national conversation about our civil liberties, and our collective security and privacy.”

Security experts such as Robert Graham and Jonathan Zdziarski can only speculate how the FBI and, most likely, third party researchers managed a feat that the government previously declared impossible without Apple’s “exclusive technical means.”

More important, Apple itself will only be able to make educated guesses if the FBI won’t reveal how it did this.

It’s unclear how much a recent White House policy on disclosing software vulnerabilities would require revealing this one to Apple, something the Electronic Frontier Foundation has already demanded.

Expect to read more about this issue as law enforcement agencies respond to encrypted communications by trying to hack into the devices at either end of the line — a remedy civil rights advocates have told me they don’t love but prefer over court-ordered weakening of crypto for everyone.

Next moves for Apple, and you

The best-case scenario for iPhone security is that investigators performed some tinkering with the iPhone’s memory — as outlined by Zdziarski, physically removing that chip, copying its contents, trying passcodes on the device, then copying it back.

That shouldn’t work on newer models with a “Secure Enclave” coprocessor set up to defeat such tampering, so its risk should fade as the iPhone 5c and older models go out of circulation.

And even if you use a 5c or older model, you should have some confidence that if your phone is lost or stolen, its vast store of data about your everyday habits will probably remain encrypted unless a skilled adversary with a personal interest in you acquires it.

The worst-case situation: The FBI now has its hands on an unpatched vulnerability it can use to crack any iPhone it obtains. Apple and any iPhone user would have to assume that this “zero-day” exploit would be known to more than investigators of the San Bernardino case.

In that scenario — Graham’s best guess — expect to see more office lights on all night at Apple’s Cupertino campus. To borrow a phrase from Apple’s Feb. 16 “Message to Our Customers” from CEO Tim Cook, this exploit “would be the equivalent of a master key, capable of opening hundreds of millions of locks.”

In either case, individual iPhone users can mitigate their risk of exposure by switching their screen locks from numeric codes to combinations of letters and numbers that would be harder to unlock via “brute force.”

But if you forget the passcode, you risk having your iPhone erase itself after 10 incorrect guesses, which is the exact feature the FBI wanted Apple to disable. Write down the passcode, and an adversary might see it. Security is hard when you must use a key more than once.

Things will be harder for the government the next time

When this first came to light, I thought the FBI might have a compelling case. The user of the phone was a monster and also unable to unlock the phone by virtue of being dead; its owner (the San Bernardino County Department of Public Health, Farook’s employer) authorized the search; the FBI was only asking Apple to give it more chances to guess its passcode; the iPhone lacked a defense found in newer devices.

A situation like this will almost certainly happen again, but even before Monday the odds were against the government having so many elements in its favor again. At a minimum, Apple would likely have shipped patches to make it harder to thwart its auto-erase feature, and the suspect might have a newer and more secure iPhone.

Or the technology in question could involve another company’s encryption — Apple has no monopoly on using cryptography to keep your data safe, a point often lost when politicians demand a crackdown on crypto.

And now if law enforcement investigators come to a court and declare that they can’t execute a legally authorized search without a given company’s assistance, the first response is likely to be, “Are you sure about that?”

Email Rob at rob@robpegoraro.com; follow him on Twitter at @robpegoraro.