Apple opens up on how it approaches security following FBI battle

Passcode-40
Passcode-40

CUPERTINO — In a press briefing Friday, Apple discussed how security works on the iPhone and iOS. The meeting, which was often technical, shed insights into its broader approach to security.

Although the meeting wasn’t specifically about the battles the company has had with the FBI and parts of the U.S. government – including cases in San Bernardino and Brooklyn – that conflict was still the elephant in the room.

Still, Apple insists its goal with iOS and iPhone security is not about protecting users from the government, it's about protecting users from hackers.

SEE ALSO: A timeline of Apple's fight with the FBI

Senior Apple engineers said that although the security has been a big part of how the company approaches its design for the last two decades, it’s become even more important in the last decade because of the iPhone.

The iPhone, more than any other product, is a place where customers place their most important and private information. Everything from identity information to health information is stored on the device and Apple engineers say that the company feels its imperative to protect that information from hackers trying to break in.

At the meeting, senior Apple engineers, who declined to offer individual comment, discussed the company's approach to security.

Building security from the ground up

The fact that hackers are the real threat vector, not the government, was a theme in the briefing.

Describing security as a process and not a destination, senior Apple engineers were quick to assert that there is no such thing as 100% security, but that the company is focused on building its system from the ground up to be as secure as possible.

The engineers also stressed that security is dynamic, not static. And because the situation is always changing, security can never be seen as complete.

For Apple, it feels that one of its core advantages is that it controls the whole stack of hardware and software. Moreover, Apple has designed security into its products from the silicon up.

Calling Apple “the most effective security organization in the world,” senior Apple engineers repeatedly emphasized that the entire Apple ecosystem was designed with security in mind.

Because Apple designs its own chips and its own operating system, it is in a unique position in the industry when it comes to hardware/software integration.

A significant portion of the briefing was spent on how the secure boot process for the iPhone works. These details are also outlined in the iOS Security White Paper.

This is how that document describes the boot process for an iPhone:

Image: Apple

In other words, hardware embedded into the chip on the phone checks the software before it boots to make sure it is secure and actually signed by Apple. This is done as a way to prevent hackers from taking over the device and putting another rogue version of an operating system on a device.

iOS devices with an A7 or later processor (so the iPhone 5S and newer), also have a Secure Enclave processor (SEP) which is also has its own secure boot process.

Again, Apple stresses that bugs are always possible – there is no such thing as 100% security – but the senior engineers pointed out that separating the components of boot process limits those bugs.

Consider that iOS has millions of lines of code. Chances are, bugs can exist in that software. But at the lowest levels – where the Boot ROM and Low-Level Bootloader live, it’s more like a couple of thousand lines of code.

Apple’s engineers – as well as outside experts and machines – evaluate that code. As a result, chances of having a bug in this part of the code is much, much lower.

Security only matters if everyone is using it

Of course, none of the security safeguards matter if users are using outdated versions of software.

A senior Apple engineer was blunt: “None of this stuff matters if users won’t install updates.”

And it's here where Apple has an inarguable advantage, especially when compared to other phone makers and operating system makers.

Apple controls the whole stack, including updates, which means that it can push out updates and bug fixes very quickly.

This is not how the rest of the mobile industry works. Consider the situation with Stagefright, an Android vulnerability disclosed last year. Even though Google was very quick to patch the exploit, it took considerable time for those updates to get down the chain to non-Nexus devices. Millions of devices will never get an update against that vulnerability.

That example shows a sharp contrast in Apple’s ecosystem and the the Android ecosystem. With Android, Google can issue updates but it's up to the manufacturers to patch the version of Android running on their devices. Because most Android makers customize the software in some way, there is usually a delay between an update or patch coming from Google and it being packaged by the OEM.

Even after the software is patched, it can take additional time for carriers to vet an update before it is issued to customers. And that’s assuming the device is still supported.

Google is making inroads to solve these problems and in the latest versions of Android it has taken more control over updating core services related to the OS, including the web browser. But Android adoption is still low.

As of April 16, only 4.6% of Android devices are running Android 6.0 (Marshmallow). In contrast, 80% of iOS devices are running iOS 9.

A lot of Apple’s update success can be attributed to owning the whole stack (and being able to issue updates directly over the air without carrier intervention), but the company is working to make the process even better.

With iOS 9, Apple reduced the amount of space needed on a device to install the new version of iOS. With iOS 8, users needed at least 4.6GB of free space on their phone. Plenty of users didn’t have that space and as a result, adoption dipped.

With iOS 9, Apple was able to get the update size down to 1.3GB. Apple says making that change made the curve of users who hadn’t upgraded immediately drop.

With iOS 9 — and improved with iOS 9.3, Apple launched a new feature that allows users to update install an OS later. Apple updates can already happen in the background, but now users will get a pop-up asking if they want to install now or “install later.” If “install later” is selected, the update will install in the evening when the phone is plugged into the charger.

This is similar to a feature Apple released with OS X updates last year and senior engineers say they expect that feature to aid in update adoption too.

And after updates are issued to phones, Apple works to make sure phones can’t be retroactively loaded with an older version of an operating system. Apple stops signing older versions of the software within a few days.

Senior Apple engineers stressed that this sort of update process isn’t easy to do, but is extremely important. Updates that happen but don’t get to consumers are worthless.

Encryption

As with its other security measures, Apple’s encryption efforts are hardware and software based. Historically, a challenge with encryption – especially on mobile – can be its impact on battery life and performance.

So starting in 2009 with the iPhone 3GS, Apple has had hardware support for Advanced Encryption Standard (AES).

Encryption on the iPhone starts with the hardware itself. There is a chip on the iPhone that sits between the flash memory (NAND) and the RAM. This chip encrypts the data between those two points. In 2013, Apple introduced the Secure Enclave co-processor (SEP), which makes this process even more robust.

The Secure Enclave is interesting because like the secure boot process, it is separate from iOS. This is done because this separation makes it harder to attack.

The AES chip talks to the Secure Enclave and they can exchange security keys with one another. When iOS wants to fetch something from its NAND storage and go into RAM, it will ask the Secure Enclave to give a key to the AES block and then that key (which iOS can never read) is encrypted by the AES chip.

Image: apple

On A7 devices and higher, the encryption runs through the AES block but key management is wholly governed by the Secure Enclave. It is worth noting that the iPhone 5C – which was at the heart of the San Bernardino case – did not have a Secure Enclave. Security experts Mashable has spoken with speculate that the vulnerability used to ultimately break into that phone, may not have been possible on a device with a Secure Enclave. Apple engineers weren’t ready to make any assertions like that (again, no system is 100% secure), but stressed that the Secure Enclave itself was something designed as the company’s security and hardware technology naturally evolved.

A senior Apple engineer says that Apple was thinking about silicon-based encryption even before the iPhone shipped. It took until the iPhone 3GS for the iPhone to get that hardware support – but it was a direction Apple wanted to go in even before 2007.

With the A7 and its Secure Enclave, that was another example of a direction Apple wanted to go in, but that took time for the hardware to be designed and for the software to work in parallel. The lead-time for hardware on devices like phones are often years in the making, so Apple thinks about its security philosophy and design long before the final results show up in its products.

Apple uses standard encryption algorithms to protects its files. It also puts the source code (the math) behind its techniques on its website. Being open in this case is good. The keys, which are generated by hardware are safe, but having the source code open means others can look at the code and verify its security.

In addition to internal security audits, Apple also does third-party code review with external experts. 

iMessage

There’s a lot of talk about encrypted messaging – especially with the news that WhatsApp now supports end-to-end encryption, but Apple wants to stress that iMessage (and FaceTime before it) has had end-to-end encryption from the start.

It’s not enough to just encrypt communications on the sender's phone and the receiver's phone because if messages pass through a central server, that server will now become an attack vector.

So with iMessage, everything that is sent is encrypted and nothing is ever sitting in the clear. The device itself (iPhone, iPad or Mac) is what generates the secure key to decrypt each message.

Image: Apple

What becomes interesting is when other devices are added to the mix. iOS lets you use your Apple ID with multiple devices and you can then get access to all of your messages across those devices.

When that happens, multiple messages are actually being sent (each encrypted) that are then delivered across devices. That way, when I send a message from my iPhone – I can still see what I wrote on my iPad later.

Apple has designed pop-ups to alert users every time a new device is added to an Apple ID. You’ve probably seen this message if you have more than one device. It basically alerts you that another device named “Christina’s Apple Watch” or “Christina’s iPad Pro” is now able to send/receive iMessages. This is done as a way to alert users and be transparent. 

That way, if you see a new, unexpected device on your account, you know to revoke access and change passwords immediately.

Touch ID

Along with automatic updates, perhaps the biggest success Apple has seen with bringing security to its masses is with Touch ID.

The average user unlocks her phone 80 times in a day. Because that is so frequent, before Touch ID, only 49% of iPhone users had a passcode on their phone.

And it makes sense. If you have to unlock something so many times, a passcode can get annoying. Passcode usage after Touch ID’s introduction is now at 89%.

The Touch ID sensor works by first collecting an image of your fingerprint. That image goes to the Secure Enclave co-processor (SEP) and all fingerprint processing happens here, not in iOS.

iOS then mediates communication between the Touch ID sensor and the SEP. The images of the fingerprint are only held transiently to build the mathematical model behind the encryption. After the model is built, the image is not stored anywhere on the phone.

Whenever a device is powered off – or after two days of non-use for plugged-in devices - the keys generated for data protection from the SEP are thrown away. You need a passcode then to get back into the device and then a new set of data protection keys can be generated.

Data security is ‘fundamental to our society’

Although Apple stressed that is security methods aren’t designed to keep the government out – but to protect its customers from hackers – it still believes that data security is fundamental to our personal security and society.

In an ideal world, Apple would like the U.S. government to play a lead role on data privacy and user security in the world. Apple thinks the U.S. government should set the tone, because so many other governments look to the United States and take cues from it approach.

Of course, many in the government think the opposite and some lawmakers are already moving to basically eradicate encryption.

But for its part, Apple is committed to security and privacy.