Apple versus the FBI

There's been a lot of interesting news regarding Apple and the FBI request to gain access to the iPhone belonging to one of the San Bernardino terrorists. It all began with the FBI's request that Apple provide a way to bypass the brute-force limitations of the PIN of the suspect's iPhone 5c.

While we've seen some devices on the market that allow for brute-forcing an iPhone's PIN, those techniques no longer can work given the security built into iOS 8 and newer. The iPhone itself, after detecting a large number of incorrect PIN entries, will force a time delay between subsequent attempts. Additionally, the suspect's iPhone is set to wipe its contents after 10 failed attempts to enter the correct passcode. Thus, it's impossible for the FBI to continue to try and gain access without the device deleting the very same data they are attempting to access.

Even if Apple wanted to comply, they couldn't. From their Government Information Request site:

On devices running iOS 8 and later versions, your personal data is placed under the protection of your passcode. For all devices running iOS 8 and later versions, Apple will not perform iOS data extractions in response to government search warrants because the files to be extracted are protected by an encryption key that is tied to the user’s passcode, which Apple does not possess.

The only way to bypass the security imposed at the operating system level is thus to modify the operating system. Only an operating system update signed using Apple's master key can remove the roadblocks.

Ultimately, Apple is being asked to provide a workaround to the security built into that iPhone. The request can be summed up in the words Tim Cook provided in an open letter on Apple's site:

But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.

The request asks for either a way to circumvent the passcode entirely, or a way to bypass the attempt delay so that they can directly brute-force the PIN in a much shorter amount of time. What they are requesting is a custom version of the operating system, installed via DFU mode, that would give the FBI access to just the suspect's iPhone.

Except, it goes beyond that. Because if Apple complied, there's nothing stopping the FBI from reverse engineering the 'backdoor' and using it on other iPhones. Tim Cook:

The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.

What's interesting is that this is surrounding an iPhone 5c. That device is technically identical to the iPhone 5. That generation of iPhone was the last to exist before the creation of Touch ID and the Secure Enclave. iPhone 5s and newer, plus modern iPads, all have an increased level of security that goes beyond what is involved in this request.

With the Secure Enclave and Touch ID, Apple added new safeguards that make it even harder to access an iOS device. We can see the side-effect of that with the recent complaints of 'Error 53', an error that comes up because someone has tampered, either on purpose or by accident during an iPhone repair, with the hardware making up the Touch ID and Secure Enclave of the device.

Essentially, Apple has a very secure device, and the FBI is asking for help to get around that security. Even Apple admits that they have

But by Apple standing firm on their stance of data security and user privacy, they're also making a public statement: they take the security of all of their customers seriously.

Dan Moren at Six Colors, responding to an article on the matter by Ben Thompson:

Thompson worries that Cook has picked the wrong battle—i.e., that it shouldn’t be about the circumvention on this particular phone. That’s in part because the phone in question—an iPhone 5c—lacks security protections present in the 5s and later; so the workaround the FBI is requesting isn’t even possible on newer models.

But I’m sure there are also millions of 5c and earlier iPhones out there that would potentially be subject to such procedures, which is likely one reason that Apple and Cook have taken a stand here: it’s aiming to protect all of its customers, not just some of them.

Apple has been vocal about user privacy for a while now. Security is also a main point in their "Why there's nothing quite like iPhone" campaign. And from Tim Cook's very public response, they take this very seriously for everyone that's using an iPhone today.

Of course, it isn't just providing a backdoor to the government that's an issue. Rather, it's the idea that a backdoor that the 'good guys' can use is also one that 'bad guys' can use. And as we store more and more sensitive data on our devices, it only makes sense that the security and privacy of our devices increases, too.

While the issue of Apple providing access to an iPhone isn't a new one, this request and the vocal response does mean this could be a very important milestone. The FBI clearly wanted to try and set a precident; if Apple did what had been requested for one device, they should do that for all devices in the future.

Thankfully, Apple has already implemented strong security with the Secure Enclave and Touch ID. With the Secure Enclave, there's an independent computer also processing security access. While the operating system could be modified to get around passcode attempt limitations, the Secure Enclave would still slow down any kind of request. To gain access to such a device (such as an iPhone 5s or newer device), Apple would have to put in a master key.

Such a key is not a safe move. Ben Thompson explains it well here:

There is a point to diving into these details: thanks the secure enclave an iPhone 5S or later, running iOS 8 or later, is basically impossible to break into, for Apple or anyone else. The only possible solution from the government’s perspective comes back to the more narrow definition of “backdoor” that I articulated above: a unique key baked into the disk encryption algorithm itself.

This solution is, frankly, unacceptable, and it’s not simply an issue of privacy: it’s one of security. A master key, contrary to conventional wisdom, is not guessable, but it can be stolen; worse, if it is stolen, no one would ever know. It would be a silent failure allowing whoever captured it to break into any device secured by the algorithm in question without those relying on it knowing anything was amiss. I can’t stress enough what a problem this is: World War II, especially in the Pacific, turned on this sort of silent cryptographic failure. And, given the sheer number of law enforcement officials that would want their hands on this key, it landing in the wrong hands would be a matter of when, not if.

Hopefully, Apple will never find itself in a position to have to provide such a security hole for the government, and everyone else, to jump into.

Meanwhile, the issue remains a very vocal one. We will have to see how the FBI, and the government in general, responds. But thus far, we've seen support come in from WhatsApp CEO Jan Koum, the American Civil Liberties Union, the Electronic Frontier Foundation, and even some insight from Google CEO Sundar Pichai.