Why can't apple simply add the backdoor now and then remove it after the FBI is done?

The whole story is weird. Since the iPhone in question does not have a tamper-resistant device, the FBI should be able to open the case, read the whole Flash chip, and then run the exhaustive search themselves without even running the phone's firmware. Updates from Apple should have no effect at all.

(Edit: in fact it is a bit more complex; see below.)

Conversely, assuming that there was a technical impossibility in the description above (which would amount to claim that the FBI's level of incompetence is at least as large as their budget), and that an Apple firmware would solve the case, then it would be easy for Apple to make a firmware version that does the exhaustive search as the FBI wants, but only after having checked that the hardware serial number exactly matches some expected value, i.e. the exact phone from the San Bernardino case. Such a firmware would comply with the exact demand from the FBI without compromising the privacy of anybody else.

That the FBI claims to have tried decrypting the phone for one month and failed, is weird. That Apple refuses to help with the decryption with a firmware update limited to a single phone, is equally weird. What the whole thing seems to be is a political struggle about the right to privacy and the legality of non-judicial eavesdropping by law enforcement. The San Bernardino case is merely a pretext that is used to elicit reflex support from non-technical electorate. Apple found it expedient to play the role of the White Knight, from which they cannot now back away without alienating their consumer base.


Edit: the security system in an iPhone is a tower of elements, described (succinctly) in this document. An iPhone 5C runs on an Apple A6 chip, which the 5S and later models use an A7. The A6 has an onboard tamper-resistant device called the "UID"; the A7 has a second one called the "Secure Enclave". Since the iPhone in the San Bernardino case is a 5C, I won't talk any more of the Secure Enclave.

The UID contains an internal key that is unique to the device (let's call it Ku), and is unknown to anybody else, including Apple (whether the UID generates it itself, or it is generated externally and then injected in the UID on the processing chain, is unknown; I'll assume here that if reality matches the latter case, then Apple really did not keep the key). The UID never let that key out, but it can do an AES-based computation that uses that key.

The iPhone data is encrypted with AES, using a 256-bit key (Kd) that is derived from the combination of the user PIN and the UID key. Though Apple does not exactly detail that combination, it says it involves key wrapping, which is another name for encrypting a key with another key. We also know that a user can change his PIN, and it would be impractical to change the actual data encryption key Kd in that case, because it would involve reading, decrypting, re-encrypting and rewriting the gigabytes of user data. Thus, a plausible mechanism is the following:

  • The key Kd has been generated once.
  • When the phone is off, what is stored (in Flash, out of the UID) is an encryption of Kd by another key Kz.
  • Kz is itself the encrypting (wrapping) of the user PIN by Ku.

Thus, the unlocking entails obtaining the user PIN, submitting it to the UID, who returns Kz by encrypting the PIN with Ku. With Kz, the phone's firmware then recovers and decrypts Kd, and configures the crypto engine to use that key for all accesses to the user data.

While the actual scheme may differ in its details, the general outline must match that description. The salient point are that, although the tamper-resistant device (the UID) must be involved with each PIN try, it does not actually verify the PIN. The UID has no idea whether the PIN was correct or not. The wrong PIN counter, the delay on error, and the automatic deletion, are handled externally, by the firmware. This must be so because otherwise there would be no sense in Apple allowing the break to be performed with a firmware update.

Of course, one can imagine a kind of extended UID that would enforce the PIN verification and lock-out strategy, and could do so by running its own firmware that would be updatable by Apple. Such a device would really make Apple's help crucial. However, such a device would then be called a "Secure Enclave" because that's exactly what it is, and if it was added in the A7 CPU, it is precisely because it was lacking in the A6 and that absence was a vulnerability.

So what does it mean for a brute-force attack ? This implies that the UID must be invoked for each user PIN try. However, that's the UID -- not the phone's firmware. If you open the iPhone case, then the A6 CPU sub-packaging, the device UID can be accessed by connecting to it directly. It will involve some precision laser-based drilling and an electronic microscope to see what you are doing, so it certainly is not easy -- let's say it will cost a few thousands of dollars because that's the same kind of thing that is done (routinely !) by people who clone and resell satellite-TV access smart cards. Once connected, an external system can submit all possible user PIN for the UID to encrypt them all and provide the corresponding Kz keys (in my terminology above). Then the rest is done offline with a PC and a copy of the Flash storage. At no point is the phone's firmware invoked.

What the FBI currently asks for is an automatization of the process. They don't want to do precision drilling with lasers. They want to be able to plug something in the iPhone port without having to even open the case, so that the brute-forcing is done by the iPhone's CPU itself and the whole process can be done smoothly.

Thus, it is really not about the San Bernardino case. The FBI does not want a one-shot intervention from Apple; what they ask for is a tool that will be usable repeatedly on many phones. Apple is right in claiming that what the FBI asks for exceeds the specific case that serves as emotional pretext.

On the other hand, Apple could produce a firmware update that does as the FBI asks for, but only on the specific iPhone (identified through, for instance, the CPU serial number). And that firmware update would be specific to the 5C, and would not work on later models. There is no inevitability in Apple's producing a new firmware leading to a generic cracking tool for all phones of all models. But even if Apple complies with the a firmware that is specific to a single iPhone, the legal precedent will have been established, and Apple would find it hard to refuse other requests, from the FBI or from other countries where Apple has business interests (i.e. all of them).


A system which would ensure protection against user PIN cracking would need a tamper-resistant device that not only enforces the PIN failure counter and key erasure, but that device should also run a firmware that is not upgradable. The Secure Enclave has its own firmware, but it can be upgraded (firmware upgrades are signed by Apple, the Secure Enclave hardware verifies the signature). Even on an iPhone 6, Apple retains the ability to unlock arbitrary phones.


Please refer to Thomas Pornin's answer. Apparently, they don't even need Apple's help for this. In my opinion, they're trying to create a legal precedent.

My question is, if the cracking takes seven minutes, why not just release the update, wait ten or so minutes (coordinate with the FBI on this) and then release another update rolling back the change.

Of course, Apple can do this if they want to. If we look at this from a different perspective, and assume that the FBI needs Apple's help in order to decrypt this phone, we might find a lot of problems with the request:


The FBI won't likely let Apple do the cracking by themselves

Even if they do, the FBI will need to have full access to the device because of forensic guidelines and regulations which must be defended in court. Because of this, the FBI isn't likely to allow Apple to do the cracking themselves... even if they did, the device would have to be in the hands of the FBI at some point.

Because of data integrity requirements when it comes to US law, they can't simply "roll back" the operating system changes. Everything has to be intact or it could be disputed easily in court when they find anyone connected to those terrorists, and attempt to charge them.

In fact, Apple seems to be terrified of creating such a thing:

"...something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone."

If Tim Cook doesn't even trust his own company to keep something like this secret, why on earth would he trust the government, which has a proven track record in failing in this area?


I feel that the White House's willingness to compromise is deception

Refer to this article:

The Obama administration told a magistrate judge Friday it would be willing to allow Apple to retain possession of and later destroy specialized software it was ordered to create to help federal authorities hack into the encrypted iPhone belong to Syed Rizwan Farook.

"Apple may maintain custody of the software, destroy it after its purpose under the order has been served, refuse to disseminate it outside of Apple and make clear to the world that it does not apply to other devices or users without lawful court orders,"

This seems like a red-herring. Who cares if an Apple engineer is there to provide assistance? Once it's in the hands of the feds, it will probably be copied quickly, reverse-engineered, and possibly provided to law enforcement for use in decrypting anyone's iPhone for any reason. The police are already able to copy phone data during traffic stops.

We don't know the details of how this is going to work, but once the device is unlocked, they may image the entire device, which would include the hacked operating system.

Although the judge instructed Apple to create the software for the FBI, she said it could be loaded onto the phone at an Apple facility.

Here's where it gets weirder: how are they going to use specialized brute-forcing hardware at an Apple facility? Are they going to bring in a massive, portable GPU farm? With all that equipment, if the the FBI is copying the contents (including the operating system), then it could be easy for this to be overlooked by Apple.

Who cares if Apple destroys it in the end if it's already been copied? Ask yourself this question: if you were in the intelligence business, would you turn down the opportunity to make copies? I sure as heck wouldn't, and I doubt any federal agency is going to turn down such a incredibly valuable intelligence-gathering tool. It's very possible the changes will be copied and reverse-engineered.


This could eventually create a huge privacy risk for everyone

Here's why this is bad: if the past is any indication, the US government is one of the worst keepers of secrets in the world. There are a plethora of leaks involving OPM, FBI, DHS, NSA, Pentagon, the Director of the CIA, contractors, etc. In some cases, sixteen year old basement dwellers have pwned their so-called defenses.

The US government often relies on dinosaur cyber-infrastructure and, of course, security through obscurity. If the government decides to create a copy, then security through obscurity will not protect this secret operating system for long. It's simply far too valuable.

If the government can't stop a bunch of teenage hackers, how can we trust them to keep something like this out of the hands of the bad guys who are infinitely more capable and subtle? You might be asking yourself how is this relevant... Well, once it's out there, someone is going to steal it, and all iPhone users will be at risk. It'll probably end up on various trackers eventually.


Could this be part of the government's war against encryption?

At the end of the day, it seems to me like they're trying to create a legal precedent to get the uneducated masses to rally against encryption so they can drum up support for anti encryption laws. They seem to be using the guise of terrorism to scare people into being afraid of the Big Bad Encryption Boogeyman, and legally allow all sorts of backdoors, most of which will be exploited and end up causing a far greater national security and privacy risk in the long run.


Apple's Side of the Story

It looks as though, as of February 25th, 2016, Apple used many of the points I've mentioned in this answer, and more, to defend themselves in court. You can read the full document here.