This week, a federal judge ordered Apple to assist the FBI in unlocking the iPhone of Syed Farook, one of the shooters who opened fire at a party in San Bernardino last December. The court order calls for a narrow tool that would only work on Farook’s iPhone, but Apple CEO Tim Cook declined on Tuesday in a strongly worded letter, arguing that such a narrow tool doesn’t exist and would be “too dangerous to create,” essentially opening a back door into every iPhone. Cook makes a very strong case that Apple shouldn’t be forced to hack the phone, but is the company bullshitting a little bit when it says opening one phone is equivalent to opening them all?
What Apple is specifically being asked to do is disable an iPhone security feature that erases a phone after ten incorrect password guesses. That would allow the FBI to “brute-force” their way into the phone by entering passwords at a rate of 12 per second until they find the right one.
According to Cook, Apple can’t simply turn off the feature. The company hasn’t written code that would bypass the password safeguard, because that code would be extremely dangerous in the wrong hands. Turned over to the FBI, it would allow the agency to unlock any iPhone it could physically seize.
“While the government may argue that its use would be limited to this case, there is no way to guarantee such control,” Cook wrote.
But the court order suggests a way to bypass the password limit on Farook’s phone only: by creating “a signed iPhone software file” that “will be coded by Apple with a unique identifier of the phone so that the [software] would only load and execute on the SUBJECT DEVICE.”
A very limited hack, only workable on one phone, is certainly nice to talk about, but is it actually within Apple’s technical capabilities?
According to Dan Guido at the Trail of Bits Blog, it is. Guido writes that Farook’s iPhone 5C doesn’t have the modern security measures Apple introduced along with TouchID — the feature that lets you unlock your phone with a fingerprint. Newer phones have a “Secure Enclave,” a separate computer that handles password security; installing a custom version of the operating system wouldn’t get around that.
But in the 5C, the operating system handles all the password functions, and the operating system could theoretically be replaced with a firmware update, even on a locked phone. In fact, this is something Apple allegedly did for law enforcement around 2012, back when the iPhone’s operating system was less secure and Edward Snowden hadn’t yet turned government surveillance into a hot political issue.
So, to Guido, it’s technically possible to do what the FBI wants here:
“On the iPhone 5C, the passcode delay and device erasure are implemented in software and Apple can add support for peripheral devices that facilitate PIN code entry,” he writes. “In order to limit the risk of abuse, Apple can lock the customized version of iOS to only work on the specific recovered iPhone and perform all recovery on their own, without sharing the firmware image with the FBI.”
Forensic scientist Jonathan Zdziarski agrees it should be doable “on a technical level,” noting that Apple has firmware signing capabilities for all its devices, making Apple employees the only people who can get a locked phone to accept new software.
It’s also possible that whatever method Apple uses won’t work on the latest phones — the 5S and up — with the Secure Enclave. Can Apple modify that Secure Enclave? Probably, according to John Kelley, an InfoSec expert at Square. It just doesn’t appear that they’d have any incentive to do so when they’re only being asked to help break into a 5C.
But no matter what measures Apple takes to make sure the custom software only works on Farook’s phone, the company will still have built a back door into iOS devices. Helping the FBI in this case will confirm that disabling the iPhone’s limit on password guesses is possible and create the potential that it could be repeated by malicious hackers (or, more likely, that the government could order Apple to repeat the process on other devices in evidence).
Those are just the consequences of installing new software on the phone. There’s also the question of how the FBI can access the device after that happens. The two options laid out in the court order both seem risky: Apple can bring the software to the FBI and let them try to crack the password, or Apple can take the phone to one of its facilities, then let the FBI try to crack it remotely.
Even if the FBI never handles the firmware file itself, and lets Apple do everything except the brute-force password entry, the company would be forced into creating a system that could be used again.
“Part of the court order also instructed Apple to essentially design a system by which pins could be sent electronically to the device, allowing for rapid brute forcing while still giving Apple plausible deniability that they hacked a customer device in a literal sense,” as Zdziarski wrote.
Cook seems to share all of these concerns, but without admitting to the possibility that Apple could theoretically build a hack with this phone only. In his open letter, he wrote, “[M]ake no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.”
There may be a way to limit its use, but that doesn’t undermine Cook’s larger point: that if Apple doesn’t take a stand in this case, they could be forced to build “limited” back doors again and again, targeting any number of devices.
“No reasonable person,” Cook wrote, “would find that acceptable.”