Since news broke last week that the FBI was suing Apple over an iPhone owned by one of the San Bernardino shooters, many engineers, coders, futurists, cryptographers, pundits, and journalists have been asking whether the FBI’s request is even feasible.
If you are a technologist, or have any sort of interest in computer science, then the technical specifics of the case are very interesting. The FBI’s specific request — not an actual key, but a tool that lets them brute-force their way in themselves — is a very smart move. They’re asking for a hint, not the actual answer.
This has many implications for the tech industry, but those implications are, for the most part, inside baseball. The types of encryption used, the firewalls and sandboxes in place, the safeguards that prevent unauthorized access. It is both an ethical puzzle and one of engineering.
But do the tech specs really matter here? Wall Street Journal writer Chris Mims asked a related question when he wondered if legal experts were more qualified to comment on this case than the technological experts.
He’s probably right! Apple’s opposition to the FBI’s case has never really been about technical feasibility. Tim Cook stated as much this week when he wrote in a company-wide email, “it is certainly possible to create an entirely new operating system to undermine our security features as the government wants.”
The most important question that will be answered at the end of this showdown isn’t about the way the phone works (or doesn’t work) at all. It’s about the legal precedent that could be set if a tech company is legally compelled to assist law enforcement (companies already do this to a certain extent, regulating themselves so that the government doesn’t have to). The FBI says the tool would only apply to this one phone. That’s doubtful; multiple law enforcement agencies have stated publicly that they hold devices in evidence that they’d like Apple to unlock. A win for the FBI would open the door for it and other law enforcement agencies to use similar legal strategies for any device that they want to investigate.
Maybe most important, consider that while tech iterates rapidly and specifically, the law does not — and therein lies the vulnerability. Part of the reason Apple previously added better encryption to iOS 8 was so that it would not be able to recover data for law enforcement. Apple is reportedly already working on strengthening its next hardware and software iterations so that the FBI cannot make the exact same legal request for other devices. But there will be other legal requests, written in language making demands about the particular software and hardware in whatever new phones that Apple designs. The law is not selective. We can build a back door that only works for one kind of phone, but, as noted cryptologist Bruce Schneier wrote last week, “We cannot build a backdoor that only works for a particular type of government, or only in the presence of a particular court order.”
There is a very good chance that this fight will go to the Supreme Court and will dictate the level to which tech companies must cooperate with law enforcement. Understanding how an iPhone’s internal security works can help you understand this case, but it’s not essential — and will be irrelevant a few years from now, when new iPhone models have been designed and produced. Again, the legal specifics of the case are more important here than what type of chip the iPhone has inside.
The FBI is asking Apple to create a tool, to be used only on its campus, to bypass the security code and extract data. The FBI is not asking for Apple to hand the source code over, but is asking for it to be built and used. That doesn’t seem particularly invasive or arduous, but when that request is acceded to even once, it sets a precedent.
The big question that remains is then: Do technical safeguards matter if they can always be lawfully broken down? The answer is no.