David Opderbeck, law professor
This semester I’m teaching a course on “Cybersecurity and National Security,” so the debate over Apple’s dispute with the FBI over the iPhone used by San Bernardino shooter Syed Rizwan Farook is particularly interesting to me. The issue raises a number of important questions about privacy, security, the technology industry, and the separation of powers, none of which, in my opinion, yield easy answers.
The issue arises because of the security features built into the iPhone 5C used by Farook. The phone is actually owned by the Farook’s employer, the San Bernardino County Department of Public Health, which has consented to the FBI’s request to search the phone. (Farook was killed in a shootout with police after the attack he carried out with his wife, Tashfeen Malik.)
The problem is that the iPhone, like all iPhone models running iOS 9, is protected by a passkey PIN (the four-digit number the user enters to unlock the home screen) and the data on the phone is encrypted in part by a key derived from the PIN. Ordinarily, a 4-digit PIN would not be hard to defeat using a “brute force” attack: the attacker need only try every combination from 0000 to 9999 in sequence until one works. This requires at most 10,000 possible combinations, which could easily be generated by a computer. But the iPhone can be configured to wipe its file system after 10 failed PIN attempts, which makes it impossible to break the file-level encryption. In other words, if the FBI tried to guess the PIN and failed in 10 attempts (essentially with a 1-in-10,000 chance each attempt), the FBI would not be able to determine whether there is information on the phone about Farook’s possible connections with ISIS, other terror plots, or other evidence relating to the shooting.
It is possible that Apple could design new firmware that would allow the FBI to bypass the 10-attempt wipe feature on Farook’s phone. The U.S. Department of Justice has secured an Order from a Federal Magistrate Judge under the “All Writs Act,” a statute first passed in 1789, requiring Apple to provide “reasonable technical assistance” to enable the search of the phone. But Apple has built much of its brand reputation on security, and its leadership, in particular CEO Tim Cook, believes very much in the value of security and encryption both for commerce and free speech. Apple therefore is unwilling to cooperate with the FBI and plans to challenge the legality of the Order.
Some well-known law enforcement and political figures have criticized Apple, while many public interest groups have criticized the government. Most of my colleagues who teach Internet and Privacy law side with Apple. There is, of course, a larger political and legal backdrop to this case, which includes the issue of government Internet surveillance and whether device and software manufacturers who enable encryption should be required to build in a backdoor or key that the government could use to break the encryption in the interests of national security or law enforcement.
These are important concerns. Personally, I am also concerned (as are many of my colleagues) about the use of the All Writs Act in this case. While the All Writs Act has been used before in somewhat analogous cases, my sense is that a question with significant policy implications should be vetted through a more robust process, and ultimately should be something addressed by Congress and not by a single Judge.
At the same time, the scope of what Apple is actually being asked to do is not entirely clear to me. Apple claims that it is being asked to create code that, if it were released into the wild, could be used by criminal hackers or rogue nation-states to compromise the security of every iPhone. Some technologists dispute that claim. If it really is a general purpose backdoor or key, the broader policy concerns are far more acute. If it really is a precisely tailored pick for this one and only this one lock, Apple’s refusal to cooperate in a mass-murder / terrorism investigation seems to me harder to justify. In either event, this case is highlighting an issue that we will be wrestling with for some time: how do we balance the legitimate desire for encryption and other technologies that facilitate commerce and free speech with national security and public safety?
What do you think?