In the wake of last month’s shooting at a Pensacola, Florida, naval base, Attorney General William Barr is putting pressure on Apple to help FBI investigators unlock two of the shooter’s iPhones. Followers of these issues will recall a similar pressure campaign in 2016 to force Apple to decrypt the San Bernardino, California, shooter’s iPhone. In that case, the FBI ultimately hired an external company to break the encryption, at a cost of over $1 million.
One might think that the FBI’s current efforts mean that iPhone encryption has advanced such that only Apple has the capability to unlock the shooter’s iPhones, but depending on the exact model of the Pensacola shooter’s phone, the FBI could pay as little as $15,000 to reach the data locked inside. However, if commercially available solutions don’t work, it’s likely there is no way for Apple to unlock the phone without its passcode.
So why this latest campaign to force Apple to undo its own encryption? One possibility is that the FBI is unaware of these new tools — unlikely, given that they’ve been developed by the company believed to have been hired to hack the San Bernardino shooter’s iPhone. If commercially available solutions are not working, another possibility is that the FBI simply doesn’t understand (or won’t accept) Apple’s inability to help. However, more likely is that this is part of Attorney General Barr’s larger anti-encryption push.
Barr has taken a strong stance against strong encryption, noting last July that “while [the Department of Justice] remain[s] open to a cooperative approach, the time to achieve that may be limited.” In a press conference last week, he accused Apple of refusing to provide “any substantive assistance.” For its part, Apple says it has provided the FBI “many gigabytes” of “iCloud backups, account information, and transactional data.”
Nor is this cooperation new, as Apple has complied with nearly 130,000 law enforcement requests for data stored on their own servers in the past seven years. Apple also refrained from instituting a policy of encrypting device back ups at the FBI’s request. At the same time, Apple is defending its position that “there is no such thing as a backdoor just for the good guys.”
It is unclear where things will go from here. The FBI ultimately ended its earlier efforts to force Apple to create a backdoor once it was able to unlock the shooter’s iPhone through other means, but Barr’s more aggressive stance could lead to an extended legal battle. With President Trump now putting pressure on Apple to assist the FBI, this path could be more appealing, though it would likely necessitate the FBI not using available tools to decrypt the iPhone, thus halting further investigation into a terror attack on an American military base.
If the FBI succeeds in such an effort, it would set a dangerous precedent. Phones are not the only thing law enforcement would want to decrypt: With forced cellphone decryption in its toolkit, decryption of personal messages and internet traffic could soon follow. Over time, the number of law enforcement requests for decryption would almost certainly lead tech companies to establish backdoors, with all of the fallout they entail.
Another possibility is that the FBI uses existing decryption technology to unlock the Pensacola shooter’s phone but an effort is made to pass legislation that would require tech companies to either cooperate in such cases or preemptively install backdoors in their encryption for law enforcement use. Barr alluded to such an effort in his anti-encryption speech in July 2019 and again in his remarks last week. While the ongoing techlash makes punitive legislation more likely, increased skepticism of the FBI could put a dent in these efforts. Heightened privacy worries could also make lawmakers reluctant to weaken consumers’ cybersecurity.
Hopefully, the desire to quickly resolve the investigation and inertia preventing the assembly of a strong anti-encryption coalition will prevail, and Apple will not be forced to weaken the encryption of its products. Weakened cybersecurity — either through the precedent of forcing a company to hack its own device or legislation mandating backdoors in encrypted products — may make some investigations easier for law enforcement. But in the 30 or so years encryption has been available to consumers, nothing has changed technologically that would prevent backdoors from posing a major security threat to any product that contained one. Mandatory backdoors would make every American more vulnerable to cybercrime. In this case, the cure is worse than the disease.
This article by Will Baird first appeared at the American Enterprise Institute.