Technology vs. Policy: How Legally Mandated Backdoors Compromise Security

There are some pros from the perspective of law enforcement and policy makers, but also some very serious cons...

The increasing demand for surveillance-proof computing has led to more advanced forms of encryption. Most notably, in 2014, Apple released an operating system that cannot be unlocked; even with a lawful warrant, Apple itself lacks the technological capacity to crack into a password-locked device. Google announced their plans for encryption the next day. Pre-existing systems that use whole disk or end-to-end encryption are also rising in popularity.

Law enforcement broadly and the Justice Department specifically were not pleased with this development in technology. In an effort spearheaded by FBI director James Comey and Deputy Attorney General Sally Quinlan Yates, the U.S. government is trying to expand its capacity to compel tech giants like Apple and Google to develop a so-called “backdoor” into their encrypted devices.

A question of perspective: Technology versus policy

Naturally, DoJ’s argument raises some concerns among a wide array of policy makers and privacy advocates. The notion that we would compromise security, potentially across the entire population of internet users, in order to marginally increase the effectiveness of law enforcement does not sit well with some audiences. The broad consensus of technology experts is that the capacity for government access to otherwise entirely encrypted systems would weaken the effectiveness of those systems.

Most notably, a group of experts in computer security wrote a report that was then published by MIT objecting to such mandatory government access. They argued that “the complexity of today’s Internet environment, with millions of apps and globally connected services, means that new law enforcement requirements are likely to introduce unanticipated, hard to detect security flaws.” From a technological standpoint, their case is hard to refute. The proposed backdoor capability does introduce a vulnerability to modern encryption practices.

However, from a policy standpoint, others have argued that the advances in security may be worth the risk. A common argument appeals to emotionally fraught stories of criminals that evade prosecution because the necessary evidence is unobtainable, behind a wall of technological protection. Some analysts have suggested, pragmatically, that criminals and other nefarious persons will find other mechanisms to encrypt their data.

On a more nuanced level, analysts have made the point that even encrypted systems are likely to transmit their data to an outside system. Think, for example, of an iCloud backup; an iPhone may be totally secure even from Apple itself, but if its files, photos, contacts, messages, etc. are backed up on iCloud, Apple does still have the technological capacity to access that information (which they do not share, as a matter of policy). This theoretical vulnerability has yet to lead us to disaster, so why should we assume that similar access to the phone itself would prove a significantly greater threat?

An uncomfortable policy