iPhone vs. Trump: How Technology Companies Can Protect Both Customers and National Security

Reuters
January 19, 2020 Topic: Security Region: Americas Tags: TechnologyAppleNational SecurityGovernmentTerrorism

iPhone vs. Trump: How Technology Companies Can Protect Both Customers and National Security

These very powerful encryptions leave the government in the dark. But perhaps there is a solution that appeases cellphone customers with privacy concerns while ensuring that criminals cannot conceal their conspiracies.

 

Attorney General William Barr characterized the attack perpetrated by a Saudi aviation student in December, which killed three people at a Florida Navy base, as an act of terrorism. As part of the government’s investigation into the attack, Barr asked Apple Inc. to help unlock two iPhones belonging to the gunman, in the hopes that data on the phones would shed light on his radicalization. This is but the latest case in a continuing tug-of-war between tech corporations that sell very high-powered encryption and the government that needs to read the messages of terrorists, drug lords, and human traffickers.

Historical Background

 

Until the Snowden revelations, about greatly enhanced capabilities of the U.S. government to surveil people, American tech corporations showed limited interest in developing and marketing high-powered encryption software. After these revelations, many customers became highly concerned about their privacy. Some nations, such as Germany, India, and Brazil, considered forging their own networks. American tech companies viewed these developments as highly threatening to their businesses. Apple led the response by marketing new, powerful encryption.

The FBI pointed out that these very powerful encryptions leave the government in the dark. The political atmosphere changed to some extent after the ISIS attacks in Paris that killed 130 people (after which “reading” a phone abandoned by a terrorist helped to catch him and avoid more attacks), and after the government found a phone used by Syed Rizwan Farook, a terrorist who, along with his wife, killed fourteen people in San Bernardino on December 2, 2015. The phone Farook used had the powerful encryption provided by Apple, and the FBI, at the time was unable to “read” it, and hence asked Apple to help decrypt it. When Apple demurred, the FBI turned to the courts, which ordered Apple to comply with the FBI request. Apple refused, and the FBI asked the court to force Apple to comply.

An intensive public debate followed between the supporters of Apple (major parts of the media, law professors, and public intellectuals) and a smaller number of supporters of the FBI. After initially refraining, President Barack Obama stated on March 11, 2016, that never allowing government access to someone’s smartphone would be equivalent to “fetishizing our phones above every other value” and that it would not “strike the balance that we have lived with for two hundred, three hundred years.” Reliable sources report that ISIS switched to use highly encrypted phones, and one must assume that human traffickers, drug lords, and spies around the world are following suit. In the United States, the strength of Apple’s encryption is leading criminals to switch from burner phones to iPhones. On March 28, 2016, the FBI announced that it was able to “read” the dead terrorist’s phone and the case became moot. (The FBI used a key provided by an Israeli company for a million dollars, but it was a key that could be used only for that particular case.) The issue is hot again as the government again faces a terrorist phone it cannot read and the tech companies’ refusal to provide access.

Legal justification

The Fourth Amendment does not state that the government may not “search” phones, homes, papers, or persons; it merely bans “unreasonable” searches. By banning only unreasonable searches and seizures, the Fourth Amendment recognizes, on the face of it, a category of reasonable searches, for instance, those that promote public safety.

Moreover, the Constitution provides a mechanism for determining what searches are reasonable: the courts. What the courts considerable reasonable changes as conditions change. For instance, after a rush of skyjacking in 1972, the courts deemed legal the newly introduced screening gates in airports, which search millions of travelers each day. These gates stopped skyjacking in 1973. The courts, as a rule, do not use the term “common good,” but instead refer to the “public interest.” Although they have given different rationales for authorizing a considerable variety of searches—many, like the airport screenings, even without a warrant—they seem to follow an ethical concept: namely, if the privacy intrusion is small and the gain to the public interest is high, then searches should be allowed.

A review of Supreme Court rulings shows that the Supreme Court has a broad understanding of public safety, which allows diverse intrusions into the realm of individual rights to serve this common good. The most basic element of public safety is upholding law and order, and the deterrence and prevention of crime. The second element of public safety relates to preventing accidental death and injury. Thus, the Court has allowed random drug and alcohol testing—without reasonable suspicion—of train engineers in the wake of a series of train accidents as well as random sobriety checkpoints on highways to prevent deadly car accidents resulting from drunk driving. A third element of public safety is the promotion of public health. Thus, the Court held that the public interest in eradicating the smallpox disease justified compulsory vaccination programs, despite the resulting intrusion on privacy, and held that search warrants for Occupational Safety and Health Act inspections do not require “probable cause in the criminal law sense.” In short, there are ample precedents to hold that when the common good, in particular where public safety and national security are concerned, individual rights can be curbed, especially if the intrusion is small and the gain to the public interest is significant.

True, not all cases run this way. For example, in United States v. Jones, the Supreme Court overturned the conviction of a drug trafficker, ruling that police could not use a GPS device to surveil suspects without a warrant. In Kyllo v. United States, a case involving marijuana being grown under heat lights in the petitioner’s garage, the Supreme Court ruled that thermal imaging of a suspect’s residence requires a warrant. Thus, the use of information from a warrantless scan of the residence could not be used in obtaining a warrant to search the premises, which is what had occurred. In Arizona v. Hicks, while police were searching an apartment for a shooter, they collected serial numbers of expensive stereo equipment, which they thought was stolen. Although they turned out to be correct, the Supreme Court ruled that police violated the Fourth Amendment because they lacked probable cause. The fact is that the Constitution, especially the Fourth Amendment, far from being a one-way ticket for the protection of individual rights, is also concerned with the common good. Hence, if the courts ruled—as they did in the previous terrorist iPhone unlocking case—that security should take precedence, then the government has a strong case.

In short, from a Constitutional viewpoint, it seems Apple and other tech corporations should comply with the Department of Justice's request to decrypt the phone.

Security and privacy combined?

Apple argues that if it weakened the encryption software so that the government could surveil phones (i.e., put in a “backdoor”), then many millions of people all across the world would not just lose their privacy but also have their security endangered. This, Apple holds, is because other governments and criminals would come in through the same backdoor. Tim Cook, the CEO of Apple, argued that such a backdoor would enable ‘bad actors’ to bring down power grids, cause people dependent on medical devices to suffer heart attacks, and find the locations of peoples’ children. Will Cathcart and Stan Chudnovsky, the Facebook executives responsible for WhatsApp and Messenger, recently wrote a letter to Barr in which they also claimed that adding a backdoor to their technology would put users at risk. The letter stated that “[t]he ‘backdoor’ access you are demanding for law enforcement would be a gift to criminals, hackers and repressive regimes, creating a way for them to enter our systems and leaving every person on our platforms more vulnerable to real-life harm.”

Apple (and other tech corporations) should leave the encryption software as it is—not introduce a vulnerability or a backdoor—and instead, develop a key to unlock phones. The corporations would keep this key. Thus, once a court orders that a given phone must be unlocked, the FBI would bring it to Apple (or Google or whatever tech corporation is involved). That company would then unlock the phones they produced and turn over to the FBI any information that’s found—but not the key. Several artificial intelligence experts thought that although Apple has the technical capability to create a key, the real issue would be keeping it secure. Steve Bellovin from Columbia University’s department of computer science responded that “a key can be readily available or it can be secure, it can’t be both.” According to Phillip Schrodt, a senior research scientist, “the problem is not the technology, it is people getting careless about how they use the technology.” In response to this claim—echoed by the tech corporations themselves—one notes that Coca-Cola kept its formula secret for many decades. And that, during the last twenty-five years, leaks about secrets from the FBI have been very rare.

During a meeting on May 11, 2016, at the Council of Foreign Relations (a rare one, on the record), I was surprised to hear Manhattan District Attorney Cyrus Vance, Jr. informing the audience that, until September 2014, his office was able to routinely send phones to Apple; Apple would open them and send back the information within a day or two. (The reason Apple stopped, Vance implied, was that, in September 2014, it started advertising that it was the only company that sold phones whose encryption could not be broken. It seems that concerns for profits, played a key role in Apple’s sudden refusal to cooperate with law enforcement and national security authorities.)