Australia’s Encryption Law Deals a Serious Blow to Privacy and Security

An Australian flag is flown at half mast during the dawn service to mark the ANZAC (Australian and New Zealand Army Corps) commemoration ceremony at the Australian National Memorial in Villers-Bretonneux, France, April 25, 2016. REUTERS/Pascal Rossignol
December 19, 2018 Topic: Security Region: Asia Tags: AustraliaEncryptionBackdoorCyber SecurityPrivacy

Australia’s Encryption Law Deals a Serious Blow to Privacy and Security

A backdoor wouldn't just be used by the government; it would be exploited by criminals and foreign actors.

The Australian government has compromised the digital privacy and security of countless Australians, ironically, in the name of protecting national security.

Home Affairs Minister, Peter Dutton, has repeatedly targeted digital encryption, which many websites and apps employ to secure user data, as an ominous roadblock standing between intelligence officers and transnational crime syndicates and pedophiles.

His efforts worked. Parliament recently turned their ire into law, after Dutton pilloried large tech companies for their supposed recalcitrance when it comes to working with governments to decrypt and hand over user data. The new law allows the government to request or coerce any communications service with an end-user in Australia to build tools that would weaken encryption protocols.

One author of this editorial (Rizer) is a former law enforcement officer and member of the intelligence community in the United States, and well aware of the challenges to modern intelligence gathering. But the Australian government is working against its aims—to protect national security and crack down on criminal conduct.

Perhaps the most glaring flaw in the law is that, by encouraging or compelling companies or individuals to create workarounds to their security protocols, the state puts the privacy and digital security of innumerable Australians (and any nonresidents with whom they may be communicating) at risk. It’s certainly true that criminals may operate on apps and websites that use encryption tools, but so do journalists, hospitals and people shopping, managing their finances or otherwise communicating online. The proposed encryption “backdoor” the government would use to track criminals could be used by criminals—or worse, national-security threats—to exploit citizens and other national interests.

The law’s proponents respond that the law prohibits these notices from forcing communications providers (which could include tech giants like Facebook and Twitter as well as any amateur blog or journalism website) to introduce “systemic vulnerabilities” into their systems. Instead, law enforcement can only compel firms to weaken encryption for “particular device(s).” This offers little assurance. If a code or key used to decrypt a particular device falls out of the provider’s or government’s hands for any reason, that device is now at substantial risk. 

This risk multiplies if the provider decides to use a decryption key that could access multiple devices. Decryption is costly and time-consuming, as pointed out by Riana Pfefferkorn, a cryptography fellow at Stanford’s Center for Internet and Society. Pfefferkorn also noted that if a company believes the government will serve multiple notices, it’s more efficient to develop a backdoor that could be stored for use on multiple devices. Should law enforcement or companies lose access to the key, then a widespread vulnerability is more than likely. All of the above concerns reinforce the proven notion that there is no such thing as a “secure backdoor.

What’s more, the threshold for “systemic” is entirely unclear. Authorities could compel a range of activities under the bill that could introduce significant vulnerabilities to a system’s security without the government considering them “systemic.” Worst of all, if the communications provider believes that a request from the government would introduce a systemic vulnerability, there’s no process for the provider to challenge the decision of the government-appointed assessor.

A final consideration regarding this new threat to digital security: the law of unintended consequences dictates that government interventions often create undesirable ripple effects that are contrary to the intervention’s original objective. That risk increases with the scale of the intervention. One likely downstream consequence of this policy is that companies and services choose not to develop security tools that increase overall digital security in order to avoid interference from Australian law enforcement. This “chilling effect” on Australian security research and development will likely create more harm than any good that may come from catching some criminals with greater ease.

Of course, it’s that last phrase that comprises the lion’s share of the government’s argument. Proponents cite the statistic that “encryption impacts at least nine out of every ten of Australian Security Intelligence Organization (ASIO)’s priority cases.” As scary as that figure sounds, they offer no evidence as to the rate at which said instances actually impede investigations. The encrypted data in question could be immaterial to the investigation, or it could be information that an agency could acquire through other means. The Department of Home Affairs can already decrypt information with special techniques, and intelligence agencies can access data at start or end points where data are not encrypted.

Experts worry that other countries may model this extreme approach. But policymakers abroad should remember that it isn’t popular. The anti-encryption law received one total public comment in support and 342 against it. Fortunately, alternatives exist. Access Now, a digital-rights group, compiled a list of effective solutions that don’t involve massive government overreach or weakened encryption standards.

Law enforcement and intelligence agencies need to be brought up to speed on technological developments in order to keep pace with society and those who threaten its security. But one thing is certain—broad mandates that weaken digital security more than they protect it are a bad first step.

Jon Haggerty (@JHaggrid) is the justice and civil liberties policy manager at the R Street Institute.

Arthur Rizer (@ArthurRizer) is R Street’s justice and civil liberties policy director, a former federal prosecutor and police officer and a retired U.S. Army military police officer.

Image: Reuters