Opinion: Encryption Back Doors are Dumb.

CyAN recently opposed the now-dead EU “Chat Control” regulatory proposal, and the UK’s push for Apple to remove ADP for UK iCloud users – the latter unfortunately having led to Apple’s removal of end-to-end iCloud encryption for its British customers.

Now, two additional countries in Europe are on the verge of adopting ill-considered mandates for encryption back doors – France, as part of an amendment to its “Narcotrafic” law that would enable access to messages between suspected drug traffickers, and Sweden, in an attempt to mandate the possibility of law enforcement access to encrypted messenger toolsthe Signal Foundation has already indicated that it would rather leave Sweden than comply with the latter. CyAN has just published a position statement opposing both.

Encryption is a tool. Like cash, cars, or firearms, encryption can serve good and nefarious purposes alike. The benefits of encryption are well documented – trust, privacy, anonymity, safety, resilience, and security, for citizens and societies alike. Encryption’s benefits include making voting and business safe, protecting dissidents and members of groups that are often threatened, such as human beings who identify as LGBTQ, and more.

At the same time, encryption could be used by fraudsters to encrypt victims’ computers with ransomware. Terrorists may use encryption to avoid intelligence agencies and law enforcement intercepting their messages. The same goes for child sexual abuse messages, drug transactions, and more. And like the three examples named above, it is up to society to democratically decide on an appropriate balance between “good” and “bad” uses of a tool, and if, when, and how to limit, or even ban that tool.

Strong end-to-end encryption’s benefits vastly outweigh the downsides to society from possible abuses. Unfortunately, both law enforcement and intelligence agencies have attempted many times, with varying degrees of success, to legislatively mandate “back doors” into encryption systems when they were unable to otherwise access protected data. Usually presented under the guise of fighting abuses such as child pornography, drugs, terrorism, or financial crime, the past decades have seen multiple misguided moves to mandate third-party access to encryption systems, from 1993’s Clipper Chip in the US, to the current slate of initiatives listed earlier. Methods of gaining such access include key escrow, mandatory additional decryption keys, or legal requirements to implement methods that allow service providers to give authorized third parties access to their customers’ data, to name a few.

The problem with encryption back doors is that they introduce a technical weakness into a security system – one which it is impossible to guarantee will not itself be used by malicious attackers. This can expose private data to malfeasants. Furthermore, the risk of such abuse undermines trust in the things that encryption exists to secure – digital democracy and e-commerce, to name two. Think of it like adding a door to a submarine – sure, it can be done, sure it might make access more convenient, but modern submarine hulls are highly engineered “bubbles” designed to withstand insane pressures. Every time you poke a hole in one, it requires a ridiculous amount of complexity to keep water out, and adds more potential points of failure.

A good example of a back door that has caused real economic damage is the US National Security Agency’s involvement in the development of what would become the DES (Data Encryption Standard) encryption standard in the 1970s. While there was suspicion at the time that the NSA’s recommendation for use of a given type of S-box was an attempt to compromise the new standard for their own benefit – this turned out to actually harden the system. However, at the same time, the NSA insisted on shorter key lengths as part of DES’ adoption by the National Institute of Standards and Technology (NIST) in order to ease brute forcing of DES.

This has come back to bite not only the NSA, but the global economy in the ass in a big way, as the original DES keys became easy to break well before even the introduction of quantum computers, leading to significant cost in replacing DES-based cryptosystems and a loss of trust in a lot of e-commerce and similarly sensitive communications. Furthermore, while the NSA plays a key role in securing US government communications, and has often been a constructive contributor to the security of global data communications, its past involvement in weakening encryption “for the rest of us” means nobody trusts anything that comes out of Fort Meade. Trust takes a long time to build.

I understand that not having back doors will occasionally make the job of law enforcement and intelligence agencies more difficult as they work to protect us from criminals, abusers, and terrorists. However, I also believe that effective investigation of bad actors cannot depend primarily on single technological capabilities. Furthermore, while I oppose legal mandates for use of specific technologies or technological limitations, there exist types of encryption application that allow for investigation while respecting citizens’ privacy. Homomorphic encryption is one such model, allowing for limited searching of known patterns, such as specific child abuse sexual imagery, without decrypting secure datastreams. Obviously these solutions are no panacea, but their very existence significantly weakens the case that back doors are needed for effective investigations.

And let us not forget that criminals do not respect laws – that is what makes them criminal. How would a ban on end-to-end encryption even be enforced, when bad guys will simply turn to messaging tools run out of countries not covered by back door mandates? I’ve yet to see a good argument that addresses this.

Even worse, we haven’t even considered the topic of potential non-technical, administrative abuse by even well-meaning government agencies; there are many examples of law enforcement officers illicitly accessing license plate databases or CCTV footage, for example to identify women they would like to meet. We all have something to hide, or which simply isn’t anyone else’s business – do you completely trust your government agencies, which are after all comprised of fallible human beings, to completely respect your privacy and dignity at all times, if they are somehow able to access what you don’t want to have seen? I don’t.

Encryption is necessary. Demanding to weaken encryption in pursuit of criminals a) doesn’t work, b) endangers citizens, c) undermines safe online business, and d) is lazy policing. Don’t do it.