• This is a political forum that is non-biased/non-partisan and treats every person's position on topics equally. This debate forum is not aligned to any political party. In today's politics, many ideas are split between and even within all the political parties. Often we find ourselves agreeing on one platform but some topics break our mold. We are here to discuss them in a civil political debate. If this is your first visit to our political forums, be sure to check out the RULES. Registering for debate politics is necessary before posting. Register today to participate - it's free!

A key under the doormat isn’t safe. Neither is an encryption backdoor.

Anomalism

Banned
DP Veteran
Joined
Dec 2, 2013
Messages
3,237
Reaction score
2,159
Location
Florida
Gender
Male
Political Leaning
Libertarian - Left
https://www.washingtonpost.com/news...sa-tried-to-build-safe-encryption-but-failed/

Cryptography, when used properly, is a critically important tool for securing data on the notoriously vulnerable networks that we rely on for almost every aspect of daily life. But law enforcement agents have expressed concern that cryptography might sometimes work too well, thwarting investigators from extracting useful evidence from wiretaps, smartphones and computers. They call for encryption systems to be designed with special backdoor access features that would allow the government to decrypt data when it is needed for an investigation. The cryptography debate is often portrayed as a zero-sum game pitting law enforcement against privacy — our individual right to be free from unwarranted intrusion by the government. Put this way, reasonable people might disagree on where balances should be struck and lines should be drawn, and we rely on the political process to find compromises, however imperfect, that we can all live with. But lost in this framing is the reality that cryptography and security are not just political issues, but also deeply difficult technical ones.

We learned this the hard way. Just over two decades ago, in 1993, the government proposed a new standard encryption system called “key escrow,” in which an National Security Agency-designed encryption device, called the “Clipper chip,” could be used in computers and other devices that needed to encrypt data. Clipper incorporated an encryption algorithm that was said to be stronger than the current standard. But there was a catch: Clipper-encoded data would include a copy of the key used to encrypt it, itself encrypted with a key held “in escrow” by the government. If Clipper-encrypted data were encountered during an investigation, the key could be taken out of escrow and the data decrypted. Clipper was, to say the least, controversial. It ignited a firestorm of debate, framed, just as it often is now, on balancing national security against individual liberty. But it turns out that this was beside the point.

Although Clipper was designed at the NSA by some of the best cryptographers in the world, it had a number of undetected technical flaws, one of which made it possible for a rogue user to bypass the government access feature while still making use of the encryption algorithm. I discovered (and published, to the agency’s credit, without objection from the NSA) a practical way to do this a year after the first products incorporating the chip hit the market. The whole scheme is today remembered as an expensive, embarrassing fiasco. Clipper’s failure starkly demonstrated that cryptographic backdoors must be understood first as a technical problem, not just the political one reflected in the debate. Clipper failed not because the NSA was incompetent, but because designing a system with a backdoor was — and still is — fundamentally in conflict with basic security principles.

Despite many advances in computer science, building a secure access feature is actually harder now than it was when Clipper failed in the 1990s. This is partly because we now rely on encryption integrated deeply into systems that are more complex, and fragile, than ever. It’s perhaps tempting to hope that technologists will find a way to build law-enforcement backdoors that don’t unduly compromise security. We built the Internet and the smartphone after all, and surely this can’t be much harder than that! Unfortunately, it really is. For all of computer science’s great advances, we simply don’t know how to build secure, reliable software of this complexity. The problem is as old as software itself, and is one of the reasons we need cryptography in the first place.

There is overwhelming consensus in the technical community that even ostensibly “secure” backdoors put the systems into which they are incorporated at increased risk of outside attack and compromise. At best, a backdoor greatly increases the “attack surface” of the system and creates rich new opportunities for unauthorized exploitation of hidden (and inevitable) software bugs, to say nothing of the human-scale processes that manage the access. The stakes are higher than ever as we rely on networked systems to support almost every aspect of the economy and our critical infrastructure. It is not an exaggeration to characterize the state of software security as a national crisis. The regularity of large-scale data breaches such as the Office of Personnel Management (OPM) attack, in which the security clearance records of millions of government employees were stolen, is ample evidence that our infrastructure is already dangerously fragile, even without the burden of complex requirements to accommodate future surveillance.
 
Back
Top Bottom