Cryptography is by far one of the most important subjects in the information age. Every time you log in somewhere, there’s an algorithm of some sort verifying your password against a hashed value that determines whether you can authenticate into your account or not. It’s how we keep hackers at bay. So, what happens when the algorithm that’s supposed to keep you safe has a backdoor that allows certain people to have unfettered access to your accounts and personal records?
On May 19, 2015, Apple and Google urged U.S. President Barack Obama to reconsider forcing private sector technology firms to include backdoors in their cryptographic algorithms. I aim to explain how this affects us as consumers of technology and the bottom lines of the corporations that provide us with said technology.
A Little Bit Of History: Dual_EC_DRBG
You could be forgiven if the term “Dual_EC_DRBG” sounds like arcane gibberish to you, but it’s perhaps a term tied to one of the biggest scandals in the history of encryption technology. Our story begins in the early 2000s, when elliptic curve cryptography was beginning to take root in computer systems. Until then, generating a random number was a pain because of its inherent predictability. You see, people can generate random numbers very efficiently since we all think differently. Can you tell what number between 1 and 100,000 I’m thinking about right now? You have a 1:100,000 chance of getting the answer right if you just guess randomly. That’s not the same with computers. They’re utterly horrible at this since they usually rely on other fixed values to get to their “conclusions.” Since they can’t “think,” we have to synthesize the process for them. Elliptic curve cryptography makes the process of generating a random number much less predictable than conventional methods.
Back to the story. The National Security Agency (NSA) pushed a module called Dual_EC_DBRG as a possibility for generating these numbers. It wasn’t passed.
It doesn’t end there, though. In 2004, the NSA made a $10 million deal with the creators of the RSA cryptosystem (the people who at that time had the most market share in cryptography) to make their pet module the default for RSA. We don’t know if the NSA included the backdoor, but Dual_EC_DRBG certainly had one. The fact that the NSA was so insistent on including this module in RSA cryptography doesn’t help the case against prior knowledge.
Fast-forward to 2015, and now you have the U.S. government as well as other governments around the world coming forward to ask private companies to include backdoors to their encryption algorithms.
Why Backdoors Are Bad for Everyone Else
You might already have an idea of why backdoors are bad. It’s a no-brainer, right? The thing is that there are other unseen consequences to introducing backdoors to encryption aside from the invasion of privacy by government entities.
First of all, if a hacker discovers the backdoor (which is exactly how the Dual_EC_DBRG fiasco mentioned earlier started), you can just about guarantee that anyone can exploit it to have a peek at things that are very private to you.
The second reason why backdoors are horrible can best be expressed in the form of a question: Knowing that not just the government, but any John Doe, can have a look at your private data, would you ever open an account anywhere ever again? People rely on technology right now because they trust it. Eliminate the trust, and you’ll see very few customers in the enterprise market. Yes, consumers may still use encrypted and connected technologies, but businesses are going to opt out in droves. A lot of our favorite manufacturers rely heavily on their business-to-business customer bases.
So, not only is this idea bad for consumers, but also bad for the bottom line of the businesses that provide us with the things we love. That’s why giants like Apple and Google are so concerned about these policies.
What do you think we should do? Is a possible law on this even enforceable? Tell us in a comment!