The big NSA revelation of last week was that the agency’s multifaceted strategy to read encrypted Internet traffic is generally successful. The story, from the New York Times and ProPublica, described NSA strategies ranging from the predictable—exploiting implementation flaws in some popular crypto products; to the widely-suspected but disappointing—inducing companies to insert backdoors into products; to the really disturbing—taking active steps to weaken public encryption standards. Dan wrote yesterday about how the NSA is defeating encryption.
To understand fully why the NSA’s actions are harmful, consider this sentence from the article:
Many users assume — or have been assured by Internet companies — that their data is safe from prying eyes, including those of the government, and the N.S.A. wants to keep it that way.
In security, the worst case—the thing you most want to avoid—is thinking you are secure when you’re not. And that’s exactly what the NSA seems to be trying to perpetuate.
Suppose you’re driving a car that has no brakes. If you know you have no brakes, then you can drive very slowly, or just get out and walk. What is deadly is thinking you have the ability to stop, until you stomp on the brake pedal and nothing happens. It’s the same way with security: if you know your communications aren’t secure, you can be careful about what you say; but if you think mistakenly that you’re safe, you’re sure to get in trouble.
So the problem is not (only) that we’re unsafe. It’s that “the N.S.A. wants to keep it that way.” The NSA wants to make sure we remain vulnerable.
Of course, we “have been assured by Internet companies” that we are safe. It’s always wise to be wary of vendors’ security assurances—there’s a lot of snake oil out there—but this news calls for a different variety of skepticism that doubts the assurances of even the most earnest and competent companies. This is going to put U.S. companies at a competitive disadvantage, because people will believe that U.S. companies lack the ability to protect their customers—and people will suspect that U.S. companies may feel compelled to lie to their customers about security.
The worst news of all, in my view, is that the NSA has taken active steps to undermine public encryption standards.
When I teach the history of encryption standards, I talk about the Data Encryption Standard (DES), published by the U.S. government in 1978, which was one of the most commonly used encryption methods for decades. Some aspects of the DES design were mysterious and there were rumors that the NSA had built in secret weaknesses. Years later, researchers discovered a powerful new codebreaking method called differential cryptanalysis—and found that DES was resistant to it. We now know that the NSA had whispered in the ears of the original DES design team to make sure the standard was secure against differential attacks, which NSA had discovered earlier. In other words, the NSA intervened secretly to improve the security of DES.
The successor of DES is the Advanced Encryption Standard (AES), published by the National Institute of Standards and Technology (NIST) in 2001. NIST went to great lengths to make the AES process as open and transparent as possible, and the result was a standard with broad buy-in from cryptographers around the world. Once again, the US government seemed to be doing its best to choose a high-security, trustworthy standard.
At the same time, there have been persistent rumors, and some evidence, over the years that the NSA has been working to undermine certain security standards. Now it seems that these rumors are confirmed, and the NSA has been undermining standards, which makes everyone—including every American—less secure.
How has the NSA sought to undermine standards? I’ll discuss two likely examples in the next post.
Hello to every , for the reason that I am
really eager of reading this weblog’s post to be updated daily.
It contains pleasant information.
Ed, I took your Information Security class about a decade ago. I recall you mentioning that there were longstanding suspicions that the NSA made DES easy for them to crack using their own specialized hardware, and that the creation of AES was done specifically in response to those concerns about DES.
Has that truth changed? Are we now sure that the NSA was only strengthening DES, not intentionally weakening it for its own gain?
I think it’s more clear now that the NSA’s contribution to DES improved security. The more transparent AES process was still a good idea, leading to a stronger result that had instant legitimacy with cryptographers.
And while the NSA (or IBM) never revealed differential cryptanalysis, the technique they used to strengthen DES, its independent rediscovery has most definitely strengthened newer symmetric ciphers.
The main reason DES needed to be replaced was that it was limited to a 56 bit keylength, which was not future proof. The NSA may have have a role to play in that. But the construction of DES appears to have been as strong as the NSA knew how to build, aside from the short keylength.
I appreciate the comments by Nick P, including some of the weaknesses of the brakeless car analogy.
Dr. Felten: “So the problem is not (only) that we’re unsafe. It’s that “the N.S.A. wants to keep it that way.” The NSA wants to make sure we remain vulnerable.”
I think the NSA doesn’t care whether most of us remain vulnerable, but they are very concerned with making sure that terrorists remain vulnerable.
In fact, the brakeless car analogy works very appropriately for this point. If terrorists know that the brakes have failed, then they will switch to other communication tactics. The NSA is very concerned with ensuring that its adversaries do not understand NSA capabilities so that the adversaries will continue to use flawed tools. This is precisely the rationale for keeping the capabilities classified. Unavoidably, the NSA capabilities must be kept secret from the public in order to maintain the advantage of terrorists who use compromised communications tools.
I’ll be interested in hearing the follow-on articles discuss specific examples, especially with respect to whether the NSA weakened encryption in general or primarily weakened encryption for the NSA’s own exploitation purposes.
AC, you write “Unavoidably, the NSA capabilities must be kept secret from the public in order to maintain the advantage of terrorists who use compromised communications tools.”
But there’s a step missing from your argument. For your conclusion to be “unavoidable” you also have to establish that the tradeoff you suggest—increasing the vulnerability of all U.S. Internet users against all attackers, in exchange for increasing the NSA’s access to terrorist communications—is good for national security on balance.
That would let you argue that what the NSA allegedly did was good public policy. If you want to argue further that the actions were legitimate in a democratic society, you have to establish as well that the people who made the secret decision to pursue this policy had the mandate to do so under our constitutional system.
Perhaps you can justify one or both of these propositions. But I don’t think you can just *assume* them.
I really don’t see a problem with the NSA being able to decrypt everything, nabbing technology helpful to the security of the country or having back doors into everything and everywhere. Kind of naive to think they couldn’t or wouldn’t read YOUR communications at will with ease.
is this comment for real?
The biggest problem I have, is trying to calculate the value of my intellectual property, already “collected” by a govt agency, so that I may properly deduct it from my taxes.
Before I say anything, let it be clear I’ve been doing plenty of NSA bashing and counter strategies myself over at Schneier’s blog. I got here from there. That said, I’ve seen some bad analogies and inaccurate portrayals of the situation.
“Many users assume — or have been assured by Internet companies — that their data is safe from prying eyes, including those of the government, and the N.S.A. wants to keep it that way. ”
This was a bad assumption for users to make in the first place. Even many lay people over the years assumed that if the NSA wanted them, it was game over. Movies such as Enemy of the State kept that in their mind. Additionally, during consults, I reminded people that the security required to beat TLA’s basically wasn’t going to happen. And that most security vendors push a false sense of security for their products, meaning one must carefully evaluate them. All in all, these NSA revelations just confirm what I told people for a decade and change little about how they do security because they don’t change the nature of their tradeoffs.
“Suppose you’re driving a car that has no brakes. If you know you have no brakes, then you can drive very slowly, or just get out and walk. What is deadly is thinking you have the ability to stop, until you stomp on the brake pedal and nothing happens. It’s the same way with security: if you know your communications aren’t secure, you can be careful about what you say; but if you think mistakenly that you’re safe, you’re sure to get in trouble.”
That’s a bad analogy. A car without brakes has a high likelihood of killing or severely injuring you. NSA tapping your SSL doesn’t. Matter of fact, if you ran the numbers, you’d probably find a low rate of harm across total number of internet users. Also, NSA isn’t completely disabling the protection: they design it with subtle, secret problems so *they* can break it. It works in most scenarios, giving us protective value, and can be weakened under certain circumstance for one organization that legally can do that. (Schell warned us of this subversion threat, but people dind’t listen…) So analogy is shattered beyond any comparison.
Now, I certainly don’t want my crypto being weakened or implementation flaws left in. Guess what, though? Weak security, security defeating complexity, and tons of residual vulnerabilities are what the software market has given (and consumers demanded implicitly) to keep features up and price down. The situation already existed. There are proven methods for producing software with low defects in general and low amounts of trusted code. There are very secure ways of operating information systems. Businesses haven’t been interested in any of that. So serious problems stay in our software awaiting NSA’s (and others’) bug hunters. Hard for me to see how we’d be more secure w/out NSA’s current shenanigans.
And, lastly, Congress and the US taxpayers kept voting to give them this power. Over and over. It’s a consequence of the will of the public. It’s really *their* mess. *They* will have to clean it up. Know why? If you’re a US citizen, no amount of technical security will save you from the courts or the SWAT teams when you become a threat. The public has to pressure Congress to change the laws or the NSA can just say “it’s legal” or “we’re just doing our authorized job.”
to say that this is ‘a consequence of the will of the public’ is to ignore that the usa does not have a functioning democracy, as jimmy carter recently said.
And the “funny” thing is, that the NSA is paid by the taxpayer i.e. yourselves.
I wonder if now that NSA’s disruption and penetration of commercial security standards is well known, an IT organization that continues to use commercial encryption products is in violation of SOX standards for security?
People, or at least the technical kind, believed in the math, not the companies.
They still do believe in the math, but they don’t know which of the math they should believe the most.
The real problem is the NSA has a 253 million dollars budget a year to only work on breaking codes and finding security vulnerabilities in software. That is a crazy amount of money.
To them everything is in the open (security standards, US government has access to the Windows source and obviously open source), but what they know and don’t know is hidden.
It is really hard to fight an adversary like that.
What I would like to understand, is that since everything really isn’t encrypted, and is intercepted, doesn’t that mean that all corporate communications are now breached and that corporate espionage will be, if not already, at an all time high? How would any corporation that wants to keep trade secrets, campaigns, and new product designs safe -say from competitors that have NSA insiders?
Don’t use a computer connected to the Internet. Transfer data on USB sticks between computers that are non-networked. Use the mail, not the email.
If you really, absolutely, truly want to keep the stuff safe, old school is probably going to be the only way to do it for the time being.
Why has most of the post-Snowden commentary been assuming that the NSAs backdoors are themselves secure – i.e. that only the NSA knows how to use them? I think that it’s a good bet that the Chinese et al also have spies within the NSA, and that much if this is known by them as well. If Snowden had been less public minded he could have simply sold his documents to a foreign government and made a mint. Why would we assume that no one did this before Snowden’s leak?
So the Chinese may even now be spying on Obama’s blackberry, using backdoors the NSA installed.
At any rate, it is said that the main thing the Soviets learned from the US in the making of their atomic bomb is simply that bomb-making is possible. I think we can rest assured that now the backdoors that the NSA put in will become increasingly well known, initially among other national security agencies, and thence, ultimately, to common criminals.
Eventually this will require a complete rewrite of essentially all of the cryptographic standards. Hopefully the authors will now know to exclude the NSA’s involvement, if they can.
What a stupid, reckless thing the NSA did…
Please also focus on two other aspects of the NSA’s pool of dirty tricks that came to light in the same report last week. (1) They are hacking into corporations and stealing private keys, code, and perhaps other property – to facilitate their SIGINT capabilities. (2) They are placing agents within corporations and open source projects to insert backdoors. It goes beyond above-board inducements. Not only are they undermining standards, they are undermining the rule of law. Gregory Perry nailed it in 2010 and he was dismissed as an non-credible source at the time.