April 18, 2014

avatar

A Court Order is an Insider Attack

Commentators on the Lavabit case, including the judge himself, have criticized Lavabit for designing its system in a way that resisted court-ordered access to user data. They ask: If court orders are legitimate, why should we allow engineers to design services that protect users against court-ordered access?

The answer is simple but subtle: There are good reasons to protect against insider attacks, and a court order is an insider attack.

To see why, consider two companies, which we’ll call Lavabit and Guavabit. At Lavabit, an employee, on receiving a court order, copies user data and gives it to an outside party—in this case, the government. Meanwhile, over at Guavabit, an employee, on receiving a bribe or extortion threat from a drug cartel, copies user data and gives it to an outside party—in this case, the drug cartel.

From a purely technological standpoint, these two scenarios are exactly the same: an employee copies user data and gives it to an outside party. Only two things are different: the employee’s motivation, and the destination of the data after it leaves the company. Neither of these differences is visible to the company’s technology—it can’t read the employee’s mind to learn the motivation, and it can’t tell where the data will go once it has been extracted from the company’s system. Technical measures that prevent one access scenario will unavoidably prevent the other one.

Insider attacks are a big problem. You might have read about a recent insider attack against the NSA by Edward Snowden. Similar but less spectacular attacks happen all the time, and Lavabit, or any well-run service that holds user data, has good reason to try to control them.

From a user’s standpoint, a service’s resistance to insider attacks does more than just protect against rogue employees. It also helps to ensure that a company will not be tempted to repurpose or sell user data for commercial gain without getting users’ permission.

In the end, what led to Lavabit’s shutdown was not that the company’s technology was too resistant to insider attacks, but that it wasn’t resistant. The government got an order that would have required Lavabit to execute the ultimate insider attack, essentially giving the government a master key to unlock the data of any Lavabit user at any time. Rather than do this, Lavabit chose to shut down.

Had Lavabit had in place measures to prevent disclosure of its master key, it would have been unable to comply with the ultimate court order—and it would have also been safe against a rogue employee turning over its master key to bad actors.

Users who want ultimate email security will now be looking for a provider that more strongly resists insider attacks. That level of security is very difficult to achieve—but law-abiding users have good reason to seek it.

Comments

  1. tz says:

    Employee, or owner?

    The problem is it would require a completely different infrastructure, something like lastpass where the key isn’t with the company at all, but that is not how SSL works.

    But even then, the court could order an update that would have a backdoor.

    Tell me the system that isn’t peer reviewed open-source across international boundaries, that if someone puts a gun to your head (the court will eventually send the SWAT team), is still secure.

    • Anonymous says:

      SSL can certainly work with perfect forward secrecy. All it requires is that each connection uses an ephemeral key which is not stored beyond the life of the connection. That way, intercepts cannot be decrypted post facto.

      • Anonymous says:

        You don’t need the key if you can retrieve the message – and since SSL isn’t provable to be unbreakable post facto, you may rely on SSL only up until it’s broken somewhere in the future.

      • ivanhoe says:

        For the privacy you just can’t trust the 3rd party services no matter how sophisticated security they have. It’s always possible for someone to physically force the access into the system by altering the application code. All confidential data should be encrypted before living your machine with something like PGP. That way they will have to get a court order or otherwise force you to unlock it, which is a much less likely to happen.

      • Gordon Mohr says:

        Note that ["Perfect"] Forward Secrecy, if enabled, could in fact give new ways to comply with such an order. For the single user that is the target, do either of the following (a) retain the ephemeral keymatter and leak to the government (perhaps via a side-channel); (b) generate the server’s contribution to the session-key in a manner that looks random to the client but is transparent/predictable to the government.

        PFS may help prevent bulk compromise at a later date, or via a single crypto win against the provider’s master key, but if the provider wants to comply, PFS may actually help them comply, in a very subtle and narrowly-targeted manner. (Maybe this intentionally-subverted variant should be called “Phony Perfect Forward Secrecy”, PPFS?)

  2. Michael Donnelly says:

    I think the real crux of the issue with Lavabit (and encryption in general) is that systems can be designed such that a government request for information is useless. The real strain and pain we’re seeing is that the government does not like that, so it moves further away from the well-accepted doctrine of “produce this info” to “do this task”.

    People accept that the government can request information. Even civilian parties can compel production of information in the discovery phase of a regular lawsuit. But the problem I see is that in order to get the information in these encryption cases, the government winds up asking a person to actually do something.

    And that’s a pretty big deal: the government has no right to show up at my house and make me wash my car or cut my hair a certain way. I’ll accept that they can search my stuff with a warrant. And I’ll accept that they can compel testimony from me that’s not self-incriminating. But they absolutely CANNOT order me to make a change to a secure system in order to acquire data that they want. If they want the source code, they can have it. If they want a key, they can have it (as Levison finally did, albeit under protest). If Lavabit was designed differently, that key would have been useless as well.

    It’s been slowly creeping in that direction, where the government can “legally” (quotes for law that many would say is unconstitutional) force a provider to install a physical device on the network via the Pen Register Act. That’s the toe over the line that goes from requesting data to requesting work. And maybe that’s an acceptable trade-off, being a minimal amount of work for information that the government has historically had access to.

    But as stronger systems are developed around solid crypto, that is not enough. The government wants access to the data and the only way to get it in some situations is to force the operator to perform work under threat of imprisonment. And that’s pretty much the definition of a totalitarian state.

    Designing a system to prevent any insider attacks is indeed a proper goal here, but I fear the idea itself may be deemed illegal. And, yes, I know that means I’m saying we’re living under a totalitarian regime. But, really, aren’t we?

    • Imaginar says:

      How can one protect against disclosure of the master key? It seems like it is an essential part of their operations unless there is a universally accepted P2P encryption scheme for email (kind of like PGP)

      • raylu says:

        I think he means design a system without a master key.

      • Woofer says:

        Then you’ll have to eschew passwords altogether and use biometrics. How, exactly, does a government force you to turn over a finger?

        • Ben Alabaster says:

          Give the generation of the master key for each mail account over to the user to be carried out locally. Biometrics shouldn’t be used as passwords, you can be legally compelled to provide even DNA. You cannot be compelled to provide information that would incriminate yourself. So never use biometrics as a password. Use biometrics as a username by all means – a verification of identity, but never as authorization to access a secure system.

        • FMJohnson says:

          I would gladly and willingly give the government the finger.

      • Ben Alabaster says:

        The only way to protect against disclosure of the master key is not to have one.

        • Anon says:

          You can buy hardware dongles to secure your SSL key. The hardware dongle is basically a tiny computer, with CPU, RAM, and Flash, and it talks to your server over either a PCI/PCIE bus or over the network. The hardware dongle is designed to be resistant against physical attack (chips covered in epoxy so you can’t attach wires onto it easily, etc) and it runs minimal software designed to be resistant against attacks. You can get the dongle to generate a new SSL key and tell it to mark the private key as “never export”. Then whenever someone connects to your website, your webserver will ask the dongle to do the part of the crypto that needs the private key. The private key never leaves the dongle.

          That way, it is impossible for you to ever provide the private key, because it is impossible to extract it from the dongle.

          (Note that backup isn’t that big an issue – if your dongle blows up, you can always create another SSL key pair and get a new SSL certificate issued. If you care about avoiding downtime, you prepare a spare dongle & certificate in advance, since getting a SSL certificate can take a while).

    • Anon says:

      Unfortunately, I believe the government *can* legally require you to “do work” under the guise of “regulation”. Two examples:

      - accounting is regulated… you are required to do it a certain way
      - meat processing is regulated… you are required to do it a certain way

      If you wilfully don’t follow the regulations, expect fines and/or jail time.

      Yeah, I know, it’s pretty hard waking up one day and realising that all our ideals have been a lie… we are living in a self-imposed dream.

    • anon says:

      4 stars there.

      The thing that is not even talked about in these discussions, is that the NSA, even when using the CIA as cover, are military institutions, and are sworn (literally) to uphold the constitution.

      These crimes then are treason, not just civil rights violations.

  3. Tony Lauck says:

    As I understand it, the attack in question amounts to a combination of wiretapping and “rubber hose” cryptography. This is possible when a web site is set up to use a single SSL key for two purposes: (1) authenticate the web site to the user to prevent a man in the middle attack, and (2) distribute an encryption key used for privacy on an SSL session. This is an unnecessary conflation of two separate security functions. There is a technical need for a long term key used to authenticate the server, but there is no technical need to use a long term key to establish session keys for SSL sessions, assuming the appropriate cryptographic setup on servers and web browsers. Any short term keys can be deleted immediately after a session has been closed.

    The attack and known defense have been known publicly for 20 years. The defense is called “Forward Secrecy” or sometimes “Perfect Forward Secrecy.” If this defense is used, then after an SSL session has been closed there will be no keys available in the server to facilitate decrypting any stored intercepts. Of course if the user keeps data on the server, or if the server operator keeps backups, then may be other ways to access data, but they won’t involve any cryptographic “master keys”.

    Unfortunately, most SSL based web servers have not been configured with Forward Secrecy, as can be seen by a test tool that is publicly available. In addition, most O/S distributions have been tardy in integrating the necessary capabilities. (One can speculate as to why this might be.)

    The following link provides more details:
    https://community.qualys.com/blogs/securitylabs/2013/06/25/ssl-labs-deploying-forward-secrecy

  4. Roland says:

    The US Govt does not have any right to get the info it seeks. It is in fact required to get special permission from a judge to do so. The US Constitution and Bill of Rights do not mention any ‘rights of govt.’, because governments don’t have rights. All they have is power. That power was meant to be closely restricted by law, but the DOJ seems to be writing their own laws these day, and the courts are in bed with them. Fishing expedition.

  5. dbrower says:

    I don’t see how forward secrecy helps, since the problem Lavabit faced was MITM, having the goverment intercept using the provided keys. That would enable capturing a current session in the clear and reaping data in flight.

    Not that forward secrecy is wrong per-se, just pointing out that handing over the master key is still bad.

    -dB

  6. Peter Bailey says:

    I may be mistaken, but I had thought that this problem had been licked back around 2000 by a company called Zero-Knowledge, that had developed an architecture (http://osiris.978.org/~brianr/crypto-research/anon/www.freedom.net/products/whitepapers/) that as the name implies was designed to prevent the operator from viewing email, seeing where it came from, enabled psuedonymity (i.e. multiple email addrs and userID’s s for different purposes) and enabled anonymous browsing (although at the time many of todays jscript/java browser attacks weren’t well comprehended). Does anybody have any opinions as to whether this architecture is still valid, and if so what it would take to recreate it in the open source world?

  7. Terry A Davis says:

    In the Bible, a prophet told a story to a king. The king got outraged at a villian, then learned it was him.

    What if I told you about a tin-foil hat mental patient with crazier and crazier theories instead of a tried and true tested account that men for many centuries have believed. Don’t you see how pathetic your outlandish theories are and how the simple truthful answer — the God of the Bible and tongues — is actually what any reasonable person would belive. Come-on! You think I corrupted public random notary sites? You think I have back doors when you run my software?

    For one rare thing to happen, you might think I am fooling you, but the weight of so many crazy things together is just silly. It’s the simple Bible — don’t be a nut.

    • Nathan T. says:

      Terry, I didn’t follow you at all with that post. I suspect you are painting an analogy between bible belief and crazy conspiracy theories. Was that analogy comparing or contrasting the two? I can’t understand your point. Please explain it a little better.

      I will give you a hint; you may need to frame it from the perspective that I am both a full believer in the Bible AND a “tin-foil hat mental patient” but that I can tell the difference between the two.

      You end in “don’t be a nut.” But I am still left to ponder what you consider a nut. To me it is all a matter of critical thinking, and my “paranoid” analysis has proved RIGHT far more often than ever shown to be wrong. Especially when it comes to security and technology such as corrupted/compromised systems and backdoors.

    • Gary says:

      @ Terry David – What?

    • Nathan T. says:

      Terry – I still don’t understand exactly your point.

      But, I realized maybe I should re-read the story you referred to (as I remembered there being such a story but did not remember any details to it), to see if I could make an inference. Alas, I spent yesterday and most of today trying to remember enough of the story that a web search would turn it up; no luck. You didn’t name the prophet or the king and I couldn’t remember the name of the parable; and any generic search for prophets and kings and anger didn’t turn up what I knew was the story you referred to.

      Until it hit me; I vaguely remembered the Prophet Nathan (a namesake) going to the king and reprimanding him. A quick search turned up 2 Samuel 12 — The Parable of the Ewe Lamb. [I always got a kick out of verse 13 as a child because I had an elder brother name of David--the irony is my sins are more similar to King David, and I am no prophet, perhaps my parents should have reversed our names]

      And with the story in mind; the only resemblance I can see is that King David sent Uriah out to be killed after he impregnated Uriah’s wife. Is that a sort of “insider attack?” Or, are you intending to indicate that courts were pilfering Lavabit the way David pilfered what was Uriah’s? Or, are you saying that the tin-foil hat wearer is a villain as was king David… You may be onto something there if it comes to chastity (as I am both unchaste and a tin-foil hat wearer)–but that is way off topic. Or, is your post just a troll? I still don’t understand; and glad I am not the only one without understanding.

  8. Nathan T. says:

    Ed, near the end you state that “In the end, what led to Lavabit’s shutdown was not that the company’s technology was too resistant to insider attacks, but that it wasn’t resistant.”

    and later

    “Had Lavabit had in place measures to prevent disclosure of its master key, it would have been unable to comply with the ultimate court order”

    I am not quite sure I follow. As I understand it; the premise to the article is that there would be some way to protect anyone at the company from being able to divulge the master key when the court came asking. Technologically speaking perhaps there is some way to protect that key from knowledge of the company’s employees/owners? Some way to be “resistant” to the insider attack? I won’t dispute or argue on that point at all, I have no idea.

    However, the premise misses the most basic fact of this case. The government “required” the information (and even if I recall the judge deemed it a “right” for the government to have it); once they deemed it a “requirement” it wouldn’t matter if it was impossible to fulfill that requirement or not; the end game would have been the same.

    The company would have been shut down. Either volitionally themselves; or shut down by force of government; including but not limited to the owner being imprisoned for failure to follow a court order. And that is the real problem here, the government can force any business to shut down simply by requiring them to comply with some “court order” or other.

    In the end; it wouldn’t have mattered if they were resistant or not; the government gets what it wants. And in this case it was ALL user information or shut down the business.

    • Nathan T. says:

      P.S. We see that most companies don’t shut down; rather they just hand over the information. Such as all the phone companies handing over all their phone records for months on end. How many other companies simply comply so that they can remain in operation one has to ponder?

  9. Don Lindsay says:

    Actually, there is a fairly simple defence against many insider attacks. You require that N employees (N>1) must collaborate in order to extract the information in question. There are well-known algorithms whereby “any N of M” keys are sufficient to unlock some further key.

    It’s more costly, of course, because those actions now need more manpower. I don’t think I know all the consequences for the overall system design.

    But if that manpower is spread across political boundaries, the resulting system would have irritated the judge even more. I’m not sure what the judge might have done.

  10. Engineer says:

    Here’s a better hypothetical:

    At company A the employee’s wife is kidnapped and will not be released unless the info is given out.
    At company B the employee is threatened with being kidnapped, unless he gives the info out.

    The former is a crime, the latter is a court order.

    Court order are only legitimate to the extent that they are legal. What this judge, and many other people seem to forget is that the orders demanding Lavabit hand over info are not complaint with the 4th amendment, meaning that everyone involved in them is guilty of a federal crime under USC 18-242.

    A court order is nothing more than a threat of violence.

    It’s time to stop thinking everything government does is ipso facto moral . The rules of morality do not change simply because the criminal works for a gang that claims its “government”.

  11. awjt says:

    Set up a system where the provider never has the actual content. The provider has an encrypted version, and manages that data transactionally, but lacks the keys to decrypt it. Make encryption/decryption the client’s responsibility, so a court would have to go after each individual client to get the decrypted content. That way the provider can never be put over the barrel. The provider who receives a court order can just copy everything requested, hand it to the government in encrypted form, and say, “Here, these are the data as we have it and we cannot decrypt it. For that, you will have to visit the owners of the data or crack it yourself.” The provider never betrayed the client. The provider also never had to compromise their business or moral position, in this scenario.

  12. Bloke says:

    Everyone seems to be missing the obvious, talking about master keys etc.

    The encryption and decryption should be done client-side, that way the company can never decrypt the clients info without some form of hacking.

    • Morin says:

      Client-side itself is not enough; it must also be done by a mechanism that cannot be affected by malicious code provided by the server. I’m specifically targeting client-side encryption implemented in Javascript on a website here: In such a case, the company running the server could still be forced to break the encryption or capture the data before it is encrypted, client-side, by sending malicous scripts together with the encryption code.

  13. David Collier-Brown says:

    A Classic problem in Library Software!

    Imagine that one wishes to prevent subversion by drug cartels but honour (or appeal) court orders. This is the problem that public libraries have dealt with since their creation. Someone always wants to know what person X has been reading, in hopes of using it against them….

    Library software is normally written to preserve privacy, and discard the record that “X has book Y” when the book is returned. It can be written this way because several of the countries where it is sold require privacy as part of their legal system. Purchasers in other countries get privacy as a side-effect.

    Countries prohibiting privacy would require a special version for a quite limited market, and the library software companies aren’t motivated to deal with them: just doing an internationalization/localization to get into a small market is hard enough!

    When an individual library is served with a court order, they can honour it by doing a lookup once a day and writing X’s new books down on a piece of paper. As this doesn’t scale, and is also a credible cost, the willingness of courts to order it is reduced, and the damage to privacy is limited.

    Applying this to email, one wishes to keep routing data only until a message is delivered to the next host and we get a “250 OK” from SMTP. If a court wishes to collect that metadata, they can station an officer with a laptop at the ISP and gobble up the packets routed to/from him. This is onerous, and in Canada at least requires a “wiretap warrant”, which the courts restrict more than ordinary search warrants.

    The person wishing to provide this kind of information to a drug cartel has the same hard task, and is also more likely to be detected by the ISP.

    To oversimplify, we’re keeping far too much information about email: an author or vendor should take notice of the privacy laws of their preferred markets and discard debugging/diagnostic information at the end of a successful delivery. If they wish to cover themselves against customer complaints, they might send delivery notices that the customer can read or filter out at their convenience.

    –dave

    • Jim Doria says:

      Dave, your point about keeping too much information is good but misses the mark. The government agency does NOT have to have an officer with a laptop at the ISP. They can simply plug in a device at the ISP, capture ALL the data for as long as they like, and have a small army of analysts and their algorithms sift through it at their leisure.

      They can also order the ISP not to divulge to its clients that the device is installed.

      Your assertion that piecemeal collection of metadata “does not scale” is out of date at this point. They have scaled it.

      • Richard Cant says:

        They have only been able to scale it because of the way in which these systems have been designed – I think that that is Dave’s point.

        If the device is only insertable at a point where the data is encrypted (and the ISP doesn’t have the key) then all the analysts and algorithms in the world won’t help them

  14. Alec Cawley says:

    Analogy: timer locks on bank vaults and security vans which prevent even the owner accessing them except in the right circumstances. Because, while there might be legitimate reasons for accessing them at other times, there are many more illegitimate ones.

  15. Jim G. says:

    The problem is that it is unwise to trust any security email service that resides within the USA. The solution is to go elsewhere.

  16. Nick P says:

    @ Ed Felton

    Lavabit screwed up by (a) playing hardball with very powerful people and (b) designing their infrastructure to make it hard to do info gathering they should have seen coming. (A few companies actually have this in their threat model…) At one point, the judge wanted a solution from them and they had nada. See my post here for an analysis and one solution:

    https://www.schneier.com/blog/archives/2013/10/friday_squid_bl_394.html#c1828038

    Far as your idea, it’s a decent one but here’s the problem: the LAW says that they can do WHATEVER IT TAKES if we put up obstacles. You use a key, they demand the key. You use a dedicated machine in Iceland, they order you to modify it to send them the data. You use end-to-end in a Java applet (eg Hush), they order you to send a subverted applet. You make a system good enough where they have no obvious order, they seize all your systems as “evidence,” throw your ass in jail for obstruction, and then start targetting users to reduce demand. The power is on their side, you are seen by the courts as resisting such authority deliberately, and you pay the price.

    There might be a technical solution to all this along the lines you suggested. I have posted a few designs over the years that might end up in one. Thing is, though, any American citizen or company trying to resist our ultrapowerful State will likely be ruined by its strong legal power over them. Best thing I can think of is have the service or code hosted in a foreign country on servers run by foreigners, using a development model/process good at detecting subversions in lifecycle, and not made with US hardware. People from US and elsewhere can contribute to the efforts, yet ultimate control of it must be out of US government’s reach. Else, all the technical security in the world probably won’t save you.

  17. Sabatini Monatesti says:

    That is why we designed RAHN/PAHISP to be PKI/CERT enabled at the desktop. We support encryption at rest and in transit, authentication and authorization. We believe this is the best opportunity for patients to protect their PHI.

  18. diligencer says:

    This is not a question of encryption. It is a question about cryptographic key management.

    If we want to have a messaging service that is resistant to internal misuse, governmental eavesdropping &c. then we need to keep the encryption “envelope” closed all the way from the sender to the recipient. No master keys, decryption, no clear text on email servers, nothing like that should be allowed. It MUST be that the sender and the recipient are the only ones having their public & private keys that can open the messages. This is already implemented and supproted on S/MIME and several PGP implementations.

    If you want to get even more paranoid, only store the private keys of the sender and the recipient on a token like a smart card. If necessary, destruction of the physical token would result to permanent data loss. No-one would be able to open those messages ever again.

  19. stoatwblr says:

    If the data is held in plaintext anywhere along the chain, other than on the client machines then it’s readable (even there it should be encrypted)

    Apparently there’s a resurgence in PGP keysigning parties. I can’t understand why they ever went out of fashion.

  20. secret says:

    An outstanding share! I’ve just forwarded this onto a co-worker who has been conducting a little
    research on this. And he actually ordered me lunch simply because I found it for
    him… lol. So let me reword this…. Thank YOU for the meal!!
    But yeah, thanx for spending some time to talk about this matter here on your web site.

  21. Ruf guerreschi says:

    We are launching a large r&d project for the kind of procedures and technologies that would protect from any insiders threats including unconstitutional court-order: access to server room conditional to physical approval of a jury of randomly selected users, who are eveb a le to launch a “scorched earth” procedure:
    http://www.openmediacluster.com/en/user-verifiable-social-telematics-project/
    Help and suggestions welcome.
    Rufo