The other day, I was re-reading the 2008 Liberty vs. The United Kingdom ruling of the European Court of Human Rights (‘ECHR’). The case reads like any BREAKING / REVEALED news report on Edward Snowden’s disclosures, and will play a crucial role in the currently pending court cases in Europe on the legality of the surveillance programs. Liberty is also great material for comparing surveillance jurisprudence across the Atlantic.
An article in The Register explains what happened in the Aug 1 2012 Wall Street glitch that cost Knight Capital $440M, resulted in a $12M fine, nearly bankrupted Knight Capital (and forced them to merge with someone else). In short, there were 8 servers that handled trades; 7 of them were correctly upgraded with new software, but the 8th was not. A particular type of transaction triggered the updated code, which worked properly on the upgraded servers. On the non-upgraded server, the transaction triggered an obsolete piece of software, which behaved altogether differently. The result was large numbers of incorrect “buy” transactions.
Bottom line is that the cause of the failure was lack of careful procedures in how the software was deployed, coupled with a poor design choice that allowed a new feature to reuse a previously used obsolete option, which meant that the trigger (instead of being ignored of causing an error) caused an unanticipated result.
So what does this have to do voting? [Read more…]
If you talk about ‘metadata’, ‘big data’ and ‘Big Brother’ just as easily as you order a pizza, ethnography and anthropology are probably not your first points of reference. But the outcome of a recent encounter of ethnographer Tom Boellstorff and Edward Snowden (not IRL but IRP), is that tech policy wonks and researchers should be careful with their day to day vocabulary, as concepts carry politics of control and power.
Earlier this week, Felten made the observation that the government eavesdropping on Lavabit could be considered as an insider attack against Lavabit users. This leads to the obvious question: how might we design an email system that’s resistant to such an attack? The sad answer is that we’ve had this technology for decades but it never took off. Phil Zimmerman put out PGP in 1991. S/MIME-based PKI email encryption was widely supported by the late 1990’s. So why didn’t it become ubiquitous?
The main takeaway of two recent disclosures around N.S.A. surveillance practices, is that Americans must re-think ‘U.S. citizenship’ as the guiding legal principle to protect against untargeted surveillance of their communications. Currently, U.S. citizens may get some comfort through the usual political discourse that ‘ordinary Americans’ are protected, and this is all about foreigners. In this post, I’ll argue that this is not the case, that the legal backdoor of U.S. Citizenship is real and that relying on U.S. citizenship for protection is not in America’s interests. As a new CITP Fellow and a first time contributor to this amazing blog, I’ll introduce myself and my research interests along the way. [Read more…]
Commentators on the Lavabit case, including the judge himself, have criticized Lavabit for designing its system in a way that resisted court-ordered access to user data. They ask: If court orders are legitimate, why should we allow engineers to design services that protect users against court-ordered access?
The answer is simple but subtle: There are good reasons to protect against insider attacks, and a court order is an insider attack.
The saga of Lavabit, the now-closed “secure” mail provider, is an interesting object of study. They’re in the process of appealing a court order to produce their SSL private keys, with which a government eavesdropper would then have access to the entirety of all traffic going in and out of Lavabit. You can read Lavabit’s appeal brief and a general summary of their legal situation. What jumps out is that Lavabit tried to propose an alternative: giving access exclusively to metadata from the target of the investigation. Lavabit’s proposal:
- The government would pay $3500 for Lavabit’s development costs and operations
- The operations would provide a variety of email headers on the subject of the investigation, notably excluding the subject line
- This surveillance data would be sent in daily batches to the government
It appears that the government wasn’t interested in negotiating this, instead going for the whole enchilada, which then led Lavabit to pull the plug on its service. The question I want to pursue is how this whole situation could have happened in a way that would have satisfied the government’s investigative needs without its flagrant violation of the Fourth Amendment prohibition against unreasonable search and seizure. Consider whether Lavabit might have adopted Google’s legal procedures. Google clearly spells out what they’re willing to divulge with a subpoena, court order, or warrant (and nicely defines each of those terms). In Google’s process, the government brings a written search warrant, Google’s legal team reviews it, and then they provide access to the targeted account, providing notice to the affected user when they’re allowed to. Seems reasonable, right?
If all the government needed was real-time traces of specific subjects, that would seem to be a reasonable point of negotiation between them and Lavabit. For the right price, Lavabit could certainly have engineered a solution to their needs. It appears that there wasn’t any serious attempt at negotiation. The government wanted much more than this, creating the dispute. (The Guardian claims that the government also wasn’t willing to pay $3500, calling it unreasonable. It’s hard to stomach that claim, given all the other expenses involved in a major criminal investigation.)
Lavabit used SSL to protect data in transit, and some other crypto derived from the user’s password to protect data on their hard drives. But when the user logs in, the necessary key material is necessarily available to present the data to the user. While users might be able to use stronger cryptographic means to protect their data against legal warrants (e.g., using Thunderbird with the enigmail OpenPGP plugin), ultimately the lesson of Lavabit is that technology cannot alone solve a legal problem. A future Lavabit needs to have its legal processes sorted out in advance, making reasonable promises to its users and making reasonable access available to the government. Likewise, it’s time for Congress to establish some clear limits on government surveillance to prevent unreasonable search and collection practices in the future.
Josh wrote recently about a serious security bug that appeared in Debian Linux back in 2006, and whether it was really a backdoor inserted by the NSA. (He concluded that it probably was not.)
Today I want to write about another incident, in 2003, in which someone tried to backdoor the Linux kernel. This one was definitely an attempt to insert a backdoor. But we don’t know who it was that made the attempt—and we probably never will.