October 13, 2015


Classified material in the public domain: what’s a university to do?

Yesterday I posted some thoughts about Purdue University’s decision to destroy a video recording of my keynote address at its Dawn or Doom colloquium. The organizers had gone dark, and a promised public link was not forthcoming. After a couple of weeks of hoping to resolve the matter quietly, I did some digging and decided to write up what I learned. I posted on the web site of the Century Foundation, my main professional home:

It turns out that Purdue has wiped all copies of my video and slides from university servers, on grounds that I displayed classified documents briefly on screen. A breach report was filed with the university’s Research Information Assurance Officer, also known as the Site Security Officer, under the terms of Defense Department Operating Manual 5220.22-M. I am told that Purdue briefly considered, among other things, whether to destroy the projector I borrowed, lest contaminants remain.

I was, perhaps, naive, but pretty much all of that came as a real surprise.

Let’s rewind. Information Assurance? Site Security?

These are familiar terms elsewhere, but new to me in a university context. I learned that Purdue, like a number of its peers, has a “facility security clearance” to perform classified U.S. government research. The manual of regulations runs to 141 pages. (Its terms forbid uncleared trustees to ask about the work underway on their campus, but that’s a subject for another day.) The pertinent provision here, spelled out at length in a manual called Classified Information Spillage, requires “sanitization, physical removal, or destruction” of classified information discovered on unauthorized media.

Two things happened in rapid sequence around the time I told Purdue about my post.

First, the university broke a week-long silence and expressed a measure of regret:

UPDATE: Just after posting this item I received an email from Julie Rosa, who heads strategic communications for Purdue. She confirmed that Purdue wiped my video after consulting the Defense Security Service, but the university now believes it went too far.

“In an overreaction while attempting to comply with regulations, the video was ordered to be deleted instead of just blocking the piece of information in question. Just FYI: The conference organizers were not even aware that any of this had happened until well after the video was already gone.”

“I’m told we are attempting to recover the video, but I have not heard yet whether that is going to be possible. When I find out, I will let you know and we will, of course, provide a copy to you.”

Then Edward Snowden tweeted the link, and the Century Foundation’s web site melted down. It now redirects to Medium, where you can find the full story.

I have not heard back from Purdue today about recovery of the video. It is not clear to me how recovery is even possible, if Purdue followed Pentagon guidelines for secure destruction. Moreover, although the university seems to suggest it could have posted most of the video, it does not promise to do so now. Most importantly, the best that I can hope for here is that my remarks and slides will be made available in redacted form — with classified images removed, and some of my central points therefore missing. There would be one version of the talk for the few hundred people who were in the room on Sept. 24, and for however many watched the live stream, and another version left as the only record.

For our purposes here, the most notable questions have to do with academic freedom in the context of national security. How did a university come to “sanitize” a public lecture it had solicited, on the subject of NSA surveillance, from an author known to possess the Snowden documents? How could it profess to be shocked to find that spillage is going on at such a talk? The beginning of an answer came, I now see, in the question and answer period after my Purdue remarks. A post-doctoral research engineer stood up to ask whether the documents I had put on display were unclassified. “No,” I replied. “They’re classified still.” Eugene Spafford, a professor of computer science there, later attributed that concern to “junior security rangers” on the faculty and staff. But the display of Top Secret material, he said, “once noted, … is something that cannot be unnoted.”

Someone reported my answer to Purdue’s Research Information Assurance Officer, who reported in turn to Purdue’s representative at the Defense Security Service. By the terms of its Pentagon agreement, Purdue decided it was now obliged to wipe the video of my talk in its entirety. I regard this as a rather devout reading of the rules, which allowed Purdue to “realistically consider the potential harm that may result from compromise of spilled information.” The slides I showed had been viewed already by millions of people online. Even so, federal funding might be at stake for Purdue, and the notoriously vague terms of the Espionage Act hung over the decision. For most lawyers, “abundance of caution” would be the default choice. Certainly that kind of thinking is commonplace, and sometimes appropriate, in military and intelligence services.

But universities are not secret agencies. They cannot lightly wear the shackles of a National Industrial Security Program, as Purdue agreed to do. The values at their core, in principle and often in practice, are open inquiry and expression.

I do not claim I suffered any great harm when Purdue purged my remarks from its conference proceedings. I do not lack for publishers or public forums. But the next person whose talk is disappeared may have fewer resources.

More importantly, to my mind, Purdue has compromised its own independence and that of its students and faculty. It set an unhappy precedent, even if the people responsible thought they were merely following routine procedures.

One can criticize the university for its choices, and quite a few have since I published my post. What interests me is how nearly the results were foreordained once Purdue made itself eligible for Top Secret work.

Think of it as a classic case of mission creep. Purdue invited the secret-keepers of the Defense Security Service into one cloistered corner of campus (“a small but significant fraction” of research in certain fields, as the university counsel put it). The trustees accepted what may have seemed a limited burden, confined to the precincts of classified research.

Now the security apparatus claims jurisdiction over the campus (“facility”) at large. The university finds itself “sanitizing” a conference that has nothing to do with any government contract.

I am glad to see that Princeton takes the view that “[s]ecurity regulations and classification of information are at variance with the basic objectives of a University.” It does not permit faculty members to do classified work on campus, which avoids Purdue’s “facility” problem. And even so, at Princeton and elsewhere, there may be an undercurrent of self-censorship and informal restraint against the use of documents derived from unauthorized leaks.

Two of my best students nearly dropped a course I taught a few years back, called “Secrecy, Accountability and the National Security State,” when they learned the syllabus would include documents from Wikileaks. Both had security clearances, for summer jobs, and feared losing them. I told them I would put the documents on Blackboard, so they need not visit the Wikileaks site itself, but the readings were mandatory. Both, to their credit, stayed in the course. They did so against the advice of some of their mentors, including faculty members. The advice was purely practical. The U.S. government will not give a clear answer when asked whether this sort of exposure to published secrets will harm job prospects or future security clearances. Why take the risk?

Every student and scholar must decide for him- or herself, but I think universities should push back harder, and perhaps in concert. There is a treasure trove of primary documents in the archives made available by Snowden and Chelsea Manning. The government may wish otherwise, but that information is irretrievably in the public domain. Should a faculty member ignore the Snowden documents when designing a course on network security architecture? Should a student write a dissertation on modern U.S.-Saudi relations without consulting the numerous diplomatic cables on Wikileaks? To me, those would be abdications of the basic duty to seek out authoritative sources of knowledge, wherever they reside.

I would be interested to learn how others have grappled with these questions. I expect to write about them in my forthcoming book on surveillance, privacy and secrecy.


Threshold signatures for Bitcoin wallets are finally here

Today we are pleased to release our paper presenting a new ECDSA threshold signature scheme that is particularly well-suited for securing Bitcoin wallets. We teamed up with cryptographer Rosario Gennaro to build this scheme. Threshold signatures can be thought of as “stealth multi-signatures.” [Read more…]


Consensus in Bitcoin: One system, many models

At a technical level, the Bitcoin protocol is a clever solution to the consensus problem in computer science. The idea of consensus is very general — a number of participants together execute a computation to come to agreement about the state of the world, or a subset of it that they’re interested in.

Because of this generality, there are different methods for analyzing and proving things about such consensus protocols, coming from different areas of applied math and computer science. These methods use different languages and terminology and embody different assumptions and views. As a result, they’re not always consistent with each other. This is a recipe for confusion; often people disagree because they’ve implicitly assumed one world-view or another. In this post I’ll explain the two main sets of models that are used to analyze the security of consensus in Bitcoin.

[Read more…]


Expert Panel Report: A New Governance Model for Communications Security?

Today, the vulnerable state of electronic communications security dominates headlines across the globe, while surveillance, money and power increasingly permeate the ‘cybersecurity’ policy arena. With the stakes so high, how should communications security be regulated? Deirdre Mulligan (UC Berkeley), Ashkan Soltani (independent, Washington Post), Ian Brown (Oxford) and Michel van Eeten (TU Delft) weighed in on this proposition at an expert panel on my doctoral project at the Amsterdam Information Influx conference. [Read more…]


“Loopholes for Circumventing the Constitution”, the NSA Statement, and Our Response

CBS News and a host of other outlets have covered my new paper with Sharon Goldberg, Loopholes for Circumventing the Constitution: Warrantless Bulk Surveillance on Americans by Collecting Network Traffic Abroad. We’ll present the paper on July 18 at HotPETS [slides, pdf], right after a keynote by Bill Binney (the NSA whistleblower), and at TPRC in September. Meanwhile, the NSA has responded to our paper in a clever way that avoids addressing what our paper is actually about. [Read more…]


Why King George III Can Encrypt

[This is a guest post by Wenley Tong, Sebastian Gold, Samuel Gichohi, Mihai Roman, and Jonathan Frankle, undergraduates in the Privacy Technologies seminar that I offered for the second time in Spring 2014. They did an excellent class project on the usability of email encryption.]

PGP and similar email encryption standards have existed since the early 1990s, yet even in the age of NSA surveillance and ubiquitous data-privacy concerns, we continue to send email in plain text.  Researchers have attributed this apparent gaping hole in our security infrastructure to a deceivingly simple source: usability.  Email encryption, although cryptographically straightforward, appears too complicated for laypeople to understand.  In our project, we aimed to understand why this problem has eluded researchers for well over a decade and expand the design space of possible solutions to this and similar challenges at the intersection of security and usability.

[Read more…]


Will Greenwald’s New Book Reveal How to Conduct Warrantless Bulk Surveillance on Americans from Abroad?

Tomorrow, Glenn Greenwald’s highly anticipated book ‘No Place to Hide’ goes on sale. Apart from personal accounts on working with whisteblower Edward Snowden in Hong Kong and elsewhere, Mr. Greenwald announced that he will reveal new surveillance operations by Western intelligence agencies. In the last weeks, Sharon Goldberg and I have been finishing a paper on Executive Order 12333 (“EO 12333”). We argue that EO 12333 creates legal loopholes for U.S. authorities to circumvent the U.S. Constitution and conduct largely unchecked and unrestrained bulk surveillance of American communications from abroad. In addition, we present several known and new technical means to exploit those legal loopholes. Today, we publish a summary of our new paper in this post.

We stress that we’re not in a position to suggest that U.S. authorities are actually structurally circumventing the Constitution using the international loophole we discuss in the paper.  But, we’re wondering: will the gist of our analysis be part of Greenwald’s new revelations tomorrow? A first snippet of Greenwald’s new book in The Guardian, about hacking American routers destined for use overseas, seems to point in that direction. Here’s our summary. [Read more…]


Bitcoin hacks and thefts: The underlying reason

Emin Gün Sirer has a fascinating post about how the use of NoSQL caused technical failures that led to the demise of Bitcoin exchanges Flexcoin and Poloniex. But these are only the latest in a long line of hacks of exchanges, other services, and individuals; a wide variety of bugs have been implicated. This suggests that there’s some underlying reason why Bitcoiners keep building systems that get exploited. In this post I’ll examine why.

[Read more…]


New Research: Cheating on Exams with Smartwatches

A Belgian university recently banned all watches from exams due to the possibility of smartwatches being used to cheat. Similarly, some standardized tests in the U.S. like the GRE have banned all digital watches. These policies seems prudent, since today’s smartwatches could be used to smuggle in notes or even access websites during the test. However, their potential use for cheating goes much farther than that.

As part of my undergrad research at the University of Michigan, I’ve recently been focusing on the security and privacy implications of wearable devices, including how smartwatches might be used for cheating in an exam. Surprisingly, while there’s been interest in the security implications of wearable devices, the focus within the research community has been on how these devices might be attacked rather than on how these devices challenge existing social assumptions.

[Read more…]


Your TV is spying on you, and what you can do about it

A recent UK observer with a packet sniffer noticed that his LG “smart” TV was sending all his viewing habits back to an LG server. This included filenames from an external USB disk. Add this atop observations that Samsung’s 2012-era “smart” TVs were riddled with security holes. (No word yet on the 2013 edition.)

What’s going on here? Mostly it’s just incompetence. Somebody thought it was a good idea to build these TVs with all these features and nobody ever said “maybe we need some security people on the design team to make sure we don’t have a problem”, much less “maybe all this data flowing from the TV to us constitutes a massive violation of our customers’ privacy that will land us in legal hot water.” The deep issue here is that it’s relatively easy to build something that works, but it’s significantly harder to build something that’s secure and respects privacy.
[Read more…]