April 18, 2014

avatar

British Court Blocks Publication of Car Security Paper

Recently a British court ordered researchers to withdraw a paper, “Dismantling Megamos Security: Wirelessly Lockpicking a Vehicle Immobiliser” from next week’s USENIX Security Symposium. This is a blow not only to academic freedom but also to progress in vehicle security. And for those of us who have worked in security for a long time, it raises bad memories of past attempts to silence researchers, which have touched many of us over the years.

The paper, by Flavio Garcia of the University of Birmingham and Roel Verdult and Baris Ege of Radboud University Niemegen, would have discussed the operation and security of Megamos, a cryptography-based system used in most or all recent Volkswagen-made vehicles. Megamos wirelessly authenticates a key to the car, and vice versa, so that the car can be started only by an authorized key. Unfortunately, as the paper would have explained, Megamos has vulnerabilites that would allow an attacker to start the car without a legitimate key in some circumstances.

There is a fallacy, typically more common among non-experts, that only “constructive” security research—that is, research that claims to describe a secure system—has value. In fact, case studies of vulnerabilities can be very valuable. Given that most security systems turn out to be vulnerable, it pays to understand in detail how and why sophisticated designers end up shipping vulnerable technologies—which is exactly what the Megamos paper was apparently trying to do.

This case has strong echoes of an incident in 2001, when the Recording Industry Association of America and some other entities threatened to sue my colleagues and me over our case study of several copy protection technologies for compact discs. The RIAA and friends threatened to sue us and others if we went ahead with publication of our paper. Under these threats, we withdrew the paper from its original venue and went to court to secure the right to publish. With help from the EFF, USENIX, and others, we were eventually able to publish our work in the 2001 USENIX Security Symposium.

The two cases are similar in many ways. Both involved a case study paper that described how a technology worked and why it was vulnerable. Both papers were fully peer reviewed and accepted for publication, and in both cases affected companies knew about the papers well in advance but acted only late in the game to try to block publication. We faced threats of a lawsuit, whereas the Megamos researchers were actually ordered by a court not to publish (pending further court proceedings). And in both cases the threatening companies seemed to be motivated mostly by a fear of embarrassment due to their poor engineering choices becoming public.

As usual, the attempt to avoid embarrassment will fail. By trying to block publication, the company is effectively admitting that it has something to hide and that the paper is correct in saying that Megamos is vulnerable. Of course trying to block the paper will only draw more attention to the flawed technologies. But what the company might succeed in doing is to withhold from researchers and practitioners the main value of the paper, which is its diagnosis of exactly what went wrong and why, that is, the lessons it teaches for the future.

This is yet another example of the legal system’s apparent ambivalence about security research. We hear that digital insecurity is a major challenge facing society. But at the same time the law seems too eager to block or deter the very research and scholarly communication that can help us learn how to do better.

Comments

  1. Tobias D. Robison says:

    Well Said!

    The bad guys probably know about this vulnerability already, right? And Volkswagen can profit from security research inspired by this paper, if only it can be published.

  2. Matt says:

    You’re so right Ed.

    In the UK we can report criminal offences anonymously, by telephone, it allows indirect communication to occur between those “in-the-know” and those working tirelessly to bring the guilty to justice.

    A year or two ago, I asked a senior House of Lords politician with responsibility for Information Security (in various roles), to consider a similar mechanism to facilitate anonymous reporting of insecurities and vulnerabilities.

    In both cases, information can flow without fear of reprisal, however close the reporter is to the fact.

    I was told that this was simply unnecessary. The reason given was that “market forces” would achieve the same aims… “those less-secure companies will be out-competed by the more secure”

    Security researchers are definitely in fear of reporting their concerns, against a back-drop of rising online crime, and Politicians and Legislative powers seem to only heighten these fears with their confused thinking and mix of under and over reaction.

  3. John says:

    Interesting comparison!

    Noticed a small typo, other stories refer to “Radboud University Nijmegen” in stead of “Radboud University Niemegen”.

  4. Mike says:

    Although you link to the judgement, your analysis suggests that you have not read it fully. The researchers refused to remove a small part of the paper pending a court case on whether the disputed information was obtained lawfully (by reverse engineering of the system) or unlawfully (by a route they knew, or ought to have known, was theft of confidential information – similar to trade secret law in various US states).

    Indeed the judge’s summary starts:

    “If I thought the purpose of this injunction was to save Volkswagen’s embarrassment I would not hesitate to refuse it. The paper in its redacted form will not prevent the defendants from saying that they have, in fact, derived the Megamos Crypto algorithm and that there is, in fact, an attack based on its weakness. Moreover, relevant people, Thales, EM, Delphi and Volkswagen, now know what the problem is. They have a chance to do something about it.”

    And later:

    “I also note that the defendants refuse, in fact, to even redact Definition 3.8, as asked for by EM and Delphi at a meeting in June. I think the defendants’ mantra of “responsible disclosure” is no such thing. It is a self-justification by defendants for the conduct they have already decided to undertake and it is not the action of responsible academics.”

    The whole judgement is worth reading, and shows that this decision is based on the very specific facts of the case and behaviours of the actors involved, not a general attempt to prevent security research.

    • Matt says:

      Mike can you provide the link :) I admit I hadn’t read the Judgment, but the flavour of Ed’s article is surely still in tact – despite the Judge’s common sense approach in this case, there is precedent here – Acts of Law in the UK, and their subsequent application, are biased towards supporting the organisations and powers that stand to lose.
      An irrelevant example but nonetheless a comparison worth noting, is that the UK police were escorting and protecting a large oil company’s trucks as they made their way to begin disruptive operations in a sleepy local community.

      At what point do the Legislative, Law-enforcement and Political powers engage back with us, the people, the communities they serve and allow us to take part in the future? We are all a little scared of doing the right thing these days, and this is why some cross legal boundaries (Anonymous etc…) to achieve (what they consider to be) ethical outcomes.

    • Ed Felten says:

      Mike,

      I did read all of the judge’s decision document. I just interpret it differently.

      First: I have no basis for evaluating your assertion that the requested removal was a “small part” of the paper’s content. To the contrary, the claim that the harm could have been prevented by removing that part seems to indicate that it carries important meaning for readers.

      Second: I agree that the *judge* is not motivated by a desire to protect Volkswagen from embarrassment. But based on the evidence I see—and my own experience in similar cases—it appears to me that Volkswagen and others are trying to use the legal process to try to prevent embarrassing facts from coming to light.

      Of course the legal action was not meant to stop security research in general. But it was unquestionably designed to prevent the dissemination of this specific research which has value to the research community, as evidenced by its acceptance for publication in one of the most prestigious and competitive venues in the security field.

  5. Matt says:

    I see the link!