The EFF has posted a very nice piece (apparently written by Seth Schoen) on “trusted computing” systems. The piece makes two important contributions to the debate. First, it gives the best simple introduction to trusted computing technologies that I have seen. Second, it suggests “owner override,” a technological tweak that would largely eliminate the downside of trusted computing (i.e., our loss of control over our own computers), while preserving most of trusted computing’s security benefits.
Halderman Dissects New CD Copy Protection
Alex Halderman has published an interesting technical report analyzing the newest CD “copy protection” technology. Alex, who is a graduate student here in Princeton’s computer science department, also wrote the definitive paper on the previous generation of CD copy protection.
Alex’s paper explains how the SunnComm technology works and why it won’t help the record labels fight copyright infringement. Despite the usual claims by the vendor (SunnComm) that the technology provides “an incredible level of security for the music”, Alex found that it is quite weak.
This technology is going to end up in the hall of fame beside the previous Sony technology that was famously defeated by drawing on the CD with a felt-tipped pen. This time, the technology can be defeated completely by holding down the computer’s Shift key while inserting the CD.
Is this the end of the road for CD copy protection? It ought to be. At the very least, I hope people in the industry will learn to ask for proof before they believe the next DRM vendor peddling “an incredible level of security”.
"Hacktivism" by Artists
A debate has started over the suggestion by Harvard Law prof Charles Nesson that artists respond to file-sharing of their work with “hacktivism,” by launching targeted denial-of-service attacks on people who redistribute their work. The reaction in blogworld has been negative.
This is probably illegal, but Derek Slater writes that Prof. Nesson is looking for ways to “support its legality.” Perhaps he would resurrect the Berman-Coble bill, which died in Congress last year . That bill would have legalized such attacks, if carried out on behalf of copyright owners.
Discussion has focused on the short-term effects of allowing targeted DoS attacks, for example on the possibility of mistaken attacks on innocent people.
If we look instead at the long term, the picture becomes even clearer. I wrote about this in the written testimony I submitted last year to a House hearing on the Berman-Coble bill:
The designers of peer-to-peer software will not simply accept this situation, but will respond by modifying their software to thwart such targeted denial of service attacks. They might do this, for example, by eliminating the self-imposed limit on the number of connections the peer-to-peer program will accept. These countermeasures will start an “arms race” between copyright owners [or artists, in Nesson’s version] and peer-to-peer system designers, with copyright owners [or artists] devising new types of targeted denial of service attacks, and peer-to-peer designers revising their software to dodge these targeted attacks.
Computer security analysis can often predict the result of such technical arms races. For example, analysis of the arms race between virus writers and antivirus companies leads to the prediction that antivirus products will be able to cope almost perfectly with known virus strains but will be largely helpless against novel viruses. This is indeed what we observe.
A similar analysis can be applied to the arms race, under the Berman Bill’s rules [which presumably are similar to the rules Nesson would choose], between peer-to-peer authors and copyright owners. In my view, the peer-to-peer authors have a natural advantage in this arms race, and they will be able to stay a step ahead of the copyright owners. Copyright owners will be forced either to give up on the strategy of narrowly targeted denial of service attacks, or to escalate to a more severe form of denial of service, such as one that crashes the target computer or jams completely its Internet connection. I understand that these more severe attacks are currently illegal, and would not be legalized by the Berman Bill, so such an escalation would not be possible within the law even if the Berman Bill is enacted. I conclude that the Berman Bill as written is unlikely to do copyright holders much good in the end.
Derek Slater put it much more succinctly when he wrote that “A technological arms race can only have one result: going nuclear. “
Story Time (Cont.)
Several readers took issue with my previous post relating anti-infringement technology to anti-cancer technology. So let me clarify what I was and wasn’t trying to say.
First, I wasn’t saying that infringement is okay. It’s not. And I wasn’t trying to draw a moral equivalence between infringers and copyright owners. Remember: I analogized infringement to cancer.
Second, I wasn’t saying that we shouldn’t do anything about infringement. Certainly, some anti-infringement measures are worth trying.
Third, I wasn’t saying that it would be wrong to deploy an effective, side-effect-free anti-infringement technology, if such a thing actually existed.
What I was trying to do was to draw an analogy between anti-infringement technologies and anti-cancer technologies, and to point out that people think about these two technology problems very differently, and without good reason. Here are four examples of the difference:
(1) Many people in the policy debate just assume that there must be a technology available that can prevent infringement. Nobody makes such an assumption about cancer.
(2) Doctors who say “I don’t know how to cure cancer” are not accused of being pro-cancer. But software companies that say “I don’t know how to stop infringement” are accused of being pro-infringement.
(3) When a company claims to have a foolproof anti-infringement technology, their claim is often taken seriously, even if no evidence is presented to support it. But nobody would believe a claim that a drug can cure cancer, based only on unsupported assertions by a drug company vice president. Actual scientific evidence is required.
(4) Congress or the FDA wouldn’t dream of mandating the use of a particular cancer treatment (thereby banning other treatments), without independent testing of the proposed treatment and a lengthy and open discussion of how and whether it worked. Yet when it comes to infringement, mandating secret or poorly tested technologies is taken seriously as a policy option.
For some reason, the development of anti-infringment technology is treated as a political problem that can be solved by dealmaking or by decree.
Diebold Voting Machines "At High Risk of Compromise"
As expected, an independent study of the Diebold electronic voting machines purchased by the state of Maryland has found that “The system, as implemented in policy, procedure, and technology, is at high risk of compromise.” The study was commissioned by the state and performed by SAIC. A Washington Post story by Brigid Schulte reports that SAIC “found 328 security weaknesses, 26 of them critical”.
The report is available to the public only in heavily redacted form, which in itself does not inspire confidence. What is in the redacted version is bad enough; for example, it reports that the Diebold machines didn’t even bother to encrypt the vote totals before sending them to the Board of Elections.
Diebold, which had previously said we should trust their unspecified security mechanisms, now says that we should trust them to implement unspecified fixes for these problems.
In case you have any remaining confidence in unaudited electronic voting systems, consider this: a Diebold executive told the Washington Post that the fixes will be made to the Maryland machines, but not to the 33,000 Diebold electronic voting machines already in use outside of Maryland.