November 26, 2024

Revisionism

Seth Finkelstein points to a rather sloppy analysis by Peter Davies of the Felten v. Recording Industry lawsuit. There is enough of this sort of thing going around that I feel compelled to rebut it.

[Background on the lawsuit: In 2001, recording industry organizations threatened to sue me and seven of my colleagues if we published a paper we had written that discussed certain technology. They argued that publishing the paper would violate the Digital Millennium Copyright Act. We filed a lawsuit, asking the court to rule on the question of whether our publication of the paper would be legal.]

For starters, Davies gets basic facts wrong. He says that the International Information Hiding Workshop, at which we wanted to publish our paper, was organized by the recording industry. In fact, it was an independent, refereed scientific conference.

Amazingly, Davies also misstates the final resolution of our case, saying that “[t]he case was settled in the end without a result.” In fact, no settlement was agreed to by the parties. After we filed our lawsuit, the recording industry parties conceded our right to publish our paper, which was the main result we sought. Once we had the right to publish the paper, our constitutional challenge to the DMCA was dismissed as moot.

Davies appears to think that we should just have gone ahead with publishing our paper, daring the recording industry to sue us. Seth Finkelstein rightly criticizes him for this.

To people like Davies, the Felten case is just an abstract topic for speculation. Let me assure you cases like this look much different if you are Felten (or any of the other would-be defendants: Bede Liu, Scott Craver, Min Wu, Dan Wallach, Ben Swartzlander, Adam Stubblefield, and Drew Dean).

I am happy to admit that if we had gone ahead and published the paper without any lawsuit, the odds were only 50/50 that we would have been sued, and we probably would have won the lawsuit.

Probably, I would have kept my house.

Probably, I would have kept my job.

When it’s not your house on the line, when it’s not your job, then probably may be enough. To people like Davies, who had nothing personally at risk, a lawsuit would have been no more than a scholarly conversation piece.

For me and my colleagues, probably wasn’t enough. Even a 99% chance of getting to keep our houses and savings wasn’t enough. Nor should it be. I am still outraged when people like Davies suggest that it’s not a problem if researchers have to put so much at risk just to write or speak on certain topics of public interest.

Bring on the Subpoena-Bots!

A few years ago I was summoned for jury duty. The summons was an old-fashioned computer-printed document spit out by an IBM mainframe computer down at the county courthouse. Procedural rules required that prospective jurors be chosen by an officer of the court, so a judge had apparently deputized the mainframe as an officer of the court. For some reason I found this concept, of a computer as deputized legal officer, endlessly amusing.

Now the same concept is being applied at the Federal level. But in this case the computer isn’t even owned and run by the court. It’s run by the recording industry.

The recording industry, you see, is barraging the Federal courts with requests for subpoenas to compel Internet Service Providers to identify their customers who are alleged to be offering copyrighted music for download. Seth Schoen has read many of these subpoenas and he reports that “they’re obviously generated by a script”, that is, by a computer program.

Congress created the special subpoena provision that the RIAA is using here, a provision that requires the court to rubber-stamp any subpoena request made by a copyright holder who claims to have a good-faith belief that its copyrights are being infringed. Given this relatively low standard for issuance of a subpoena, the advent of subpoena-bots should come as no surprise.

Of course, big copyright owners aren’t the only people allowed to use subpoena-bots. Virtually everything that anybody writes is copyrighted, so this subpoena power is available to every writer or artist, even down to the humblest newbie blogger. Want to know who that anonymous critic is? No problem; send your subpoena-bots after them.

Voting Machine Insecurity

Recently, researchers at John Hopkins and Rice Universities reported serious security flaws in electronic voting technology sold by Diebold. I haven’t yet had a chance to read the paper carefully, but I know all of the authors and I would be very surprised if they are wrong. Eric Rescorla discusses the paper and Diebold’s response.

This story follows a common pattern, in which a company claims that its secret technology is secure, only to have the security claim collapse when the system’s design finally does become known. This happens so often that security experts now routinely discount security claims that have not been subject to public scrutiny.

The researchers’ results should not be taken as evidence that Diebold machines are less secure than other secret systems. Most likely, all of the secret systems suffer from a similar level of problems. If Diebold fixes the reported problems, then Diebold’s systems will probably be more secure than their competitors.

This effect is what makes legislation like H.R. 2239 so important. Secrecy makes it difficult for vendors to differentiate their products based on security, since the secrecy makes it so difficult for a buyer to tell a secure product from an insecure one. Opening the systems up for inspection allows vendors to compete based on security, and that competition helps everybody.

Conflict of Interest

Several readers have asked about the big project that has kept me from blogging much this summer. The “project” involved expert witness testimony in a lawsuit, Eolas Technologies and University of California v. Microsoft. I testified as an expert witness, called by the plaintiffs. (The case is ongoing.)

In some alternative universe, this lawsuit and my work on it would have provided fodder for many interesting blog posts. But, as so often happens here in this universe, I can’t really talk or write about most of it.

It’s depressing how often this kind of thing happens, with direct knowledge of a topic serving to disqualify somebody from talking about it. Many conflict of interest rules seem to have this effect, locking out of a discussion precisely those people who know the topic best.

The same thing often happens in discussions with the press, where people who are connected to an issue has to speak especially carefully, because their words might be attributed indirectly to one of the participants. The result can be that those unconnected to the events get most of the ink.

Now I understand why these rules and practices exist; and in most cases I agree that they are good policy. I understand why I cannot talk about what I have learned on various topics. Still, it’s frustrating to imagine how much richer our public discourse could be if everybody were free to bring their full knowledge and understanding to the table.

[I remember an interesting old blog post on a related topic from Lyn Millett over at uncorked.org; but I couldn’t find her post when I was writing this one.]

Here We Go Again

Rep. John Conyers has introduced the Author, Consumer, and Computer Owner Protection and Security (ACCOPS) Act of 2003 in the House of Representatives.

The oddest provision of the bill is this one:

(a) Whoever knowingly offers enabling software for download over the Internet and does not–

(1) clearly and conspicuously warn any person downloading that software, before it is downloaded, that it is enabling software and could create a security and privacy risk for the user’s computer; and

(2) obtain that person’s prior consent to the download after that warning;

shall be fined under this title or imprisoned not more than 6 months, or both.

(b) As used in this section, the term `enabling software’ means software that, when installed on the user’s computer, enables 3rd parties to store data on that computer, or use that computer to search other computers’ contents over the Internet.

As so often happens in these sorts of bills, the definition has unexpected consequences. For example, it would apparently categorize Microsoft Windows as “enabling software,” since Windows offers both file server facilities and network search facilities. But the original Napster client, lacking upload and search facilities, would not be “enabling software.”

Note also that the mandated security and privacy warnings would be misleading. After all, there is no reason why file storage or search services are inherently riskier than other network software. Misleading warnings impose a real cost, since they dilute users’ trust in any legitimate warnings they see.

The general approach of this bill, which we also saw in the Hollings CBDTPA, is to impose regulation on Bad Technologies. This approach will be a big success, once we work out the right definition for Bad Technologies.

Imagine the simplification we could achieve by applying this same principle to other areas of the law. For example, the entire criminal law can be reduced to a ban on Bad Acts, once we work out the appropriate definition for that term. Campaign finance law would be reduced to a ban on Corrupting Financial Transactions (with an appropriate exception for Constructive Debate).