March 19, 2024

Archives for July 2003

Conflict of Interest

Several readers have asked about the big project that has kept me from blogging much this summer. The “project” involved expert witness testimony in a lawsuit, Eolas Technologies and University of California v. Microsoft. I testified as an expert witness, called by the plaintiffs. (The case is ongoing.)

In some alternative universe, this lawsuit and my work on it would have provided fodder for many interesting blog posts. But, as so often happens here in this universe, I can’t really talk or write about most of it.

It’s depressing how often this kind of thing happens, with direct knowledge of a topic serving to disqualify somebody from talking about it. Many conflict of interest rules seem to have this effect, locking out of a discussion precisely those people who know the topic best.

The same thing often happens in discussions with the press, where people who are connected to an issue has to speak especially carefully, because their words might be attributed indirectly to one of the participants. The result can be that those unconnected to the events get most of the ink.

Now I understand why these rules and practices exist; and in most cases I agree that they are good policy. I understand why I cannot talk about what I have learned on various topics. Still, it’s frustrating to imagine how much richer our public discourse could be if everybody were free to bring their full knowledge and understanding to the table.

[I remember an interesting old blog post on a related topic from Lyn Millett over at uncorked.org; but I couldn’t find her post when I was writing this one.]

Here We Go Again

Rep. John Conyers has introduced the Author, Consumer, and Computer Owner Protection and Security (ACCOPS) Act of 2003 in the House of Representatives.

The oddest provision of the bill is this one:

(a) Whoever knowingly offers enabling software for download over the Internet and does not–

(1) clearly and conspicuously warn any person downloading that software, before it is downloaded, that it is enabling software and could create a security and privacy risk for the user’s computer; and

(2) obtain that person’s prior consent to the download after that warning;

shall be fined under this title or imprisoned not more than 6 months, or both.

(b) As used in this section, the term `enabling software’ means software that, when installed on the user’s computer, enables 3rd parties to store data on that computer, or use that computer to search other computers’ contents over the Internet.

As so often happens in these sorts of bills, the definition has unexpected consequences. For example, it would apparently categorize Microsoft Windows as “enabling software,” since Windows offers both file server facilities and network search facilities. But the original Napster client, lacking upload and search facilities, would not be “enabling software.”

Note also that the mandated security and privacy warnings would be misleading. After all, there is no reason why file storage or search services are inherently riskier than other network software. Misleading warnings impose a real cost, since they dilute users’ trust in any legitimate warnings they see.

The general approach of this bill, which we also saw in the Hollings CBDTPA, is to impose regulation on Bad Technologies. This approach will be a big success, once we work out the right definition for Bad Technologies.

Imagine the simplification we could achieve by applying this same principle to other areas of the law. For example, the entire criminal law can be reduced to a ban on Bad Acts, once we work out the appropriate definition for that term. Campaign finance law would be reduced to a ban on Corrupting Financial Transactions (with an appropriate exception for Constructive Debate).

Back in the Saddle

I haven’t been posting much lately, due to a high-intensity project that has sucked up all of my time. But now that’s over, so I should return to normal posting pace soon.

Why Aren't Virus Attacks Worse?

Dan Simon notes a scary NYT op-ed, “Terrorism and the Biology Lab,” by Henry C. Kelly. Kelly argues convincingly that ordinary molecular biology students will soon be able to make evil bio-weapons. Simon points out the analogy to computer viruses, which are easily made and easily released. If serious bio-weapons become as common as computer viruses, we are indeed in deep trouble.

Eric Rescorla responds by noting that the computer viruses we actually see do relatively little damage, at least compared to what they might have done. Really malicious viruses, that is, ones engineered to do maximum damage, are rare. What we see instead are viruses designed to get attention and to show that the author could have done damage. The most likely explanation is that the authors of well-known viruses have written them as a sort of (twisted) intellectual exercise rather than out of spite. [By the way, don’t miss the comments on Eric’s post.]

This reminds me of a series of conversations I had a few years ago with a hotshot mo-bio professor, about the national-security implications of bio-attacks versus cyber-attacks. I started out convinced that the cyber-attack threat, while real, was overstated; but bio-attacks terrified me. He had the converse view, that bio-attacks were possible but overhyped, while cyber-attacks were the real nightmare scenario. Each of us tried to reassure the other that really large-scale malicious attacks of the type we knew best (cyber- for me, bio- for him) were harder to carry out, and less likely, than commonly believed.

It seems to me that both of us, having spent many days in the lab, understood how hard it really is to make a novel, sophisticated technology work as planned. Since nightmare attacks are, by definition, novel and sophisticated and thus not fully testable in advance, the odds are pretty high that something would go “wrong” for the attacker. With a better understanding of how software can go wrong, I fully appreciated the cyber-attacker’s problem; and with a better understanding of how bio-experiments can go wrong, my colleague fully appreciated the bio-attacker’s problem. If there is any reassurance here, it is in the likelihood that any would-be attacker will miss some detail and his attack will fizzle.

Aimster Loses

As expected, the Seventh Circuit Court of Appeals has upheld a lower court’s temporary injunction against the Aimster file-sharing service. The Court’s opinion was written by Judge Richard Posner.

I noted three interesting things in the opinion. First, the court seemed unimpressed with Aimster’s legal representation. At several points the opinion notes arguments that Aimster could have made but didn’t, or important factual questions on which Aimster failed to present any evidence. For example, Aimster apparently never presented evidence that its system is ever used for noninfringing purposes.

Second, the opinion states, in a surprisingly offhand manner, that it’s illegal to fast-forward through the commercials when you’re replaying a taped TV show. “Commercial-skipping … amounted to creating an unauthorized derivative work … namely a commercial-free copy that would reduce the copyright owner’s income from his original program…”

Finally, the opinion makes much of the fact that Aimster traffic uses end-to-end encryption so that the traffic cannot be observed by anybody, including Aimster itself. Why did Aimster choose a design that prevented Aimster itself from seeing the traffic? The opinion assumes that Aimster did this because it wanted to remain ignorant of the infringing nature of the traffic. That may well be the real reason for Aimster’s use of encryption.

But there is another good reason to use end-to-end encryption in such a service. Users might want to transfer sensitive but noninfringing materials. If so, they would want those transfers to be protected by encryption. The transfer could in principle be decrypted and then reencrypted at an intermediate point such as an Aimster server. This extra decryption would indeed allow Aimster to detect infringement, but it would have several enginering disadvantages, including the extra processing time required to do the extra decryption and reencryption, and the risk of the data being compromised in case of a break-in at the server. The opinion hints at all of this; but apparently Aimster did not offer arguments on this point.

The opinion says, instead, that a service provider has a limited duty to redesign a service to prevent infringement, where the cost of adopting a different design must be weighted against the amount of infringement that it would prevent.

Even when there are noninfringing uses of an Internet file-sharing service, moreover, if the infringing uses are substantial then to avoid liability as a contributory infringer the provider of the service must show that it would have been dispropotionately costly for him to eliminate or at least reduce substantially the infringing uses.

And so one more reading of the Supreme Court’s Sony Betamax decision is now on the table.