November 26, 2024

Aimster Loses

As expected, the Seventh Circuit Court of Appeals has upheld a lower court’s temporary injunction against the Aimster file-sharing service. The Court’s opinion was written by Judge Richard Posner.

I noted three interesting things in the opinion. First, the court seemed unimpressed with Aimster’s legal representation. At several points the opinion notes arguments that Aimster could have made but didn’t, or important factual questions on which Aimster failed to present any evidence. For example, Aimster apparently never presented evidence that its system is ever used for noninfringing purposes.

Second, the opinion states, in a surprisingly offhand manner, that it’s illegal to fast-forward through the commercials when you’re replaying a taped TV show. “Commercial-skipping … amounted to creating an unauthorized derivative work … namely a commercial-free copy that would reduce the copyright owner’s income from his original program…”

Finally, the opinion makes much of the fact that Aimster traffic uses end-to-end encryption so that the traffic cannot be observed by anybody, including Aimster itself. Why did Aimster choose a design that prevented Aimster itself from seeing the traffic? The opinion assumes that Aimster did this because it wanted to remain ignorant of the infringing nature of the traffic. That may well be the real reason for Aimster’s use of encryption.

But there is another good reason to use end-to-end encryption in such a service. Users might want to transfer sensitive but noninfringing materials. If so, they would want those transfers to be protected by encryption. The transfer could in principle be decrypted and then reencrypted at an intermediate point such as an Aimster server. This extra decryption would indeed allow Aimster to detect infringement, but it would have several enginering disadvantages, including the extra processing time required to do the extra decryption and reencryption, and the risk of the data being compromised in case of a break-in at the server. The opinion hints at all of this; but apparently Aimster did not offer arguments on this point.

The opinion says, instead, that a service provider has a limited duty to redesign a service to prevent infringement, where the cost of adopting a different design must be weighted against the amount of infringement that it would prevent.

Even when there are noninfringing uses of an Internet file-sharing service, moreover, if the infringing uses are substantial then to avoid liability as a contributory infringer the provider of the service must show that it would have been dispropotionately costly for him to eliminate or at least reduce substantially the infringing uses.

And so one more reading of the Supreme Court’s Sony Betamax decision is now on the table.

P2P Evolution to Accelerate

The Washington Post online has a nice summary/directory of articles on the RIAA’s upcoming crackdown on peer-to-peer file sharers. The crackdown seems like a risky move, but it seems the industry can’t think of anything else to do about their P2P problem.

When the industry sued Napster into oblivion, Napster was replaced, hydra-like, by a newer generation of P2P systems that are apparently resistant to the tactics that took down Napster.

The RIAA’s new crackdown, if it works, will most likely cause yet another step in the evolution of P2P systems. P2P systems that provide only weak anonymity protection for their users will fade away, replaced by a new generation of P2P technology that resists the RIAA’s new tactics.

The RIAA’s new tactic is to join a P2P network semi-anonymously, and then to pierce the anonymity of people who are offering files. There are two countermeasures that can frustrate this tactic, and the use of these countermeasures is already starting to grow slowly.

The first countermeasure is to provide stronger anonymity protection for users, to prevent investigators from so easily unmasking users who are sharing files.

The second countermeasure is share files only among small friends-and-family groups, making it difficult for investigators to join the group. If every P2P user is a member of a few of these overlapping small groups, then files can still diffuse from place to face fairly quickly.

All of this must look pretty unfair from the RIAA’s point of view. No matter how strong the RIAA’s legal and ethical arguments against file sharing are, people will continue to share files as long as they view it as a basically benign activity. It seems to me that only a change in public attitudes, or a change in the basic legal structure of copyright, can solve the file sharing problem.

A Modest Proposal

Now that the Supreme Court has ruled that Congress can condition Federal funding for libraries on the libraries’ use of censorware (i.e., that a law called CIPA is consistent with the constitution), it’s time to take a serious look at the deficiencies of censorware, and what can be done about them.

Suppose you’re a librarian who wants to comply with CIPA, but otherwise you want your patrons to have access to as much material on the Net as possible. From your standpoint, the popular censorware products have four problems. (1) They block some unobjectionable material. (2) They fail to block some material that is obscene or harmful to minors. (3) They try to block material that Congress does not require to be blocked, such as certain political speech. (4) They don’t let you find out what they block.

(1) and (2) are just facts of life – no technology can eliminate these problems. But (3) and (4) are solvable – it’s possible to build a censorware program that doesn’t try to block anything except as required by the law, and it’s possible for a program’s vendor to reveal what their product blocks. But of course it’s unlikely that the main censorware vendors will give you (3) or (4).

So why doesn’t somebody create an open-source censoreware program that is minimally compliant with CIPA? This would give librarians a better option, and it would put pressure on the existing vendors to narrow their blocking lists and to say what they block.

I can understand why people would have been hesitant to create such a program in the past. Most people who want to minimize the intrusiveness of censorware have thus far done so by not using censorware; so there hasn’t been much of a market for a narrowly tailored product. But that may change as librarians are forced to use censorware.

Also, censorware opponents have found the lameness and overbreadth of existing censorware useful, especially in court. But now, in libraries at least, that usefulness is mostly past, and it’s time to figure out how to cope with CIPA in the least harmful way. More librarian-friendly censorware seems like a good start.

[Note: I must admit that I’m not entirely convinced by my own argument here. But I do think it has some merit and deserves discussion, and nobody else seemed to be saying it. So let the flaming begin!]

Hatch "Clarifies" His Position

Senator Orrin Hatch issued a short press release yesterday, backtracking from his previous (mis-)statement about remedies for copyright infringement. There are some interesting tidbits in the release, which I quote here in full, with the surprising bits italicized:

HATCH COMMENTS ON COPYRIGHT ENFORCEMENT

Washington – Sen. Orrin G. Hatch (R-Utah), Chairman of the Senate Judiciary Committee, today issued the following statement:

“I am very concerned about Internet piracy of personal and copyrighted materials, and I want to find effective solutions to these problems.

“I made my comments at yesterday’s hearing because I think that industry is not doing enough to help us find effective ways to stop people from using computers to steal copyrighted, personal or sensitive materials. I do not favor extreme remedies – unless no moderate remedies can be found. I asked the interested industries to help us find those moderate remedies.”

We can assume that every word of the release was chosen carefully, since it was issued in writing by Hatch’s office to clarify his position after a previous misstatement.

It’s significant, then, that he wants technology to prevent not only copyright infringement but also “piracy” of “personal or sensitive” information.

Note also that he does not entirely disavow his previous statement that appeared to advocate vigilante destruction of the computers of suspected violators – he still favors “extreme remedies” if “moderate remedies” prove infeasible, an eventuality that seems likely given his apparent belief that we have no moderate remedies today.

If the mainstream press is paying attention, they ought to find this alarming, since much of what they do involves collecting and publishing information that some people would prefer to call “personal or sensitive”. If “extreme remedies” for copyright infringement are a bad idea, “extreme remedies” for making truthful statements about other people are even worse.

Layers

Lawrence Solum and Minn Chung have a new paper, “The Layers Principle: Internet Architecture and the Law,” in which they argue that layering is an essential part of the Internet’s architecture and that Internet regulation should therefore respect the Internet’s layered nature. It’s a long paper, so no short commentary can do it justice, but here are a few reactions.

First, there is no doubt that layering is a central design principle of the Internet, or of any well-designed multipurpose network. When we teach computer science students about networks, layering is one of the most important concepts we try to convey. Solum and Chung are right on target about the importance of layering.

They’re on shakier ground, though, when they relate their layering principle to the end-to-end principle that Lessig has popularized in the legal/policy world. (The end-to-end principle says that most of the “brains” in the Internet should be at the endpoints, e.g. in end users’ computers, rather than in the core of the network itself.) Solum and Chung say that end-to-end is a simple consequence of their layering principle. That’s true, but only because the end-to-end principle is built in to their assumptions, in a subtle way, from the beginning. In their account, layering occurs only at the endpoints, and not in the network itself. While this is not entirely accurate, it’s not far wrong, since the layering is much deeper at the endpoints than in the core of the Net. But the reason this is true is that the Net is designed on the end-to-end principle. There are alternative designs that use deep layering everywhere, but those were not chosen because they would have violated the end-to-end principle. End-to-end is not necessarily a consequence of layering; but end-to-end is, tautologically, a consequence of the kind of end-to-end style layering that Solum and Chung assume.

Layering and end-to-end, then, are both useful rules of thumb for understanding how the Internet works. It follows, naturally, that regulation of the Net should be consistent with both principles. Any regulatory proposal, in any sphere of human activity, is backed by a story about how the proposed regulation will lead to a desirable result. And unless that story makes sense – unless it is rooted in an understanding of how the sphere being regulated actually works – then the proposal is almost certainly a bad one. So regulatory plans that are inconsistent with end-to-end or layering are usually unwise.

Of course, these two rules of thumb don’t give us the complete picture. The Net is more complicated, and sometimes a deeper understanding is needed to evaluate a policy proposal. For example, a few widespread and helpful practices such as Network Address Translation violate both the end-to-end principle and layering; and so a ban on address translation would be consistent with end-to-end and layering, but inconsistent with the actual Internet. Rules of thumb are at best a lesser substitute for detailed knowledge about how the Net really works. Thus far, we have done a poor job of incorporating that knowledge into the regulatory process. Solum and Chunn’s paper has its flaws, but it is a step in the right direction.

[UPDATE (Sept. 11, 2003): Looking back at this entry, I realize that by devoting most of my “ink” to my area of disagreement with Solum and Chunn, I might have given the impression that I didn’t like their paper. Quite the contrary. It’s a very good paper overall, and anyone serious about Internet policy should read it.]