November 21, 2024

Too Stupid to Look the Other Way

David Weinberger explains the value of “leeway,” or small decisions not to enforce the rules in cases where enforcement wouldn’t be reasonable.

Imagine that your mother were visiting your apartment, and she got sick, so you let her stay overnight because she wan’t well enough to travel home. If this happened, no reasonable landlord would enforce a no-overnight-guests rule against you. Weinberger says:

Leeway is the only way we manage to live together: We ignore what isn’t our business. We cut one another some slack. We forgive one another when we transgress.

By bending the rules we’re not violating fairness. The equal and blind application of rules is a bureaucracy’s idea of fairness. Judiciously granting leeway is what fairness is all about. Fairness comes in dealing with the exceptions.

And there will always be exceptions because rules are imposed on an unruly reality. The analog world is continuous. It has no edges and barely has corners. Rules at best work pretty well. That’s why in the analog world we have a variety of judges, arbiters, and referees to settle issues fairly when smudgy reality outstrips clear rules.

The problem, Weinberger says, is computers don’t give leeway. Would the computer toss your sick mother out on the street, or cancel your lease because you let her stay?

Of course, you can always change the rules to add exceptions, such as a sick-mother allowance. Doing this would cover some cases, but you would be left with a more complex set of rules that was still enforced inflexibly. You can change the rules, but you can’t teach a computer to give leeway.

Weinberger goes on:

Which brings us to “digital rights management” which implements in code a digital view of rights. Yes, vendors and users should have a wide variety of agreements possible, but the nature of those agreements is necessarily digital….

If we build software that enables us to “negotiate” usage rules with content providers, the rules can be as favorable as we’d like but their enforcement will necessarily be strict, literal and unforgiving. Binary, not human.

DRM raises very difficult leeway issues. Fair use is an officially sanctioned leeway mechanism, designed to prevent enforcement of certain rules when the particular circumstances would make enforcement unwise. Fair use is just the kind of subtle and context-dependent leeway mechanism that computers can’t handle.

Weinberger’s message can be summed up in a quote attributed to him by Jon Udell:

That’s the problem with DRM. Computers are too stupid to look the other way.

Schoen vs. Stallman on "Trusted Computing"

Seth Schoen raises two interesting issues in his response to Richard Stallman’s essay on “trusted computing.” (To see Seth’s posting, click here and scroll down to the “Trusted computing” heading.)

Stallman says

[Trusted computing] is designed to stop your computer from functioning as a general-purpose computer.

Schoen responds:

Neither of these concerns is applicable at all to Palladium (as Microsoft has described it to us) or to TCPA (as the TCPA has specified it and as it has been implemented). While Microsoft could be misleading us about Palladium, the TCPA specification is public and implementations of it have already been made.

It’s possible that some other trusted computing system could have such a misfeature, but the design of TCPA and Palladium doesn’t require these properties at all, as far as I can tell, and they seem to be more or less independent.

Schoen is right here – Palladium and TCPA do not do what Stallman says it does. Stallman seems too eager to blame Microsoft for the sins of others.

The conversation then moves on to the connection between Palladium and the Hollings CBDTPA. The Hollings bill mandates that some kind of “trusted computing” restrictions be made mandatory in essentially all digital devices. But what kind of restrictions would be mandated?

Stallman implies strongly that the CBDTPA would mandate the use of Palladium. Schoen disagrees, saying that he is “not convinced that something like Palladium is the infrastructure contemplated by the CBDTPA.”

Here I don’t know who is right. The CBDTPA is cleverly constructed so that it doesn’t say what it is mandating – it leaves that to be decided later, either by the FCC or by a vaguely-specified industry consortium. This gives CBDTPA advocates a way to dodge hard questions about the bill’s effects, by invoking a hoped-for perfect technical solution that is just around the corner. Given the track record of copy restriction and its advocates, I think we should insist on taking a test drive before we buy this used car.

Paper on Copy-Protected CDs

Alex Halderman, a senior here at Princeton, has written a very interesting paper entitled “Evaluating New Copy-Prevention Techniques for Audio CDs.” Here is the paper’s abstract:

Several major record labels are adopting a new family of copy-prevention techniques intended to limit “casual” copying by compact disc owners using their personal computers. These employ deliberate data errors introduced into discs during manufacturing to cause incompatibility with PCs without affecting ordinary CD players. We examine three such recordings: A Tribute to Jim Reeves by Charley Pride, A New Day Has Come by Celine Dion, and More Music from The Fast and the Furious by various artists. In tests with different CD-ROM drives, operating systems, and playback software, we find these discs are unreadable in most widely-used applications today. We analyze the specific technical differences between the modified recordings and standard audio CDs, and we consider repairs to hardware and software that would restore compatibility. We conclude that these schemes are harmful to legitimate CD owners and will not reduce illegal copying in the long term, so the music industry should reconsider their deployment.

The paper will appear in the proceedings of the ACM’s DRM Workshop. It’s currently available, but only in PostScript format, on the Workshop’s site. (It’s available in PDF format here.)

It’s rare to see a workshop like this accept a single-author paper written by an undergraduate; but this paper is really good. (Grad schools: you want this guy!)

More on the Almost-General-Purpose Language

Seth Finkelstein and Eric Albert criticize my claim that the fallacy of the almost-general-purpose computer can best be illustrated by analogy to an almost-general-purpose spoken language. They make some good points, but I think my original conclusion is still sound.

Seth argues that speech (or a program) can be regulated by making it extremely difficult to express, even if it isn’t strictly impossible to say. I’m skeptical of this claim for human languages, since it seems to me that no usable language can hope to prevent people from creating new words and then teaching others what they mean. I think my skepticism is even more valid for computer languages. If a computer language makes something difficult but not impossible, then some programmer will create a library that provides the difficult functionality in more convenient form. This is the computer equivalent of creating a new word and defining it for others.

Eric argues that advancing technology might make it possible to restrict what people can say online. I’m skeptical, but he may be right that restrictions on, say, porn may become more accurately enforceable over time. Still, my point was not that mega-censorship is impossible, but that mega-censorship necessarily causes huge collateral damage.

There’s another obvious reason to like the 1984 analogy: using it puts the anti-computer forces into the shoes of the 1984 government. (I don’t think they’ll spend a lot of time comparing and contrasting themselves with the 1984 government.)

You may say that this is cheap rhetorical trick, but I disagree. I believe that code is speech, and I believe that its status as speech is not just a legal technicality but a deep truth about the social value of code. What the code-regulators want is not so different from what the speech-regulators of 1984 wanted.

Schoen: Palladium Can Have an "Owner Override"

Seth Schoen argues that “trusted systems” like Palladium can have a sort of manual override that allows the owner to get all of the data on a machine, even if it is protected by DRM.

As Seth points out, the main implication of this is that it is possible to build a system like Palladium in a way that provides benefits to the user but doesn’t give outsiders the ability to lock the user out of some sections of his own machine. If Seth is right about this, then you don’t have to give up control over your machine in order to have Palladium protect your own interests as a user.

This is a pretty interesting argument, but I suspect that there’s more discussion to be had here. For one thing, Seth suggests requiring physical presence (i.e., pushing a button or the like) to use the give-me-all-the-data feature; but physical presence is not enough, as people other than the machine’s owner often get physical access to it. Nonetheless, this is great stuff, and worth reading for those interested in the implications of DRM and “trusted systems”.