December 21, 2024

Don't Blame "The Government"

Some people have interpreted my previous posting, “The Fallacy of the Almost-General-Purpose Computer” as saying that the U.S. government views general-purpose computers as a threat. That’s not quite what I meant to say. What I meant to say was that in Washington law/policy/lobbyist circles, the proposition that general-purpose computers might be too dangerous is now (apparently) being taken seriously.

It’s almost always a mistake to talk about “the U.S. government” as having any particular view, as though the government were a monolithic entity that had a unitary viewpoint or a single coherent plan. Any organization so large necessarily will have multiple viewpoints jostling for influence, and its left hand will not know what its right hand is doing. This is especially true for the U.S. government, which was designed to include multiple competing voices, and not to centralize power in any one place.

The Fallacy of the Almost-General-Purpose Computer

I was at a conference in Washington, DC on Friday and Saturday. Participants included some people who are reasonably plugged in to the Washington political process. I was stunned to hear one of these folks sum up the Washington conventional wisdom like this:

“The political dialog today is that the general purpose computer is a threat, not only to copyright but to our entire future.”

(It’s worth noting that he was repeating the views of others rather than offering his own opinion – and that he had a general-purpose computer open on the table in front of him as he said this!)

If I could take just one concept from computer science and magically implant it into the heads of everybody in Washington – I mean really implant it, so that they understood the idea and its importance in the same way that computer scientists do – it would be the role of the general-purpose computer. I would want them to understand, most of all, why there is no such thing as an almost-general-purpose computer.

If you’re designing a computer, you have two choices. Either you make a general-purpose computer that can do everything that every other computer can do; or you make a special-purpose device that can do only an infinitesimally small fraction of all the interesting computations one might want to do. There’s no in-between.

I can tell you that this is true. And I can assure you that every well-educated computer scientist knows why it is true. But what I don’t know how to do – at least not yet – is to give a simple, non-technical explanation for it. If anybody has a hint about how to do this, please, please let me know.

Bricklin: Copy Protection Robs the Future

Dan Bricklin explains how copy restriction technology frustrates archiving of historically interesting works. Archivists normally preserve works by copying them; so works that can’t be copied may never be archived.

Bricklin tells a sobering story about his attempts to recover an original copy of VisiCalc (the first spreadsheet program, of which Bricklin himself was the primary author). Due to copy restriction technology, he was unable to recover a version himself. Ultimately he found that an ex-employee had kept a (probably unauthorized) unprotected copy, so he was able to recover the program and archive it. VisiCalc is only about twenty years old, and was one of the most popular computer programs of its time. An older or less popular work might well have been lost forever.

Misleading Term of the Week: "Rights"

A “right” is a legal entitlement – something that the law says you are allowed to do. But the term is often misused to refer to something else.

Consider, for example, the use of “digital rights management” (often abbreviated as DRM) to describe technologies that restrict the use of creative works. In practice, the “rights” being managed are really just rules that the copyright owner wants to impose; and those rules may bear little relation to the parties’ legal rights. Cloaking these restrictions in the language of “rights” makes them sound more neutral and unchangeable than they really are.

DRM advocates often put forth arguments that go roughly like this:

(1) we have built technology that doesn’t let you do X;

(2) therefore you cannot do X;

(3) therefore you do not have the right to do X;

(4) therefore you should be required to use technology that doesn’t let you do X.

The trickiest part of this argument is getting from (2) to (3). Using the term “digital rights management” in (1) and (2) makes the leap from (2) to (3) seem smaller than it really is.

There is at least one more common misuse of “rights” in the copyright/technology debate. This is in the use of the term “rights holder” to refer to copyright owners (but not to users). When someone says, “Content is shipped from the rights holder to the consumer,” the implication is that the rights of the copyright owner are more important than those of the user. There is no need for this term “rights holder.” “Copyright owner” will do just fine, and it will help us remember that both parties in the transaction have rights that need to be protected.

Misleading Term of the Week: "Standard"

A “standard” is a technical specification that allows systems to work together to make themselves more useful. Most people say, for good reasons, that they are in favor of technical standards. But increasingly, we are seeing the term “standard” misapplied to things that are really regulations in disguise.

True standards strive to make systems more useful, by providing a voluntary set of rules that allow systems to understand each other. For example, a standard called RFC822 describes a standardized way to format email messages. If my email-sending software creates RFC822-compliant messages, and your email-receiving software understands RFC822-compliant messages, then you can read the email messages that I send you. Compliance with such a standard makes our software more functional.

Crucially, standards like RFC822 are voluntary and nonexclusive. Nobody forces any email-software vendor to comply with RFC822, and there is nothing to stop a vendor’s product from complying simultaneously with both RFC822 and other standards.

Lately we have seen the word “standard” misapplied. For example, the Broadcast Protection Discussion Group (BPDG) calls its proposal a “standard,” though it is anything but. Unlike a real standard, BPDG is not voluntary. Unlike a real standard, it contains prohibitions rather than opportunities. Put the BPDG “standard” in front of experienced engineers, and they’ll tell you that it looks like a regulation, not like a standards document. BPDG is trying to make its restrictive regulations more palatable by wrapping them in the mantle of “standards.”

A more subtle misuse of “standard” arises in claims that we need to standardize on DRM technology. As I wrote previously:

In an attempt to sweep [the technical infeasibility of DRM] under the rug, the content industry has framed the issue cleverly as one of standardization. This presupposes that there is a menu of workable technologies, and the only issue is which of them to choose. They want us to ask which technology is best. But we should ask another question: Are any of these technologies workable in the first place? If not, then a standard for copy protection is as premature as a standard for teleportation.