November 24, 2024

Software Customer Bill of Rights

Cem Kaner has written a Software Customer Bill of Rights. His general approach is to require that customers have roughly the same rights when they buy software as when they buy other products.

Much of what Kaner says makes sense. But at least one of his principles seems awfully hard to implement in practice:

2. Disclose known defects. The software company or service provider must disclose the defects that it knows about to potential customers, in a way that is likely to be understood by a typical member of the market for that product or service.

This is hard to implement because software products have so many defects – big mass-market software products typically have thousands of known defects. And this is not just the practice of one or two companies; it’s standard in the industry. If a vendors waited until all the defects were removed from a product, that product would never be finished and would never ship.

Some of the defects in software products are serious, but most are relatively minor. There is simply no way to explain them all to consumers. And sometimes it can be hard to tell in advance which defects will prove to be critical to customers.

Still, Kaner seems to be on the right track. It would be helpful if vendors disclosed the most serious known defects to their customers, so that customers could weight their impact in deciding which product to buy.

[Link credit: Dan Gillmor.]

Trade Secrets and Free Speech

Yesterday the California Supreme Court issued its ruling in DVDCCA v. Bunner, a case pitting trade secrets against freedom of speech. The court ruled that an injunction against disclosure of a trade secret is valid, even though it restricts some speech.

The case relates to CSS, the encryption scheme used to scramble the data on DVDs. CSS was developed in secret, and an outfit called the DVD Copy Control Association (DVDCCA) claims that the details of CSS are its trade secret. Andrew Bunner posted DeCSS, a program that unscrambles CSS-encrypted content, on his web site. DVDCCA sued Bunner for misappropriating its trade secret. A lower court issued an injunction, ordering Bunner not to publish DeCSS. Bunner appealed, arguing that the injunction violated his free speech right.

The lower court ruled that Bunner knew (or should have known) that CSS was a trade secret, and that Bunner knew (or should have known) that the original source of DeCSS had gotten the trade secret improperly. I think these factual findings were highly questionable, but the Court accepted them for the purposes of its decision. So the issue before the state Supreme Court was merely whether an injunction against publishing a trade secret violates freedom of speech. The Court ruled that it does not, at least not when the speech is software code.

Why does it matter that the speech is software code? As Seth Finkelstein points out, the Court seemed to say that software code cannot be of public concern, because only experts can read it:

DVD CCA’s trade secrets in the CSS technology are not publicly available and convey only technical information about the method used by specific private entities to protect their intellectual property. Bunner posted these secrets in the form of DeCSS on the Internet so Linux users could enjoy and use DVD’s and so others could improve the functional capabilities of DeCSS. He did not post them to comment on any public issue or to participate in any public debate. Indeed, only computer encryption enthusiasts are likely to have an interest in the expressive content– rather than the uses–of DVD CCA’s trade secrets. (See Tien, Publishing Software as a Speech Act, supra, 15 Berkeley Tech. L.J. at pp. 662-663 [“Programming languages provide the best means for communicating highly technical ideas–such as mathematical concepts–within the community of computer scientists and programmers”].) Thus, these trade secrets, as disclosed by Bunner, address matters of purely private concern and not matters of public importance. …

This seems like a pretty odd position to take. Information about Enron’s finances is of public concern, even though only accountants can interpret it in its raw form. Information about the Space Shuttle wing structure is of public concern, even though only a few engineers understand it fully. CSS is a controversial technology, and information about how it works is directly relevant to the debate about it. True, many people who are interested in the debate will have to rely on experts to explain the relevant parts of DeCSS to them; but the same is true of Enron’s accounting or the Shuttle’s engineering.

Odder still, in my view, is the notion that because DeCSS is directly useful to members of the public, it is somehow of less public concern than a purely theoretical discussion would be. It seems to me that the First Amendment protects speech precisely because the speech may have an effect on what people think and how they act. To suppress speech because of its impact seems to defeat the very purpose of the free speech guarantee.

It's Ten O'Clock. Do You Know What Your Computer is Doing?

Last week saw a scary story about a British man who was acquitted of the charge of possessing child pr0n. [Deliberate misspelling to keep dumb censorware tools from blocking this site. But some censorware programs will block this anyway. Heavy Sigh.] The illegal material was on the man’s computer, but he argued that an intruder had put it there, and he presented evidence to support that defense.

Although I have no special knowledge of his particular case, I know the kind of scenario he described does really happen. At least two innocent people I know have had their computers turned by intruders into pr0n distributors.

The lesson of these incidents is that we have less control over our computers than we have over our physical territory. Nobody would turn a file drawer in your office into a distribution center for contraband; but they might do that to your computer. Inevitably, innocent people will be accused of crimes, and they will suffer, even if they are eventually acquitted. And of course, some real bad guys will get away with crimes by blaming them on nonexistent intruders.

The best way to address this kind of problem is to make sure that people retain control – in practice as well as in theory – over their own computers. When we erode that control, whether we do so by technical or legal means, we are making the bad guys’ jobs easier.

Here We Go Again

Rep. John Conyers has introduced the Author, Consumer, and Computer Owner Protection and Security (ACCOPS) Act of 2003 in the House of Representatives.

The oddest provision of the bill is this one:

(a) Whoever knowingly offers enabling software for download over the Internet and does not–

(1) clearly and conspicuously warn any person downloading that software, before it is downloaded, that it is enabling software and could create a security and privacy risk for the user’s computer; and

(2) obtain that person’s prior consent to the download after that warning;

shall be fined under this title or imprisoned not more than 6 months, or both.

(b) As used in this section, the term `enabling software’ means software that, when installed on the user’s computer, enables 3rd parties to store data on that computer, or use that computer to search other computers’ contents over the Internet.

As so often happens in these sorts of bills, the definition has unexpected consequences. For example, it would apparently categorize Microsoft Windows as “enabling software,” since Windows offers both file server facilities and network search facilities. But the original Napster client, lacking upload and search facilities, would not be “enabling software.”

Note also that the mandated security and privacy warnings would be misleading. After all, there is no reason why file storage or search services are inherently riskier than other network software. Misleading warnings impose a real cost, since they dilute users’ trust in any legitimate warnings they see.

The general approach of this bill, which we also saw in the Hollings CBDTPA, is to impose regulation on Bad Technologies. This approach will be a big success, once we work out the right definition for Bad Technologies.

Imagine the simplification we could achieve by applying this same principle to other areas of the law. For example, the entire criminal law can be reduced to a ban on Bad Acts, once we work out the appropriate definition for that term. Campaign finance law would be reduced to a ban on Corrupting Financial Transactions (with an appropriate exception for Constructive Debate).

A Modest Proposal

Now that the Supreme Court has ruled that Congress can condition Federal funding for libraries on the libraries’ use of censorware (i.e., that a law called CIPA is consistent with the constitution), it’s time to take a serious look at the deficiencies of censorware, and what can be done about them.

Suppose you’re a librarian who wants to comply with CIPA, but otherwise you want your patrons to have access to as much material on the Net as possible. From your standpoint, the popular censorware products have four problems. (1) They block some unobjectionable material. (2) They fail to block some material that is obscene or harmful to minors. (3) They try to block material that Congress does not require to be blocked, such as certain political speech. (4) They don’t let you find out what they block.

(1) and (2) are just facts of life – no technology can eliminate these problems. But (3) and (4) are solvable – it’s possible to build a censorware program that doesn’t try to block anything except as required by the law, and it’s possible for a program’s vendor to reveal what their product blocks. But of course it’s unlikely that the main censorware vendors will give you (3) or (4).

So why doesn’t somebody create an open-source censoreware program that is minimally compliant with CIPA? This would give librarians a better option, and it would put pressure on the existing vendors to narrow their blocking lists and to say what they block.

I can understand why people would have been hesitant to create such a program in the past. Most people who want to minimize the intrusiveness of censorware have thus far done so by not using censorware; so there hasn’t been much of a market for a narrowly tailored product. But that may change as librarians are forced to use censorware.

Also, censorware opponents have found the lameness and overbreadth of existing censorware useful, especially in court. But now, in libraries at least, that usefulness is mostly past, and it’s time to figure out how to cope with CIPA in the least harmful way. More librarian-friendly censorware seems like a good start.

[Note: I must admit that I’m not entirely convinced by my own argument here. But I do think it has some merit and deserves discussion, and nobody else seemed to be saying it. So let the flaming begin!]