March 19, 2024

Archives for November 2003

Diebold to Stop Suppressing Memos

Diebold has filed a court document promising not to sue people for posting the now-famous memos, and withdrawing the DMCA takedown notices it had sent previously. It’s a standard-issue lawyer’s non-surrender surrender (“Mr. Bonaparte, having demonstrated his mastery of the Waterloo battlefield, chooses to withdraw at this time”), asserting that “[u]nder well-established copyright law” Diebold could win an infringement suit, but that Diebold has decided anyway not to sue, given that it no longer has any realistic hope of suppressing distribution of the memos.

Diebold’s filing also contains this interesting sentence:

Diebold has informally encouraged the students to refrain from publishing passwords, source codes, information protected by employees’ privacy interests, and trade secret-type information, none of which is essential for purposes of criticism.

Some of these things certainly are essential for criticism. Diebold’s source code, for instance, is the most precise description of how their technology works, so it has obvious relevance to criticism of the technology’s security or reliability. Trade secret information includes facts about the failure history of the product, which are also highly relevant.

I’m not saying that it is always legal or ethical to publish companies’ source code or trade secrets, no matter what the circumstances. But in this case, some code and some trade secrets are essential for criticism, and Diebold’s assertion to the contrary doesn’t pass the laugh test.

[Link via Larry Lessig.]

Taming EULAs

Most software programs, and some websites, are subject to End User License Agreements (EULAs). EULAs are long and detailed and apparently written by lawyer-bots. Almost everybody agrees to them without even reading them. A EULA is a contract, but it’s not the result of a negotiation between the vendor and the user. The vendor writes the EULA, and the user can take it or leave it. Most users just accept EULAs without thinking.

This has led to any number of problems. For example, some EULAs give the software vendors permission to install spyware – and most users never realize they have granted that permission.

Why don’t users pay more attention to EULAs? Rational ignorance is one possibility – it may be that the cost of accepting a bad EULA every now and then is lower than the cost of actually reading EULAs and making careful decisions. If so, then a rational cost-minimizing user won’t read EULAs.

And there are a few oddballs who read EULAs. When these people find a particularly egregious provision, they spread the word. Occasionally the press will report on an extreme EULA. So rationally ignorant consumers get a little information about popular EULAs, and there is some pressure on vendors to keep their EULAs reasonable.

In domains where rational ignorance is common, tools often spring up to help people make decisions that are more rational and less ignorant. If it’s not worth your time to research your senator’s voting record, you can look at how he is rated by the Environmental Defense Fund or the NRA, or you can see who has endorsed him for reelection. None of these sources captures the nuances of an individual voting record. But if you’re not going to spend the time to examine that record, these crude tools can be valuable.

When it comes to EULAs, we don’t have these tools. So let’s create them. Let me suggest two useful tools.

The first tool is a service, provided via a website, that rates EULAs in the same way that political advocacy groups rate legislators. I’m not talking about a detailed explanation – which rationally ignorant users wouldn’t bother to read – but a simple one-dimensional rating, such as a grade on an A-to-F scale. Products whose EULAs get good scores might be allowed to display a trademarked “Our EULA got an A-” logo.

Naturally, reducing a complex EULA to a single rating is an oversimplification. But that’s exactly the point. Rationally ignorant users demand simplification, and if they don’t get it they’ll decide based on no information at all. The site could offer more details for users who want them. But let’s face it: most users don’t.

The second tool is a standardized template for writing EULAs, akin to the structure of Creative Commons licenses. You’d have some core EULA language, along with a set of modules that could be added at the vendor’s discretion. Standardized EULAs can be displayed concisely to the user, by listing the modules that are included. They could be expressed easily in machine-readable form, so various automated tools could be created.

The main benefit of standardization is that users could re-use what they had learned about past licenses, so that the cost of learning about a license could be amortized over more decisions. Standardization would also seem to benefit those companies who have more likable EULAs, since it would help users notice the substantive differences between the EULAs they see.

Will either of these things happen? I don’t know. But I would like to see somebody try them.

California to Require Open-Source in Voting Software?

Donna Wentworth at Copyfight points to the fine print in the recent e-voting edict from California Secretary of State Kevin Shelley, which says this:

Any electronic verification method must have open source code in order to be certified for use in a voting system in California.

Many computer scientists have argued that e-voting systems should be required to have open source code, because of the special circumstances surrounding voting. Is that what Mr. Shelley is requiring?

I’m not sure. His requirement applies to “electronic verification method[s]” and not explicitly to all e-voting systems. What exactly is an “electronic verification method”? Mr. Shelley’s directive uses this term in reference to the report of a previous task force on e-voting.

So what does the task force’s report say? Surprisingly, the report refers to “electronic verification” methods at several points, but I couldn’t find any specific mention of what those methods might be. This is particularly odd considering that the task force members included computer scientists (including David Dill and David Jefferson) who are more than qualified to understand and write about any “electronic verification” methods, even if only to summarize them or give examples.

It looks as if there might be a hidden layer to this story, but I can’t figure out what it could be. Can anybody help out?

[Correction (1:50 PM): corrected the spelling of Kevin Shelley’s last name.]

California to Require E-Voting Paper Trail

California Secretary of State Kevin Shelley will announce today that as of 2006, all e-voting machines in the state must provide a voter-verifiable paper trail, according to an L.A. Times story by Allison Hoffman and Tim Reiterman.

This is yet another sign that the push for sensible e-voting safeguards is gaining momentum.

[Link credit: Siva Vaidhyanathan at Sivacracy.net.]

Princeton Ignores Strauss, Makes Sensible Decisions

The Office of Information Technology (OIT) here at Princeton has taken the unusual step of issuing a statement distancing itself from the views expressed by one of its employees, Howard Strauss, in a column in Syllabus magazine.

(OIT operates the campus network and other shared computing facilities. It is not to be confused with the Computer Science Department, which is the main site of information technology teaching and research at Princeton.)

Mr. Strauss’s column, which really has to be read to be believed, likens open source products to fraudulent Nigerian spam emails.

Fortunately the grownups in charge at OIT responded by reiterating the university’s non-Straussian procurement policy, which is based not on a rigid pro- or anti-open source rule but instead involves – listen carefully, ’cause this might be hard to follow – looking at all of the available products and choosing the best one for the job.