November 27, 2024

Taming EULAs

Most software programs, and some websites, are subject to End User License Agreements (EULAs). EULAs are long and detailed and apparently written by lawyer-bots. Almost everybody agrees to them without even reading them. A EULA is a contract, but it’s not the result of a negotiation between the vendor and the user. The vendor writes the EULA, and the user can take it or leave it. Most users just accept EULAs without thinking.

This has led to any number of problems. For example, some EULAs give the software vendors permission to install spyware – and most users never realize they have granted that permission.

Why don’t users pay more attention to EULAs? Rational ignorance is one possibility – it may be that the cost of accepting a bad EULA every now and then is lower than the cost of actually reading EULAs and making careful decisions. If so, then a rational cost-minimizing user won’t read EULAs.

And there are a few oddballs who read EULAs. When these people find a particularly egregious provision, they spread the word. Occasionally the press will report on an extreme EULA. So rationally ignorant consumers get a little information about popular EULAs, and there is some pressure on vendors to keep their EULAs reasonable.

In domains where rational ignorance is common, tools often spring up to help people make decisions that are more rational and less ignorant. If it’s not worth your time to research your senator’s voting record, you can look at how he is rated by the Environmental Defense Fund or the NRA, or you can see who has endorsed him for reelection. None of these sources captures the nuances of an individual voting record. But if you’re not going to spend the time to examine that record, these crude tools can be valuable.

When it comes to EULAs, we don’t have these tools. So let’s create them. Let me suggest two useful tools.

The first tool is a service, provided via a website, that rates EULAs in the same way that political advocacy groups rate legislators. I’m not talking about a detailed explanation – which rationally ignorant users wouldn’t bother to read – but a simple one-dimensional rating, such as a grade on an A-to-F scale. Products whose EULAs get good scores might be allowed to display a trademarked “Our EULA got an A-” logo.

Naturally, reducing a complex EULA to a single rating is an oversimplification. But that’s exactly the point. Rationally ignorant users demand simplification, and if they don’t get it they’ll decide based on no information at all. The site could offer more details for users who want them. But let’s face it: most users don’t.

The second tool is a standardized template for writing EULAs, akin to the structure of Creative Commons licenses. You’d have some core EULA language, along with a set of modules that could be added at the vendor’s discretion. Standardized EULAs can be displayed concisely to the user, by listing the modules that are included. They could be expressed easily in machine-readable form, so various automated tools could be created.

The main benefit of standardization is that users could re-use what they had learned about past licenses, so that the cost of learning about a license could be amortized over more decisions. Standardization would also seem to benefit those companies who have more likable EULAs, since it would help users notice the substantive differences between the EULAs they see.

Will either of these things happen? I don’t know. But I would like to see somebody try them.

California to Require Open-Source in Voting Software?

Donna Wentworth at Copyfight points to the fine print in the recent e-voting edict from California Secretary of State Kevin Shelley, which says this:

Any electronic verification method must have open source code in order to be certified for use in a voting system in California.

Many computer scientists have argued that e-voting systems should be required to have open source code, because of the special circumstances surrounding voting. Is that what Mr. Shelley is requiring?

I’m not sure. His requirement applies to “electronic verification method[s]” and not explicitly to all e-voting systems. What exactly is an “electronic verification method”? Mr. Shelley’s directive uses this term in reference to the report of a previous task force on e-voting.

So what does the task force’s report say? Surprisingly, the report refers to “electronic verification” methods at several points, but I couldn’t find any specific mention of what those methods might be. This is particularly odd considering that the task force members included computer scientists (including David Dill and David Jefferson) who are more than qualified to understand and write about any “electronic verification” methods, even if only to summarize them or give examples.

It looks as if there might be a hidden layer to this story, but I can’t figure out what it could be. Can anybody help out?

[Correction (1:50 PM): corrected the spelling of Kevin Shelley’s last name.]

California to Require E-Voting Paper Trail

California Secretary of State Kevin Shelley will announce today that as of 2006, all e-voting machines in the state must provide a voter-verifiable paper trail, according to an L.A. Times story by Allison Hoffman and Tim Reiterman.

This is yet another sign that the push for sensible e-voting safeguards is gaining momentum.

[Link credit: Siva Vaidhyanathan at Sivacracy.net.]

Princeton Ignores Strauss, Makes Sensible Decisions

The Office of Information Technology (OIT) here at Princeton has taken the unusual step of issuing a statement distancing itself from the views expressed by one of its employees, Howard Strauss, in a column in Syllabus magazine.

(OIT operates the campus network and other shared computing facilities. It is not to be confused with the Computer Science Department, which is the main site of information technology teaching and research at Princeton.)

Mr. Strauss’s column, which really has to be read to be believed, likens open source products to fraudulent Nigerian spam emails.

Fortunately the grownups in charge at OIT responded by reiterating the university’s non-Straussian procurement policy, which is based not on a rigid pro- or anti-open source rule but instead involves – listen carefully, ’cause this might be hard to follow – looking at all of the available products and choosing the best one for the job.

CDT Report on Spyware

The Center for Democracy and Technology has issued a sensible and accessible paper about the spyware problem and associated policy issues.

Spyware is software, installed on your computer without your consent, that gathers information about what you do on your computer. It’s shockingly common – if you are a typical active web surfer using Internet Explorer in its default configuration, and you haven’t been taking specific steps to protect yourself against spyware, then you probably have several spyware programs on your computer right now.

CDT recommends that end users protect themselves by using anti-spyware tools such as AdAware, Spybot Search and Destroy, Spyware Eliminator, or BPS Spyware/Adware Remover. (I have had good luck with Spybot Search and Destroy.)

At the policy level, CDT is lukewarm about attempts to ban spyware specifically, because of the difficult line-drawing exercise involved in distinguishing spyware from certain types of legitimate programs. They argue instead for policies that address the underlying problems: installation without consent, and surreptitious monitoring of user behavior.

Kudos to CDT for advancing the policy discussion on this often overlooked issue.