December 13, 2018

Archives for October 2003

Swarthmore Students Re-Publish Diebold Memos

A group of Swarthmore students has published a damning series of internal memos from electronic-voting vendor Diebold. The memos appear to document cavalier treatment of security issues by Diebold, and the use of non-certified software in real elections. Diebold, claiming that the students are infringing copyright, has sent a series of DMCA takedown letters to Swarthmore. Swarthmore is apparently shutting off the Internet access of students who publish the memos. The students are responding by finding new places to report the memos, setting up what Ernest Miller calls a “whack-a-mole game”. (See, for example, posts from Ernest Miller and Aaron Swartz.)

Here is my question for the lawyers: Is this really copyright infringement? I know that copyright attaches even to pedestrian writings like business memos. But don’t the students have some kind of fair use argument? It seems to me that their purpose is noncommercial; and it can hardly be said that they are depriving Diebold of the opportunity to sell the memos to the public. So the students would seem to have a decent argument on at least two of the four fair-use factors. So it might be fair use.

Even if the students are breaking the law, what Diebold is doing in trying to suppress the memos certainly doesn’t further the goals underlying copyright law. A trade secret argument from Diebold would seem to make more sense here, although the students would seem to have a free-speech counterargument, bolstered by the strong public interest in knowing how our votes are counted.

Can any of my lawyer readers (or fellow bloggers) help clear up these issues?

Rescorla on Airport ID Checks

Eric Rescorla, at Educated Guesswork, notes a flaw in the security process at U.S. airports – the information used to verify a passenger’s ID is not the same information used to look them up in a suspicious-persons database.

Let’s say that you’re a dangerous Canadian terrorist, bearing the clearly suspicious name “Guy Lafleur”. Now, the American government is aware of your activities and puts you on the CAPPS blacklist to stop you from boarding the plane. Further, let’s assume that you’re too incompetent to get a fake ID….

You have someone who’s not on the blacklist buy you a ticket under an innocuous assumed name, say “Babe Ruth”. This is perfectly legitimate and quite easy to do…. Then, the day before the flight you go onto the web and get your boarding pass. You print out two copies, one with your real name and one with the innocuous fake name. Remember, it’s just a web page, so it’s easy to modify When you go to the airport, you show the security agent your “Guy Lafleur” boarding pass and your real ID. He verifies that they match but doesn’t check the watchlist, because his only job is to verify that you have a valid-looking boarding pass and that it matches your ID. Then, when you go to board the plane, you give the gate agent your real boarding pass. Since they don’t check ID, you can just walk onboard.

What’s happened is that whoever designed this system violated a basic security principle that’s one of the first things protocol designers learn: information you’re using to make a decision has to be the information you verify. Unfortunately, that’s not the case here. The identity that’s being verified is what’s written on a piece of paper and the identity that’s being used to check the watchlist is in some computer database which isn’t tied to the paper in any way other than your computer and printer, which are easy to subvert.

In a later post, he discusses some ways to fix the problem.

Warning Fatigue

One of the many problems facing security engineers is warning fatigue – the tendency of users who have seen too many security warnings to start ignoring the warnings altogether. Good designers think carefully about every warning they display, knowing that each added warning will dilute the warnings that were already there.

Warning fatigue is a significant security problem today. Users are so conditioned to warning boxes that they click them away, unread, as if instinctively swatting a fly.

Which brings us to H.R. 2752, the “Author, Consumer, and Computer Owner Protection and Security (ACCOPS) Act of 2003”, introduced in the House of Representatives in July, and discussed by Declan McCullagh in his latest column. The bill would require a security warning, and user consent, before allowing the download of any “software that, when installed on the user’s computer, enables 3rd parties to store data on that computer, or use that computer to search other computers’ contents over the Internet.”

Most users already know that downloading software is potentially risky. Most users are already accustomed to swatting away warning boxes telling them so. One more warning is unlikely to deter the would-be KaZaa downloader.

This is especially true given that the same warning would have to be placed on many other types of programs that meet the bill’s criteria, including operating systems and web browsers. The ACCOPS warning will be just another of those dialog boxes that nobody reads.