November 28, 2024

Refuting Diebold's Response

Diebold issued a response to our e-voting report. While we feel our paper already addresses all the issues they raise, here is a point by point rebuttal. Diebold’s statement is in italics, our response in normal type.

Three people from the Center for Information Technology Policy and Department of Computer Science at Princeton University today released a study of a Diebold Election Systems AccuVote-TS unit they received from an undisclosed source. The unit has security software that was two generations old, and to our knowledge is not used anywhere in the country.

We studied the most recent software version available to us. The version we studied has been used in national elections, and Diebold claimed at the time that it was perfectly secure and could not possibly be subject to the kinds of malicious code injection attacks that our paper and video demonstrate. In short, Diebold made the same kinds of claims about this version – claims that turned out to be wrong – that they are now making about their more recent versions.

Normal security procedures were ignored. Numbered security tape, 18 enclosure screws and numbered security tags were destroyed or missing so that the researchers could get inside the unit.

This is incorrect. Far from ignoring Diebold’s “normal security procedures”, we made them a main focus of our study.

The tape and seals are discussed in our paper (e.g., in Section 5.2), where we explain why they are not impediments to the attacks we describe. The main attack does not require removal of any screws. Contrary to Diebold’s implication here, our paper accounts for these measures and explains why they do not prevent the attacks we describe. Indeed, Diebold does not claim that these measures would prevent any of our attacks.

A virus was introduced to a machine that is never attached to a network.

This is irrelevant. Our paper describes how the virus propagates (see Sections 2.2.2 and 4.3) via memory cards, without requiring any network.

By any standard – academic or common sense – the study is unrealistic and inaccurate.

This is little more than name-calling.

For an academic evaluation, ask our academic colleagues. We’d be happy to provide a long list of names.

We demonstrated these problems on our video, and again in live demos on Fox News and CNN. Common sense says to believe your eyes, not unsubstantiated claims that a technology is secure.

The current generation of AccuVote-TS software – software that is used today on AccuVote-TS units in the United States – features the most advanced security features, including Advanced Encryption Standard 128 bit data encryption, Digitally Signed memory card data, Secure Socket Layer (SSL) data encryption for transmitted results, dynamic passwords, and more.

As above, Diebold does not assert that any of these measures would prevent the attacks described in our paper. Nor do we see any reason why they would.

These touch screen voting stations are stand-alone units that are never networked together and contain their own individual digitally signed memory cards.

As discussed above, the lack of networking is irrelevant. We never claim the machines are networked, and we explain in our paper (e.g. Sections 2.2.2 and 4.3) how the virus propagates using memory cards, without requiring a network.

Again, Diebold does not claim that these measures would prevent the attacks described in our paper.

In addition to this extensive security, the report all but ignores physical security and election procedures. Every local jurisdiction secures its voting machines – every voting machine, not just electronic machines. Electronic machines are secured with security tape and numbered security seals that would reveal any sign of tampering.

Our paper discusses physical security, election procedures, security tape, and numbered security seals. See, for example, Sections 3.3 and 5.2 of our paper. These sections and others explain why these measures do not prevent the attacks we describe. And once again, Diebold does not assert that they would.

Diebold strongly disagrees with the conclusion of the Princeton report. Secure voting equipment, proper procedures and adequate testing assure an accurate voting process that has been confirmed through numerous, stringent accuracy tests and third party security analysis.

Every voter in every local jurisdiction that uses the AccuVote-Ts should feel secure knowing that their vote will count on Election Day.

Secure voting equipment and adequate testing would assure accurate voting – if we had them. To our knowledge, every independent third party analysis of the AccuVote-TS has found serious problems, including the Hopkins/Rice report, the SAIC report, the RABA report, the Compuware report, and now our report. Diebold ignores all of these results, and still tries to prevent third-party studies of its system.

If Diebold really believes its latest systems are secure, it should allow third parties like us to evaluate them.

"Hotel Minibar" Keys Open Diebold Voting Machines

Like other computer scientists who have studied Diebold voting machines, we were surprised at the apparent carelessness of Diebold’s security design. It can be hard to convey this to nonexperts, because the examples are technical. To security practitioners, the use of a fixed, unchangeable encryption key and the blind acceptance of every software update offered on removable storage are rookie mistakes; but nonexperts have trouble appreciating this. Here is an example that anybody, expert or not, can appreciate:

The access panel door on a Diebold AccuVote-TS voting machine – the door that protects the memory card that stores the votes, and is the main barrier to the injection of a virus – can be opened with a standard key that is widely available on the Internet.

On Wednesday we did a live demo for our Princeton Computer Science colleagues of the vote-stealing software described in our paper and video. Afterward, Chris Tengi, a technical staff member, asked to look at the key that came with the voting machine. He noticed an alphanumeric code printed on the key, and remarked that he had a key at home with the same code on it. The next day he brought in his key and sure enough it opened the voting machine.

This seemed like a freakish coincidence – until we learned how common these keys are.

Chris’s key was left over from a previous job, maybe fifteen years ago. He said the key had opened either a file cabinet or the access panel on an old VAX computer. A little research revealed that the exact same key is used widely in office furniture, electronic equipment, jukeboxes, and hotel minibars. It’s a standard part, and like most standard parts it’s easily purchased on the Internet. We bought several keys from an office furniture key shop – they open the voting machine too. We ordered another key on eBay from a jukebox supply shop. The keys can be purchased from many online merchants.

Using such a standard key doesn’t provide much security, but it does allow Diebold to assert that their design uses a lock and key. Experts will recognize the same problem in Diebold’s use of encryption – they can say they use encryption, but they use it in a way that neutralizes its security benefits.

The bad guys don’t care whether you use encryption; they care whether they can read and modify your data. They don’t care whether your door has a lock on it; they care whether they can get it open. The checkbox approach to security works in press releases, but it doesn’t work in the field.

Update (Oct. 28): Several people have asked whether this entry is a joke. Unfortunately, it is not a joke.

Security Analysis of the Diebold AccuVote-TS Voting Machine

Today, Ari Feldman, Alex Halderman, and I released a paper on the security of e-voting technology. The paper is accompanied by a ten-minute video that demonstrates some of the vulnerabilities and attacks we discuss. Here is the paper’s abstract:

Security Analysis of the Diebold AccuVote-TS Voting Machine

Ariel J. Feldman, J. Alex Halderman, and Edward W. Felten
Princeton University

This paper presents a fully independent security study of a Diebold AccuVote-TS voting machine, including its hardware and software. We obtained the machine from a private party. Analysis of the machine, in light of real election procedures, shows that it is vulnerable to extremely serious attacks. For example, an attacker who gets physical access to a machine or its removable memory card for as little as one minute could install malicious code; malicious code on a machine could steal votes undetectably, modifying all records, logs, and counters to be consistent with the fraudulent vote count it creates. An attacker could also create malicious code that spreads automatically and silently from machine to machine during normal election activities – a voting-machine virus. We have constructed working demonstrations of these attacks in our lab. Mitigating these threats will require changes to the voting machine’s hardware and software and the adoption of more rigorous election procedures.

9/11

Five years ago this morning I was in a hotel room in Minneapolis, getting dressed. I flipped on the TV and saw smoke streaming from a skyscraper. Nobody knew yet what it meant.

My plan had been to meet a colleague in the lobby and walk over to our meeting. Everybody in the lobby area was watching the big-screen TV in the bar. It’s there that I saw the second plane hit.

There was nothing to do but go to the meeting. Not much got accomplished and we all spent much of the day in my hosts’ conference room watching a projected image of CNN. Much later I visited the same room and found a big painting of a firefighter hanging near where I had stood that day.

My wife and I had just moved to Palo Alto, California for a sabbatical year. The attacks affected folks in Palo Alto and Princeton quite differently. In Palo Alto, it happened during breakfast. Families were together; many learned of the attacks by phone from East Coast friends and relatives, and spent the morning watching together. In Princeton, adults were at work and kids at school; most kids learned of the attacks from parents who had had a few hours to think about what to say. In Princeton, the horrible question was: Who do we know who works There? Many people commute from Princeton to New York. The social network buzzed. Exactly where does M work? Exactly which train does he ride?

We didn’t lose any close friends, but at least two people I knew died. Later, reading the 9/11 report, I learned that one of them had been killed horribly by the hijackers to intimidate the other passengers. Several people we know were scarred. One man, who had been staying in a hotel across the street from the Trade Center, was haunted by images of falling bodies. A new doctor who had emergency duty at a Lower Manhattan hospital sent an email that I wish you could read.

As for myself, I was stuck in Minneapolis. As the week went on with no definite date of departure, we extended our meetings, trying to put our time to use. The hotel quickly emptied, as cancellations flooded in and those who could get home bolted. The few remaining guests bonded with the staff. One morning in the coffee shop, I was the only customer. The waitress sat down at my table and we had a long talk about what it all meant. I visit that hotel occasionally, and it still feels different to me than every other hotel in the world.

Eventually the airports reopened and I was on one of the first flights out of Minneapolis. The security screeners were jittery and ultra-vigilant, but also polite. I was disconcerted to note that nobody ever checked my ID that morning. When I mentioned this to the flight attendant, she quietly told me not to bring it up again.

I was happy to be at home and looked forward to some quiet time. Little did I know that I was about to be called to Washington for the final settlement talks in the Microsoft antitrust case. A month working in a DOJ building, in immediate post-9/11, post-anthrax Washington, is an experience not soon forgotten. Perhaps I’ll write about that next year.

E-Voting, Up Close

Recently the Election Science Institute released a fascinating report on real experience with e-voting technologies in a May 2006 primary election in Cuyahoga County, Ohio (which includes Cleveland). The report digs beneath the too-frequent platitudes of the e-voting debates, to see how , poll workers and officials actually use the technology, what really goes wrong in practice, and how well records are kept. The results are sobering.

Cuyahoga County deserves huge credit for allowing this study. Too often, voting officials try to avoid finding problems, rather than avoiding having problems. It takes courage to open one’s own processes to this kind of scrutiny, but it is the best way to improve. Cuyahoga County has done us all a service.

The election used Diebold electronic voting systems with Diebold’s add-on voter verified paper trail (VVPT) facility. One of the most widely discussed parts of the report describes ESI’s attempt to reconcile the VVPT with the electronic records kept by the voting machines. In about 10% of the machines, the paper record was spoiled: the paper roll was totally blank, or scrunched and smeared beyond reconstruction, or broken and taped back together, or otherwise obviously wrong. Had the election required a recount, this could have been a disaster – roughly 10% of the votes would not have been backed by a useful paper record, and Ohio election law says the paper record is the official ballot.

What does this teach us? First, the design of this particular VVPT mechanism needs work. It’s not that hard to make a printer that works more than 90% of the time. Printer malfunctions can never be eliminated completely, but they must be made very rare.

Second, we need to remember why we wanted to augment electronic records with a VVPT in the first place. It’s not that paper records are always more reliable than electronic records. The real reason we want to use them together is that paper and electronic recordkeeping systems have different failure modes, so that the two used together can be more secure than either used alone. In a well-designed system, an adversary who wants to create fraudulent ballots must launch two very different attacks, against the paper and electronic systems, and must synchronize them so that the fraudulent records end up consistent.

Third, this result illustrates why it’s important to audit some random subset of precincts or voting machines as a routine post-election procedure. Regular integrity-checking will help us detect problems, whether they’re caused by glitches or malicious attacks.

There’s much more in the ESI report, including a summary of voting machine problems (power failures, inability to boot, broken security seals, etc.) reported from polling places, and some pretty pointed criticism of the county’s procedural laxity. The best system is one that can tolerate these kinds of problems, learn from them, and do a better job next time.