May 19, 2024

Latest voting system analysis from California

This summer, the California Secretary of State commissioned a first-ever “Top to Bottom Review” of all the electronic voting systems used in the state. In August, the results of the first round of review were published, finding significant security vulnerabilities and a variety of other problems with the three vendors reviewed at the time. (See the Freedom to Tinker coverage for additional details.) The ES&S InkaVote Plus system, used in Los Angeles County, wasn’t included in this particular review. (The InkaVote is apparently unrelated to the ES&S iVotronic systems used elsewhere in the U.S.) The reports on InkaVote are now public.

(Disclosure: I was a co-author of the Hart InterCivic source code report, released by the California Secretary of State in August. I was uninvolved in the current round of investigation and have no inside information about this work.)

First, it’s worth a moment to describe what InkaVote is actually all about.  It’s essentially a precinct-based optical-scan paper ballot system, with a template-like device, comparable to the Votomatic punch-card systems.  As such, even if the tabulation computers are completely compromised, the paper ballots remain behind with the potential for being retabulated, whether mechanically or by hand.

The InkaVote reports represent work done by a commercial firm, atsec, whose primary business is performing security evaluation against a variety of standards, such as FIPS-140 or the ISO Common Criteria. The InkaVote reports are quite short (or, at least the public reports are short). In effect, we only get to see the high-level bullet-points rather than detailed explanations of what they found. Furthermore, their analysis was apparently compressed to an impossible two week period, meaning there are likely to be additional issues that exist but were not discovered by virtue of the lack of time. Despite this, we still get a strong sense of how vulnerable these systems are.

From the source code report:

The documentation provided by the vendor does not contain any test procedure description; rather, it provides only a very abstract description of areas to be tested. The document mentions test cases and test tools, but these have not been submitted as part of the TDP and could not be considered for this review. The provided documentation does not show evidence of “conducting of tests at every level of the software structure”. The TDP and source code did not contain unit tests, or any evidence that the modules were developed in such a way that program components were tested in isolation. The vendor documentation contains a description of cryptographic algorithms that is inconsistent with standard practices and represented a serious vulnerability. No vulnerability assessment was made as part of the documentation review because the attack approach could not be identified based on the documentation alone. (The source review identified additional specific vulnerabilities related to encryption).

This is consistent, for better or for worse, with what we’ve seen from the other vendors.  Given that, security vulnerabilities are practically a given. So, what kinds of vulnerabilities were found?

In the area of cryptography and key management, multiple potential and actual vulnerabilities were identified, including inappropriate use of symmetric cryptography for authenticity checking (A.8), use of a very weak homebrewed cipher for the master key algorithm (A.7), and key generation with artificially low entropy which facilitates brute force attacks (A.6). In addition, the code and comments indicated that a hash (checksum) method that is suitable only for detecting accidental corruption is used inappropriately with the claimed intent of detecting malicious tampering. The Red Team has demonstrated that due to the flawed encryption mechanisms a fake election definition CD can be produced that appears genuine, see Red Team report, section A.15.

106 instances were identified of SQL statements embedded in the code with no evidence of sanitation of the data before it is added to the SQL statement. It is considered a bad practice to build the SQL statements at runtime; the preferred method is to use predefined SQL statements using bound variables. A specific potential vulnerability was found and documented in A.10, SQL Injection.

Ahh, lovely (or, I should say, oy gevaldik). Curiously, the InkaVote tabulation application appears to have been written in Java – a good thing, because it eliminates the possibility of buffer overflows. Nonetheless, writing this software in a “safe” language is insufficient to yield a secure system.

The reviewer noted the following items as impediments to an effective security analysis of the system:

  • Lack of design documentation at appropriate levels of detail.
  • Design does not use privilege separation, so all code in the entire application is potentially security critical.
  • Unhelpful or misleading comments in the code.
  • Potentially complex data flow due to exception handling.
  • Subjectively, large amount of source code compared to the functionality implemented.

The code constructs used were generally straightforward and easy to follow on a local level. However, the lack of design documentation made it difficult to globally analyze the system.

It’s clear that none of the voting system vendors that have been reviewed so far have had the engineering mandate (or the engineering talent) to build secure software systems that are suitably designed to resist threats that are reasonable to expect in an election setting. Instead, these vendors have produced systems that are “good enough” to sell, relying on external tamper-resistance mechanisms and human procedures. The Red Team report offers some insight into the value of these kinds of mitigations:

In the physical security testing, the wire and tamper proof paper seals were easily removed without damage to the seals using simple household chemicals and tools and could be replaced without detection (Ref item A.1 in the Summary Table). The tamper proof paper seals were designed to show evidence of removal and did so if simply peeled off but simple household solvents could be used to remove the seal unharmed to be replaced later with no evidence that it had been removed. Once the seals are bypassed, simple tools or easy modifications to simple tools could be used to access the computer and its components (Ref A.2 in summary). The key lock for the Transfer Device was unlocked using a common office item without the special ‘key’ and the seal removed. The USB port may then be used to attach a USB memory device which can be used in as part of other attacks to gain control of the system. The keyboard connector for the Audio Ballot unit was used to attach a standard keyboard which was then used to get access to the operating system (Ref A.10 in Summary) without reopening the computer.

The seal used to secure the PBC head to the ballot box provided some protection but the InkaVote Plus Manual (UDEL) provides instructions for installing the seal that, if followed, will allow the seal to be opened without breaking it (Ref A.3 in the Summary Table). However, even if the seals are attached correctly, there was enough play and movement in the housing that it was possible to lift the PBC head unit out of the way and insert or remove ballots (removal was more difficult but possible). [Note that best practices in the polling place which were not considered in the security test include steps that significantly reduce the risk of this attack succeeding but this weakness still needs to be rectified.]

I’ll leave it as an exercise to the reader to determine what the “household solvents” or “common office item” must be.

AT&T Explains Guilt by Association

According to government documents studied by The New York Times, the FBI asked several phone companies to analyze phone-call patterns of Americans using a technology called “communities of interest”. Verizon refused, saying that it didn’t have any such technology. AT&T, famously, did not refuse.

What is the “communities of interest” technology? It’s spelled out very clearly in a 2001 research paper from AT&T itself, entitled “Communities of Interest” (by C. Cortes, D. Pregibon, and C. Volinsky). They use high-tech data-mining algorithms to scan through the huge daily logs of every call made on the AT&T network; then they use sophisticated algorithms to analyze the connections between phone numbers: who is talking to whom? The paper literally uses the term “Guilt by Association” to describe what they’re looking for: what phone numbers are in contact with other numbers that are in contact with the bad guys?

When this research was done, back in the last century, the bad guys where people who wanted to rip off AT&T by making fraudulent credit-card calls. (Remember, back in the last century, intercontinental long-distance voice communication actually cost money!) But it’s easy to see how the FBI could use this to chase down anyone who talked to anyone who talked to a terrorist. Or even to a “terrorist.”

Here are a couple of representative diagrams from the paper:

Fig. 4. Guilt by association – what is the shortest path to a fraudulent node?

Fig. 5. A guilt by association plot. Circular nodes correspond to wireless service accounts while rectangular nodes are conventional land line accounts. Shaded nodes have been previously labeled as fraudulent by network security associates.

Response to ITIF Voting Report

[This post was written by David Robinson and me, based on our discussions with Alex Halderman, Joe Calandrino, and Ari Feldman.]

On Tuesday, the Information Technology and Innovation Foundation released a report on the possible role of paper trails in auditing elections conducted using DRE machines. The report contained a blend of reasonable and unreasonable claims, and careful and uncareful argumentation. A lay reader might come away from the report – entitled Stop the Presses: How Paper Trails Fail to Secure e-Voting – with the belief that the addition of paper trails to DRE voting machines makes them less secure than they are on their own. Such a belief would be incorrect.

As the report puts it at one point, “The addition of paper audit trails to DRE voting machines would simply convert our elections back to a paper ballot system.” The report dwells at remarkable length on the convenient appearance of extra ballots during Lyndon Johnson’s political career. But we know about that cheating today precisely because paper ballots, unlike many DRE vote tallies, can be independently recounted.

One could spend months arguing about what exact position emerges from the 19 pages of delicately drafted hedging that make up the body of this report. But the bottom line – contrary to the impression most readers will gather from the report – is that paper and electronic voting together are, if done right, better than either the best paper system or the best computerized system would be alone.

The ITIF report suggests that a situation in which the paper and electronic records don’t match would be a disaster, since authorities wouldn’t know which record to trust. But that’s a shortsighted view. Divergent paper and electronic records are a sure sign that something has gone awry during voting. In some cases, that sign lets officials make a reasonable judgment about which record is, under the specific circumstances of a given race, more likely to be trustworthy.

The real worst-case scenario isn’t divergent paper and electronic records – with their attendant litigation and political discord. The real worst case is an attack or error that never even comes to the attention of election officials or the public, because there isn’t an independent way of catching problems.

E-Voting Ballots Not Secret; Vendors Don't See Problem

Two Ohio researchers have discovered that some of the state’s e-voting machines put a timestamp on each ballot, which severely erodes the secrecy of ballots. The researchers, James Moyer and Jim Cropcho, used the state’s open records law to get access to ballot records, according to Declan McCullagh’s story at news.com. The pair say they have reconstructed the individual ballots for a county tax referendum in Delaware County, Ohio.

Timestamped ballots are a problem because polling-place procedures often record the time or sequence of voter’s arrivals. For example, at my polling place in New Jersey, each voter is given a sequence number which is recorded next to the voter’s name in the poll book records and is recorded in notebooks by Republican and Democratic poll watchers. If I’m the 74th voter using the machine today, and the recorded ballots on that machine are timestamped or kept in order, then anyone with access to the records can figure out how I voted. That, of course, violates the secret ballot and opens the door to coercion and vote-buying.

Most e-voting systems that have been examined get this wrong. In the recent California top-to-bottom review, researchers found that the Diebold system stores the ballots in the order they were cast and with timestamps (report pp. 49-50), and the Hart (report pp. 59) and Sequoia (report p. 64) systems “randomize” stored ballots in an easily reversible fashion. Add in the newly discovered ES&S system, and the vendors are 0-for-4 in protecting ballot secrecy.

You’d expect the vendors to hurry up and fix these problems, but instead they’re just shrugging them off.

An ES&S spokeswoman at the Fleishman-Hillard public relations firm downplayed concerns about vote linking. “It’s very difficult to make a direct correlation between the order of the sign-in and the timestamp in the unit,” said Jill Friedman-Wilson.

This is baloney. If you know the order of sign-ins, and you can put the ballots in order by timestamp, you’ll be able to connect them most of the time. You might make occasional mistakes, but that won’t reassure voters who want secrecy.

You know things are bad when questions about a technical matter like security are answered by a public-relations firm. Companies that respond constructively to security problems are those that see them not merely as a PR (public relations) problem but as a technology problem with PR implications. The constructive response in these situations is to say, “We take all security issues seriously and we’re investigating this report.”

Diebold, amazingly, claims that they don’t timestamp ballots – even though they do:

Other suppliers of electronic voting machines say they do not include time stamps in their products that provide voter-verified paper audit trails…. A spokesman for Diebold Election Systems (now Premier Election Solutions), said they don’t for security and privacy reasons: “We’re very sensitive to the integrity of the process.”

You have to wonder why e-voting vendors are so much worse at responding to security flaw reports than makers of other products. Most software vendors will admit problems when they’re real, will work constructively with the problems’ discoverers, and will issue patches promptly. Companies might try PR bluster once or twice, but they learn that bluster doesn’t work and they’re just driving away customers. The e-voting companies seem to make the same mistakes over and over.

More California E-Voting Reports Released; More Bad News

Yesterday the California Secretary of State released the reports of three source code study teams that analyzed the source code of e-voting systems from Diebold, Hart InterCivic, and Sequoia.

All three reports found many serious vulnerabilities. It seems likely that computer viruses could be constructed that could infect any of the three systems, spread between voting machines, and steal votes on the infected machines. All three systems use central tabulators (machines at election headquarters that accumulate ballots and report election results) that can be penetrated without great effort.

It’s hard to convey the magnitude of the problems in a short blog post. You really have read through the reports – the shortest one is 78 pages – to appreciate the sheer volume and diversity of severe vulnerabilities.

It is interesting (at least to me as a computer security guy) to see how often the three companies made similar mistakes. They misuse cryptography in the same ways: using fixed unchangeable keys, using ciphers in ECB mode, using a cyclic redundancy code for data integrity, and so on. Their central tabulators use poorly protected database software. Their code suffers from buffer overflows, integer overflow errors, and format string vulnerabilities. They store votes in a way that compromises the secret ballot.

Some of these are problems that the vendors claimed to have fixed years ago. For example, Diebold claimed (p. 11) in 2003 that its use of hard-coded passwords was “resolved in subsequent versions of the software”. Yet the current version still uses at least two hard-coded passwords – one is “diebold” (report, p. 46) and another is the eight-byte sequence 1,2,3,4,5,6,7,8 (report, p. 45).

Similarly, Diebold in 2003 ridiculed (p. 6) the idea that their software could suffer from buffer overflows: “Unlike a Web server or other Internet enabled applications, the code is not vulnerable to most ‘buffer overflow attacks’ to which the authors [Kohno et al.] refer. This form of attack is almost entirely inapplicable to our application. In the limited number of cases in which it would apply, we have taken the steps necessary to ensure correctness.” Yet the California source code study found several buffer overflow vulnerabilities in Diebold’s systems (e.g., issues 5.1.6, 5.2.3 (“multiple buffer overflows”), and 5.2.18 in the report).

As far as I can tell, major news outlets haven’t taken much notice of these reports. That in itself may be the most eloquent commentary on the state of e-voting: reports of huge security holes in e-voting systems are barely even newsworthy any more.