April 23, 2021

Voting Machine Hashcode Testing: Unsurprisingly insecure, and surprisingly insecure

By Andrew Appel and Susan Greenhalgh

The accuracy of a voting machine is dependent on the software that runs it. If that software is corrupted or hacked, it can misreport the votes.  There is a common assumption that we can check the legitimacy of the software that is installed by checking a “hash code” and comparing it to the hash code of the authorized software.  In practice the scheme is supposed to work like this:  Software provided by the voting-machine vendor examines all the installed software in the voting machine, to make sure it’s the right stuff.

There are some flaws in this concept:  it’s hard to find “all the installed software in the voting machine,” because modern computers have many layers underneath what you examine.  But mainly, if a hacker can corrupt the vote-tallying software, perhaps they can corrupt the hash-generating function as well, so that whenever you ask the checker “does the voting machine have the right software installed,” it will say, “Yes, boss.”  Or, if the hasher is designed not to say “yes” or “no,” but to report the hash of what’s installed, it can simply report the hash of what’s supposed to be there, not what’s actually there. For that reason, election security experts never put much reliance in this hash-code idea; instead they insist that you can’t fully trust what software is installed, so you must achieve election integrity by doing recounts or risk-limiting audits of the paper ballots.

But you might have thought that the hash-code could at least help protect against accidental, nonmalicious errors in configuration.  You would be wrong.  It turns out that ES&S has bugs in their hash-code checker:  if the “reference hashcode” is completely missing, then it’ll say “yes, boss, everything is fine” instead of reporting an error.  It’s simultaneously shocking and unsurprising that ES&S’s hashcode checker could contain such a blunder and that it would go unnoticed by the U.S. Election Assistance Commission’s federal certification process. It’s unsurprising because testing naturally tends to focus on “does the system work right when used as intended?”  Using the system in unintended ways (which is what hackers would do) is not something anyone will notice.

Until somebody does notice.  In this case, it was the State of Texas’s voting-machine examiner, Brian Mechler.  In his report dated September 2020 he found this bug in the hash-checking script supplied with the ES&S EVS election system (for the ExpressVote touch-screen BMD, the DS200 in-precinct optical scanner, the DS450 and DS850 high-speed optical scanners, and other related voting machines).  (Read Section 7.2 of Mr. Mechler’s report for details).

We can’t know whether that bug was intentional or not.  Either way, it’s certainly convenient for ES&S, because it’s one less hassle when installing firmware upgrades.  (Of course, it’s one less hassle for potential hackers, too.)

Another gem in Mr. Mechler’s report is in Section 7.1, in which he reveals that acceptance testing of voting systems is done by the vendor, not by the customer.  Acceptance testing is the process by which a customer checks a delivered product to make sure it satisfies requirements.  To have the vendor do acceptance testing pretty much defeats the purpose.  

When the Texas Secretary of State learned that their vendor was doing the acceptance testing themselves, the SoS’s Election Division took an action “to work with ES&S and their Texas customers to better define their roles and responsibilities with respect to acceptance testing,” according to the report. They may encounter a problem, though: the ES&S sales contract specifies that ES&S must perform the acceptance testing, or they will void your warranty (see clause 7b)

There’s another item in Mr. Mechler’s report, Section 7.3.  The U.S. Election Assistance Commission requires that “The vendor shall have a process to verify that the correct software is loaded, that there is no unauthorized software, and that voting system software on voting equipment has not been modified, using the reference information from the [National Software Reference Library] or from a State designated repository. The process used to verify software should be possible to perform without using software installed on the voting system.”  This requirement is usually interpreted to mean, “check the hash code of the installed software against the reference hash code held by the EAC or the State.”

But ES&S’s hash-checker doesn’t do that at all.  Instead, ES&S instructs its techs to create some “golden” hashes from the first installation, then subsequently check the hash code against these.  So whatever software was first installed gets to be “golden”, regardless of whether it’s been approved by the EAC or by the State of Texas. This design decision was probably a convenient shortcut by engineers at ES&S, but it directly violates the EAC’s rules for how hash-checking is supposed to work.

So, what have we learned?

We already knew that hash codes can’t protect against hackers who install vote-stealing software, because the hackers can also install software that lies about the hash code.  But now we’ve learned that hash codes are even more useless than we might have thought.  This voting-machine manufacturer

  • has a hash-code checker that erroneously reports a match, even when you forget to tell it what to match against;
  • checks the hash against what was first installed, not against the authorized reference that they’re supposed to;
  • and the vendor insists on running this check itself — not letting the customer do it — otherwise the warranty is voided.

As a bonus we learned that the EAC certifies voting systems without checking if the validation software functions properly. 

Are we surprised?  You know: fool me once, shame on you; fool me twice, shame on me.  Every time that we imagine that a voting-machine manufacturer might have sound cybersecurity practices, it turns out that they’ve taken shortcuts and they’ve made mistakes.  In this, voting-machine manufacturers are no different from any other makers of software.  There’s lots of insecure software out there made by software engineers who cut corners and don’t pay attention to security, and why should we think that voting machines are any different?

So if we want to trust our elections, we should vote on hand-marked paper ballots, counted by optical scanners, and recountable by hand.  Those optical scanners are pretty accurate when they haven’t been hacked — even the ES&S DS200 — and it’s impractical to count all the ballots without them.  But we should always check up on the machines by doing random audits of the paper ballots.  And those audits should be “strong” enough — that is, use good statistical methods and check enough of the ballots — to catch the mistakes that the machines might make, if the machines make mistakes (or are hacked).  The technical term for those “strong enough” audits is Risk-Limiting Audit.

Andrew W. Appel is Professor of Computer Science at Princeton University.

Susan Greenhalgh is Senior Advisor on Election Security at Free Speech For People.

Georgia’s election certification avoided an even worse nightmare that’s just waiting to happen next time

Voters in Georgia polling places, 2020, used Ballot-Marking Devices (BMDs), touchscreen computers that print out paper ballots; then voters fed those ballots into Precinct-Count Optical Scan (PCOS) voting machines for tabulation. There were many allegations about hacking of Georgia’s Presidential election. Based on the statewide audit, we can know that the PCOS machines were not cheating (in any way that changed the outcome). But can we know that the touchscreen BMDs were not cheating? And what about next time? There’s a nightmare scenario waiting to happen if Georgia (or other states) continue to use touchscreen BMDs on a large scale.

Dominion ICX ballot-marking device used in Georgia polling places 2020. Voters use the touchscreen to select candidates, then a paper ballot is printed out, which the voter then feeds into the scanner for tabulation and for retention in a ballot box.
Dominion ICP optical-scanner used in Georgia polling places 2020.
25% of Georgia voters in 2020 voted by mail; they marked their optical-scan ballot by hand, so they didn’t need to worry about whether the computer that marked their ballot was hacked–no computer marked their ballot! This is a high-speed central-count scanner that counts mail-in ballots; the screen on the right is not a touch-screen for the voter, it’s a control computer for the election administrators. It’s legitimate to worry about whether the optical scanners are hacked—but the hand audits of the paper ballots (by people, not computers) resolved that question in Georgia 2020.

Part 1: What happened in November 2020

There were many allegations about hacking of Georgia’s voting-machine computers in the November 2020 election—accusations about who owned the company that made the voting machines, accusations about who might have hacked into the computers. An important principle of election integrity is “software independence,” which I’ll paraphrase as saying that we should be able to verify the outcome of the election without having to know who wrote the software in the voting machines.

Indeed, the State of Georgia did a manual audit of all the paper ballots in the November 2020 Presidential election. The audit agreed with the outcome claimed by the optical-scan voting machines. This means,

  • The software in Georgia’s PCOS scanners is now irrelevant to the outcome of the 2020 Presidential election in Georgia, which has been confirmed by the audit.
  • Georgia’s PCOS scanners were not cheating in the 2020 Presidential election (certainly not by enough to change the outcome), which we know because the hand-count audits closely agreed with the PCOS counts.
  • The audit gave election officials the opportunity to notice that several batches of ballots hadn’t even been counted the first time; properly counting those ballots changed the vote totals but not the outcome. I’ll discuss that in a future post.

Suppose the polling-place optical scanners had been hacked (enough to change the outcome). Then this would have been detected in the audit, and (in principle) Georgia would have been able to recover by doing a full recount. That’s what we mean when we say optical-scan voting machines have “strong software independence”—you can obtain a trustworthy result even if you’re not sure about the software in the machine on election day.

If Georgia had still been using the paperless touchscreen DRE voting machines that they used from 2003 to 2019, then there would have been no paper ballots to recount, and no way to disprove the allegations that the election was hacked. That would have been a nightmare scenario. I’ll bet that Secretary of State Raffensperger now appreciates why the Federal Court forced him to stop using those DRE machines (Curling v. Raffensperger, Case 1:17-cv-02989-AT Document 579).

But optical scanners are not the only voting machines in Georgia’s polling places. Every in-person Georgia voter uses two machines: first, voters select candidates on a touch-screen ballot-marking device (BMD) that prints out a ballot paper; then, they feed that ballot paper into a precinct-count optical scanner (PCOS). The software independence of BMDs is much more problematic.

The audit confirmed that the PCOS was not cheating. How do we know that the BMD was not cheating, printing different votes onto the ballot paper than what the voter selected on the touch screen? This is a much more difficult question, and it can’t be answered by any audit or recount of the ballot papers.

You might think, “the voter would notice if the ballot paper differs from what they indicated on the touch screen.” But two different scientific studies have shown that most voters don’t notice. Only about 7% of voters speak up if a touchscreen BMD fraudulently prints a wrong vote. And that’s just one estimate from one study—it might actually be overoptimistic.***

Biden got about 50.125% of the votes in Georgia, and Trump got 49.875%. Suppose, hypothetically, that 50.125% of the voters chose Trump, but (hypothetically) hacked BMDs were changing votes on 0.25% of the ballots, in favor of Biden. Then the result we’d see would be Biden 50.125%, and the recount would confirm that—because that’s what’s printed on the paper.

In this scenario, if 7% (1 out of 15) of voters carefully review their paper ballot, and 0.25% (1 out of 400) of paper ballots had votes for Biden when the voter had really chosen Trump, then we might expect 1 out of 6000 (15×400) voters to complain to the pollworkers. And the pollworkers would supposedly tell those voters, “no problem, don’t put that ballot into the PCOS, we’ll void that for you and you can mark a fresh ballot.” But all those other voters who didn’t carefully check the printout would still be voting for a candidate they didn’t intend to, and the hack would be successful.

You might think (in this hypothetical scenario), “at least some voters caught the BMDs cheating”. But even if a voter catches the machine cheating, so what? Election officials can’t void an entire election, or “correct” the vote totals, based on the say-so of 0.017% (that is, 1/6000) of the voters.

Did the touchscreen BMDs cheat in the Georgia 2020 Presidential Election? We can guess that they did not cheat this time, and here’s a weak basis for that guess: If the BMDs had been shifting enough votes from Trump to Biden to make a difference, then at least 0.017% of voters would have noticed. There were 5 million votes cast, so that’s about 83 833 voters statewide**. If those voters complained, then presumably the local news media would have reported contemporaneous reports of such “BMD vote flipping.” But we didn’t hear any such reports.**** So probably the BMDs weren’t flipping any votes.

That’s a pretty weak basis to assert that the BMDs weren’t cheating. But it could be a lot worse . . .

Part 2: The nightmare scenario just waiting to happen next time.

But what about the next election? Suppose in Georgia’s 2022 Senate election between Raphael Warnock and his Republican challenger (whoever that will be), one of those candidates wins with 50.125% of the vote. And suppose 100 voters statewide claim that the BMDs flipped their vote. What should Secretary of State Raffensperger do? He cannot change the election results based on the say-so of 100 voters—those voters might be mistaken (or lying) about what they indicated on the touch screen. He cannot fix it by a recount, because (if the BMDs were really cheating) the paper ballots are fraudulent. He will be in a bind, and there will be no way out. And no way out for the people of Georgia, either.

You might argue, “More than 7% of voters would notice that their paper ballot was incorrectly marked.” Even if that were true (there’s no evidence for it), it just means 2000 or 3000 voters statewide (10 or 20 per county) would have noticed, instead of just 83 833. The problem is the same: even if they notice, there’s no way to correct the election.

The solution is simple.  Voters should mark their optical-scan bubble ballots with a pen.  That way, you know the recount is counting the ballots that the voter actually marked. Touchscreen BMDs (which also have audio interfaces for blind voters) should be reserved for those voters with disabilities who cannot mark a paper ballot by hand.

Georgia should continue using their PCOS (optical scan) voting machines, which will readily count hand-marked optical-scan “bubble” ballots. No major investment in new equipment is needed. This change can easily be implemented before the next election.

And other states and counties that are considering BMDs-for-all-voters—some counties in Pennsylvania and New Jersey have bought those, New York is considering them—should consider the nightmare scenario, and stick with hand-marked paper ballots.

Everything I’ve described here is consistent with the peer-reviewed scientific paper,  Ballot-Marking Devices Cannot Assure the Will of the Voters, by Andrew W. Appel, Richard A. DeMillo, and Philip B. Stark, in Election Law Journal, vol. 19 no. 3, pp. 432-450, September 2020. [non-paywall version here]

Georgia’s law doesn’t actually say what’s required if the audit detects a problem. The law doesn’t specify that audit results are binding on official results. This year that didn’t matter, because the audit agreed with the official outcome.

*Georgia’s audit was done by examining the ballots with human eyes. Later, at the request of the Trump campaign, Georgia also did a recount using their central-count optical scanners. If those optical scanners had been hacked to cheat consistently with (hypothetically) cheating precinct-count optical scanners, then the machine recount wouldn’t catch the fraud. For that reason, a hand-count is more effective protection than a machine recount. In any case, all three counts (the polling-place count using PCOS, the audit, and the machine recount) showed a Biden victory, although their actual numbers of votes differed.

**Actually, this year a large proportion of Georgians voted by mail, on hand-marked paper ballots, so they didn’t use BMDs at all. Those votes are safe from BMD hacks. But it doesn’t change the “83 833 voters statewide” result of my analysis.

***That statistic (“7% of voters will notice if the BMD prints the wrong candidate on their ballot”) comes from a single study in Michigan. Here’s why it might be overoptimistic, as applied to this voting machine and these voters. First, look at the BMD ballot and how hard it is to read.***** In November, one observer watched a constant stream of voters during about 20 minutes in Cobb County: they voted without a glance at their paper ballots, but then they told the poll workers that they had checked them. It is just too much trouble to try to read and check them.  In the January 2021 Senate runoffs, another observer saw that only 6 of 46 voters even glanced at the paper—which is not the same as checking it carefully.

****We would like to think “there was no local news reporting of BMD-flipped votes” means that “BMDs didn’t flip votes”. But so much of Georgia is quite rural with very little local reporting, and certainly without the experience to know how to even report something like that. And (in other elections) it often happens that there are verified stories of discrepancies months after the election that never made it to any newspaper.

*****I mean, really! not easy to decode the paper printout. In the Senate race, this is what the ballot says:

For United States Senate (Loeffler) -
Special (Vote for One) (NP)
   Vote for Annette Davis Jackson

Is that a vote for Kelly Loeffler, whose name appears on the first line? Apparently not, I’d guess it’s a vote for Annette Davis Jackson. And what does (NP) mean? And what does (I) mean attached to votes for many other candidates? Certainly (I) does not mean Independent. This ballot is a masterpiece of bad design, and it’s no wonder that real-life voters are discouraged from looking at it very carefully.

Edited 8 February 2021 to correct 83 to 833.

ESS voting machine company sends threats

For over 15 years, election security experts and election integrity advocates have been communicating to their state and local election officials the dangers of touch-screen voting machines. The danger is simple: if fraudulent software is installed in the voting machine, it can steal votes in a way that a recount wouldn’t be able to detect or correct. That was true of the paperless touchscreens of the 2000s, and it’s still true of the ballot-marking devices (BMDs) and “all-in-one” machines such as the ES&S ExpressVote XL voting machine (see section 8 of this paper*). This analysis is based on the characteristics of the technology itself, and doesn’t require any conspiracy theories about who owns the voting-machine company.

In contrast, if an optical-scan voting machine was suspected to be hacked, the recount can assure an election outcome reflects the will of the voters, because the recount examines the very sheets of paper that the voters marked with a pen. In late 2020, many states were glad they used optical-scan voting machines with paper ballots: the recounts could demonstrate conclusively that the election results were legitimate, regardless of what software might have been installed in the voting machines or who owned the voting-machine companies. In fact, the vast majority of the states use optical-scan voting machines with hand-marked paper ballots, and in 2020 we saw clearly why that’s a good thing.

In November and December 2020, certain conspiracy theorists made unsupportable claims about the ownership of Dominion Voting Systems, which manufactured the voting machines used in Georgia. Dominion has sued for defamation.

Dominion is the manufacturer of voting machines used in many states. Its rival, Election Systems and Software (ES&S), has an even bigger share of the market.

Apparently, ES&S must think that amongst all that confusion, the time is right to send threatening Cease & Desist letters to the legitimate critics of their ExpressVote XL voting machine. Their lawyers sent this letter to the leaders of SMART Elections, a journalism+advocacy organization in New York State who have been communicating to the New York State Board of Elections, explaining to the Board why it’s a bad idea to use the ExpressVote XL in New York (or in any state).

ES&S’s lawyers claim that certain facts (which they call “accusations”) are “false, defamatory, and disparaging”, namely: that the “ExpressVote XL can add, delete, or change the votes on individual ballots”, that the ExpressVote XL will “deteriorate our security and our ability to have confidence in our elections,” and that it is a “bad voting machine.”

Well, let me explain it for you. The ExpressVote XL, if hacked, can add, delete, or change votes on individual ballots — and no voting machine is immune from hacking. That’s why optical-scan voting machines are the way to go, because they can’t change what’s printed on the ballot. And let me explain some more: The ExpressVote XL, if adopted, will deteriorate our security and our ability to have confidence in our elections, and indeed it is a bad voting machine. And expensive, too!

It’s been clearly explained in the peer-reviewed literature how touch-screen voting machines–even the ones like the XL that print out paper ballots–can (if hacked) alter votes; and how most voters won’t notice; and how even if some voters do notice, there’s no way to correct the election result. And it’s been explained why machines like the ExpressVote XL are particularly insecure–as I said, see section 8 of this paper*.

And it’s pretty clear that the folks at SMART Elections are aware of these scientific studies, and are basing their journalism and advocacy on good science.

I’ll summarize here what’s explained in the paper: how the ExpressVote XL, if hacked, can change votes. If the machine is hacked, the software can do whatever the hacker has programmed, but the hacker can’t change the hardware. The hardware includes a thermal printer that can make black marks (i.e., print text or barcodes or whatever) on the paper, but the hardware can’t erase marks. Therefore you might think the ExpressVote XL, even if hacked, couldn’t alter votes. But consider this: suppose there are 15 contests on the ballot; suppose the voter makes choices for all 13 contests and chooses not to vote for State Senator. Then what the legitimate software does is, in the line for State Senator, print NO SELECTION MADE. But the hacked software could simply leave that line blank–then, when the voter has reviewed the ballot (or not bothered to), the ballot card is pulled past the printhead into the ballot box, and the printhead (under control of hacked software) can print in a vote for Candidate Smith. Few voters will be worried that the line is blank rather than filled in with NO SELECTION MADE.

You might think, “OK, the ExpressVote XL can fill in undervotes, that’s bad, but it can’t change votes.” But it can! Here is the mechanism: Suppose the voter makes choices in all 15 contests, and chooses Jones for State Senator. The hacked software can print a ballot card with only 14 contests, and leave blank spaces for State Senator. Then, after the voter reviews the ballot card behind glass, the card moves past the printhead into the ballot box. At this time the hacked software can print the hacker’s choice (Smith) for State Senator. If most humans were really good at checking their printout line-by-line with what they marked on the touchscreen, this wouldn’t succeed because the voter would notice the missing line, but voters are only human.

More details and explanation are in the paper*.

* Ballot-Marking Devices Cannot Assure the Will of the Voters, by Andrew W. Appel, Richard A. DeMillo, and Philip B. Stark. Election Law Journal, vol. 19 no. 3, pp. 432-450, September 2020. Non-paywall version, differs in formatting and pagination.