May 6, 2024

Shamos on paper trails

In an interview today with CNet, Michael Shamos talks about paper trails.  Shamos is a professor at CMU who has served as a voting system analyst for the Pennsylvania Secretary of State. In this article, a transcript of an interview conducted by Declan McCullagh, he spends a fair bit of time trashing paper trails, and by that, he’s referring to the “toilet paper roll” thermal printer attachments that are sold by the major U.S. voting system vendors.

He’s correct, to a limited extent.  He discusses a “20%” failure rate, which he probably gets from some problems in Ohio.  It’s certainly the case that these things are poorly engineered.  The ostensible reason for the continuous paper roll, as opposed to cutting the sheets individually, is that you’d have better reliability.  However, having the votes recorded in the order they were cast is a clear violation of voter privacy.  A more serious concern with paper trails is that it’s unclear whether voters will bother to double-check them at all.  I’ve pointed Freedom to Tinker readers at Sarah Everett’s PhD thesis before and it’s worth doing it again.  The punchline is that roughly two thirds of the test subjects didn’t notice when our homebrew DRE system was lying on its summary screen.  In fact, they gave our machine exceptionally high marks.  They loved it.

Shamos criticizes the EFF, VerifiedVoting, the League of Women voters, and anybody else he can think of because they advocate for paper trails.  The preferred solution that they generally advocate is hand-marked optical scan ballots.  These appear to have better accuracy, and paper ballots are, inherently, paper trails that give us an unfiltered window into the voters’ original intent.  Don’t interpret Shamos’s criticism of toilet-paper rolls as a criticism of hand-marked paper ballots.

Shamos goes on to make a flip comparison between “ATM technology” and voting systems, saying we could have reliable paper trails if we only spent 10x the cost.  This is a very strange argument.  ATMs are expensive because they have a safe full of cash inside.  It’s important that you can’t steal the cash, even if you’ve got time and tools at your disposal.  Voting systems (at least anywhere I’ll ever be likely to vote) don’t dispense money.  Building a reliable printer doesn’t need to be expensive.

Then Shamos gets into the meat of the argument for paper trails.

I’m not advocating that we blindly trust machines. We have to have a way to make sure the (record is correct). If anything happens to that piece of paper, if it gets substituted or lost, there’s absolutely no way to reconstruct the election. that’s unlike an electronic system, which is if one memory fails you have the other.

The security on ballot boxes is much lower than the security on voting machines themselves. In order to do anything with those pieces of paper, they have to be handled by people. What do you think happens?

If I want to screw up an election, all I have to do is modify five votes. Then we have to do a manual recount (which is vulnerable to tampering and ballot-stuffing).

This is completely false.  Paper records are redundant with the electronic records, and that’s a huge feature.  That means that you can compare them, either statistically in aggregate, or even one-to-one (assuming there are serial numbers, which could cause some privacy concerns, but maybe you can obscure those in barcodes).  It’s certainly the case that missing paper votes can be reconstructed from electronic records.  When you have both, you reconcile.  If there’s ambiguity, then you need to resolve that ambiguity.  You then have a forensic problem.  If all the tamper-evident stickers and locks on the paper ballot box were disturbed, maybe you’re more likely to trust the electronic parts.  If the totals are radically divergent, you can’t tell which is more authentic, and the election is tight, then maybe the proper answer (from a scientific perspective) is to throw your hands up and say that you cannot legitimately state who won the election as a result of fraud.  This is defensible, scientifically, but it could lead to a political crisis.  Nobody ever said election administration was easy.

Doing away with the paper only does away evidence that might help you discover fraud.  Even if you cannot come up with the proper answer, it’s better to at least know you were under attack.

The fundamental difficulty with paper trails is that they’re ridiculously kludgey. The problem is that once you mandate paper trails, it cuts off research. There would be no reason to use anything else because it would be illegal.

Speaking as somebody who does research in electronic voting, I don’t feel that laws mandating paper trails would stop me from studying alternatives.  The 2007 VVSG standards process includes an “innovation class” for how vendors can get funky fresh technologies certified for use.  The trick is to make sure that the innovation class isn’t a loophole that vendors can use for the current crop of insecure equipment.

Does that mean you’re suggesting that we should be voting from insecure home computers even if they’re running Windows 98?
Shamos: I can point you to a mechanism (in a paper by Avi Rubin and Dan Wallach) that would allow secure voting on insecure terminals. The notion that the Internet is just not secure enough to do anything important is just wrong. It’s not insurmountable. The right people aren’t thinking about it because you gotta have a paper trail.

Really?  A recent paper that I just submitted to a workshop talked about how Internet voting might work, by virtue of having remote precincts set up in places like embassies and consulates, and using dedicated voting machines.  You could send the results home over the Internet.  Voting on dedicated voting machines with an Internet connection might be workable.  Voting on Windows 98 PCs would be an unmitigated disaster.  Botnets control literally millions of computers out there.  What if you’re voting from a botnet-infested computer?  Could the botnet modify your vote?  Why not?  For these sorts of reasons, the authors of the SERVE Report, including Avi Rubin, recommended strongly against voting on generic PCs.  Shamos says that Avi and I would support secure voting on insecure terminals?  Sure.  We’ll probably be beaten by the bioengineers working on flying pigs.

Update: in private email, Shamos states that he was citing our 2003 workshop paper, “Authentication for Remote Voting“.  That paper discusses how to do bidirectional remote authentication, which would certainly be applicable to an Internet-based remote voting system.  That paper, however, offers no technique that could allow for secure voting on insecure home computers.

I say, and the advocates are forced to admit it, that there’s never been any evidence that a DRE machine has been tampered with in an election. They say that doesn’t mean it never happened. I agree with that. But I believe deeply that if people were out there trying to hack elections we would see evidence of failed attempts.

Indeed, there’s no evidence to support a lack of tampering, but that’s meaningless.  A better way to look at this is that the incredibly poor security of modern paperless electronic voting systems makes it cheaper than it ever has been before to manipulate votes.  The cost per vote for electronic manipulation is almost nill, particularly if you allow for viral attacks, where one corrupt DRE can take out the entire tabulation system (a vulnerably shown to apply to Hart InterCivic and Diebold as part of the California Top to Bottom reports from last summer).  Regardless of whether somebody has attempted an attack like this, it’s dirt cheap – cheaper than with paper, because manipulating paper takes more time and more labor.  The economic incentives are clearly in play for electronic election fraud.  The big question is whether it’s more cost effective to manipulate voters through other means (e.g., dubious television advertising, robotic phone calls, etc.).

When a bridge collapses, do we outlaw bridges or do we inspect bridges of similar design? If the design itself is fundamentally flawed, then those bridges are going to have to be taken out of service and rebuilt. If there’s a fix, however, you can add a bracing member.

Excellent point.  DRE systems from all the major vendors have been conclusively shown to be fundamentally flawed in their design.  Even if and when the vendors patch their software, the time delay to push those patches through the certification process guarantees they won’t be ready for November.  Optically scanned paper ballots are available today and they work quite well (despite known security vulnerabilities in the tabulators).  Likewise, junky toilet-paper roll printers are available today, despite known problems with their ability to print and with voter’s ability to catch mistakes.

One last point:

Please don’t use the term “paperless.” It’s a construction of the advocates and it’s false and misleading. They’re not paperless. They just don’t produce a contemporaneous paper that the voter can view.

The word “paperless” is really insidious. The word “less” is meant to imply that they’re thereby missing something. Whoever decided to come up with the term “paperless” deserves a left-handed prize for their imagination. It’s wonderful for them. Paperless.

Yes, “paperless.”  It’s a fine word.  I’ve been using it for years.  It concisely captures the lack of redundancy, the reliance on poorly engineered software, and the risky nature of using paperless DRE voting systems for something as important as a national election.

Paperless electronic voting systems can be made better, using tricks like Benaloh’s challenge mechanism, which can catch a machine, in the act, while it might otherwise be trying to corrupt the vote.  We used a variant on his mechanism in our research prototype (paper to appear this summer at Usenix Security).  Nonetheless, I really like the term “paperless” when hooked to “electronic voting machine” because it creates a burden of proof for the system designer.  You want to go paperless?  Fine.  Prove to us that your system is secure.  Without paper, we’ll assume it’s insecure until proven otherwise.

How can we require ID for voters?

Recently, HR 5036 was shot down in Congress.  That bill was to provide “emergency” money to help election administrators who wished to replace paperless voting systems with optically scanned paper ballots (or to add paper-printing attachments to existing electronic voting systems).  While the bill initially received strong bipartisan support, it was opposed at the last minute by the White House.  To the extent that I understand the political subtext of this, the Republicans wanted to attach a Voter ID requirement to the bill, and that gummed up the works.  (HR 5036 isn’t strictly dead, since it still has strong support, but it was originally fast tracked as a “non-controversial” bill, and it is now unlikely to gain the necessary 2/3 majority.)

I’ve been thinking for a while about this whole voter ID problem, and I have to say that I don’t really see a big problem with requiring that voters present ID so they can vote.  This kind of requirement is used in other countries like Mexico and it seems to work just fine.  The real issue is making sure that all people who might want to vote actually have IDs, which is a real problem for the apparently non-trivial number of current voters who lack normal ID cards (and, who we are led to believe, tend to vote in favor of Democrats).

The question then becomes how to get IDs for everybody.  One answer is to put election authorities in charge of issuing special voting ID cards.  This works in other countries, but nobody would ever support such a thing in the U.S. because it would be fantastically expensive and the last thing we need is yet another ID card.  The “obvious” solution is to use driver’s licenses or official state IDs (for non-drivers).  But, what if you’ve never had a driver’s license?

As an example, here are Texas’s list of requirements to get a driver’s license.  Notice how they also require you have proof of a social security number?  If you’ve somehow managed to make it through life without getting one, and I imagine many poor people could live without one, then that becomes a significant prerequisite for getting a driver’s license.  And it’s pretty difficult to get a SSN if you’re unemployed and don’t have a driver’s license (see the Social Security Administration’s rules).

One way or another, you’re going to need your birth certificate.  Here’s how you get a copy of one in Texas.  If you don’t have any other form of ID, it’s pretty difficult to get your birth certificate as well.  You’ll either need an immediate relative with an ID to request your birth certificate on your behalf, or you’ll need utility bills in your name.  And if you’re older than 75, the state agency may not be able to help you, and who knows if the county where you were born has kept its older records properly.

It’s easy to see that somebody in this situation is going to find it difficult to navigate the bureaucratic maze.  If the only benefit they get, at the end of the day, is being allowed to vote, it’s pretty hard to justify the time and expense ($25 for the birth certificate, the social security card is free, and $15 plus hours waiting in line for the state ID card).  For potential voters who don’t have a permanent home address, this process seems even less reasonable.

The only way I could imagine a voter ID requirement being workable (i.e., having a neutral effect on partisan elections) is if there was a serious amount of money budgeted to help people without IDs to get them.  That boils down to an army of social workers digging around for historical birth records and whatever else, and that’s not going to be cheap.  However, I’m perfectly willing to accept a mandatory voter ID, as long as enough money is there to get one, for free, for anybody who wants one.  The government is willing to give you a $40 coupon to receive digital signals for an analog TV, as part of next year’s phase-out of analog broadcasts.  Why not help out with getting identification papers as part of phasing in an ID requirement?

[Sidebar: if you’re really concerned about people voting multiple times, the most effective solution has nothing to do with voter ID.  The simple, low-tech answer is to mark voters’ fingers with indelible ink.  It wears off after a while, it’s widely used throughout the world, and there’s no mistaking it for anything else.  I can’t wait for the day when I tune into my nightly newscast and see the anchor giving grief to the sportscaster because his thumb isn’t painted purple.]

California review of the ES&S AutoMARK and M100

California’s Secretary of State has been busy. It appears that ES&S (manufacturers of the Ink-a-Vote voting system, used in Los Angeles, as well as the iVotronic systems that made news in Sarasota, Florida in 2006) submitted its latest and greatest “Unity 3.0.1.1” system for California certification. ES&S systems were also considered by Ohio’s study last year, which found a variety of security problems.

California already analyzed the Ink-a-Vote. This time, ES&S submitted their AutoMARK ballot marking device, which has generated some prior fame for being more accessible than other electronic voting solutions, as well has having generated some prior infamy for having gone through various hardware changes without being resubmitted for certification. ES&S also submitted its M100 precinct-based tabulation systems, which would work in conjunction with the AutoMARK devices. (Most voters would vote with pen on a bubble sheet. The AutoMARK presents a fancy computer interface but really does nothing more than mark the bubble sheet on behalf of the voter.) ES&S apparently did not submit its iVotronic systems.

The results? Certification denied.

Let’s start with the letter from the Secretary to the vendor and work our way down.

ES&S failed to submit “California Use Procedures” to address issues that they were notified about back in December as part of their conditional certification of an earlier version of the system. This can only be interpreted as vendor incompetence. Here’s a choice quote:

ES&S submitted what it stated were its revised, completed California Use Procedures on March 4th. Staff spent several days reviewing the document, which is several hundred pages in length. Staff found revisions expressly called for in the testing reports, but found that none of the changes promised two months earlier in Mr. Groh’ s letter of January 11, 2008, were included.

The accessibility report is very well done and should be required reading for anybody wanting to understand accessibility issues from a broad perspective. They found:

  • Physical access has some limitations.
  • There are some personal safety hazards.
  • Voters with severe manual dexterity impairments may not be able to independently remove the ballot from the AutoMARK and cast it.
  • The keypad controls present challenges for some voters.
  • It takes more time to vote with the audio interface.
  • The audio ballot navigation can be confusing.
  • Write-in difficulties frustrated some voters.
  • The voting accuracy was limited by write-in failures.
  • Many of the spoken instructions and prompts are inadequate.
  • The system lacks support for good public hygiene.
  • There were some reliability concerns.
  • The vendor’s pollworker training and materials need improvement.

Yet still, they note that “We are not aware of any public device that has more flexibility in accommodating the wide range of physical and dexterity abilities that voters may have. The key, as always, is whether pollworkers and voters will be able to identify and implement the optimal input system without better guidance or expert support. In fact, it may be that the more flexible a system is, the more difficult it is for novices to navigate through the necessary choices for configuring the access options in order to arrive at the best solution.” One of their most striking findings was how long it took test subjects to use the system. Audio-only voters needed an average of almost 18 minutes to use the machine on a simplified ballot (minimum 10 minutes; maximum 35 minutes). Write-in votes were exceptionally difficult. And, again, this is arguably one of the best voting systems available, at least from an accessibility perspective.

Okay, you were all waiting to learn more about the security problems. Let’s go. The “red team” exercise was performed by the Freeman Craft McGregor Group. It’s a bit skimpy and superficial. Nonetheless, they say:

  • You can swap out the PCMCIA memory cards in the precinct-based ballot tabulator (model M100), while in the precinct. This attack would be unlikely to be detected.
  • There’s no cryptography of any kind protecting the data written to the PCMCIA cards. If an attacker can learn the file format (which isn’t very hard to do), and can get physical access to the card while in transit or storage, then the attacker can trivially substitute alternative vote records.
  • The back-end “Election Reporting Manager” has a feature to add or remove votes from the vote totals. This would be visible in the audit logs, if anybody bothered to look at them, but these sorts of logs aren’t typically produced to the public. (Hart InterCivic has a very similar “Adjust Vote Totals” feature with similar vulnerabilities.)
  • The high speed central ballot tabulator (the M650) writes its results to a Zip disk, again with no cryptography or anything else to protect the data in transit.
  • The database in which audit records are kept has a password that can be “cracked” (we’re not told how). Once you’re into the database, you can create new accounts, delete old audit records, and otherwise cause arbitrary mayhem.
  • Generally speaking, a few minutes of physical access is all you need to compromise any of the back-end tools.
  • All of the physical key locks could be picked in “five seconds to one minute.” The wire and paper-sticker tamper-evidence seals could also be easily bypassed.

And then there’s the source code analysis, prepared by atsec (who normally make a living doing Common Criteria analyses). Again, the public report is less detailed than it can and should be (and we have no idea how much more is in the private report). Where should we begin?

The developer did not provide detailed build instructions that would explain how the system is constructed from the source code. Among the missing aspects were details about versions of compilers, build environment and preconditions, and ordering requirements.

This was one of our big annoyances when working on California’s original top-to-bottom review last summer. It’s fantastically helpful to be able to compile the program. You need to do that if you want to run various automated tools that might check for bugs. Likewise, there’s no substitute for being able to add debugging print statements and use other debugging techniques when you want to really understand how something works. Vendors should be required to provide not just source code but everything necessary to produce a working build of the software.

The M100 ballot counter is designed to load and dynamically execute binary files that are stored on the PCMCIA card containing the election definition (A.12) in cleartext without effective integrity protection (A.1).

Or, in other words, election officials must never, ever believe the results they get from electronic vote tabulation without doing a suitable random sample of the paper ballots, by hand, to make sure that the paper ballots are consistent with the electronic tallies. (Similarly fundamental vulnerabilities exist with other vendors’ precinct-based optical scanners.)

The M100 design documentation contains a specification of the data structure layout for information stored on the PCMCIA card. The reviewer compared the actual structures as defined in the source code to the documentation, and none of the actual structures matched the specification. Each one showed significant differences to or omissions from the specification.

I require the students in my sophomore-level software engineering class to keep their specs in synchrony with their code as their code evolves. If college sophomores can do it, you’d think professional programmers could do it as well.

The user’s guide for the Election Reporting Manager describes how a password is constructed from publicly-available data. This password cannot be changed, and anyone reading the documentation can use this information to deduce the password. This is not an effective authentication mechanism.

While this report doesn’t get into the ES&S iVotronic, the iVotronic version 8 systems had three character passwords, fixed from the factory. (They apparently fixed this in the version 9 software which is now already a few years old.) You’d think they would have gone around and fixed this issue elsewhere in their software, since it’s so fundamental.

A.4 “EDM iVotronic Password Scramble Key and Algorithm”: A hardcoded key is used to obfuscate passwords before storing them in a database. The scrambling algorithm is very weak and reversible, allowing an attacker with access to the scrambled password to retrieve the actual password. The iVotronic is supported by the Unity software but is not being used for California elections.

Well, okay, maybe they didn’t fix the iVotronic passwords, then, either. Other passwords throughout the system are similarly hard-coded and/or poorly stored. And, given that, you can trivially tamper with any and all of the audit logs in the system that might otherwise contain records of what damage you might have done.

In the area of cryptography and key management, multiple potential and actual vulnerabilities were identified, including inappropriate use of symmetric cryptography for authenticity checking (A.9) and several different very weak homebrewed ciphers (A.4, A.7, A.8, A.11). In addition, the code and comments indicated that a checksum algorithm that is suitable only for detecting accidental corruption is used inappropriately with the claimed intent of detecting malicious tampering (A.1).

We’ve seen similar ill-conceived mechanisms used by other vendors, so it’s similarly unsurprising to see it here. The number one lesson these vendors should take home is thou shalt not implement thine own cryptography, particularly when the stuff they’re doing is all pretty standard and could be pulled from places like the OpenSSL library support code. And even then, you have to know what you’re doing. As Aggelos Kiayias once quipped, don’t use cryptography; use a cryptographer.

The developers generally assume that input data will be supplied in the correct expected format. Many modules that process input data do not perform data validation such as range checks for input numbers or checking validity of internal cross references in interlinked data, leading to potentially exploitable vulnerabilities when those assumptions turn out to be incorrect.

They’re talking about buffer overflow vulnerabilities. This is one of the core techniques that an attacker might use to gain leverage. If an attacker compromises one solitary memory card on its way back to Election Central, then corrupt data on that might be able to attack the tabulation system, and thus effect the outcome of the entire election. This report doesn’t contain enough information for us to conclude whether ES&S’s Unity systems are vulnerable in this fashion, but these are exactly the kinds of poor development practices that enable viral attacks.

Finally, a few summary bullets jumped out at me:

  • The system design does not consistently use privilege separation, leading to large amounts of code being potentially security-critical due to having privileges to modify data.
  • Unhelpful or misleading comments in the code.
  • Subjectively, large amount of source code compared to the functionality implemented.

Okay, let’s get this straight. The code is bloated, the comments are garbage, and the system is broadly not engineered to restrict privileges. Put that all together, and you’re guaranteed a buggy, error-prone, security vulnerable program that must be incredibly painful to maintain and extend. This is the kind of issue that leads smart companies to start over from scratch (while simultaneously supporting the old version until the new version gets up to speed). Is ES&S or any other voting system vendor doing a from-scratch implementation in order to get it right? They’ll never get there any other way.

[Sidebar: I live in Texas. Texas’s Secretary of State, like California’s, is responsible for certifying voting equipment for use in the state. If you visit their web page and scroll to the bottom, you’ll see links for each of the vendors. There are three vendors who are presently certified to sell election equipment here: Hart InterCivic, ES&S and Premier (née Diebold). Nothing yet published on the Texas site post-dates the California or Ohio studies, but Texas’s examiners recently considered a new submission from Hart InterCivic. It will be very interesting to see whether they take any of the staggering security flaws in Hart’s system into consideration. If they do, it would be a big chance for Texas to catch up to the rest of the country. Incidentally, I have offered my services to be on Texas’s board of election examiners on several occasions. Thus far, they haven’t responded. The offer remains open.]

pesky details with getting a voting system correct

Today was the last day of early voting in Texas’s primary election. Historically, I have never voted in a primary election. I’ve never felt I identified enough with a particular political party to want to have a say in selecting their candidates. Once I started working on voting security, I discovered that this also allowed me to make a legitimate claim to being “non-partisan.” (While some election officials, political scientists, and others who you might perhaps prefer to be non-partisan do have explicit partisan views, many more make a point of similarly obscuring their partisan preferences like I do.)

In Texas, you are not required to register with a party in order to vote in their primary. Instead, you just show up and ask for their primary ballot. In the big city of Houston, any registered voter can go to any of 35 early voting locations over the two weeks of early voting. Alternately, they may vote in their home, local precinct (there are almost a thousand of these) on the day of the election. There have been stories of long lines over the past two weeks. My wife wanted to vote, but procrastinating, we went on the final night to a gigantic supermarket near campus. Arriving at 5:50pm or so, she didn’t reach the head of the queue until 8pm. Meanwhile, I took care of our daughter and tried to figure out the causes of the queue.

There were maybe twenty electronic voting machines, consistently operating at between 50-70% utilization (i.e., as many as half of the voting machines were unused at any given time). Yet the queue was huge. How could this be? Turns out there were four people at the desk in front dealing with the sign-in procedure. In a traditional, local precinct, this is nothing fancier than flipping open a paper printout to the page with your name. You sign next to it, and then you go vote. Simple as can be. Early voting is a different can of worms. They can’t feasibly keep a printout with over a million names of it in each of 35 early voting centers. That means they need computers. Our county’s computers had some kind of web interface that they could use to verify the voter’s registration. They then print a sticker with your name on it, you sign it, and it goes into a book. If a voter happens to present their voter registration card (my wife happened to have hers with her), the process is over in a hurry. Otherwise, things slow down, particularly if, say, your driver’s license doesn’t match up with the computer. “What was your previous address?” Unsurprisingly, the voter registration / sign-in table was the bottleneck. I’ve seen similar effects beforehand when voting early.

How could you solve this problem? You could have an explicit “fast path” for voters who match quickly versus a “slow path” with a secondary queue for more complicated voters. You can have more registration terminals. You could have roving helpers with PDAs and battery-powered printers that try to get further back into the queue and help voters reconcile themselves with their “true” identity. There’s no lack of creativity that’s been applied to solving this class of problems outside of the domain of election management.

Now, these voter registration systems are not subject to any of the verification and testing procedures that apply to the electronic voting machines themselves. Any vendor can sell pretty much anything and the state government doesn’t have much to say about it. That’s both good and bad. It’s clearly bad because any vetting process might have tried to consider these queueing issues and would have issued requirements on how to address the problem. On the flip side, one of the benefits of the lack of regulation is that the vendor(s) could ostensibly fix their software. Quickly.

To the extent there’s a moral to this story, it’s that the whole system matters. For the most part, we computer security folks have largely ignored voter registration as being somebody else’s problem. Maybe there’s a market for some crack programmer to crank out a superior solution in the time it took to read this blog post and get us out of the queue and into the voting booth.

(Sidebar: Turns out, the Texas Democratic Party has both a primary election and a caucus. Any voter who casts a vote in the primary is elgible to caucus with the party. The caucus locations are the same as the local polling places, with caucusing starting 15 minutes after the close of the polls. Expect stories about crowding, confusion, and chaos, particularly given the crowded, small precinct rooms and relatively few people with experience in the caucusing process. Wikipedia has some details about the complex process by which the state’s delegates are ultimately selected. There may or may not be lawsuits over the process as well.)

The continuing saga of Sarasota's lost votes

At a hearing today before a subcommittee of Congress’s Committee on House Administration, the U.S. Government Accountability Office (GAO) reported on the results of their technical investigation into the exceptional undervote rate in the November 2006 election for Florida’s 13th Congressional District.

David Dill and I wrote a long paper about shortcomings in previous investigations, so I’m not going to present a detailed review of the history of this case. [Disclosure: Dill and I were both expert witnesses on behalf of Jennings and the other plaintiffs in the Jennings v. Buchanan case. Writing this blog post, I’m only speaking on my own. I do not speak on behalf of Christine Jennings or anybody else involved with the campaign.]

Heavily abridged history: One in seven votes recorded on Sarasota’s ES&S iVotronic systems in the Congressional race were blank. The margin of victory was radically smaller than this. If you attempt to do a statistical projection from the votes that were cast onto the blank votes, then you inevitably end up with a different candidate seated in Congress.

While I’m not a lawyer, my understanding of Florida election law is that the summary screen, displayed before the voter casts a vote, is what really matters. If the summary screen showed no vote in the race and the voter missed it before casting the ballot, then that’s tough luck for them. If, however, the proper thing was displayed on the summary screen and things went wrong afterward, then there would be a legal basis under Florida law to reverse the election.

Florida’s court system never got far enough to make this call. The judge refused to even allow the plaintiffs access to the machines in order to conduct their own investigation. Consequently, Jennings took her case directly to Congress, which has the power to seat its own members. The last time this particular mechanism was used to overturn an election was in 1985. It’s unclear exactly what standard Congress must use when making a decision like this. Should they use Florida’s standard? Should they impose their own standard? Good question.

Okay, then. On to the GAO’s report. GAO did three tests:

  1. They sampled the machines to make sure the firmware that was inside the machines was the firmware that was supposed to be there. They also “witnessed” the source code being compiled and yielding the same thing as the firmware being used. Nothing surprising was found.
  2. They cast a number of test ballots. Everything worked.
  3. They deliberately miscalibrated some iVotronic systems in a variety of different ways and cast some more test votes. They found the machines were “difficult to use”, but that the summary screens were accurate with respect to the voter’s selections.

What they didn’t do:

  • They didn’t conduct any controlled human subject tests to cast simulated votes. Such a test, while difficult and expensive to perform, would allow us to quantify the extent to which voters are confused by different aspects of the voting system’s user interface.
  • They didn’t examine any of the warehoused machines for evidence of miscalibration. They speculate that grossly miscalibrated machines would have been detected in the field and would have been either recalibrated or taken out of service. They suggest that two such machines were, in fact, taken out of service.
  • They didn’t go through any of ES&S’s internal change logs or trouble tickets. If ES&S knows more, internally, about what may have caused this problem, they’re not saying and GAO was unable to learn more.
  • For the tests that they did conduct, GAO didn’t describe enough about the test setup and execution for us to make a reasonable critique of whether their test setup was done properly.

GAO’s conclusions are actually rather mild. All they’re saying is that they have some confidence that the machines in the field were running the correct software, and that the software doesn’t seem to induce failures. GAO has no opinion on whether poor human factors played a factor, nor do they offer any opinion on what the legal implications of poor human factors would be in terms of who should have won the race. Absent any sort of “smoking gun” (and, yes, 18,000 undervotes apparently didn’t make quite enough smoke on their own), it would seem unlikely that the Committee on House Administration would vote to overturn the election.

Meanwhile, you can expect ES&S and others to use the GAO report as some sort of vindication of the iVotronic, in specific, or of paperless DRE voting systems, in general. Don’t buy it. Even if Sarasota’s extreme undervote rate wasn’t itself sufficient to throw out this specific election result, it still represents compelling evidence that the voting system, as a whole, substantially failed to capture the intent of Sarasota’s voters. Finally, the extreme effort invested by Sarasota County, the State of Florida, and the GAO demonstrates the fundamental problem with the current generation of paperless DRE voting systems: when problems occur, it’s exceptionally difficult to diagnose them. There simply isn’t enough information left behind to determine what really happened during the election.

Other articles on today’s news: CNet News, Bradeton Herald, Sarasota Herald-Tribune, NetworkWorld, Miami Herald (AP wire story), VoteTrustUSA

UPDATE (2/12): Ted Selker (MIT Media Lab) has a press release online that describes human factors experiments with a Flash-based mock-up of the Sarasota CD-13 ballot. They appear to have found undervote rates of comparable magnitude to those obvserved in Sarasota. A press release is very different from a proper technical report, much less a conference or journal publication, so it’s inappropriate to look to this press release as “proof” of any sort of “ballot blindness” effect.