April 19, 2024

The continuing saga of Sarasota's lost votes

At a hearing today before a subcommittee of Congress’s Committee on House Administration, the U.S. Government Accountability Office (GAO) reported on the results of their technical investigation into the exceptional undervote rate in the November 2006 election for Florida’s 13th Congressional District.

David Dill and I wrote a long paper about shortcomings in previous investigations, so I’m not going to present a detailed review of the history of this case. [Disclosure: Dill and I were both expert witnesses on behalf of Jennings and the other plaintiffs in the Jennings v. Buchanan case. Writing this blog post, I’m only speaking on my own. I do not speak on behalf of Christine Jennings or anybody else involved with the campaign.]

Heavily abridged history: One in seven votes recorded on Sarasota’s ES&S iVotronic systems in the Congressional race were blank. The margin of victory was radically smaller than this. If you attempt to do a statistical projection from the votes that were cast onto the blank votes, then you inevitably end up with a different candidate seated in Congress.

While I’m not a lawyer, my understanding of Florida election law is that the summary screen, displayed before the voter casts a vote, is what really matters. If the summary screen showed no vote in the race and the voter missed it before casting the ballot, then that’s tough luck for them. If, however, the proper thing was displayed on the summary screen and things went wrong afterward, then there would be a legal basis under Florida law to reverse the election.

Florida’s court system never got far enough to make this call. The judge refused to even allow the plaintiffs access to the machines in order to conduct their own investigation. Consequently, Jennings took her case directly to Congress, which has the power to seat its own members. The last time this particular mechanism was used to overturn an election was in 1985. It’s unclear exactly what standard Congress must use when making a decision like this. Should they use Florida’s standard? Should they impose their own standard? Good question.

Okay, then. On to the GAO’s report. GAO did three tests:

  1. They sampled the machines to make sure the firmware that was inside the machines was the firmware that was supposed to be there. They also “witnessed” the source code being compiled and yielding the same thing as the firmware being used. Nothing surprising was found.
  2. They cast a number of test ballots. Everything worked.
  3. They deliberately miscalibrated some iVotronic systems in a variety of different ways and cast some more test votes. They found the machines were “difficult to use”, but that the summary screens were accurate with respect to the voter’s selections.

What they didn’t do:

  • They didn’t conduct any controlled human subject tests to cast simulated votes. Such a test, while difficult and expensive to perform, would allow us to quantify the extent to which voters are confused by different aspects of the voting system’s user interface.
  • They didn’t examine any of the warehoused machines for evidence of miscalibration. They speculate that grossly miscalibrated machines would have been detected in the field and would have been either recalibrated or taken out of service. They suggest that two such machines were, in fact, taken out of service.
  • They didn’t go through any of ES&S’s internal change logs or trouble tickets. If ES&S knows more, internally, about what may have caused this problem, they’re not saying and GAO was unable to learn more.
  • For the tests that they did conduct, GAO didn’t describe enough about the test setup and execution for us to make a reasonable critique of whether their test setup was done properly.

GAO’s conclusions are actually rather mild. All they’re saying is that they have some confidence that the machines in the field were running the correct software, and that the software doesn’t seem to induce failures. GAO has no opinion on whether poor human factors played a factor, nor do they offer any opinion on what the legal implications of poor human factors would be in terms of who should have won the race. Absent any sort of “smoking gun” (and, yes, 18,000 undervotes apparently didn’t make quite enough smoke on their own), it would seem unlikely that the Committee on House Administration would vote to overturn the election.

Meanwhile, you can expect ES&S and others to use the GAO report as some sort of vindication of the iVotronic, in specific, or of paperless DRE voting systems, in general. Don’t buy it. Even if Sarasota’s extreme undervote rate wasn’t itself sufficient to throw out this specific election result, it still represents compelling evidence that the voting system, as a whole, substantially failed to capture the intent of Sarasota’s voters. Finally, the extreme effort invested by Sarasota County, the State of Florida, and the GAO demonstrates the fundamental problem with the current generation of paperless DRE voting systems: when problems occur, it’s exceptionally difficult to diagnose them. There simply isn’t enough information left behind to determine what really happened during the election.

Other articles on today’s news: CNet News, Bradeton Herald, Sarasota Herald-Tribune, NetworkWorld, Miami Herald (AP wire story), VoteTrustUSA

UPDATE (2/12): Ted Selker (MIT Media Lab) has a press release online that describes human factors experiments with a Flash-based mock-up of the Sarasota CD-13 ballot. They appear to have found undervote rates of comparable magnitude to those obvserved in Sarasota. A press release is very different from a proper technical report, much less a conference or journal publication, so it’s inappropriate to look to this press release as “proof” of any sort of “ballot blindness” effect.

Comments

  1. OpEdNews may not be the best site to cite. I just saw them cited earlier today by a UFO nut: http://www.unknowncountry.com/journal/?id=312

    Calls the site’s credibility into question, I’d say.

  2. I agree with EVERYTHING James Strait says, particularly his “Tic Toc, Tic Toc,” article. 🙂

    Circling back to GAO and the continuing saga of Sarasota’s lost votes. GAO’s inconclusive findings by no means exonerate ES&S iVotronics. Sarasota is only one area of ivotronic failure. When we look to the bigger picture, statewide Florida lost 89K votes in the 2006 Attorney General race. This number is determined by comparing the statewide undervote rates of the iVotronic (8.65%) with other electronic voting machines used across Florida, including ES&S’s own optical scanner (3.04%).

    The details and other findings are included the OpEdNews article “Sarasota 13: If the tests can’t find it, never mind it?” at
    http://www.opednews.com/articles/opedne_lani_mas_080229_sarasota_13_3a___if_th.htm

  3. What the investigation also failed to mention was the general level of disgust with both candidates. In the weeks leading up to the election heavy use of automated phone calls – as much as seven or more times a night – alienated a lot of Sarasota voters. I’ve talked to over a dozen people that decided not to cast a vote for either candidate. I wish there was a box labeled “None of the above – give us more clowns to choose from”.

  4. So much for secrecy of the ballot then, eh Bob? 😛

  5. Bob Schmidt says

    I’m with James Straight on this one. In fact, I would go as far as to say what is needed is a verifiable audit trail in real time signed and approved by each voter and I agree with James, the voter should receive a receipt.

    You get a receipt for a “nickel” candy bar at the 7-11. You should get one for your vote. You sign the register when you check in to the precinct. You need to sign off on your vote. A tracking number on your receipt should tie in to your vote and then all votes by tracking number should be displayed on the board of elections’ web site for all to see both the individual votes cast and the totals. Each voter can verify his or her vote and anyone can see how the election results were arrived at. If the vote displayed differs from the vote cast, the receipt will provide proof of the discrepancy.

    It’s not about the developer or debugging or the UI or any of that. As to calibration, if the machine is going to be calibrated, it needs to be calibrated before each and every vote is cast, to the satisfaction of the voter. Not just once before every election — once before every vote.

  6. Sorry, but some of us, despite having worthwhile things to say, still do like privacy and don’t like spam. 😛

  7. The arguments against voting into a computer style machine have reached the point of absurdity, but it also reveals that most have lost sight of the guiding star. If you believe in separation of powers, if you belive in checks and balance, the issue of voting into a computer becomes moot.

    Voting transparency must be the guiding star for any voting mechanism. Transparency is defined by the author as, that process which permits the Average Literate English Speaking American (ALISA) voter to interact with the system from beginning to end. Thus, any ALISA can serve as a fair witness poll worker or auditor.

    Any system that does not meet the ALISA standard cannot serve the purpose of checks and balance based American Democracy. Computers will never pass the ALISA standard, thus, computers can never be a viable mechanism by which to vote.

    Computers are esoteric. Very very very few ALISA’s have any idea about the logic contained in a computers hardware brain, and even fewer have any appreciation for the software that controls the brain. Thus, only the rarest of ALISA’s could even begin to serve as a fare witness auditor, thus, computers can never serve as a viable mechanism by which to vote.

    Understanding that transparency is the guiding star, and understanding that ALISA is the transparency standard, makes the idea of voting via computer unthinkable.

    Computer voting is fatally flawed? Well no kidding! The concept of voting via computer is flawed! Computers and ALISA transparency are forever contradtictory terms.

    Don’t fear voting via computer because they are flawed, or may be such, fear voting via computer because at some point in time in the distant future, if left to continue their evolution, computers will work with a degree of reliability that produces no glaring abnormalities, thus raises no suspicions. It is at that point in time where American checks and balance democracy will have come to an end.

    Fear computer based voting because at some time in the future they WILL WORK, and an ALISA will have zero capability to backtrack the system.

    Note: If you have something to say, it should be worth attaching your real name to.

  8. Well, I’m very glad to hear that Ted Selker was able to reproduce the problems reported by voters on election day in Sarasota.

    From a Sarasota Herald-Tribune report, “ Dozens of voters complain about glitch” (8 Nov 2006):

    Throughout the day, dozens of people complained that their votes in the 13th Congressional District were not recorded properly. One volunteer election watcher said he heard dozens of such complaints.

    It’s awfully good to hear that a”human factors” experiment was able to reproduce those problems.

  9. I wrote about today’s experience voting on a paperless Accuvote machine in Maryland here: http://tinyurl.com/yws4l5

  10. In general if you have a n-way vote you have n+2 voting outcomes – blank and invalid votes are two different things. The distinction between blank and invalid are subtle, but at least in proportional voting that make a difference. (I know that you don’t vote proportional in the US)

    If the sum of blank and invalid votes is larger than the difference between the winner and the runner up, that is strong circumstantial evidence that that particular election may have been tampered with, and the result of that voting district should be rejected.

    The only fair thing to do is to re-run the election in that district, possibly in a french style round 2 between the 2 candidates on the top of the list.

    eskild
    Denmark

  11. john sarasota says

    GAO Report issues:

    1. They did not seek or review the PEB’s in the eleciton which among other things contained the ballot images and other softswitchs for every voter. These were used at random in precincts and accounted for some machines work for one voter and the same machine not working for another. 72 PEB’s were sent to ESS Omah for the election and sealed for the election after service in Omah.. out of custody chain. No audit or GAO staff ever did a real forensic evaluation of them.
    2. The sample of tested machines stated included “non-sequestered” voting machines,,, now most may not know what that represents but it means in Sarasota’s case they were cleared, cancled , and re-loaded with new software after the Nov electoin for city elections, then tested by the GAO….. almost half of them used in the GAO sample,,, they got an “F” in statistical analyis porcess and procedures.
    3. GAO never asked where the firmware came from to compare the machines with,, yes from the State but “extracted from machines kept in a hall in the Div of Electons right next to the “suppliers- ESS room”. not in some vault…
    4. GAO never looked at the controler chip and firmware for the T Screen, it’s compensating formulas for “load” in elections is key to keeping in calibration and even registration of human instructions.
    5. GAO represented they ran an “independant study” but in Sarasota they were assisted by the State, SOE staff in the same room during the tests, and went on to use the ESS lab in Rockford instead of taking the code to someone like NIST labs in Germantown MD. So every step of the way the parties at interest including the SOE, State and ESS were part of the test, GAO was notified of this conflict and poor procedure ahead of the report leaked publication date.

    It is not the results that are the problem as much as it is the GAO not following best practice testing procedures and from the answer to questions in the hearings regading PEB’s they were over their head.

    The GAO has suffered a real setback of “integrety” in investigation and reporting..

  12. Anonymizing the timestamp is probably the most difficult. Even when the time is erased, you will still want to keep some kind of sequence information, otherwise the log becomes useless.

    More generally, anonymity/privacy is not (necessarily) a yes/no thing. You can have varying degrees of anonymity by aggregating information. For example, you can log system information aggregated over every N voting events, which may be good enough to pinpoint discrepancies, but not attribute votes to individuals beyond the trivial (e.g. all N votes were the same, so you know how everybody voted, which is of course still a concern).

  13. This story highlights a fundamental problem of voting system design: unlike most cases of security engineering, in this case keeping detailed logs runs counter to the specifications. Having an execution trace of a voting machine after election day would help immensely in figuring out what works and what doesn’t — and also eliminate the privacy of the individual voters.

    I am therefore putting forth a challenge: come up with a way to generate useful logs from a voting machine which can be used after election day for debugging and improvements in the design without compromising the privacy of the voters.

    Example: say we record an execution trace for each voter, but record them at random disk locations and without timestamps. Can this information be used to reconstruct votes?