March 26, 2017

Archives for January 2009

CA SoS Bowen sends proposals to EAC

California Secretary of State Debra Bowen has sent a letter to Chair Gineen Beach of the US Election Assistance Commission (EAC) outlining three proposals that she thinks will markedly improve the integrity of voting systems in the country.

I’ve put a copy of Bowen’s letter here (87kB PDF).

Bowen’s three proposals are:

  • Vulnerability Reporting — The EAC should require that vendors disclose vulnerabilities, flaws, problems, etc. to the EAC as the system certification authority and to all the state election directors that use the affected equipment.
  • Uniform Incident Reporting — The EAC should create and adopt procedures that jurisdictions can follow to collect and report data about incidents they experience with their voting systems.
  • Voting System Performance Measurement — As part of the Election Day Survey, the EAC should systematically collect data from election officials about how voting systems perform during general elections.

In my opinion, each of these would be a welcome move for the EAC.

These proposals would put into place a number of essential missing elements of administering computerized elections equipment. First, for the users of these systems, election officials, it can be extremely frustrating and debilitating if they suspect that some voting system flaw is responsible for problems they’re experiencing. Often, when errors arise, contingency planning requires detailed knowledge about specific details of a voting system flaw. Without knowing as much as possible about the problem they’re facing, election officials can exacerbate the problem. At best, not knowing about a potential flaw can do what Bowen describes: doom the election official, and others with the same equipment, to repeatedly encounter the flaw in subsequent elections. Of course, vendors are the most likely to have useful information on a given flaw, and they should be required to report this information to both the EAC and election officials.

Often the most information we have about voting system incidents come from reports from local journalists. These reporters don’t tend to cover high technology too often; their reports are often incomplete and in many cases simply and obviously incorrect. Having a standardized set of elements that an election official can collect and report about voting system incidents will help to ensure that the data comes directly from those experiencing a given problem. The EAC should design such procedures and then a system for collecting and reporting these issues to other election officials and the public.

Finally, many of us were disappointed to learn that the 2008 Election Day survey would not include questions about voting system performance. Election Day is a unique and hard-to-replicate event where very little systematic data is collected about voting machine performance. The OurVoteLive and MyVote1 efforts go a long way towards actionable, qualitative data that can help to increase enfranchisement. However, self-reported data from the operators of the machinery of our democracy would be a gold mine in terms of identifying and examining trends in how this machinery performs, both good and bad.

I know a number of people, including Susannah Goodman at Common Cause as well as John Gideon and Ellen Theisen of VotersUnite!, who have been championing one or another of these proposals in their advocacy. The fact that Debra Bowen has penned this letter is a testament to the reason behind their efforts.

DRM In Retreat

Last week’s agreement between Apple and the major record companies to eliminate DRM (copy protection) in iTunes songs marks the effective end of DRM for recorded music. The major online music stores are now all DRM-free, and CDs still lack DRM, so consumers who acquire music will now expect it without DRM. That’s a sensible result, given the incompatibility and other problems caused by DRM, and it’s a good sign that the record companies are ready to retreat from DRM and get on with the job of reinventing themselves for the digital world.

In the movie world, DRM for stored content may also be in trouble. On DVDs, the CSS DRM scheme has long been a dead letter, technologically speaking. The Blu-ray scheme is better, but if Blu-ray doesn’t catch on, this doesn’t matter.

Interestingly, DRM is not retreating as quickly in systems that stream content on demand. This makes sense because the drawbacks of DRM are less salient in a streaming context: there is no need to maintain compatibility with old content; users can be assumed to be online so software can be updated whenever necessary; and users worry less about preserving access when they know they can stream the content again later. I’m not saying that DRM causes no problems with streaming, but I do think the problems are less serious than in a stored-content setting.

In some cases, streaming uses good old fashioned incompatibility in place of DRM. For example, a stream might use a proprietary format and the most convenient software for watching streams might lack a “save this video” button.

It remains to be seen how far DRM will retreat. Will it wither away entirely, or will it hang on in some applications?

Meanwhile, it’s interesting to see traditional DRM supporters back away from it. RIAA chief Mitch Bainwol now says that the RIAA is agnostic on DRM. And DRM cheerleader Bill Rosenblatt has relaunched his “DRM Watch” blog under the new title “Copyright and Technology“. The new blog’s first entry: iTunes going DRM-free.

Optical-scan voting extremely accurate in Minnesota

The recount of the 2008 Minnesota Senate race gives us an opportunity to evaluate the accuracy of precinct-count optical-scan voting. Though there have been contentious disputes over which absentee ballot envelopes to open, the core technology for scanning ballots has proved to be extremely accurate.

The votes were counted by machine (except for part of one county that counts votes by hand), then every single ballot was examined by hand in the recount.

The “net” accuracy of optical-scan voting was 99.99% (see below).
The “gross” accuracy was 99.91% (see below).
The rate of ambiguous ballots was low, 99.99% unambiguous (see below).

My analysis is based on the official spreadsheet from the Minnesota Secretary of State. I commend the Secretary of State for his commitment to transparency in the form of making the data available in such an easy-to-analyze format. The vast majority of the counties use the ES&S M100 precinct-count optical-scanners; a few use other in-precinct scanners.

I exclude from this analysis all disputes over which absentee ballots to open. Approximately 10% of the ballots included in this analysis are optically scanned absentee ballots that were not subject to dispute over eligibility.

There were 2,423,851 votes counted for Coleman and Franken. The “net” error rate is the net change in the vote margin from the machine-scan to the hand recount (not including change related to qualification of absentee ballot envelopes). This was 264 votes, for an accuracy of 99.99% (error, one part in ten thousand).

The “gross” error rate is the total number of individual ballots either added to one candidate, or subtracted from one candidate, by the recount. A ballot that was changed from one candidate to the other will count twice, but such ballots are rare. In the precinct-by-precinct data, the vast majority of precincts have no change; many precincts have exactly one vote added to one candidate; few precincts have votes subtracted, or more than one vote added, or both.

The recount added a total of 1,528 votes to the candidates, and subtracted a total of 642 votes, for a gross change of 2170 (again, not including absentee ballot qualification). Thus, the “gross” error rate is about 1 in 1000, or a gross accuracy of 99.91%.

Ambiguous ballots: During the recount, the Coleman and Franken campaigns initially challenged a total of 6,655 ballot-interpretation decisions made by the human recounters. The State Canvassing Board asked the campaigns to voluntarily withdraw all but their most serious challenges, and in the end approximately 1,325 challenges remained. That is, approximately 5 ballots in 10,000 were ambiguous enough that one side or the other felt like arguing about it. The State Canvassing Board, in the end, classified all but 248 of these ballots as votes for one candidate or another. That is, approximately 1 ballot in 10,000 was ambiguous enough that the bipartisan recount board could not determine an intent to vote. (This analysis is based on the assumption that if the voter made an ambiguous mark, then this ballot was likely to be challenged either by one campaign or the other.)

Caveat: As with all voting systems, including optical-scan, DREs, and plain old paper ballots, there is also a source of error from voters incorrectly translating their intent into the marked ballot. Such error is likely to be greater than 0.1%, but the analysis I have done here does not measure this error.

Hand counting: Saint Louis County, which uses a mix of optical-scan and hand-counting, had a higher error rate: net accuracy 99.95%, gross accuracy 99.81%.