October 6, 2022

Archives for October 2018

The Third Workshop on Technology and Consumer Protection

Arvind Narayanan and I are pleased to announce that the Workshop on Technology and Consumer Protection (ConPro ’19) will return for a third year! The workshop will once again be co-located with the IEEE Symposium on Security and Privacy, occurring in May 2019.

ConPro is a forum for a diverse range of computer science research with consumer protection implications. Last year, papers covered topics ranging from online dating fraud to the readability of security guidance. Panelists and invited speakers explored topics from preventing caller-ID spoofing to protecting unique communities.

We see ConPro as a workshop in the classic sense, providing substantive feedback and new ideas. Presentations have sparked suggestions for follow-up work and collaboration opportunities. Attendees represent a wide range of research areas, spurring creative ideas and interesting conversation. For example, comments about crowdworker concerns this year led to discussion of best practices for research making use of those workers.

Although our community has grown, we aim to keep discussion and feedback a central part of the workshop. Our friends in the legal community have had some success with larger events focused on feedback and discussion, such as PLSC. We plan to take lessons from those cases.

The success of ConPro in past years—amazing research, attendees, discussion, and PCs—makes us excited for next year. The call for papers lists some relevant topics, but if you do computer science research with consumer protection implications, it’s relevant (but be sure those implications are clear). The submission deadline is January 23, 2019. We hope you’ll submit a paper and join us in San Francisco!

Ten ways to make voting machines cheat with plausible deniability

Summary:  Voting machines can be hacked; risk-limiting audits of paper ballots can detect incorrect outcomes, whether from hacked voting machines or programming inaccuracies; recounts of paper ballots can correct those outcomes; but some methods for producing paper ballots are more auditable and recountable than others.

A now-standard principle of computer-counted public elections is, use a voter-verified paper ballot, so that in case the voting machine cheats in counting the votes, the human doing an audit or recount can see the paper that the voter marked.  Why would the voting machine cheat?  Well, they’re computers, and any computer may have security vulnerabilities that permits an attacker to modify or replace its software.  We must presume that any voting machine might, at any time, be under the complete control of an attacker, an election thief.

There are several ways that voter-verified paper ballots can be implemented:

  1. Voter marks an optical-scan ballot with a pen, deposits into optical-scan voting machine for counting (and for saving in sealed ballot box).
  2. Voter uses a ballot-marking device (BMD), a computer with touchscreen/audio/sip-and-puff interfaces, which prints an optical-scan ballot, deposits into optical-scan voting machine for counting (and saving).
  3. Voter uses a DRE+VVPAT voting machine, that is, a Direct-Recording Electronic  “touchscreen” machine with a Voter-Verified Paper Audit Trail, which saves the VVPAT printouts in a ballot box.
  4. Voter uses an “all-in-one” voting machine: inserts blank paper into slot, voter uses touchscreen interface to mark ballot, machine ejects ballot from slot, voter  inspects printed ballot, voter reinserts printed ballot into same slot, where it is scanned (or is it?) and deposited into ballot box.

There’s also 1a (hand-marked optical-scan ballots, dropped into a precinct ballot box to be centrally counted instead of counted immediately by a precinct-located scanner), 1b (hand-marked optical-scan ballots, sent by mail) and 2a (BMD-marked optical-scan ballots, centrally counted).

In this article I will put on my “adversarial thinking” hat, and try to design ways that the attacker might try to cheat (and get away with it).  You might think that the voter-verified paper ballot will detect cheating, and therefore deter cheating or correct the result–but maybe that depends on which kind of technology is used! [Read more…]

User Perceptions of Smart Home Internet of Things (IoT) Privacy

by Noah Apthorpe

This post summarizes a research paper, authored by Serena Zheng, Noah Apthorpe, Marshini Chetty, and Nick Feamster from Princeton University, which is available here. The paper will be presented at the ACM Conference on Computer-Supported Cooperative Work and Social Computing (CSCW) on November 6, 2018.

Smart home Internet of Things (IoT) devices have a growing presence in consumer households. Learning thermostats, energy tracking switches, video doorbells, smart baby monitors, and app- and voice-controlled lights, speakers, and other devices are all increasingly available and affordable. Many of these smart home devices continuously monitor user activity, raising privacy concerns that may pose a barrier to adoption.

In this study, we conducted 11 interviews of early adopters of smart home technology in the United States, investigating their reasons for purchasing smart-home IoT devices, perceptions of smart home privacy risks, and actions taken to protect their privacy from entities external to the home who create, manage, track, or regulate IoT devices and their data.

We recruited participants by posting flyers in the local area, emailing listservs, and asking through word of mouth. Our recruiting resulted in six female and five male interviewees, ranging from 23–45 years old. The majority of participants were from the Seattle metropolitan area, but included others from New Jersey, Colorado, and Texas. The participants came from a variety of living arrangements, including families, couples, and roommates. All participants were fairly affluent, technically skilled, and highly interested in new technology, fitting the profile of “early adopters.” Each interview began with a tour of the participant’s smart home, followed by a semi-structured conversation with specific questions from an interview guide and open-ended follow-up discussions on topics of interest to each participant.

The participants owned a wide variety of smart home devices and shared a broad range of experiences about how these devices have impacted their lives. They also expressed a range of privacy concerns, including intentional purchasing and device interaction decisions made based on privacy considerations. We performed open coding on transcripts of the interviews and identified four common themes:

  1. Convenience and connectedness are priorities for smart home device users. These values dictate privacy opinions and behaviors. Most participants cited the ability to stay connected to their homes, families, or pets as primary reasons for purchasing and using smart home devices. Values of convenience and connectedness outweighed other concerns, including obsolescence, security, and privacy. For example, one participant commented, “I would be willing to give up a bit of privacy to create a seamless experience, because it makes life easier.”
  2. User opinions about who should have access to their smart home data depend on perceived benefit from entities external to the home, such as device manufacturers, advertisers, Internet service providers, and the government. For example, participants felt more comfortable sharing their smart home data with advertisers if they believed that they would receive improved targeted advertising experiences.
  3. User assumptions about privacy protections are contingent on their trust of IoT device manufacturers. Participants tended to trust large technology companies, such as Google and Amazon, to have the technical means to protect their data, although they could not confirm if these companies actually performed encryption or anonymization. Participants also trusted home appliance and electronics brands, such as Philips and Belkin, although these companies have limited experience making Internet-connected appliances. Participants generally rationalized their reluctance to take extra steps to protect their privacy by referring to their trust in IoT device manufacturers to not do anything malicious with their data.
  4. Users are less concerned about privacy risks from devices that do not record audio or video. However, researchers have demonstrated that metadata from non-A/V smart home devices, such as lightbulbs and thermostats, can provide enough information to infer user activities, such as home occupancy, work routines, and sleeping patterns. Additional outreach is needed to inform consumers about non-A/V privacy risks.

Recommendations. These themes motivate recommendations for smart home device designers, researchers, regulators, and industry standards bodies. Participants’ desires for convenience and trust in IoT device manufacturers limit their willingness to take action to verify or enforce smart home data privacy. This means that privacy notifications and settings must be exceptionally clear and convenient, especially for smart home devices without screens. Improved cybersecurity and privacy regulation, combined with industry standards outlining best privacy practices, would also reduce the burden on users to manage their own privacy. We encourage follow-up studies examining the effects of smart home devices on privacy between individuals within a household and comparing perceptions of smart home privacy in different countries.

For more details about our interview findings and corresponding recommendations, please read our paper or see our presentation at CSCW 2018.

Full citation: Serena Zheng, Noah Apthorpe, Marshini Chetty, and Nick Feamster. 2018. User Perceptions of Smart Home IoT Privacy. In Proceedings of the ACM on Human-Computer Interaction, Vol. 2, CSCW, Article 200 (November 2018), 20 pages. https://doi.org/10.1145/3274469

An unverifiability principle for voting machines

In my last three articles I described the ES&S ExpressVote, the Dominion ImageCast Evolution, and the Dominion ImageCast X (in its DRE+VVPAT configuration).  There’s something they all have in common: they all violate a certain principle of voter verifiability.

  • Any voting machine whose physical hardware can print votes onto the ballot after the last time the voter sees the paperis not a voter verified paper ballot system, and is not acceptable.
  • The best way to implement this principle is to physically separate the ballot-marking device from the scanning-and-tabulating device.  The voter marks a paper ballot with a pen or BMD, then after inspecting the paper ballot, the voter inserts the ballot into an optical-scan vote counter that is not physically capable of printing votes onto the ballot.

The ExpressVote, IC-Evolution, and ICX all violate the principle in slightly different ways: The IC-Evolution one machine allows hand-marked paper ballots to be inserted (but then can make more marks), the ExpressVote in one configuration is a ballot-marking device (but after you verify that it marked your ballot, you insert it back into the same slot that can print more votes on the ballot), and IC-X configured as DRE+VVPAT can also print onto the ballot after the voter inspects it.  In fact, almost all DRE+VVPATs can do this:  after the voter inspects the ballot, print VOID on that ballot (hope the voter doesn’t notice), and then print a new one after the voter leaves the booth.

It is to obey this principle that we should separate ballot marking devices from ballot scanning/tabulation devices (better known as “optical scanners”).  Here’s my favorite ballot-marking device:

But here are some other acceptable BMDs (from ClearBallot, ES&S, Hart, Dominion, and Unisyn):

     

Any of these can mark a paper ballot to be inserted in a separate optical-scanner.  You might notice that the second picture is an ExpressVote, which if used as an all-in-one unit that both marks and scans the ballot,  violates the principle.  But if used as a nonscanning, nontabulating ballot-marking device, and if the tabulating optical scanner cannot mark votes onto the ballot,  then the ExpressVote (and similar machines) can safely be used as a BMD.

“… whose physical hardware …”

I stated the principle as, “Any voting machine whose physical hardware can print votes onto the ballot after the last time…”  That’s quite different from “Any voting machine that can print votes onto the ballot after the last time…”

What’s the difference?  Those two statements might seem equivalent, but they’re not.

All-in-one voting machines such as the Dominion ImageCast Evolution and the ES&S ExpressVote have software that, to the best of our knowledge, doesn’t cheat.  Their software passes inspection by and EAC-certified laboratory, and we hope that such labs would notice if there were a part of the program that printed votes on an already-marked ballot.  So it’s fair to say, as it’s shipped from the manufacturer, neither of these machines can print votes onto an already-marked ballot.

But the problem is, the software can be replaced by unauthorized software that behaves differently.  That unauthorized replacement, we call “hacking.”  The unauthorized software can send instructions to the physical hardware of the machine: motors, scanners, printers, indicator lights, and so on.  Anything that the voting machine’s physical hardware can do, the fraudulent software can tell it to do.

Optical scanners that mark serial numbers on the ballot

I stated the principle as, “Any machine whose physical hardware can print votes onto the ballot after the last time…”  That’s quite differnt from, “Any machine whose physical hardware can print onto the ballot after the last time…”

What’s the difference?    Those two statements might seem equivalent, but they’re not.

Ballot-comparison audits are one form of risk-limiting audit (RLA) that can be particularly efficient.  The idea is: the optical-scan voting machine produces a file of Cast-Vote Records (CVRs) that contains a commitment to the contents and interpretation of each individual paper ballot.  It must be possible to link each CVR to one particular piece of paper, otherwise a ballot-comparison audit is not possible.  One cannot link CVRs to paper ballots unless the paper ballot has some sort of serial number, either preprinted (before it goes through the optical scanner) or printed afterward (perhaps as it goes through the optical scanner).   Because most voting equipment in use today does not have this capability, ballot-comparison audits cannot be used with that equipment, and other RLA methods are used, such as ballot-polling audits or batch-comparison audits.

There’s a problem with putting serial numbers on the ballot that the voter can see: it weakens the secret ballot, because now the voter can remember the serial number, and prove how she voted; thus she can be bribed or coerced to vote a certain way.  Therefore, some jurisdictions may be reluctant to use preprinted serial numbers.

So there are reasons that we might wish to allow optical-scanners to print serial numbers onto the ballot, but the optical scanner must not be physically able to print votes onto the ballot — that would violate the verifiability principle I stated at the beginning.

One solution to this problem  is to equip the optical scanner with a printer that is physically able to print only within 1 centimeter of the edge of the paper.  As long as no vote-marks are expected at the edge of the paper, then the scanner can print onto the ballot but cannot print votes onto the ballot.

Two widely used central-count optical scanners from major voting-machine manufacturers both have this capability: the Dominion ImageCast Central and the ES&S DS850.  Jennifer Morrell informs me, “So far, Dominion’s CVR is the only one I’ve seen where the imprinted ID can be formatted to indicate a specific scanner, batch, and sequence number within the batch.”  That is, the cast-vote record of Dominion’s central-count op-scanner has not just a serial number, but an identifier whose design is particularly helpful in ballot-comparison audits.

“… the voter inserts the ballot …”

Some voters have motor disabilities that make it difficult or impossible for them to physically handle a paper ballot.  Some voters have visual impairments, they can’t see a paper ballot.  For those voters, polling places that use optical-scan voting can (and do) provide ballot-marking devices (such as the ones shown in the pictures above) that have audio interfaces (for blind voters) or sip-and-puff interfaces (for quadriplegic voters).

But after they use the BMD to mark their ballot, some of these disabled voters are physically unable to take the ballot from the BMD and insert it into the optical scanner.  For those voters, an advantage of DRE+VVPAT or all-in-one voting machines is that they don’t have to handle a paper ballot.

When the ballot-marking device is separate from the optical scanner, those voters will need the assistance of a pollworker to insert their ballot into the optical scanner (or, when central-count optical scanning is used, insert it into the ballot box).  This seems necessary: the security hazards of all-in-one voting machines, the unverifiability of scanners that can print more votes onto the ballot, outweigh the convenience factor of an all-in-one voting machine.

 

 

Continuous-roll VVPAT under glass: an idea whose time has passed

States and counties should not adopt DRE+VVPAT voting machines such as the Dominion ImageCast X and the ES&S ExpressVote.  Here’s why.

Touchscreen voting machines (direct-recording electronic, DRE) cannot be trusted to count votes, because (like any voting computer) a hacker may have installed fraudulent software that steals votes from one candidate and gives them to another.  The best solution is to vote on hand-marked paper ballots, counted by optical scanners.  Those opscan computers can be hacked too, of course, but we can recount or random-sample (“risk-limiting audit”) the paper ballots, by human inspection of the paper that the voter marked, to make sure.

Fifteen years ago in the early 2000s, we computer scientists proposed another solution: equip the touchscreen DREs with a “voter verified paper audit trail” (VVPAT).  The voter would select candidates on a touchscreen, the DRE would print those choices on a cash-register tape under glass, the voter would inspect the paper to make sure the machine wasn’t cheating, the printed ballot would drop into a sealed ballot box, and the DRE would count the vote electronically.  If the DRE had been hacked to cheat, it could report fraudulent vote totals for the candidates, but a recount of the paper VVPAT ballots in the ballot box would detect (and correct) the fraud.

By the year 2009, this idea was already considered obsolete.  The problem is, no one has any confidence that the VVPAT is actually “voter verified,” for many reasons:

  1. The VVPAT is printed in small type on a narrow cash-register tape under glass, difficult for the voter to read.
  2. The voter is not well informed about the purpose of the VVPAT.  (For example, in 2016 an instructional video from Buncombe County, NC showed how to use the machine; the VVPAT-under-glass was clearly visible at times, but the narrator didn’t even mention that it was there, let alone explain what it’s for and why it’s important for the voter to look at it.)
  3. It’s not clear to the voter, or to the pollworker, what to do if the VVPAT shows the wrong selections.  Yes, the voter can alert the pollworker, the ballot will be voided, and the voter can start afresh.  But think about the “threat model.”  Suppose the hacked/cheating DRE changes a vote, and prints the changed vote in the VVPAT.  If the voter doesn’t notice, then the DRE has successfully stolen a vote, and this theft will survive the recount.  If the voter does notice, then the DRE is caught red-handed, except that nothing happens other than the voter tries again (and the DRE doesn’t cheat this time).   You might think, if the wrong candidate is printed on the VVPAT then this is strong evidence that the machine is hacked, alarm bells should ring– but what if the voter misremembers what he entered in the touch screen?  There’s no way to know whose fault it is.
  4. Voters are not very good at correlating their VVPAT-in-tiny-type-under-glass to the selections they made on the touch screen.  They can remember who they selected for president, but do they really remember the name of their selection for county commissioner?  And yet, historically in American elections, it’s as often the local and legislative offices where ballot-box-counting (insider) fraud has occurred.
  5. “Continuous-roll” VVPATs, which don’t cut the tape into individual ballots, compromise the secrecy of the ballot.  Since any of the political-party-designated pollwatchers can see (and write down) what order people vote on the machine, and know the names of all the voters who announce themselves when signing in, they can (during a recount) correlate voters to ballots.  (During a 2006 trial in the Superior Court of New Jersey, I was testifying about this issue; Judge Linda Feinberg saw this point immediately, she said it was obvious that continuous-roll VVPATs compromise the secret ballot and should not be acceptable under New Jersey law. )

For all these reasons, many states that adopted DRE+VVPAT in the period 2003-2008 have abandoned them, switching over to optical-scan voting with hand-marked (“fill in the opscan bubbles”) paper ballots, with Ballot-Marking Devices (BMDs) available for voters who can’t easily read or handle the paper.  Buncombe County switched to optical scan between 2016 and 2018, because the state of North Caroline outlawed continuous-roll VVPATs).

In the 2018 election, approximately* 42 states will use optical-scan, 3 states will use DRE+VVPAT, and 5 states will use paperless DREs (touchscreens).  Between 2002 and 2018, many states switched from DRE to opscan, from mechanical lever machines to opscan, from punchcard to opscan, from DRE+VVPAT to opscan; but not one state that I know of switched to DRE+VVPAT.  It’s not a good technology; it’s too easy for the computer (if hacked) to manipulate what appears on the paper record.

New Jersey is one of those 5 states that use paperless DREs.  There’s no excuse for that; if the DREs are hacked, elections can be stolen with no detection and no recourse.  (Or if the DREs “make a mistake“, no recount is possible.)  New Jersey should switch to voter-marked optical-scan ballots, like the rest of the country.

But I am informed** that three New Jersey counties (Gloucester, Essex, and Union) are considering the purchase of new voting machines, and they’re considering only the ES&S ExpressVote and the Dominion ImageCast X.  I’ve already explained why the ExpressVote is a bad idea.

New Jersey (or any state) should not adopt Dominion ImageCast X DRE+VVPAT voting machine.  The ImageCast X comes in several configurations, and one of them is basically a DRE+VVPAT, with a continuous-roll cash-register tape under glass.  Kevin Skoglund, a software engineer in Pennsylvania, had an opportunity to examine one at a demonstration in Harrisburg, PA.  He reports that it’s quite difficult to read the VVPAT-under-glass:  the printing was gray (not black) on the thermal paper, the font was small, the glass window in the machine was small.  Even though he has 20/20 vision, he had difficulty reading it.

The ImageCast X is advertised as an optical scanner, not a DRE, because, technically, this configuration prints a QR barcode onto the VVPAT tape, then an integrated scanner immediately reads this QR code before counting the vote.  This is a distinction without a difference.  All the disadvantages 1,2,3,4,5 (above) apply to this format.  Sure, a DRE+VVPAT is marginally better than a DRE; but that’s not the technology to adopt in 2018.

New Jersey should buy optical-scan voting machines for hand-marked optical-scan ballots.  Dominion makes reasonable optical-scan voting machines:  the ImageCast Precinct and the ImageCast Central.  ES&S makes reasonable optical-scan voting machines: the DS200, the DS450, and the DS850.   Three other companies make EAC-certified optical-scan voting machines: Clearballot, Hart, and Unisyn.  New Jersey (and the few other states still using paperless DREs)  should buy optical-scan voting machines from any of these 5 companies.

*I say “approximately” because some states use different machines in different counties.

**e-mail from Robert Giles, Director of the NJ Division of Elections, to Stephanie Harris, October 11, 2018.

Photo of ImageCast X VVPAT window:  Kevin Skoglund, June 2018.