November 21, 2024

My Testimony on Behavioral Advertising

I’m testifying this morning at 10:00 AM (Eastern) at a Congressional hearing on “Behavioral Advertising: Industry Practices and Consumers’ Expectations”. It’s a joint hearing of two subcommittees of the House Committee on Energy and Commerce: the Subcommittee on Commerce, Trade, and Consumer Protection; and the Subcommittee on Communications, Technology, and the Internet .

Witnesses at the hearing are:

  • Jeffrey Chester, Executive Director, Center for Digital Democracy
  • Scott Cleland, President, Precursor LLC
  • Charles D. Curran, Executive Director, Network Advertising Initiative
  • Edward W. Felten, Professor of Computer Science and Public Affairs, Princeton University
  • Christopher M. Kelly, Chief Privacy Officer, Facebook
  • Anne Toth, Vice President of Policy, Head of Privacy, Yahoo! Inc.
  • Nicole Wong, Deputy General Counsel, Google Inc.

I submitted written testimony to the committee.

Look for a live webcast on the committee’s site.

CITP Seeking New Associate Director

In the next few days, I’ll be writing a post to announce CITP’s visiting fellows for the upcoming 2009-2010 academic year. But first, today, I want to let you know about a change in the Center’s leadership structure. After serving for two years as CITP’s first-ever Associate Director, David Robinson will be leaving us in August to begin law school at Yale. As a result, we are now launching a search for a new Associate Director.

As Associate Director, he helped oversee CITP’s growth into a larger, more mature organization, our move into a great new space in Sherrerd Hall, and two years of our busy activities calendar. He has been an integral part of the Center’s management and its research activities. David has done a fantastic job, and we’ll miss him, but we understand and support his decision to go on to law school as the next stage of his sure-to-be-stellar career. David will remain engaged with the Center’s research, and we expect to cross paths with him often in the future.

The new Associate Director will pick up where David leaves off, taking our Center to the next level in its development. The job is a fabulous opportunity to exercise leadership, vision and dedication: As a startup, we are improvising and learning while we grow, constantly looking for new and better ways to advance the policy debate and public understanding of digital technologies through both technical and policy research. Our first challenge was to get things started—now that we are established, a key priority for the new Associate Director will be building richer and deeper links and collaborations with other faculty members, policymakers, and the tech policy community generally. Here’s the official job description, soon to appear on the University’s “Jobs at Princeton” web site:

The Associate Director serves as a core organizer and evangelist for the Center, both on campus and beyond. Working with the existing Center staff, the Associate Director will develop, plan and execute the Center’s public activities, including lecture series, workshops and policy briefings; recruit visiting researchers and policy experts and coordinate the selection appointment process; cultivate research collaborations, joint public events and other activities to build faculty engagement in the Center; coordinate interdisciplinary grant writing as appropriate; and develop and maintain the Center’s website and other published materials.

One of David’s last projects at the Center will be to coordinate the search process for his replacement. The search will continue until the position is filled: We hope to have the new Associate Director in place by the start of the school year. Applicants should provide a cover letter, CV, and contact information for three references. These materials can be sent to David (or equivalently, once the University’s jobs site has the listing, they can also be submitted through that route). David will also be happy to answer any questions about the position.

China's New Mandatory Censorware Creates Big Security Flaws

Today Scott Wolchok, Randy Yao, and Alex Halderman at the University of Michigan released a report analyzing Green Dam, the censorware program that the Chinese government just ordered installed on all new computers in China. The researchers found that Green Dam creates very serious security vulnerabilities on users’ computers.

The report starts with a summary of its findings:

The Chinese government has mandated that all PCs sold in the country must soon include a censorship program called Green Dam. This software monitors web sites visited and other activity on the computer and blocks adult content as well as politically sensitive material. We examined the Green Dam software and found that it contains serious security vulnerabilities due to programming errors. Once Green Dam is installed, any web site the user visits can exploit these problems to take control of the computer. This could allow malicious sites to steal private data, send spam, or enlist the computer in a botnet. In addition, we found vulnerabilities in the way Green Dam processes blacklist updates that could allow the software makers or others to install malicious code during the update process. We found these problems with less than 12 hours of testing, and we believe they may be only the tip of the iceberg. Green Dam makes frequent use of unsafe and outdated programming practices that likely introduce numerous other vulnerabilities. Correcting these problems will require extensive changes to the software and careful retesting. In the meantime, we recommend that users protect themselves by uninstalling Green Dam immediately.

The researchers have released a demonstration attack which will crash the browser of any Green Dam user. Another attack, for which they have not released a demonstration, allows any web page to seize control of any Green Dam user’s computer.

This is a serious blow to the Chinese government’s mandatory censorware plan. Green Dam’s insecurity is a show-stopper — no responsible PC maker will want to preinstall such dangerous software. The software can be fixed, but it will take a while to test the fix, and there is no guarantee that the next version won’t have other flaws, especially in light of the blatant errors in the current version.

Internet Voting: How Far Can We Go Safely?

Yesterday I chaired an interesting panel on Internet Voting at CFP. Participants included Amy Bjelland and Craig Stender (State of Arizona), Susan Dzieduszycka-Suinat (Overseas Vote Foundation) Avi Rubin (Johns Hopkins), and Alec Yasinsac (Univ. of South Alabama). Thanks to David Bruggeman and Cameron Wilson at USACM for setting up the panel.

Nobody advocated a full-on web voting system that would allow voting from any web browser. Instead, the emphasis was on more modest steps, aimed specifically at overseas voters. Overseas voters are a good target population, because there aren’t too many of them — making experimentation less risky — and because vote-by-mail serves them poorly.

Discussion focused on two types of systems: voting kiosks, and Internet transmission of absentee ballots.

A voting kiosk is a computer-based system, running carefully configured software, that is set up in a securable location overseas. Voters come to this location, authenticate themselves, and vote just as they would in a polling place back home. A good kiosk system keeps an electronic record, which is transmitted securely across the Internet to voting officials in the voter’s home jurisdiction. It also keeps a paper record, verifiable by the voter, which is sent back to voting officials after the elections, enabling a post-election audit. A kiosk can use optical-scan technology or it can be a touch-screen machine with a paper trail — essentially it’s a standard voting system with a paper trail, connected to home across the Internet. If the engineering is done right, if the home system that receives the electronic ballots is walled off from the central vote-tabulating system, and if appropriate post-election auditing is done, this system can be secure enough to use. All of the panelists agreed that this type of system is worth trying, at least as a pilot test.

The other approach is use ordinary absentee ballots, but to distribute them and allow voters to return them online. A voter goes to a web site and downloads a file containing an absentee ballot and a cover sheet. After printing out the file, the voter fills out the cover sheet (giving his name and other information) and the ballot. He scans the cover sheet and ballot, and uploads the scan to a web site. Election officials collect and print the resulting file, and treat the printout like an ordinary absentee ballot.

Kevin Poulsen and Eric Rescorla criticize the security of this system, and for good reason. Internet distribution of blank ballots can be secure enough, if done very carefully, but returning filled-out ballots from an ordinary computer and browser is risky. Eric summarizes the risks:

We have integrity issues here as well: as Poulsen suggests (and quotes Rubin as suggesting), there are a number of ways for things to go wrong here: an attacker could subvert your computer and have it modify the ballots before sending them; you could get phished and the phisher could modify your ballot appropriately before passing it on to the central site. Finally, the attacker could subvert the central server and modify the ballots before they are printed out.

Despite the risks, systems of this sort are moving forward in various places. Arizona has one, which Amy and Craig demonstrated for the panel’s audience, and the Overseas Vote Foundation has one as well.

Why is this less-secure alternative getting more traction than kiosk-based systems? Partly it’s due to the convenience of being able to vote from anywhere (with a Net connection) instead of having to visit a kiosk location. That’s understandable. But another part of the reason seems to be that people don’t realize what can go wrong, and how often things actually do go wrong, in online interactions.

In the end, there was a lot of agreement among the panelists — a rare occurrence in public e-voting discussions — but disagreement remained about how far we can go safely. For overseas voters at least, the gap between what is convenient and what can be made safe is smaller than it is elsewhere, but that gap does still exist.

iPhone Apps: Apple Picks a Little, Talks a Little

Last week Apple, in an incident destined for the textbooks, rejected an iPhone app called Eucalyptus, which lets you download and read classic public-domain books from Project Gutenberg. The rejection meant that nobody could download or use the app (without jailbreaking their phone). Apple’s rationale? Some of the books, in Apple’s view, were inappropriate.

Apple’s behavior put me in mind of the Pick-a-Little Ladies from the classic musical The Music Man. These women, named for their signature song “Pick a Little, Talk a Little,” condemn Marian the Librarian for having inappropriate books in her library:

Maud: Professor, her kind of woman doesn’t belong on any committee. Of course, I shouldn’t tell you this but she advocates dirty books.

Harold: Dirty books?!

Alma: Chaucer!

Ethel: Rabelais!

Eulalie: Balzac!

This is pretty much the scene we saw last week, with the Eucalyptus app in the role of Marian — providing works by Chaucer, Rabelais, and Balzac — and Apple in the role of the Pick-a-Little Ladies. Visualize Steve Jobs, in his black turtleneck and jeans, transported back to 1912 Iowa and singing along with these frumpy busybodies.

Later in The Music Man, the Pick-a-Little Ladies decide that Marian is all right after all, and they praise her for offering great literature. (“The Professor told us to read those books, and we simply adored them all!”) In the same way, Apple, after the outcry over its muzzling of Eucalyptus, reverse course and un-rejected Eucalyptus. Now we can all get Chaucer! Rabelais! Balzac! on our iPhones.

But there is one important difference between Apple and the Pick-a-Little Ladies. Apple had the power to veto Eucalyptus, but the Ladies couldn’t stop Marian from offering dirty books. The Ladies were powerless because Old Man Mason had cleverly bequeathed the library building to the town but the books to Marian. In today’s terms, Mason had jailbroken the library.

All of this highlights the downside of Apple’s controlling strategy. It’s one thing to block apps that are fraudulent or malicious, but Apple has gone beyond this to set itself up as the arbiter of good taste in iPhone apps. If you were Apple, would you rather be the Pick-a-Little Ladies, pretending to sit in judgement over the town, or Old Man Mason, letting people make their own choices?