February 23, 2020

CITP Tech Policy Boot Camp 2019

[This post was written by Liza Paudel, MPA’21 and Allison Huang, History’20.]

Over Fall Break, the Center for Information Technology Policy (CITP) hosted 17 current students on a two-day tech policy bootcamp in Washington D.C. The group was a mix of undergraduate and graduate students from various disciplines including Computer Science, Public Policy, Economics, and History. The students were accompanied by CITP professors and staff, Tithi Chattopadhyay, Ed Felten, Mihir Kshirsagar, and Matt Salganik. Over the course of the two days, students met with technologists, researchers, public policy professionals, and government officials, and learned about the tech policy landscape across the tech industry, regulatory agencies, and research institutions.

On the first day, students met with the Federal Trade Commission (FTC) Commissioner Noah Joshua Phillips and staff at FTC, and discussed the FTC’s role in anti-trust and consumer protection. This was followed by a reception where students mingled with alumni that work in tech policy-related fields on and off the hill. On the second day, students met with Pablo Chavez ’93, the head of Public Policy and Government Affairs at Google Cloud, and researchers at the Brookings Institution. At Brookings, the students and researchers discussed Brookings’ cross-cutting tech policy research initiatives, Artificial Intelligence (AI) governance, its implications for social and foreign policy, as well as the promise of large-scale data analysis for more effective policymaking. Finally, students met with Deputy Chief Technology Officer of the United States and Assistant Director of Artificial Intelligence Dr. Lynne Parker and staff at the White House Office of Science and Technology Policy. At the White House, students learned about how the Executive Branch approaches agenda-setting, stakeholder engagement, and inter-agency collaboration on tech policy.

Some of the broad themes that emerged are highlighted below:

  • Public pressure to do something is mounting. As public awareness of issues like AI systems, protection of personal data, algorithmic bias, and anti-trust, grows, regulatory agencies, industry, and research institutions are feeling the need to prioritize tech policy issues on their agenda. The interest in regulating ‘big tech’ more heavily has also gained momentum, and there seems to be tacit understanding that more regulation is coming. Regulatory agencies and research institutions are thus looking for effective ways to bring together stakeholders and think through the balance between enabling innovation and the necessary regulatory burden. Tech companies, for their part, have their own ethics principles and have created codes of conduct in anticipation. With increased public interest and news coverage, there has also been a rise in misinformation and public confusion. For example, one researcher noted how he has often had to expel away media narratives of ‘Ex-Machina-style AI taking over the world’ that largely shape public perception of the dangers of AI.
  • Everyone’s eyes are on one another. The tech industry is looking to the government, the government to the industry, and research institutions to both, as disparate attempts to gain new understandings of emerging technologies are moving forward on all three. Each is carving out its own space in the still nascent landscape. The relationship between technology companies and policy institutions is also complicated, hindering real collaboration. While the ‘revolving door’ between the two was a recurring theme in discussions, the old schism between the public and the private continues to persist as well.
  • The problems are interdisciplinary, so should the solutions be. There is both tacit understanding and explicit expression that the government lacks the information and tools to understand and regulate emerging technologies. There is a dearth of technical experts who are also well-versed in policy and legislation, and vice versa. Multiple speakers noted how lawyers do some of this work, but only up to a certain degree because there are technical limitations to their training. Thus, there is a growing need for computer scientists and public policy students to be interdisciplinary in their academic training.

Overall, the tech policy boot camp illustrated the need for Princeton students to nurture interdisciplinary technical and non-technical skills to have impactful and rewarding careers in tech policy.

Improving Protections for Children’s Privacy Online

CITP’s Tech Policy Clinic submitted a Comment to the Federal Trade Commission in connection with its review of the COPPA Rule to protect children’s privacy online. Our Comment explains why it is important to update the COPPA Rule to keep it current with new privacy risks, especially as children spend increasing amounts of time online on a variety of connected devices.

What is the Children’s Online Privacy Protection Act (COPPA)?

As background, Congress in 1998 gave the FTC authority to issue rules that govern how online commercial service providers should collect, use or disclose information about children under the age of 13. The FTC issued the first version of the Rule in 2000 which requires providers to place parents in control over what information is collected from their young children online. The Rule applies to both providers of services directed to children under 13 as well as those serving a general audience who have actual knowledge that they are collecting, using, or disclosing personal information from children under 13. This Rule was subsequently revised, after a period of public comment, in 2013 to account for technological developments, including the pervasive use of mobile apps. In 2019, the FTC announced it was revisiting the Rule in light of ongoing questions about the efficacy of the Rule in a data-fueled online marketplace and soliciting public comment on potential improvements to the Rule. 

Core Recommendations to Update the COPPA Rule

Our Comment makes three main points:

  • We  encourage the FTC to develop rules that promote external scrutiny of provider practices by making the provider’s choices about how they are complying with the Rule available in a transparent and machine-readable format. 
  • We recommend that the FTC allow providers to rely on an exemption from collecting or tracking information related to “internal operations” only under extremely limited circumstances, otherwise the exception risks swallowing the rule. 
  • We offer some suggestions on how education technology providers should be responsive to parents and recommend that the FTC conduct further studies about how such technology is being used in practice. 

We elaborate on each point below.

Enabling Effective External Compliance Checks Through Transparency

One of the central challenges with the COPPA Rule today is that it is very difficult for external observers (parents, researchers, journalists or advocacy groups) to understand how an online provider has decided to comply with the Rule. For example, it is not clear if a site believes it is in compliance with the Rule because it argues that none of its content is directed at children or because it has implemented rules that seek appropriate consent before gathering information about users. Making a provider’s choices on compliance transparent will enable meaningful external scrutiny of practices and hold providers to account. 

Under the COPPA Rule providers are responsible for determining whether or not a service is child directed by looking to a variety of factors. If the service is directed at children, then the provider must ensure they have verified parental consent before collecting information about users. If the audience is of mixed age, then the provider must ensure that it does not collect information about users under the age of 13 without parental consent.

The determination about whether a service is child directed, as the FTC explains, includes factors such as “its subject matter, visual content, use of animated characters or child-oriented activities and incentives, music or other audio content, age of models, presence of child celebrities or celebrities who appeal to children, language or other characteristics of the Web site or online service, as well as whether advertising promoting or appearing on the Web site or online service is directed to children . . . [and] competent and reliable empirical evidence regarding audience composition, and evidence regarding the intended audience.” If the service is child directed and children under the age of 13 are the primary audience, then it is “primarily child directed.” If services that are child directed, but do not target children as the primary audience, they are “child directed, but mixed audience” services under the COPPA Rule. 

If a mixed audience service seeks to collect information about users it can choose to implement an age gate to ensure it does not collect data about underage users. An age gate is, a mechanism that asks users to provide their age or date of birth in an age-neutral way. 

Our principal recommendation is that the COPPA Rule should be revised to explicitly facilitate external scrutiny by requiring providers to make their design choices more open to external review. Specifically, we suggest that the FTC should make sites or services disclose, in a machine-readable format, whether they consider themselves, in whole or part, “directed to children” under COPPA. This allows academic researchers (or parents) to examine what the provider is actually doing to protect children’s privacy. 

We also recommend that the FTC establish a requirement that, if a website or online service is using an age gate as part of its determination that it is not child directed, it must publicly post a description of the operation of the age gate and what steps it took to validate that children under 13 cannot circumvent the age gate. 

In addition, drawing on our work on online dark patterns, we suggest that the FTC examine the verifiable parental consent mechanisms used by providers to ensure that parents are being given the opportunity to make fully informed and free choices about their child’s privacy. 

Finally, we suggest some ways that platforms such as iOS or Android can be enlisted by the FTC to play a more effective role in screening users and verifying ages.

Restrict Providers from Relying on the “Internal Operations” Exception

Another significant issue with current practices is that providers rely on an exception for providing parental notice and obtaining consent before collecting personal information when they use persistent identifiers for “internal operations.” The 2013 revisions to the Rule included this new exception, but required it to be used for a limited set of circumstances necessary to deliver the service. It appears many providers now use that exception for a wide variety of purposes that go well beyond what is strictly necessary to deliver the service. In particular, users have no external way to verify whether certain persistent identifiers, such as cookies, are being used for impermissible purposes. Therefore, our Comment urges the FTC to require providers to be transparent about how they rely on the “internal operations” exception when using certain persistent identifiers and limit the circumstances when the providers are allowed to use such an exception.

Give Parents Control Over Information Collected by Educational Technology Service Providers

Finally, our Comment addresses the FTC’s query about whether a specific exception for parental consent is warranted for the growing market of providers of educational technology services to children (and their parents) in the classroom and at home. We recommend that the FTC should study the use of educational technology in the field before considering a specific exception to parental consent. In particular, we explain that any rule should cover the following issues: First, parents should be told, in an accessible manner, what data educational technology providers collect about their children, how that data is used, who has access to the data, and how long it is retained. Parents should also have the right to request that data about their children are deleted. Second, school administrators should be given guidance on how to make informed decisions about selecting educational technology providers, develop policies that preserve student privacy, and train educators to implement those policies. Third, the rule should clarify how school administrators and educational technology providers are accountable to parents for how data about their children are collected, used and maintained. Fourth, the FTC needs to clearly define what is meant by “educational purposes” in the classroom in considering any exceptions for parental consent.

* The Comment was principally drafted by Jonathan Mayer and Mihir Kshirsagar, along with Marshini Chetty, Edward W. Felten, Arunesh Mathur, Arvind Narayanan, Victor Ongkowijaya, Matthew J. Salganik, Madelyn Sanfilippo, and Ari Ezra Waldman.

The Unknown History of Digital Cash

How could we create “a digital equivalent to cash, something that could be created but not forged, exchanged but not copied, and which reveals nothing about its users”?

Why would we need this digital currency?

Book cover for Finn Brunton's "Digital Cash: The Unknown History of the Anarchists, Utopians, and Technologists Who Created Cryptocurrency."

Dr. Finn Brunton, Associate Professor in the Department of Media, Culture, and Communication at NYU, discussed his new book Digital Cash: The Unknown History of the Anarchists, Utopians, and Technologists Who Created Cryptocurrency on November 19th, 2019 with CITP’s Technology and Society Reading Group. Footage aired on C-SPAN’s Book TV.

Through a series of “how” and “why” questions, Finn constructed a fascinating and critical narrative around the history of digital currencies and the emergence of modern cryptocurrency. How much currency should be produced? How do we know if currency is real? Why gold, relative to digital gold currencies (DGCs)?

Beginning with the $20 bill, as analog “beautiful objects of government technology” made possible in a digital era by the rose engine lathe, and ending with the first ever tweet about Bitcoin (“Running bitcoin”), posted by Hal Finney (@halfin), Finn described the unexpected sociotechnical origins of Bitcoin and blockchain. His talk, and the book on which it was based, identify seminal articles (e.g. “The Computers of Tomorrow” by Martin Greenberger) and discussion communities (e.g. Extropy), key figures from David Chaum and Paul Armer to Tim May and Phil Salin, and digital currencies, from EFTs to hashcash, that served as stepping stones toward contemporary cryptocurrencies. Yet, Finn also importantly acknowledged that while names and dates are memorable and compelling in constructing a timeline and pulling continuous threads through this history, there are “n+1” ideas about and versions of digital currency.

In this sense, Finn provides, more so than an attempt at a comprehensive chronology, a sense of the recurring objectives that motivated the evolution of cryptocurrency: trust in value, exchangeability, multiplicity, reproducibility, decentralization, abundance, scalability, sovereignty, verification, authenticity, fungibility, and transparency. In addition to these many, often “fundamentally conflicting,” values and objectives, very real concerns about privacy, surveillance, coercion, power asymmetries, and libertarian fears of crises and “the coming emergencies” led individuals and communities to develop their own digital currencies. Finn also identified some of the problematic narratives around digital currencies, such as the assertion that cryptocurrency is “as real as math”, and real challenges that have stymied and limited various experimental currencies.

Many of these challenges were highly apparent as Finn described the rise and fall of DGCs. The strange union between futuristic digital currency and precious metals, particularly gold in its “magnificent, stupid honesty,” emerged in many parallel libertarian communities in the US and around the world, as digital and analog receipts of ownership in precious metals were distributed to document remote stored value in a decentralized system. Finn explained how these DGCs (e.g. “eLiberty Dollars” or “The Second Amendment Dollar”) challenged the power and authority of state currencies and modern banking and how the abrupt seizure of precious metal stockpiles, as evidence, by Federal Marshals foreshadowed some of the inaccessibility problems of cryptocurrency, as well as the relationships between illicit activities and digital currencies which now exist on the Silk Road.

Finn ended the discussion answering audience questions, including about power dynamics and the libertarian origins of cryptocurrency. His assertion that money and crisis are linked, not only in the “economy of emergency preparedness,” but also in key points of progress toward “the future of money” is compelling in identifying how digital currencies fit into this historical pattern in a larger monetary history.