October 20, 2018

Archives for 2017

How the Contextual Integrity Framework Helps Explain Children’s Understanding of Privacy and Security Online

This post discusses a new paper that will be presented at the 2018 ACM Conference on Computer Supported Cooperative Work and Social Computing (CSCW). I wrote this paper with co-authors Shalmali Naik, Utkarsha Devkar, Marshini Chetty, Tammy Clegg, and Jessica Vitak.

Watching YouTube during breakfast. Playing Animal Jam after school. Asking Google about snakes. Checking points on Class Dojo. Posting a lip-synching video on Musical.ly. These online activities are interspersed in the daily lives of today’s children. They also involve logging into an account, disclosing information, or exchanging messages with others—actions that can raise privacy and security concerns.

How do elementary school-age children conceptualize privacy and security online? What strategies do they and their parents use to help address such concerns? In interviews with 18 families, we found that children ages 5-11 understand some aspects of how privacy and security apply to online activities. And while children look to their parents for support, parents feel that privacy and security are largely a concern for the future, when their children are older, have their own smartphones, and spend more time on activities like social media. (For a summary of the paper, see this Princeton HCI post.)

Privacy scholar Helen Nissenbaum’s contextual integrity framework was developed to help identify what privacy concerns emerge through the use of new technology and what types of solutions can address those concerns. We found that the framework is also useful to explain what children know (and don’t know) about privacy online and what types of educational materials can enhance that knowledge.

What is contextual integrity? The contextual integrity framework considers privacy from the perspective of how information flows. People expect information to flow in a certain way in a given situation. When it does not, privacy concerns may arise. For example, the norms of a parent-teacher conference dictate that a teacher can reveal information about the parent’s child to the parent, but not about other children. Four parameters influence these norms:

  • Context: This relates to the backdrop against which a given situation occurs.  A parent-teacher conference occurs within an educational context.
  • Attributes: This refers to the types of information involved in a particular context. The parent-teacher conference involves information about a child’s academic performance and behavioral patterns, but not necessarily the child’s medical history.
  • Actors: This concerns the parties involved in a given situation. In a parent-teacher conference, the teacher (sender) discloses information about the student (subject) to the parent (recipient).
  • Transmission Principles: This involves constraints that affect the flow of information. For example, information shared during a parent-teacher conference is unidirectional (i.e. teachers don’t share information about their own children with parents) and confidential (i.e. social norms and legal restrictions prevent teachers from sharing such information with the entire school).

How does the contextual integrity framework help us understand what children know about privacy and security online? In our interviews, we found that children largely understood how attributes and actors could affect privacy and security online. They knew that certain types of information, such as a password, deserved more protection than others. They also recognized that it was more appropriate to share information with known parties, such as parents and teachers, rather than strangers or unknown people online.

But children under age 10 struggled to grasp how interacting online could violate transmission principles by, for example, enabling unintended actors to see information. Only one child recognized that someone could take information shared in a chat message and repost it elsewhere, potentially spreading it far beyond its intended audience. Children also struggled to understand how the context of a situation could inform decisions about how to appropriately share information. They largely used the heuristic of “Could I get in trouble for this?” to guide behavior.

How do children and parents navigate privacy and security online? While a few children understood that restricting access to information or providing false information online could help them protect their privacy, most relied on their parents for support in navigating potentially concerning situations. Parents primarily used passive strategies to manage their children’s technology use. They maintained a general awareness of what their children were doing, primarily by telling children to use devices only when parents were around. They minimized the chances that their children would download additional apps or spend money by withholding the passwords for app stores.

Most parents felt their children were too young to face privacy or security risks online. But elementary school-age children already engage in a variety of activities online, and our results show they can absorb lessons related to privacy and security. Childrens’ willingness to rely on parents suggests that parents have an opportunity to usher their children’s knowledge to the next level. And parents may have an easier time doing so before their children reach adolescence and lose interest in listening to parents.

How can the contextual integrity framework inform children’s learning about privacy and security online? The contextual integrity framework can guide the development of relevant materials that parents and others can use to scaffold their children’s learning. For example, the development of a child-friendly ad blocker could help show children that other actors, such as companies and trackers, can “see” what people do online. Videos or games that explain, in an age-appropriate manner, how the Internet works, can help children understand how the Internet can challenge transmission principles such as confidentiality. Integrating privacy and security-related lessons into apps and websites that children already use can help refine their understanding of how contexts and norms shape decisions to disclose information. For example, the website for the public broadcasting channel PBS Kids instructs children to avoid using personal information, such as their last name or address, in a username.

As the boundaries between offline and online life continue to fade, privacy and security knowledge remains critical for people of all ages. Theoretical frameworks like contextual integrity help us understand how to to evaluate and enhance that knowledge.

For more information, read the full paper.

CITP Call for Visitors and Affiliates 2018-19

The Center for Information Technology Policy is an interdisciplinary research center at Princeton that sits at the crossroads of engineering, the social sciences, law, and policy.

We are seeking applicants for various residential visiting positions and for non-residential affiliates. For more information about these positions, please see our general information page and yearly call for applications and our lists of current and past visitors.

For all visitors, we are happy to hear from anyone working at the intersection of digital technology and public life, including experts in computer science, sociology, economics, law, political science, public policy, information studies, communication, and other related disciplines.

We have a particular interest this year in candidates working on issues related to Artificial Intelligence (AI) and the Internet of Things (IoT).

Visitors

All visitors must apply online through the links below. There are three job postings for CITP visitors: 1) the Microsoft Visiting Research Scholar/Professor of Information Technology Policy, 2) Visiting IT Policy Fellow, and 3) IT Policy Researcher.

Microsoft Visiting Research Scholar/Professor of Information Technology Policy

The successful applicant must possess a Ph.D. and will be appointed to a ten-month term, beginning September 1st. The visiting professor must teach one course in technology policy per academic year. Preference will be given to current or past professors in related fields and to nationally or internationally recognized experts in technology policy.

The application process for the Microsoft Visiting Research Scholar/Professor of Information Technology Policy position is generally open from November through the end of January for the upcoming year.

Apply here to become the Microsoft Visiting Research Scholar/Visiting Professor of Information Technology Policy

Visiting IT Policy Fellow

A Visiting IT Policy Fellow is on leave from a full-time position (for example, a professor on sabbatical). The successful applicant must possess an advanced degree and typically will be appointed to a nine-month term, beginning September 1st.

Full consideration for the Visiting IT Policy Fellow is given to those who apply from November through the end of January for the upcoming year.

Apply here to become a Visiting IT Policy Fellow

IT Policy Researcher

An IT Policy Researcher will have Princeton University as the primary affiliation during the visit to CITP (for example, a postdoctoral researcher or a professional visiting for a year between jobs). The successful applicant must possess a Ph.D. or equivalent and typically will be appointed to a 12-month term, beginning September 1st.

Full consideration for IT Policy Researcher positions is given to those who apply from November through the end of January for the upcoming year.

Apply here to become an IT Policy Researcher

Applicants should apply to either the Visiting IT Policy Fellow position or the IT Policy Researcher position, but not both; applicants to either position may also apply to be the Microsoft Visiting Research Scholar/Professor if they hold a Ph.D.

All applicants should submit a current curriculum vitae, a research plan (including a description of potential courses to be taught if applying for the visiting professor position), and a cover letter describing background, interest in the program, and any funding support for the visit. References are not required until finalists are notified. CITP has secured limited resources from a range of sources to support visitors. However, many of our visitors are on paid sabbatical from their own institutions or otherwise provide some or all of their own outside funding.

Princeton University is an Equal Opportunity/Affirmative Action Employer and all qualified applicants will receive consideration for employment without regard to age, race, color, religion, sex, sexual orientation, gender identity or expression, national origin, disability status, protected veteran status, or any other characteristic protected by law.

All offers and appointments are subject to review and approval by the Dean of the Faculty.

Affiliates

Technology policy researchers and experts who wish to have a formal affiliation with CITP, but will not be in residence in Princeton, may apply to become a CITP affiliate. The affiliation typically is a one-year appointment with the possibility of renewal. Applicants should have a strong interest in collaborating with a researcher or faculty member associated with CITP. Please send a current curriculum vitae and a cover letter describing your relevant background, research interests, and potential collaborative projects to Affiliates will be given access to post to our blog (Freedom to Tinker), will be invited to participate in center events, and could possibly advise CITP-affiliated students.

Applications will be accepted between November and the end of January for affiliations beginning the following academic year (July 1st). Decisions regarding candidates accepted will be made by May 1st.

Affiliates do not have any formal appointment at Princeton University. There will be no Princeton email or permanent workspace provided.

If you have any questions about any of these positions or the application process, please feel free to contact us at

AI and Policy Event in DC, December 8

Princeton’s Center for Information Technology Policy (CITP) recently launched an initiative on Artificial Intelligence, Machine Learning, and Public Policy.  On Friday, December 8, 2017, we’ll be in Washington DC talking about AI and policy.

The event is at the National Press Club, at 12:15-2:15pm on Friday, December 8.  Lunch will be provided for those who register in advance.

The agenda includes:

  • Ed Felten, with a background briefing on AI and the AI policy landscape,
  • Arvind Narayanan on AI and fairness,
  • Olga Russakovsky on diversifying the AI workforce,
  • Chloe Bakalar on AI and ethics, and
  • Nick Feamster on AI and freedom of expression.

For those who can stay longer, we’ll have a roundtable discussion with the speakers, starting at 2:30.