October 5, 2024

PrivaCI Challenge: Context Matters

by  Yan Shvartzshnaider and Marshini Chetty

In this post, we describe the Privacy through Contextual Integrity (PrivaCI) challenge that took place as part of the symposium on applications of contextual integrity sponsored by Center for Information Technology Policy and Digital Life Initiative at Princeton University. We summarize the key takeaways from the unfolded discussion.

We welcome your feedback on any of the aspects of the challenge, as we seek to improve the challenge to serve as a pedagogical and methodological tool to elicit discussion around privacy in a systematic and structured way.

See below the Additional Material and Resources section for links to learning more about the theory of Contextual Integrity and the challenge instruction web page.

What Is the PrivaCI Challenge?

The PrivaCI challenge is designed for evaluating information technologies and to discuss legitimate responses. It puts into practice the approach formulated by the theory of Contextual Integrity for providing “a rigorous, substantive account of factors determining when people will perceive new information technologies and system as threats to privacy (Nissenbaum, H., 2009).”

In the symposium, we used the challenge to discuss and evaluate recent-privacy relevant events. The challenge included 8 teams and 4 contextual scenarios. Each team was presented with a use case/context scenario which then they discussed using the theory of CI. This way each contextual scenario was discussed by a couple of teams.

 

PrivaCI challenge at the symposium on applications of Contextual Integrity

 

To facilitate a structured discussion we asked the group to fill in the following template:

Context Scenario: The template included a brief summary of a context scenario which in our case was based on one of the four privacy news related stories with a link to the original story.

Contextual Informational Norms and privacy expectations: During the discussion, the teams had to identify the relevant contextual information norms and privacy expectations and provide examples of information flows violating these norms.

Example of flows violating the norms: We asked each flow to be broken down into relevant CI Params, i.e., Identify the actors involved (senders, receivers, subjects), Attributes, Transmission Principle.

Possible solutions: Finally, the teams were asked to think of possible solutions to the problem which incorporates previous or ongoing research projects of your teammates.

What Were The Privacy-Related Scenarios Discussed?

We briefly summarize the four case studies/privacy-related scenarios and discuss some of the takeaways here from the group discussions.

  1. St. Louis Uber driver has put a video of hundreds of his passengers online without letting them know.
    https://www.stltoday.com/news/local/metro/st-louis-uber-driver-has-put-video-of-hundreds-of/article_9060fd2f-f683-5321-8c67-ebba5559c753.html
  2. “Saint Louis University will put 2,300 Echo Dots in student residences. The school has unveiled plans to provide all 2,300 student residences on campus (both dorms and apartments).”
    https://www.engadget.com/2018/08/16/saint-louis-university-to-install-2300-echo-dots/
  3. Google tracks your movements even if users set the settings to prevent it. https://apnews.com/828aefab64d4411bac257a07c1af0ecb
  4. Facebook asked large U.S. banks to share financial information on their customers.
    https://www.wsj.com/articles/facebook-to-banks-give-us-your-data-well-give-you-our-users-1533564049

 

Identifying Governing Norms

Much of the discussion focused on the relevant governing norms. For some groups, identifying norms was a relatively straightforward task. For example, in the Uber driver scenario, a group listed: “We do not expect to be filmed in private (?) spaces like Uber/Lyft vehicles.” In the Facebook case, one of the groups articulated a norm as “Financial information should only be shared between financial institutions and individuals, by default, AND Facebook is a social space where personal financial information is not shared.”

Other groups, could not always identify norms that were violated. For example, in the same “Google tracks your movements, like it or not” scenario, one of the teams could not formulate what norms were breached. Nevertheless, they felt uncomfortable with the overall notion of being tracked. Similarly, a group analyzing the scenario where “Facebook has asked large U.S. banks to share detailed financial information about their customers” found that the notion of an information flow traversing between social and financial spheres unacceptable. Nevertheless, they were not sure about the governing norms.

The unfolded discussion included whether norms usually correspond to “best” practice, due diligence. It might be even possible for Facebook to claim that it is all legal and no laws were breached in the process, but this by itself does not mean there was no violation of a norm.

We emphasized the fact that norms are not always grounded in law. An information flow can still violate a norm, despite being specified in a privacy policy or even if it is considered legal, or a “best” practice. Norms are influenced by many other factors. If we feel uneasy about an information flow, it probably violates some deeper norm that we might not be consciously aware of. This requires a deeper analysis.

Norms and privacy expectations vary among members of groups and across groups

The challenge showcases the norms and privacy expectations may vary. Some members of the group, and across groups, had different privacy expectations for the same context scenario. For example, in the Uber scenario, some members of the group, expected drivers to film their passengers for security purposes, while others did not expect to be filmed at all. In this case, we followed the CI decision heuristic which “recommends assessing [alternative flows’] respective merits as a function of the of their meaning and significance in relation to the aims, purposes, and values of the context.” It was interesting to see how by explaining the values of a “violating” information flows, it was possible to get the members of the team to consider their validity in a certain context under very specific conditions. For example, it might be acceptable for a taxi driver to record their passengers onto a secure server (without Internet access) for safety reasons.

Contextual Integrity offers a framework to capture contextual information norms

The challenge revealed additional aspects regarding the way groups approach the norm identification task. Two separate teams listed the following statement as norms: “Consistency between presentation of service and actual functioning,” and “Privacy controls actually do something.” These outline general expectations and fall under the deceptive practice of the Federal Trade Commission (FTC) act; nevertheless these expectations are difficult to capture and asses using the CI framework because they do not articulate in terms of appropriate information flows. This also might be a limitation of the task itself, due to time limitation, the groups were asked to articulate the norms in general sentences, rather than specify them using the five CI parameters.

Norm violating information flows

Once norms were identified, the groups were asked to specify possible information flows that violate them. It was encouraging to see that most teams were able to articulate the violating information flows in a correct manner, i.e., specifying the parameters that correspond to the flow. A team working on the Google’s location tracking scenario could pinpoint the violating information flow: Google should not generate flow without users’ awareness or consent, i.e., the flow can happen under specific conditions. Similar violations identified in other scenarios. For example, in the case, where an Uber driver was streaming live videos of his passengers onto the internet site. Here also the change in transmission principle and the recipient prompted a feeling of privacy violation among the group.

Finally, we asked the groups to propose possible solutions to mitigate the problem. Most of the solutions included asking users for permissions, notifying or designing an opt-in only system. The most critical takeaway from the discussion on the fact that norms and users’ privacy expectation evolve as new information flows are introduced, their merits need to be discussed in terms of the functions they serve.

Summary

The PrivaCI Challenge was a success! It served as an icebreaker for the participants to know each other a little better and also offered a structured way to brainstorm and discuss specific cases. The goal of the challenge exercise was to introduce a systematic way of using the CI framework to evaluate a system in a given scenario. We believe similar challenges can be used as a methodology to introduce and discuss Contextual Integrity in an educational setting or even possibly during the design stage of a product to reveal possible privacy violations.

Additional material and resources

You can access the challenge description and the template here: http://privaci.info/ci_symposium/challenge

The symposium program is available here.

To learn more about the theory of Contextual Integrity and how it differs from other existing privacy frameworks we recommend reading “Privacy in Context: Technology, Policy, and the Integrity of Social Life” by Helen Nissenbaum.

To participate in the discussion on CI, follow @privaci_way on Twitter.
Visit the website: http://privaci.info
Join the privaci_research mailing list.

References

Nissenbaum, H., 2009. Privacy in context: Technology, policy, and the integrity of social life. Stanford University Press.