April 27, 2024

Building Respectful Products using Crypto: Lea Kissner at CITP

How can we build respect into products and systems? What role does cryptography play in respectful design?

Speaking today at CITP is Lea Kissner (@LeaKissner), global lead of Privacy Technology at Google. Lea has spent the last 11 years designing and building security and privacy for Google projects from the grittiest layers of infrastructure to the shiniest user features — and cleaning up when something goes awry. She earned a Ph.D. in cryptography at Carnegie Mellon and a B.S. in CS from UC Berkeley.

As head of privacy at Google, Lea is crafts privacy reviews, defines what privacy means at Google, and leads a team that supports privacy across Google. Her team also creates tools and infrastructure that manage privacy across the company. If you’ve reviewed your privacy on Google, deleted your data, or shared any information with Google, Lea and her team have shaped your experience.

How does Lea think about privacy? When working to build products that respect users, Lea reminds us that it’s important for people to feel safe. This is a full-stack problem, all the way from humans and societies down to the level of hardware. Since society varies widely, people have very expectations around privacy and security, but not in the ways you would anticipate. Lea talks about many assumptions that don’t apply globally: not all languages have a word for privacy, people don’t always have control over their physical devices, and they often operate in settings of conflict.

Lea next talks about the case of online harassment. She describes hate speech as a distributed denial of service attack, a way to suppress speech they don’t like. Many platforms enable this kind of harassment, allowing anyone to send messages to anyone and enabling mass harassment. Sometimes it’s possible for platforms to develop policies to manage these problems, but platforms are often unable to intervene in cases of conflicting values.

Lea tells us about one project she worked on during the Arab uprisings. When people’s faces appeared in videos of protests, those people sometimes faced substantial risks when videos became widely viewed. Lea’s team worked with YouTube to implement software that allowed content creators to blur the faces of people appearing in videos.

Next, Lea describes the ways that her team links research with practical benefits to people. Her team’s ethnographers study differences in situations and norms. These observations shape how her team designs systems. As they create more systems, they then create design patterns, then do user testing on those patterns. Research with humans is important at both ends of the work: when understanding the meaning and nature of the challenges, and when testing systems.

Finally, Lea argues that we need to make privacy and security easy for people to do. Right now, cryptography processes are hard for people to use, and hard for people to implement. Her team focuses on creating systems to minimize the number of things that humans need to do in order to stay secure.

How Cryptography Projects can Fail

Lea next tells us about common failures in privacy and security.

The first way to fail is to create your own cryptography system. That’s a dangerous thing to do, says Lea. Why do people do this? Some think they’re smart and know enough just enough to be dangerous. Some think it’s cool to roll their own. Some don’t understand how cryptography works. Sometimes it seems too expensive (in terms of computation and network) for them to use a third-party system. To make good crypto easier, Lea’s team has created Tink, a multi-language, cross-platform library that provides cryptographic APIs that are secure, easy to use correctly, and hard(er) to misuse.

Lea urges us, “Do me a solid. Don’t give people excuses to roll their own crypto.”

Another area where people fail is in privacy-preserving computation. Lea tells us the story of a feature within Google where people wanted to send messages to someone whose phone number they have. Simple, right? Lea unpacks how complex such features can be, how easy it is to enable privacy breaches, and how expensive it can be to offer privacy. She describes a system that stores a large number of phone numbers associated with user IDs. By storing information with encrypted user IDs, it’s possible to enable people to manage their privacy. When Lea’s team estimated the impact of this privacy feature, they realized that it would require more than all of Google’s total computational power. They’re still working on that one.

Privacy is easier to implement in structured analysis of databases such as advertising metrics, says Lea. Google has had more success adopting privacy practices in areas like advertising dashboards that don’t involve real-time user experiences.

Hardware failures are a major source of privacy and security failures. Lea tells us about the squirrels and sharks that have contributed to Amazon and Yahoo data failures by nibbling on cables. She then talks to us about sources of failures from software errors, as well as key errors. Lea tells us about Google’s Key Management Server, which knows about data objects and the keys that pertain to those objects. Keys in this service need to be accessed quickly and globally.

How do generalized key management servers fail? First, encrypted data compresses poorly. If a million people send each other the same image, a typical storage system can compress it efficiently, storing it only once. An encrypted storage system has to encrypt and store each image individually. Second, people who store information often like to index and search for information. Exact matches are easy, but if you need to retrieve a range of things from a period of time, you need an index, and to create an index, the software needs to know what’s inside the encrypted data. Sharding, backing up, and caching data is also very difficult when information is encrypted.

Next, Lea tells us about the problem of key rotation. People need to be able to change their keys in any usable encryption system. When rotating keys, for every single object, you need to decrypt it using the key and then re-encrypt it using a new key. During this process, you can’t shut down an entire service in order to re-do the encryption. Within a large organization like Google, key rotation should be regular, but if it needs to be coordinated across a large number of people. Lea’s team tried something like this, but it ended up being too complex for the company’s needs. After trying this, they moved key management to the storage level, where it would be possible to manage and rotate keys independently of software teams.

What do we learn from this? Lea tells us that cryptography is a tool for turning things into key management problems. She encourages us to avoid rolling our own cryptography, to design scalable privacy-preserving systems, plan for key management up front, and evaluate the success of a design in the full stack, working from humans all the way to the hardware.