It looks like the idea of requiring law enforcement access to encrypted data is back in the news, with the UK government apparently pushing for access in the wake of the recent London attack. With that in mind, let’s talk about how one can go about analyzing a proposed access mandate.
The first thing to recognize is that although law enforcement is often clear about what result they want–getting access to encrypted data–they are often far from clear about how they propose to get that result. There is no magic wand that can give encrypted data to law enforcement and nobody else, while leaving everything else about the world unchanged. If a mandate were to be imposed, this would happen via regulation of companies’ products or behavior.
The operation of a mandate would necessarily be a three stage process: the government imposes specific mandate language, which induces changes in product design and behavior by companies and users, thereby leading to consequences that affect the public good.
Expanding this a bit, we can lay out some questions that a mandate proposal should be prepared to answer:
- mandate language: What requirements are imposed, and on whom? Which types of devices and products are covered and which are not? What specifically is required of a device maker? Of an operating system developer? Of a network provider? Of a retailer selling devices? Of an importer of devices? Of a user?
- changes in product design and behavior: How will companies and users react to the mandate? For example, how will companies change the design of their products to comply with the mandate while maintaining their competitive position and serving their customers? How will criminals and terrorists change their behavior? How will law-abiding users adapt? What might foreign governments do to take advantage of these changes?
- consequences: What consequences will result from the design and behavioral changes that are predicted? How will the changes affect public safety? Cybersecurity? Personal privacy? The competitiveness of domestic companies? Human rights and free expression?
These questions are important because they expose the kinds of tradeoffs that would have to be made in imposing a mandate. As an example, covering a broad range of devices might allow recovery of more encrypted data (with a warrant), but it might be difficult to write requirements that make sense across a broad spectrum of different device types. As another example, all of the company types that you might regulate come with challenges: some are mostly located outside your national borders, others lack technical sophistication, others touch only a subset of the devices of interest, and so on. Difficult choices abound–and if you haven’t thought about how you would make those choices, then you aren’t in a position to assert that the benefits of a mandate are worth the downsides.
To date, the FBI has not put forward any specific approach. Nor has the UK government, to my knowledge. All they have offered in their public statements are vague assertions that a good approach must exist.
If our law enforcement agencies want to have a grown-up conversation about encryption mandates, they can start by offering a specific proposal, at least for purposes of discussion. Then the serious policy discussion can begin.
There’s also the whole issue of open source. If you’re giving software away for free, can they mandate what kinds of encryption you use? Can they make it illegal to write certain kinds of code? If so, how will we ever get progress in the field of security? And wouldn’t the bad guys just use the ‘outlawed’ techniques? They don’t have to follow the rules!
If your product is built on top of an open source OS, can they mandate what’s in the OS? How can you prevent users changing it out, if they have access to the OS (whether you intended them to have access or not)? Could you just offer a weakly encrypted product, and provide instructions on a web site (outside that particular government’s jurisdiction) on how to replace the encryption with a stonger solution?
This is an impossible problem, and trying to legislate a solution will only cause unintended bad consequences. No doubt it will be used as further justification to block people from fixing or repurposing the devices they own, giving companies another way to lock users down (just like the DMCA!).
This is a good example of the kind of tradeoff that the designer of a mandate will have to make. Do you require hardware/OS designers to prevent users from modifying the OS on a system or installing their own OS? If so, you’re making open source operation systems impossible. And you have to think about how system designers could try to enforce that kind of requirement.
On the other hand, if you don’t prohibit systems with modifiable/replaceable operating systems, then you have to think about whether you are creating ways for users to easily circumvent your mandate.