March 30, 2017

Archives for September 2012

Accountable Algorithms: An Example

I wrote yesterday about accountable algorithms. When I say that a public algorithm is “accountable” I mean that the output produced by a particular execution of the algorithm can be verified as correct after the fact by a skeptical member of the public. Today I want to work through an example.
[Read more…]

Accountable Algorithms

Ethan Zuckerman had an interesting reaction to his first experience with the TSA Pre-Check program, which lets frequent flyers go through a much shorter and less elaborate procedure at airport security checkpoints. Ethan’s concerns about unfairness are worth pondering, but I want to focus here on his call for more openness about the algorithm that selects people for enhanced search.

Public processes often involve algorithms, and the public has an interest in the openness of these processes. Today I want to expand on what we mean when we talk about this kind of openness. In my next post, I’ll work through a specific example, taken from airport security, and show how we can improve the public accountability of that algorithm.
[Read more…]

Privacy Threat Model for Mobile

Evaluating privacy vulnerabilities in the mobile space can be a difficult and ad hoc process for developers, publishers, regulators, and researchers. This is due, in significant part, to the absence of a well-developed and widely accepted privacy threat model. With 1 million UDIDs posted on the Internet this past week, there is an urgent need for such a model to identify privacy vulnerabilities, assess compliance, scope potential solutions, and drive disclosure. This is not to say that there aren’t a number of excellent resources that provide lists of normative best practices for mobile app development. Several such resources come readily to mind: the EFF’s Mobile Bill of Rights, Future of Privacy Forum’s Best Practices for Mobile App Developers, and Via Forensics’ 42 Best Practices.
[Read more…]