April 1, 2015

avatar

Facebook’s Emotional Manipulation Study: When Ethical Worlds Collide

The research community is buzzing about the ethics of Facebook’s now-famous experiment in which it manipulated the emotional content of users’ news feeds to see how that would affect users’ activity on the site. (The paper, by Adam Kramer of Facebook, Jamie Guillory of UCSF, and Jeffrey Hancock of Cornell, appeared in Proceedings of the National Academy of Sciences.)

The main dispute seems to be between people such as James Grimmelmann and Zeynep Tufecki who see this as a clear violation of research ethics; versus people such as Tal Yarkoni who see it as consistent with ordinary practices for a big online company like Facebook.

One explanation for the controversy is the large gap between the ethical standards of industry practice, versus the research community’s ethical standards for human subjects studies.
[Read more...]

avatar

On Decentralizing Prediction Markets and Order Books

In a new paper to be presented next week at WEIS by Jeremy Clark, we discuss the challenges in designing truly decentralized prediction markets and order books. Prediction markets allow market participants to trade shares in future events (such as “Will the USA advance to the knockout stage of the 2014 World Cup?”) and turn a profit from accurate predictions. Prediction markets have undergone extensive study by economists and have significant social value by providing accurate forecasts of future events.

Prediction markets have been traditionally run by centralized entities that holds all of their users’ funds and shares in escrow, don’t generally allow trades to be routed through different exchange services, and make many important decisions: which events to open a market for, what the correct outcome is, and how to match buyers with sellers. Our work examines the extent to which these tasks can be decentralized to reduce trust in single entities and increase transparency, fault-tolerance, and flexibility. Bitcoin’s success as a decentralized ledger of financial transactions suggests a decentralized prediction market may be within reach. [Read more...]

avatar

Cognitive disconnect: Understanding Facebook Connect login permissions

[Nicky Robinson is an undergraduate whose Junior Independent Work project, advised by Joseph Bonneau, turned into a neat research paper. — Arvind Narayanan]

When you use the Facebook Connect [1] login system, another website may ask for permission to “post to Facebook for you.” But what does this message mean? If you click “Okay”, what can the site do to your profile?

Motivated by this confusion, we explored Facebook Connect login permissions with the twin goals of understanding what permissions websites are given when a user logs in with Facebook and whether users understand that they are authorizing those permissions. Here is a working draft of our research report.
[Read more...]

avatar

Bitcoin Mining Now Dominated by One Pool

The big news in the Bitcoin world, is that one entity, called GHash, seems to be in control of more than half of all of the mining power. A part of Bitcoin’s appeal has been its distributed nature: the idea that no one party is in control but the system operates through the cooperative action of a large community. The worry now is that GHash has too much power and that this could destabilize the Bitcoin system. Today I want to explain what has happened, why it provokes worry, and how I see the situation.
[Read more...]

avatar

Encryption as protest

As a computer scientist who studies Privacy-Enhancing Technologies, I remember my surprise when I first learned that some groups of people view and use them very differently than I’m used to. In computer science, PETs are used for protecting anonymity or confidentiality, often via application of cryptography, and are intended to be bullet-proof against an adversary who is trying to breach privacy.

By contrast, Helen Nissenbaum and others have developed a political and ethical theory of obfuscation [1], “a strategy for individuals, groups or communities to hide; to protect themselves; to protest or enact civil disobedience, especially in the context of monitoring, aggregated analysis, and profiling..”  CV Dazzle and Ad Nauseam are good examples.

[Read more...]

avatar

Why King George III Can Encrypt

[This is a guest post by Wenley Tong, Sebastian Gold, Samuel Gichohi, Mihai Roman, and Jonathan Frankle, undergraduates in the Privacy Technologies seminar that I offered for the second time in Spring 2014. They did an excellent class project on the usability of email encryption.]

PGP and similar email encryption standards have existed since the early 1990s, yet even in the age of NSA surveillance and ubiquitous data-privacy concerns, we continue to send email in plain text.  Researchers have attributed this apparent gaping hole in our security infrastructure to a deceivingly simple source: usability.  Email encryption, although cryptographically straightforward, appears too complicated for laypeople to understand.  In our project, we aimed to understand why this problem has eluded researchers for well over a decade and expand the design space of possible solutions to this and similar challenges at the intersection of security and usability.

[Read more...]

avatar

If Robots Replace Lawyers, Will Politics Calm Down?

[TL;DR: Probably not.]

A recent essay from law professor John McGinnis, titled “Machines v. Lawyers,” explores how machine learning and other digital technologies may soon reshape the legal profession, and by extension, how they may change the broader national policy debate in which lawyers play such key roles.

His topic and my life seem closely related: After law school, instead of taking the bar, I became a consultant to public interest organizations and governments on the intersection of computing, law and public policy.

McGinnis sees computing as an increasingly compelling substitute for many of the most routine tasks currently done by human lawyers, and on that he must be right: “[T]he large number of journeyman lawyers—such as those who do routine wills, vet house closings, write standard contracts, or review documents on a contractual basis—face a bleak future” as automation increasingly supplants their daily work.

But what about the more difficult cognitive work of the law — how much difference will technology make there? [Read more...]

avatar

Wickr: Putting the “non” in anonymity

[Let's welcome new CITP blogger Pete Zimmerman, a first-year graduate student in the computer security group at Princeton. — Arvind Narayanan]

Following the revelations of wide-scale surveillance by US intelligence agencies and their allies, a myriad of services offering end-to-end encrypted communications have cropped up to take advantage of the increasing demand for privacy from surveillance. When coupled with anonymity, end-to-end encryption can prevent a central service provider from obtaining any information about its users or their communications.  However, maintaining anonymity is difficult while simultaneously offering a straightforward way for users to find each other.

Enter Wickr.  This startup offers a simple app featuring “military grade encryption” of text, photo, video, and voice messages as well as anonymous registration for its users. Wickr claims that it cannot identify who has registered with the service or which of its users are communicating with each other.  During registration, users enter their email address and/or phone number (non-Wickr IDs).  The app utilizes a cryptographic hash function (SHA-256 in this case) to obtain “anonymous” Wickr IDs from the non-Wickr IDs.  Wickr IDs are then stored server-side and used for discovery.  When your friends want to find you, they enter your phone number or email address, which is then put through the same hash function, resulting in the same output (Wickr ID).  Wickr looks this up in its database to determine if you’ve registered with the service to facilitate message exchange. This process simplifies the discovery of other users, supposedly without Wickr having the ability to identify the users of the anonymous service.

The problem here is that while it’s not always possible to determine the input to a hash function given the output, we can leverage the fact that the same input always yields the same output. If the number of possible inputs is small, we can simply try all of them.  Unfortunately, this is a recurring theme in a variety of applications as a result of misunderstanding cryptography — specifically, the fact that hash functions are not one-way if the input space is small.  A great explanation on the use of cryptographic hash functions in attempts to anonymize data can be found here.
[Read more...]