April 23, 2024

Archives for March 2007

Introducing All-Request Friday

Adapting an idea from Tyler Cowen, I’m going to try a new feature, where on Fridays I post about topics suggested by readers. Please post your suggested topics in the comments.

Manipulating Reputation Systems

BoingBoing points to a nice pair of articles by Annalee Newitz on how people manipulate online reputation systems like eBay’s user ratings, Digg, and so on.

There’s a myth floating around that such systems distill an uncannily accurate folk judgment from the votes submitted by millions of ordinary citizens. The wisdom of crowds, and all that. In fact, reputation systems are fraught with problems, and the most important systems survive because companies expend great effort to supplement the algorithms by investigating abuse and trying to compensate for it. eBay, for example, reportedly works very hard to fight abuse of its reputation system.

Why do people put more faith in reputation systems than the systems really deserve? One reason is the compelling but not entirely accurate analogy to the power of personal reputations in small town gossip networks. If a small-town merchant is accused of cheating a customer, everyone in town will find out quickly and – here’s where the analogy goes off the rails – individual townspeople will make nuanced judgments based on the details of the story, the character of the participants, and their own personal experiences. The reason this works is that the merchant, the customer, and the person evaluating the story are embedded in a complex, densely interconnected network.

When the network of participants gets much bigger and the interconnections much sparser, there is no guarantee that the same system will still work. Even if it does work, a large-scale system might succeed for different reasons than the small-town system. What we need is some kind of theory: some kind of explanation for why a reputation system can succeed. Our theory, whatever it is, will have to account for the desires and incentives of participants, the effect of relevant social norms, and so on.

The incentive problem is especially challenging for recommendation services like Digg. Digg assumes that users will cast votes for the sites they like. If I vote for sites that I really do like, this will mostly benefit strangers (by helping them find something cool to read). But if I sell my votes or cast them for sites run by my friends and me, I will benefit more directly. In short, my incentive is to cheat. These sorts of problems seem likely to get worse as a service grows, because the stakes will grow and the sense of community may weaken.

It seems to me that reputation systems are a fruitful area for technical, economic and social research. I know there is research going on already – and readers will probably chastise me in the comments for not citing it all – but we’re still far from understanding online reputation.