Facebook is once again clashing with its users over privacy. As a user myself, I was pretty unhappy about the recently changed privacy control. I felt that Facebook was trying to trick me into loosening controls on my information. Though the initial letter from Facebook founder Mark Zuckerberg painted the changes as pro-privacy — which led more than 48,000 users to click the “I like this” button — the actual effect of the company’s suggested new policy was to allow more public access to information. Though the company has backtracked on some of the changes, problems remain.
Some of you may be wondering why Facebook users are complaining about privacy, given that the site’s main use is to publish private information about yourself. But Facebook is not really about making your life an open book. It’s about telling the story of your life. And like any autobiography, your Facebook-story will include a certain amount of spin. It will leave out some facts and will likely offer more and different levels of detail depending on the audience. Some people might not get to hear your story at all. For Facebook users, privacy means not the prevention of all information flow, but control over the content of their story and who gets to read it.
So when Facebook tries to monetize users’ information by passing that information along to third parties, such as advertisers, users get angry. That’s what happened two years ago with Facebook’s ill-considered Beacon initiative: Facebook started telling advertisers what you had done — telling your story to strangers. But perhaps even worse, Facebook sometimes added items to your wall about what you had purchased — editing your story, without your permission. Users revolted, and Facebook shuttered Beacon.
Viewed through this lens, Facebook’s business dilemma is clear. The company is sitting on an ever-growing treasure trove of information about users. Methods for monetizing this information are many and obvious, but virtually all of them require either telling users’ stories to third parties, or modifying users’ stories — steps that would break users’ mental model of Facebook, triggering more outrage.
What Facebook has, in other words, is a governance problem. Users see Facebook as a community in which they are members. Though Facebook (presumably) has no legal obligation to get users’ permission before instituting changes, it makes business sense to consult the user community before making significant changes in the privacy model. Announcing a new initiative, only to backpedal in the face of user outrage, can’t be the best way to maximize long-term profits.
The challenge is finding a structure that allows the company to explore new business opportunities, while at the same time securing truly informed consent from the user community. Some kind of customer advisory board seems like an obvious approach. But how would the members be chosen? And how much information and power would they get? This isn’t easy to do. But the current approach isn’t working either. If your business is based on user buy-in to an online community, then you have to give that community some kind of voice — you have to make it a community that users want to inhabit.