October 9, 2024

Archives for December 2006

Holiday Stories

It’s time for our holiday hiatus. See you back here in the new year.

As a small holiday gift, we’re pleased to offer updated versions of some classic Christmas stories.

How the Grinch Pwned Christmas: The Grinch, determined to stop Christmas, hacks into Amazon’s servers and cancels all deliveries to Who-ville. The Whos celebrate anyway, gathering in a virtual circle and exchanging user-generated content. When the Grinch sees this, his heart grows two sizes and he priority-ships replacement gifts to Who-ville.

Rudolph the Net-Nosed Reindeer: Rudolph is shunned by his reindeer peers for having a goofy WiFi-enabled nose. But he becomes a hero one foggy Christmas Eve by using the nose to access Google Maps, helping Santa navigate to the homes of good children.

Gift of the eMagi: Poor husband and wife find perfect gifts for each other and bid aggressively for them on eBay. Unbeknownst to them, they’re bidding against each other for the same gift. Determined to express their love by paying whatever it takes to get the gift, they bid themselves into bankruptcy.

NSA Claus is Coming to Town: He sees you when you’re sleeping. He knows when you’re awake. He knows if you’ve been bad or good, so be good or go to Gitmo.

The Little DRM-er Boy: A boy wants to share his recorded drum solo with Baby Jesus, but the file is tethered to a faraway computer. With the aid of three downloads from the East, he rips an MP3 and emails it the Mary and Joseph just in time for Christmas Night.

It’s a Wonderful Second Life: George Bailey believes that Second Life would have been better if he had never signed on at all. He jumps off a bridge … and floats slowly to the ground. Clarence Linden, George’s guardian avatar, restores the server backup from before George signed on, and watches with George while griefers run wild. George sees the error of his ways, and Clarence restores his account.

A Vista Carol: Ebenezer “Steve” Ballmer runs a coding shop in Merry Old Redmond. He forces programmer Bob Cratchit to work overtime on Christmas to meet the Vista ship date. At night, Ballmer is visited by three Ghost images: Windows Past, Windows Present, and Windows Future. [Fill in your own jokes here.] The next morning, Ballmer sends Bob home for Christmas, in exchange for a promise to keep his Blackberry on during dinner.

[Thanks to Alex Halderman and my family for help writing the stories.]

Sharecropping 2.0? Not Likely

Nick Carr has an interesting post arguing that sites like MySpace and Facebook are essentially high-tech sharecropping, exploiting the labor of the many to enrich the few. He’s wrong, I think, but in an instructive way.

Here’s the core of his argument:

What’s being concentrated, in other words, is not content but the economic value of content. MySpace, Facebook, and many other businesses have realized that they can give away the tools of production but maintain ownership over the resulting products. One of the fundamental economic characteristics of Web 2.0 is the distribution of production into the hands of the many and the concentration of the economic rewards into the hands of the few. It’s a sharecropping system, but the sharecroppers are generally happy because their interest lies in self-expression or socializing, not in making money, and, besides, the economic value of each of their individual contributions is trivial. It’s only by aggregating those contributions on a massive scale – on a web scale – that the business becomes lucrative. To put it a different way, the sharecroppers operate happily in an attention economy while their overseers operate happily in a cash economy. In this view, the attention economy does not operate separately from the cash economy; it’s simply a means of creating cheap inputs for the cash economy.

As Mike at Techdirt observes, it’s a mistake to think of the attention economy and the cash economy as separate. Attention can be converted into cash – that’s what advertising does – and vice versa. Often it’s hard to distinguish attention-seekers from cash-seekers: is that guy eating bugs on Survivor doing it for attention or money?

It’s a mistake, too, to think that MySpace provides nothing of real value to its users. I think of MySpace as a low-end Web hosting service. Most sites, including this blog, pay a hosting company to manage servers, store content, serve out pages, and so on. If all you want is to put up a few pages, full-on hosting service is overkill. What you want instead is a simple system optimized for ease of use, and that’s basically what MySpace provides. Because it provides less than a real hosting service, MySpace can offer a more attractive price point – zero – which has the additional advantage of lowering transaction costs.

The most interesting assumption Carr makes is that MySpace is capturing most of the value created by its users’ contributions. Isn’t it possible that MySpace’s profit is small, compared to the value that its users get from using the site?

Underlying all of this, perhaps, is a common but irrational discomfort with transactions where no cash changes hands. It’s the same discomfort we see in some weak critiques of open-source, which look at a free-market transaction involving copyright licenses and somehow see a telltale tinge of socialism, just because no cash changes hands in the transaction. MySpace makes a deal with its users. Based on the users’ behavior, they seem to like the deal.

Soft Coercion and the Secret Ballot

Today I want to continue our discussion of the secret ballot. (Previous posts: 1, 2.) One purpose of the secret ballot is to prevent coercion: if ballots are strongly secret, then the voter cannot produce evidence of how he voted, allowing him to lie safely to the would-be coercer about how he voted.

Talk about coercion usually centers on lead-pipe scenarios, where somebody issues a direct threat to a voter. Nice kneecaps you have there … be a shame if something unfortunate should happen to them.

But coercion needn’t be so direct. Consider this scenario: Big Johnny is a powerful man in town. Disturbing rumors swirl around him, but nothing has ever been proven. Big Johnny is pals with the mayor, and it’s no secret that Big Johnny wants the mayor reelected. The word goes around town that Big Johnny can tell how you vote, though nobody is quite sure how he does it. When you get to the polling place, Big Johnny’s cousin is one of the poll workers. You’re no fan of the mayor, but you don’t know much about his opponent. How do you vote?

What’s interesting about this scenario is that it doesn’t require Big Johnny to do anything. No lawbreaking is necessary, and the scheme works even if Big Johnny can’t actually tell how you vote, as long as the rumor that he can is at all plausible. You’re free to vote for the other guy, but Big Johnny’s influence will tend to push your vote toward the mayor. It’s soft coercion.

This sort of scheme would work today. E-voting systems are far from transparent. Do you know what is recorded in the machine’s memory cartridge? Big Johnny’s pals can get the cartridge. Is your vote time-stamped? Big Johnny’s cousin knows when you voted. Are the votes recorded in the order they were cast? Big Johnny’s cousin knows that you were the 37th voter today.

Paper ballots aren’t immune to such problems, either. Are you sure the blank paper ballot they gave you wasn’t marked? Remember: scanners can see things you can’t. And high-res scanners might be able to recognize tiny imperfections in that sheet of paper, or distinctive ink-splatters in its printing. Sure, the ballots are counted by hand, right there in the precinct, but what happens to them afterward?

There’s no perfect defense against this problem, but a good start is to insist on transparency in the election technology, and to research useful technologies and procedures. It’s a hard problem, and we have a long way to go.

Voting, Secrecy, and Phonecams

Yesterday I wrote about the recent erosion of the secret ballot. One cause is the change in voting technology, especially voting by mail. But even if we don’t change our voting technology at all, changes in other technologies are still eroding the secret ballot.

Phonecams are a good example. You probably carry into the voting booth a silent camera, built into a mobile phone, that can transmit photos around the world within seconds. Many phones can shoot movies, making it even easier to document your vote. Here is an example shot in 2004.

Could such a video be faked? Probably. But if your employer or union boss threatens your job unless you deliver a video of yourself voting “correctly”, will you bet your job that your fake video won’t be detected? I doubt it.

This kind of video recording subverts the purpose of the voting booth. The booth is designed to ensure the secret ballot by protecting voters from being observed while voting. Now a voter can exploit the privacy of the voting booth to create evidence of his vote. It’s not an exact reversal – at least the phonecam attack requires the voter’s participation – but it’s close.

One oft-suggested approach to fighting this problem is to have a way to revise your vote later, or to vote more than once with only one of the votes being real. This approach sounds promising at first, but it seems to cause other problems.

For example, imagine that you can get as many absentee ballots as you want, but only one of them counts and the others will be ignored. Now if somebody sees you complete and mail in a ballot, they can’t tell whether they saw your real vote. But if this is going to work, there must be no way to tell, just by looking at a ballot, whether it is real. The Board of Elections can’t send you an official letter saying which ballot is the real one – if they did, you could show that letter to a third party. (They could send you multiple letters, but that wouldn’t help – how could you tell which letter was the real one?) They can notify you orally, in person, but that makes it harder to get a ballot and lets the clerk at the Board of Elections quietly disenfranchise you by lying about which ballot is real.

(I’m not saying this problem is impossible to solve, only that (a) it’s harder than you might expect, and (b) I don’t know a solution.)

Approaches where you can cancel or revise your vote later have similar problems. There can’t be a “this is my final answer” button, because you could record yourself pushing it. But if there is no way to rule out later revisions to your vote, then you have to worry about somebody else coming along later and changing your vote.

Perhaps the hardest problem in voting system design is how to reconcile the secret ballot with accuracy. Methods that protect secrecy tend to undermine accuracy, and vice versa. Clever design is needed to get enough secrecy and enough accuracy at the same time. Technology seems to be making this tradeoff even nastier.

Erosion of the Secret Ballot

Voting technology has changed greatly in recent years, leading to problems with accuracy and auditability. These are important, but another trend has gotten less attention: the gradual erosion of the secret ballot.

It’s useful to distinguish two separate conceptions of the secret ballot. Let’s define weak secrecy to mean that the voter has the option of keeping his ballot secret, and strong secrecy to mean that the voter is forced to keep his ballot secret. To put it another way, weak secrecy means the ballot is secret if the voter cooperates in maintaining its secrecy; strong secrecy means the ballot is secret even if the voter wants to reveal it.

The difference is important. No system can stop a voter from telling somebody how he voted. But strong secrecy prevents the voter from proving how he voted, whereas weak secrecy does not rule out such a proof. Strong secrecy therefore deters vote buying and coercion, by stopping a vote buyer from confirming that he is getting what he wants – a voter can take the payment, or pretend to knuckle under to the coercion, while still voting however he likes. With weak secrecy, the buyer or coercer can demand proof.

In theory, our electoral system is supposed to provide strong secrecy, as a corrective to an unfortunate history of vote buying and coercion. But in practice, our system provides only weak secrecy.

The main culprit is voting by mail. A mail-in absentee ballot is only weakly secret, the voter can mark and mail the ballot in front of a third party, or the voter can just give the blank ballot to the third party to be filled out. Any voter who wants to reveal his vote can request an absentee ballot. (Some states allow absentee voting only for specific reasons, but in practice people who are willing to sell their votes will also be willing to lie about their justification for absentee voting.)

Strong secrecy seems to require the voter to cast his ballot in a private booth, which can only be guaranteed at an officially run polling place.

The trend toward voting by mail is just one of the forces eroding the secret ballot. Some e-voting technologies fail to provide even weak secrecy, for example by recording ballots in the order they were cast, thereby allowing officials or pollwatchers who record the order of voters’ appearance (as happens in many places) to connect each recorded vote to a voter.

Worse yet, even if a complex voting technology does protect secrecy, this may do little good if voters aren’t confident that the system really protects them. If everybody “knows” that the party boss can tell who votes the wrong way, the value of secrecy will be lost no matter what the technology does. For this reason, the trend toward complex black-box technologies may neutralize the benefits of secrecy.

If secrecy is being eroded, we can respond by trying to restore it, or we can decide instead to give up on secrecy or fall back to weak secrecy. Merely pretending to enforce strong secrecy looks like a recipe for bad policy.

(Thanks to Alex Halderman and Harlan Yu for helpful conversations on this topic.)