December 22, 2024

Archives for October 2004

LAMP and Regulatory Arbitrage

Today, MIT’s LAMP system goes back on line, with a new design. LAMP (“Library Access to Music Project”) streams music to the MIT campus via the campus cable TV system. Any student can connect to LAMP’s website and choose a sequence of songs. The chosen songs are then scheduled for playing on one of sixteen campus TV channels.

According to MIT, transmission of music via LAMP is legal because it is covered by music licenses that MIT has purchased in connection with the campus radio station. In other words, LAMP is just like another set of sixteen campus radio stations that happen to be controllable by MIT students across the Web. I don’t know whether this legal argument is correct, but it sounds plausible and MIT appears to stand behind it.

You may recall that LAMP launched last year but was shut down a few days later when copyright owners argued that LoudEye, which had sold MIT digital files to use in that incarnation of LAMP, did not have the legal right to sell those files for such uses.

Now LAMP is back, with the original design’s efficient digital back end replaced by a new setup in which an array of low-end CD jukeboxes are controlled by special computers. This allows LAMP to get its music from ordinary CDs, as many radio stations do.

From an engineering standpoint, the new design of LAMP is overly complex, fragile, and inefficient. That’s not surprising, because lawyers must have had a big impact on the design.

LAMP is a great example of regulatory arbitrage – the adoption of otherwise-inefficient behavior in order to shift from one legal or regulatory regime to another. There’s one set of copyright rules for radio stations and another set for webcasters. LAMP transmits music over the cable-TV system, rather than the more efficient Internet system, in order to stay on the radio-station side of the line. There’s one set of rules for direct access to digital music on CDs and another set of rules for copies stored on hard disks. LAMP uses CDs in jukeboxes, rather than more efficient hard-disk storage, in order to stay on the CD side of that legal line.

We’re going to see more and more of this kind of regulatory arbitrage by engineers. Copyright law is getting more complicated and is accumulating more technology-specific rules, so there are more and more legal lines across which designers will want to step. At the same time, technology is becoming more powerful and more flexible, giving designers an ever wider menu of design options. The logical outcome is a twisting of technology design to satisfy predetermined legal categories rather than engineering efficiency.

Tit for Tat

Recent news stories, picked up all over blogland, reported that Tit-for-Tat has been dethroned as the best strategy in iterated prisoners’ dilemma games. In a computer tournament, a team from Southampton University won with a new strategy, beating the Tit-for-Tat strategy for the first time.

Here’s the background. Prisoners’ Dilemma is a game with two players. Each player chooses a move, which is either Cooperate or Defect. Then the players reveal their moves to each other. If both sides Cooperate, they each get three points. If both Defect, they each get one point. If one player Cooperates and the other Defects, then the defector gets five points and the cooperator gets none. The game is interesting because no matter what one’s opponent does, one is better off chosing to Defect; but the most mutually beneficial result occurs when both players Cooperate.

Things get more interesting when you iterate the game, so that the same pair of players plays many times in a row. A player can then base its strategy on what the opponent has done recently, which changes the opponent’s incentives in an subtle ways. This game is an interesting abstract model of adversarial social relationships, so people are interested in understanding its strategy tradeoffs.

For at least twenty years, the best-looking strategy has been Tit-for-Tat, in which one starts out by Cooperating and then copies whatever action the opponent used last. This strategy offers an appealing combination of initial friendliness with measured retaliation for an opponent’s Defections. In tournaments among computer players, Tit-for-Tat won consistently.

But this year, the Southampton team unveiled a new strategy that won the latest tournament. Many commentators responded by declaring that Tit-for-Tat had been dethroned. But I think that conclusion is wrong, for reasons I’ll explain.

But first, let me explain the new Southampton strategy. (This is based on press accounts, but I’m confident that it’s at least pretty close to correct.) They entered many players in the tournament. Their players divide into two groups, which I’ll call Stars and Stooges. The Stars try to win the tournament, and the Stooges sacrifice themselves so the Stars can win. When facing a new opponent, one of these players starts out by making a distinctive sequence of moves. Southampton’s players watch for this distinctive sequence, which allows them to tell whether their opponents are other Southampton players. When two Southampton players are playing each other, they collude to maximize their scores (or at least the score of the Star(s), if any, among them). When a Star plays an outsider, it tries to score points normally; but when a Stooge plays an outsider, it always Defects, to minimize the opponent’s score. Thus the Stooges sacrifice themselves so that the Stars can win. And indeed, the final results show a few Stars at the top of the standings (above Tit-for-Tat players) and a group of Stooges near the bottom.

If we look more closely, the Southampton strategy doesn’t look so good. Apparently, Tit-for-Tat still scores higher than the average Southampton player – the sacrifice (in points) made by the Stooges is not fully recouped by the Stars. So Tit-for-Tat will still be the best strategy, both for a lone player, and for a team of players, assuming the goal is to maximize the sum of the team members’ scores. (Note that a team of Tit-for-Tat players doesn’t need to use the Southampton trick for recognizing fellow team members, since Tit-for-Tat players who play each other will always cooperate, which is the team-optimal thing to do.)

So it seems that all the Southampton folks discovered is a clever way to exploit the rules of this particular tournament, with its winner-take-all structure. That’s clever, but I don’t think it has much theoretical significance.

UPDATE (Friday 22 October): The comments on this post are particularly good.

Preemptive Blame-Shifting by the E-Voting Industry

The November 2nd election hasn’t even happened yet, and already the e-voting industry is making excuses for the election-day failures of their technology. That’s right – they’re rebutting future reports of future failures. Here’s a sample:

Problem

Voting machines will not turn on or operate.

Explanation

Voting machines are not connected to an active power source. Machines may have been connected to a power strip that has been turned off or plugged into an outlet controlled by a wall switch. Power surges or outages caused by electrical storms or other natural occurrences are not unheard of. If the power source to the machine has been lost, voting machines will generally operate on battery power for brief periods. Once battery power is lost, however, the machines will cease to function (although votes cast on such machines will not be lost). Electronic voting machines may require the election official or precinct worker to enter a password in order to operate. Lost or forgotten passwords may produce lengthy delays as this information is retrieved from other sources.

In the past, of course, voting machines have failed to operate for other reasons, as in the 2002 California gubernatorial recall election, when Diebold machines, which turned out to be uncertified, failed to boot properly at many polling places in San Diego and Alameda counties. (Verified-voting.org offers a litany of these and other observed e-voting failures.)

The quote above comes from a document released by the Election Technology Council, a trade group of e-voting vendors. (The original, tellingly released only in the not-entirely-secure Word format, is here.)

The tone of the ETC document is clear – our technology is great, but voters and poll workers aren’t smart enough to use it correctly. Never mind that the technology is deeply flawed (see, e.g., my discussion of Diebold’s insecure protocols, not to mention all of the independent studies of the technology). Never mind that the vendors are the ones who design the training regimes whose inadequacy they blame. Never mind that it is their responsibility to make their products usable.

[Link credit: Slashdot]

Privacy, Recording, and Deliberately Bad Crypto

One reason for the growing concern about privacy these days is the ever-decreasing cost of storing information. The cost of storing a fixed amount of data seems to be dropping at the Moore’s Law rate, that is, by a factor of two every 18 months, or equivalently a factor of about 100 every decade. When storage costs less, people will store more information. Indeed, if storage gets cheap enough, people will store even information that has no evident use, as long as there is even a tiny probability that it will turn out to be valuable later. In other words, they’ll store everything they can get their hands on. The result is that more information about our lives will be accessible to strangers.

(Some people argue that the growth in available information is on balance a good thing. I want to put that argument aside here, and ask you to accept only that technology is making more information about us available to strangers, and that an erosion of our legitimate privacy interests is among the consequences of that trend.)

By default, information that is stored can be accessed cheaply. But it turns out that there are technologies we can use to make stored information (artificially) expensive to access. For example, we can encrypt the information using a weak encryption method that can be broken by expending some predetermined amount of computation. To access the information, one would then have to buy or rent sufficient computer time to break the encryption method. The cost of access could be set to whatever value we like.

(For techies, here’s how it works. (There are fancier methods. This one is the simplest to explain.) You encrypt the data, using a strong cipher, under a randomly chosen key K. You provide a hint about the value of K (e.g. upper and lower bounds on the value of K), and then you discard K. Reconstructing the data now requires doing an exhaustive search to find K. The size of the search required depends on how precise the hint is.)

This method has many applications. For example, suppose the police want to take snapshots of public places at fixed intervals, and we want them to be able to see any serious crimes that happen in front of their cameras, but we don’t want them to be able to browse the pictures arbitrarily. (Again, I’m putting aside the question of whether it’s wise for us to impose this requirement.) We could require them to store the pictures in such a way that retrieving any one picture carried some moderate cost. Then they would be able to access photos of a few crimes being committed, but they couldn’t afford to look at everything.

One drawback of this approach is that it is subject to Moore’s Law. The price of accessing a data item is paid not in dollars but in computing cycles, a resource whose dollar cost is cut in half every 18 months. So what is expensive to access now will be relatively cheap in, say, ten years. For some applications, that’s just fine, but for others it may be a problem.

Sometimes this drop in access cost may be just what you want. If you want to make a digital time capsule that cannot be opened now but will be easy to open 100 years from now, this method is perfect.

DoJ To Divert Resources to P2P Enforcement

Last week the Department of Justice issued a report on intellectual property enforcement. Public discussion has been slow to develop, since the report seems to be encoded in some variant of the PDF format that stops many people from reading it. (I could read it fine on one of my computers, but ran into an error message saying the file was encrypted on the rest of my machines. Does anybody have a non-crippled version?)

The report makes a strong case for the harmfulness of intellectual property crimes, and then proceeds to suggest some steps to strengthen enforcement. I couldn’t help noticing, though, that the enforcement effort is not aimed at the most harmful crimes cited in the report.

The report leads with the story of a criminal who sold counterfeit medicines, which caused a patient to die because he was not taking the medicines he (and his doctors) thought he was. This is a serious crime. But what makes it serious is the criminal’s lying about the chemical composition of the medicines, not his lying about their brand name. This kind of counterfeiting is best treated as an attack on public safety rather than a violation of trademark law.

(This is not to say that counterfeiting of non-safety-critical products should be ignored, only that counterfeiting of safety-critical products can be much more serious.)

Similarly, the report argues that for-profit piracy, mostly of physical media, should be treated seriously. It claims that such piracy funds organized crime, and it hints (without citing evidence) that physical piracy might fund terrorism too. All of which argues for a crackdown on for-profit distribution of copied media.

But when it comes to action items, the report’s target seems to shift away from counterfeiting and for-profit piracy, and toward P2P file sharing. Why else, for example, would the report bother to endorse the Induce Act, which does not apply to counterfeiters or for-profit infringers but only to the makers of products, such as P2P software, that merely allow not-for-profit infringement?

It’s hard to believe, in today’s world, that putting P2P users in jail is the best use of our scarce national law-enforcement resources. Copyright owners can already bring down terrifying monetary judgments on P2P infringers. If we’re going to spend DoJ resources on attacking IP crime, let’s go after counterfeiters (especially of safety-critical products) and large-scale for-profit infringers. As Adam Shostack notes, to shift resources to enforcing less critical IP crimes, at a time when possible-terrorist wiretaps go unheard and violent fugitive cases go uninvestigated, is to lose track of our priorities.