November 21, 2024

Archives for July 2006

Banner Ads Launch Security Attacks

An online banner advertisement that ran on MySpace.com and other sites over the past week used a Windows security flaw to infect more than a million users with spyware when people merely browsed the sites with unpatched versions of Windows …

So says Brian Krebs at the Washington Post’s Security Fix blog. The ads, he says, contained a booby-trapped image that exploited a Windows security flaw to install malicious software. (Microsoft released a patch for the flaw back in January.)

Is this MySpace’s fault? I’m not asking whether MySpace is legally liable for the attack, though I’m curious what lawyers have to say about that question. I’m asking from an ethical and practical standpoint. Recognizing that the attacker himself bears primary responsibility, does MySpace bear some responsibility too?

A naive user who saw the ad displayed on a MySpace page would assume the ad was coming from MySpace. On a technical level, MySpace would not have served out the ad image, but would instead have put into the MySpace page some code directing the user’s browser to go to somebody else’s server and get an ad image; this other server would have actually provided the ad. MySpace’s business model relies on getting paid by ad agencies to embed ads in this way.

Of course, MySpace is in the business of displaying content submitted by other people. Any MySpace user could have put a similarly booby-trapped image on his own MySpace page; this has almost certainly happened. But it’s one thing to go to Johnny’s MySpace page and be attacked by Johnny. It’s another thing to go to your friend’s MySpace page and get attacked because of something that MySpace told you to display. If we’re willing to absolve MySpace of responsibility for Johnny’s attack – and I think we should be – it doesn’t follow that we have to hold MySpace blameless for the ad attack.

Nor does the fact that MySpace (presumably) does not vet the individual ads resolve the question. Failure to take a precaution does not in itself imply that the precaution is unnecessary. MySpace could have decided to vet every ad, at some cost, but instead they presumably decided to vet the ad agencies they are working with, and rely on those agencies to vet the ads.

The online ad business is a complicated web of relationships and deals. Some agencies don’t sell ads directly but make deals to display ads sold by others; and those others may in turn make the same kinds of deals, so that ads are not placed on sites not directly but through a chain of intermediaries. The more the sale and placement of ads is automated, the less there are people in the loop to spot harmful or inappropriate ads. And the more complex and indirect the mechanisms of ad placement become, the harder it is for anyone to tell where an ad came from or how it ended up being displayed on a particular site. Ben Edelman has documented how these factors can cause ads for reputable companies to be displayed by spyware. Presumably the same kinds of factors enabled the display of these attack ads on MySpace and elsewhere.

If this is true, then these sorts of ad-based attacks will be a systemic problem unless the structure of the online ad business changes.

Taking Stevens Seriously

From the lowliest blogger to Jon Stewart, everybody is laughing at Sen. Ted Stevens and his remarks (1.2MB mp3) on net neutrality. The sound bite about the Internet being “a series of tubes” has come in for for the most ridicule.

I’ll grant that Stevens sounds pretty confused on the recording. But’s let’s give the guy a break. He was speaking off the cuff in a meeting, and he sounds a bit agitated. Have you ever listened to a recording of yourself speaking in an unscripted setting? For most people, it’s pretty depressing. We misspeak, drop words, repeat phrases, and mangle sentences all the time. Normally, listeners’ brains edit out the errors.

In this light, some of the ridicule of Stevens seems a bit unfair. He said the Internet is made up of “tubes”. Taken literally, that’s crazy. But experts talk about “pipes” all the time. Is the gap between “tubes” and “pipes” really so large? And when Stevens says that his staff sent him “an Internet” and it took several days to arrive, it sounds to me like he meant to say “an email” and just misspoke.

So let’s take Stevens seriously, and consider the possibility that somewhere in his head, or in the head of a staffer telling him what to say, there was a coherent argument that was supposed to come out of Stevens’ mouth but was garbled into what we heard. Let’s try to reconstruct that argument and see if it makes any sense.

In particular, let’s look at the much-quoted core of Stevens’ argument, as transcribed by Ryan Singel. Here is my cleaned-up restatement of that part of Stevens’ remarks:

NetFlix delivers movies by mail. What happens when they start delivering them by download? The Internet will get congested.

Last Friday morning, my staff sent me an email and it didn’t arrive until Tuesday. Why? Because the Internet was congested.

You want to help consumers? Consumers don’t benefit when the Net is congested. A few companies want to flood the Internet with traffic. Why shouldn’t ISPs be able to manage that traffic, so other traffic can get through? Your regulatory approach would make that impossible.

The Internet doesn’t have infinite capacity. It’s like a series of pipes. If you try to push too much traffic through the pipes, they’ll fill up and other traffic will be delayed.

The Department of Defense had to build their own network so their time-critical traffic wouldn’t get blocked by Internet congestion.

Maybe the companies that want to dump so much traffic on the Net should pay for the extra capacity. They shouldn’t just dump their traffic onto the same network links that all of us are paying for.

We don’t have regulation now, and the Net seems to be working reasonably well. Let’s leave it unregulated. Let’s wait to see if a problem really develops.

This is a rehash of two of the standard arguments of neutrality regulation opponents: let ISPs charge sites that send lots of traffic through their networks; and it’s not broke so don’t fix it. Nothing new here, but nothing scandalous either.

His examples, on the other hand, seem pretty weak. First, it’s hard to imagine that NetFlix would really use up so much bandwidth that they or their customers weren’t already paying for. If I buy an expensive broadband connection, and I want to use it to download a few gigabytes a month of movies, that seems fine. The traffic I slow down will mostly be my own.

Second, the slow email wouldn’t have been caused by general congestion on the Net. The cause must be either an inattentive person or downtime of a Senate server. My guess is that Stevens was searching his memory for examples of network delays, and this one popped up.

Third, the DoD has plenty of reasons other than congestion to have its own network. Secrecy, for example. And a need for redundancy in case of a denial-of-service attack on the Internet’s infrastructure. Congestion probably ranks pretty far down the list.

The bottom line? Stevens may have been trying to make a coherent argument. It’s not a great argument, and his examples were poorly chosen, but it’s far from the worst argument ever heard in the Senate.

Why then the shock and ridicule from the Internet public? Partly because the recording was a perfect seed for a Net ridicule meme. But partly, too, because people unfamiliar with everyday Washington expect a high level of debate in the Senate, and Stevens’ remarks, even if cleaned up, don’t nearly qualify. As Art Brodsky of Public Knowledge put it, “We didn’t [post the recording] to embarrass Sen. Stevens, but to give the public an inside view of what can go on at a markup. Just so you know.” Millions of netizens now know, and they’re alarmed.

Net Neutrality: Strike While the Iron Is Hot?

Bill Herman at the Public Knowledge blog has an interesting response to my net neutrality paper. As he notes, my paper was mostly about the technical details surrounding neutrality, with a short policy recommendation at the end. Here’s the last paragraph of my paper:

There is a good policy argument in favor of doing nothing and letting the situation develop further. The present situation, with the network neutrality issue on the table in Washington but no rules yet adopted, is in many ways ideal. ISPs, knowing that discriminating now would make regulation seem more necessary, are on their best behavior; and with no rules yet adopted we don’t have to face the difficult issues of line-drawing and enforcement. Enacting strong regulation now would risk side-effects, and passing toothless regulation now would remove the threat of regulation. If it is possible to maintain the threat of regulation while leaving the issue unresolved, time will teach us more about what regulation, if any, is needed.

Herman argues that waiting is a mistake, because the neutrality issue is in play now and that can’t continue for long. Normally, issues like these are controlled by a small group of legislative committee members, staffers, interest groups and lobbyists, but rarely an issue will open up for wider debate, giving broader constituencies influence over what happens. That’s when most of the important policy changes happen. Herman argues that the net neutrality issue is open now, and if we don’t act it will close again and we (the public) will lose our influence on the issue.

He makes a good point: the issue won’t stay in the public eye forever, and when it leaves the public eye change will be more difficult. But I don’t think it follows that we should enact strong neutrality regulation now. There are several reasons for this.

Tim Lee offers one reason in his response to Herman. Here’s Tim:

So let’s say Herman is right and the good guys have limited resources with which to wage this fight. What happens once network neutrality is the law of the land, Public Knowledge has moved onto its next legislative issue, and the only guys in the room at FCC hearings on network neutrality implementation are telco lawyers and lobbyists? The FCC will interpret the statute in a way that’s friendly to the telecom industry, for precisely the reasons Herman identifies. Over time, “network neutrality” will be redefined and reinterpreted to mean something the telcos can live with.

But it’s worse than that, because the telcos aren’t likely to stop at rendering the law toothless. They’re likely to continue lobbying for additional changes to the rules—by the FCC or Congress—that helps them exclude new competitors and cement their monopoly power? Don’t believe me? Look at the history of cable franchising. Look at the way the CAB helped cartelize the airline industry, and the ICC cartelized surface transportation. Look at FCC regulation of telephone service and the broadcast spectrum. All of those regulatory regimes were initially designed to control oligopolistic industries too, and each of them ended up becoming part of the problem.

I’m wary of Herman’s argument for other reasons too. Most of all, I’m not sure we know how to write neutrality regulations that will have the effects we want. I’m all in favor of neutrality as a principle, but it’s one thing to have a goal and another thing entirely to know how to write rules that will achieve that goal in practice. I worry that we’ll adopt well-intentioned neutrality regulations that we’ll regret later – and if the issue is frozen later it will be even harder to undo our mistakes. Waiting will help us learn more about the problem and how to fix it.

Finally, I worry that Congress will enact toothless rules or vague statements of principle, and then declare that the issue has been taken care of. That’s not what I’m advocating; but I’m afraid it’s what we’ll get if insist that Congress pass a net neutrality bill this year.

In any case, odds are good that the issue will be stalemated, and we’ll have to wait for the new Congress, next year, before anything happens.

New Net Neutrality Paper

I just released a new paper on net neutrality, called Nuts and Bolts of Network Neutrality. It’s based on several of my earlier blog posts, with some new material.

CleanFlicks Ruled an Infringer

Joe Gratz writes,

Judge Richard P. Matsch of the United States District Court for the District of Colorado [on] Wednesday filed this opinion granting partial summary judgment in favor of the movie studios, finding that CleanFlicks infringes copyright. This is not a terribly surprising result; CleanFlicks’ business involves selling edited DVD-Rs of Hollywood movies, buying and warehousing one authorized DVD of the movie for each edited copy it sells.

CleanFlicks edited the movies by bleeping out strong language, and removing or obscuring depictions of explicit sex and violence. (Tim Lee also has interesting commentary: 1 2 3.)

The opinion is relatively short, and worth reading if you’re interested in copyright. The judge ruled that CleanFlicks violated the studios’ exclusive rights to make copies of the movies, and to distribute copies to the public. He said that what CleanFlicks did was not fair use.

There are at least four interesting aspects to the opinion.

First, the judge utterly rejected CleanFlicks’s public policy argument. CleanFlicks had argued that public policy should favor allowing its business, because it enables people with different moral standards to watch movies, and it lets people compare the redacted and unredacted versions to decide whether the language, sex, and violence are really necessary to the films. The judge noted that Congress, in debating and passing the Family Movie Act, during the pendency of this lawsuit, had chosen to legalize redaction technologies that didn’t make a new DVD copy, but had not legalized those like CleanFlicks that did make a copy. He said, reasonably, that he did not want to overrule Congress on this policy issue. But he went farther, saying that this public policy argument is “inconsequential to copyright law” (page 7).

Second, the judge ruled that the redacted copies of the movies are not derivative works. His reasoning here strikes me as odd. He says first that the redaction is not a transformative use, because it removes material but doesn’t add anything. He then says that because the redacted version is not transformative, it is not a derivative work (page 11). If it is true in general that redaction does not create a derivative work, this has interesting consequences for commercial-skipping technologies – my understanding is that the main copyright-law objection to commercial-skipping is that it creates an unauthorized derivative work by redacting the commercials.

Third, the judge was unimpressed with CleanFlicks’s argument that it wasn’t reducing the studios’ profits, and was possibly even increasing them by bringing the movie to people who wouldn’t have bought it otherwise. (Recall that for every redacted copy it sold, CleanFlicks bought and warehoused one ordinary studio-issued DVD; so every CleanFlicks sale generated a sale for the studio.) The judge didn’t much engage this economic argument but instead stuck to a moral-rights view that CleanFlicks was injuring the artistic integrity of the films:

The argument [that CleanFlicks has no impact or a positive impact on studio revenues] has superficial appeal but it ignores the intrinsic value of the right to control the content of the copyrighted work which is the essence of the law of copyright.

(page 11)

Finally, the judge notes that the studios did not make a DMCA claim, even though CleanFlicks was circumventing the encryption on DVDs into order to enable its editing. (The studios say they could have brought such a claim but chose not to.) Why they chose not to is an interesting question. I think Tim Lee is probably right here: the studios were feeling defensive about the overbreadth of the DMCA, so they didn’t want to generate more conservative opponents of the DMCA by winning this case on DMCA grounds.

There also seems to have been no claim that CleanFlicks fostered infringement by releasing its copies as unencrypted DVDs, when the original studio DVDs had been encrypted with CSS (the standard, laughably weak DVD encryption scheme). The judge takes care to note that CleanFlicks and its co-parties all release their edited DVDs in unencrypted form, but his ruling doesn’t seem to rely on this fact. Presumably the studios chose not to make this argument either, perhaps for reasons similar to their DMCA non-claim.

In theory CleanFlicks can appeal this decision, but my guess is that they’ll run out of money and fold before any appeal can happen.