November 23, 2024

Archives for 2006

Return to Monkey High

Newsweek has released its annual list of America’s top high schools, using the same flawed formula as last year. Here’s what I wrote then:

Here is Newsweek’s formula:
“Public schools are ranked according to a ratio devised by Jay Mathews: the number of Advanced Placement and/or International Baccalaureate tests taken by all students at a school in 2004 divided by the number of graduating seniors.”

Both parts of this ratio are suspect. In the numerator, they count the number of students who show up for AP/IB tests, not the number who get an acceptable score. Schools that require their students to take AP/IB tests will do well on this factor, regardless of how poorly they educate their students. In the denominator is the number of students who graduate. That’s right — every student who graduates lowers the school’s rating.

To see the problems with Newsweek’s formula, let’s consider a hypothetical school, Monkey High, where all of the students are monkeys. As principal of Monkey High, I require my students to take at least one AP test. (Attendance is enforced by zookeepers.) The monkeys do terribly on the test, but Newsweek gives them credit for showing up anyway. My monkey students don’t learn enough to earn a high school diploma — not to mention their behavioral problems — so I flunk them all out. Monkey High gets an infinite score on the Newsweek formula: many AP tests taken, divided by zero graduates. It’s the best high school in the universe!

[Note to math geeks annoyed by the division-by-zero: I can let one monkey graduate if that would make you happier.]

Though it didn’t change the formula this year, Newsweek did change which schools are eligible to appear on the list. In the past, schools with selective admission policies were not included, on the theory that they could boost their ratings by cherry-picking the best students. This year, selective schools are eligible, provided that their average SAT score is below 1300 (or their average ACT score is below 27).

This allows me to correct an error in last year’s post. Monkey High, with its selective monkeys-only admission policy, would have been barred from Newsweek’s list last year. But this year it qualifies, thanks to the monkeys’ low SAT scores.

Newsweek helpfully includes a list of selective schools that would have made the list but were barred due to SAT scores. This excluded-schools list is topped by a mind-bending caption:

Newsweek excluded these high performers from the list of Best High Schools because so many of their students score well above average on the SAT and ACT.

(If that doesn’t sound wrong to you, go back and read it again.) The excluded schools include, among others, the famous Thomas Jefferson H.S. for Science and Technology, in northern Virginia. Don’t lose heart, Jefferson teachers – with enough effort you can lower your students’ SAT scores and become one of America’s best high schools.

Happy Endings

Cameron Wilson at the USACM Policy Blog writes about a Cato Institute event about copyright policy, which was held Wednesday. The panel on the DMCA was especially interesting. (audio download; audio stream; video stream)

Tim Lee, author of the recent Cato paper on the ill effects of the DMCA, spoke first.

The second speaker was Solveig Singleton of PFF, who offered some amazing arguments. Here is her response to the well-documented list of DMCA misuses:

Even if you set aside some of the errors in the Cato paper, you’re left with a set of examples, many of which have happy endings, without any change to the law. Ed Felten’s case, for example. There are other cases. There were lawsuits that were threatened but not brought. Lawsuits that were brought but ultimately failed. Lawsuits that succeeded but on grounds other than the DMCA.

(This is my transcription from the audio stream.)

To call the case of my colleagues and me a “happy ending” takes some real chutzpah. Let’s catalog the happy consequences of our case. One person lost his job, and another nearly did. Countless hours of pro bono lawyer time were consumed. Anonymous donors gave up large amounts of money to support our defense. I lost at least months of my professional life, and other colleagues did too. And after all this, the ending was that we were able to publish our work – something which, before the DMCA, we would have been able to do with no trouble at all.

In the end, yes, we were happy – in the same way one is happy to recover from food poisoning. Which is not really an argument in favor of food poisoning.

She goes on to argue for the efficacy of the DMCA, using the example of Apple’s FairPlay technology (which is used by the iTunes music store):

But … are they [Apple] going to be able to get music developers to the table to negotiate with them to help create this library [of music] if they can’t make some reasonable assurances that that content isn’t going to show up free everywhere else?

Never mind that all of the songs Apple sells are available for free on P2P networks, despite FairPlay and the DMCA. Never mind that FairPlay has a huge and widely known hole – the ability to burn songs to an unprotected CD – which Apple created deliberately.

It’s understandable that DMCA advocates don’t want to give a realistic, straightforward explanation of exactly why the DMCA is needed. If they tried to do so, it would become clear that the DMCA, as written, is poorly suited for their purpose. Instead, we get strawmen and arguments from counterfactual assumptions.

I’ll close with a quote from Emery Simon of the Business Software Alliance, another speaker on the same panel, making a claim so far off-base that I won’t even bother to rebut it:

[If not] for copy protection technologies, whether it’s Macrovision or CSS or Fairplay, my VCR and my television set would be devices no more useful to me than my car without gasoline.

U.S. Copyright May Get Harsher and Broader

Rep. Lamar Smith is preparing to introduce a bill in Congress that would increase penalties for copyright infringement and broaden the scope of the DMCA and other copyright laws, according to a news.com story. (The story seems to get some details of the bill wrong, so be sure to look at the bill itself before drawing conclusions.)

The bill would increase penalties for small-scale, noncommercial copyright infringement beyond even their current excessive levels. For example, noncommercial distribution of copyrighted material worth $2500 or more would carry a maximum sentence of ten years in Federal prison. Even attempting to commit that level of infringment would potentially carry a ten-year sentence. That’s the same maximum sentenced faced by bribe-taking Congressman Duke Cunningham, whose corruption probably cost taxpayers millions of dollars. It’s also more than the average Federal sentence for manslaughter (33 months), sexual abuse (73 months), arson (87 months), fraud (14 months), embezzlement (7 months), bribery (10 months), or racketeering/extortion (72 months).

The bill would also expand the scope of copyright in several respects. Most interesting to readers here is an expansion of the DMCA’s anticircumvention rules.

Recall that Section 1201 of the DMCA bans circumvention of technical protection mechanisms (TPMs), and also bans trafficking in circumvention devices. The Smith bill would expand the trafficking ban, by redefining “trafficking” as follows:

[T]he term ‘traffic in’ means to transport, transfer, or otherwise dispose of, to another, or to make, import, export, obtain control of, or possess, with intent to so transport, transfer, or dispose of.

In short, where the law now bans distribution of a circumvention device, the bill would also ban possession of a circumvention device with intent to distribute it.

This bill, if passed, would probably increase the DMCA’s chilling effect on research. Currently, a researcher can steer clear of the trafficking provision by keeping any circumvention devices to himself, using those devices himself (lawfully) in the lab. If the Smith bill passes, the researcher would have to worry that a plaintiff or prosecutor will misjudge his intent and bring a case, and that a judge or jury might be convinced that the researcher was eventually planning to distribute the device. Even if the claim of bad intent is baseless, refuting it will be slow, painful, and expensive.

I’m eager to hear the rationale for these expansions. But I wouldn’t be surprised if no rationale is offered, beyond the standard “piracy is bad” mantra or vague claims to be “rationalizing” the statute.

Serialized Posts

Lately I’ve found myself writing short series of posts on a single topic, as with the recent sequence of four posts on HDCP security. This is a departure from the traditional style of this blog, where posts were self-contained and the topic would typically change from day to day.

I typically plan out these post series in advance. For example, on HDCP I was pretty sure that there would be four posts, and what their topics would be. Sometimes the plan changes along the way, because writing the early posts, or reading the comments on them, advances my thinking on the topic. But generally I’ll stick pretty close to the plan.

You may wonder why I deliver this content as a series of short quasi-daily posts rather than delivering the whole discussion of (say) HDCP all at once. Part of the answer is that my schedule allows only a certain amount of blog-writing each day, and I would rather publish each piece sooner rather than waiting until it is all done. Another part of the answer is that I suspect there are advantages to letting you see the pieces one at a time and then seeing your comments before writing the next piece.

HDCP: Why So Weak?

Today I want to wrap up (I think) the discussion on security weaknesses in HDCP, the encryption scheme used for sending very high-def video from a device like a next-gen DVD player to a TV monitor. I wrote previously (1, 2, 3) about how HDCP will inevitably fail – catastrophically – when somebody manages to recover the master secrets that are the source of all power in the system, and publishes those secrets on the Internet. I wrote, too, about how this problem could have been avoided by using standard cryptographic primitives rather than custom-designed ones.

It seems very likely that the people in charge of HDCP knew what they were doing, and made a deliberate choice to use the less-secure scheme rather than the more secure, standard one. (I don’t have definite proof that they knew about the security problems, but it’s pretty hard to believe that their engineers failed to notice it.) Why did they choose the weak system?

The academic paper on HDCP, by Crosby et al., says that HDCP’s designers were given a “budget” of 10,000 gates. (Gates are one of the basic building blocks from which digital chips are designed.) Crosby estimates that a more secure design would have required about 30,000 gates, to fix the vulnerability I discussed earlier and some smaller vulnerabilities. How much does it cost to add gates to a design? That depends – the high end of the cost range is around $100 per 10,000 gates, but the low end might be much lower.

There are really two questions here. (1) Why did they think it was worth paying for 10,000 extra gates to have the weak system, rather than no encryption at all? (2) Why did they think it wasn’t worth 20,000 gates to have a stronger system, rather than the weak system? Let’s consider these questions in order.

First: Why is the weak system worth spending 10,000 gates for? The answer doesn’t lie in platitudes about speedbumps or raising the bar – any technical bumps or bars will be obliterated when the master secrets are published. It’s worth noting, too, that the data stream they are protecting – uncompressed super high-def (1080i) video – blasts so much data so fast that there’s no affordable way for a would-be pirate to capture it, at least today. About all that can be done with such data streams today, at reasonable cost, is to display them, or to run them through simple format converter boxes. In future years, capturing the video stream will become a viable piracy strategy, but by then the master secrets will almost certainly have been published. So temporary piracy prevention doesn’t seem like a good explanation.

A much more plausible answer is that HDCP encryption exists only as a hook on which to hang lawsuits. For example, if somebody makes unlicensed displays or format converters, copyright owners could try to sue them under the DMCA for circumventing the encryption. (Also, converter box vendors who accepted HDCP’s license terms might sue vendors who didn’t accept those terms.) The price of enabling these lawsuits is to add the cost of 10,000 gates to every high-def TV or video source, and to add another way in which high-def video devices can be incompatible.

The second question is why they weren’t willing to spend an extra 20,000 gates to use a more secure crypto scheme. Doing so would have reduced, in the long run, some types of P2P infringement. They apparently felt this would not be a good investment, presumably because other infringment scenarios were more troublesome. Why spend money strengthening one link in a chain, when other links are already weaker?

The bottom line is clear. In HDCP, “security” technologies serve not to disable pirates but to enable lawsuits. When you buy an HDCP-enabled TV or player, you are paying for this – your device will cost more and do less.