April 26, 2024

Archives for January 2005

Why Hasn't TiVo Improved?

The name TiVo was once synonymous with an entire product category, Digital Video Recorders. Now the vultures are starting to circle above TiVo, according to a New York Times story by Saul Hansell. What went wrong?

The answer is obvious: TiVo chose to cozy up to the TV networks rather than to its customers.

When my family bought a TiVo, it was a cutting-edge product (the TiVo, not the family; but come to think of it, the family was pretty cool too), delivering a customer experience that was hard to find elsewhere. Since then, eight years have passed – an eternity in the electronics business – and TiVo is still selling essentially the same product. Sure, they have added a few bells and whistles, but nothing that made us want to run out and buy a new box.

TiVo made a decision, early on, to cozy up to the TV networks, to stay within their comfort zone. But the networks’ comfort zone is awfully confining. ReplayTV took a different path, seizing the technological lead with new features that angered the networks; and the networks brought a lawsuit that ReplayTV couldn’t afford to defend. At the time, TiVo execs probably chuckled and congratulated themselves for their caution.

Now the time has come for TiVo to pay for its timidity. Its technology is no longer distinctive, and the rising tide of DRM threatens to cut TiVo’s products out of the TV delivery pipeline. (Remember, DRM is just another name for deliberate incompatibility.) It’s not clear what the company will have to offer future customers.

Which brings us to the key paragraph in the New York Times story:

Last week, TiVo announced that Mr. Ramsay was stepping down as chief executive but would remain as chairman. He said the change was his idea and had been under discussion for months. Several board members and others close to the board confirm that. But they also said that the board hoped to hire someone with less of Mr. Ramsay’s fierce belief in the power of TiVo’s technology. They said they preferred someone with an ability to repair TiVo’s relations with the big cable companies.

[italics added] As in so many organizations, TiVo’s response to crisis is to do more of what got them in trouble, rather than returning to the strategy that made them successful in the first place.

This is bad news for TiVo, which desperately needs new, distinctive technology if it wants to survive. It’s bad news for customers too.

UPDATE (2:00 PM): Matt Haughey has a nice response over at PVRblog.

Network Monitoring: Harder Than It Looks

Proposals like the Cal-INDUCE bill often assume that it’s reasonably easy to monitor network traffic to block certain kinds of data from being transmitted. In fact, there are many simple countermeasures that users can (and do, if pressed) use to avoid monitoring.

As a simple example, here’s an interesting (and well known) technical trick. Suppose Alice has a message M that she wants to send to Bob. We’ll treat M as a number (bearing in mind that any digital message can be thought of as a number). Alice chooses a random number R which has the same number of digits as M. She sends the message R to Bob; then she computes X = M-R, and sends the message X to Bob. Obviously, Bob can add the two messages, R + (M-R), and the sum will be M – the message Alice originally wanted to send him.

[Details, for mathematical purists: all arithmetic is done modulo a large prime P; R is chosen randomly in [0, P-1]. When I say a value “looks random” I mean that it is indistinguishable (in the information-theoretic sense) from a random value.]

Now here’s the cool part: both of the messages that Alice sends look completely random. Obviously R looks random, because Alice generated it randomly. But it turns out that X looks random too. To be more precise: either message by itself looks completely random; only by combining the two messages can any information be extracted.

By this expedient, Alice can foil any network monitor who looks at network messages one at a time. Each individual message looks innocuous, and it is only by storing messages and combining them that a monitor can learn what Alice is really telling Bob. If Alice sends the two messages by different paths, then the monitor has to gather messages from multiple paths, and combine them, to learn what Alice is telling Bob.

It’s easy for Alice to extend this trick, to split her message M into any number of pieces. For example, Alice could split M into five pieces, by generating four random numbers, R1, R2, R3, and R4, and then computing X = M-(R1+R2+R3+R4). Given any four of these five pieces, nothing can be deduced. Only somebody who has all five pieces, and knows to combine them by addition, can extract information. So a monitor has to gather and compare many messages to see what Alice is up to, even though Alice isn’t using encryption.

There are many more technical tricks like this that are easy for Alice and Bob to adopt, but hard for network monitors to cope with. If the monitors want to engage in an arms race, they’ll lose.

My Morning Pick-Me-Up

First thing this morning, I’m sitting in my bathrobe, scanning my inbox, when I’m jolted awake by the headline on a TechDirt story:

California Senator Wants to Throw Ed Felten in Jail

I guess I’ll take the time to read that story!

Kevin Murray, a California legislator, has introduced a bill that would fine, or imprison for up to one year, any person who “sells, offers for sale, advertises, distributes, disseminates, provides, or otherwise makes available” software that allows users to connect to networks that can share files, unless that person takes “reasonable care” to ensure that the software is not used illegally. TechDirt argues that my TinyP2P program would violate the proposed law.

Actually, the bill would appear to apply to a wide range of general-purpose software:

“[P]eer-to-peer file sharing software” means software that once installed and launched, enables the user to connect his or her computer to a network of other computers on which the users of these computers have made available recording or audiovisual works for electronic dissemination to other users who are connected to the network. When a transaction is complete, the user has an identical copy of the file on his or her computer and may also then disseminate the file to other users connected to the network.

That definition clearly includes the web, and the Internet itself, so that any software that enabled a user to connect to the Internet would be covered. And note that it’s not just the author or seller of the software who is at risk, but also any advertiser or distributor. Would TechDirt be committing a crime by linking to my TinyP2P page? Would my ISP be committing a crime by hosting my site?

The bill provides a safe harbor if the person takes “reasonable care” to ensure that the software isn’t used illegally. What does this mean? Standard law dictionaries define “reasonable care” as the level of care that a “reasonable person” would take under the circumstances, which isn’t very helpful. (Larry Solum has a longer discussion, which is interesting but doesn’t help much in this case.) I would argue that trying to build content blocking software into a general-purpose network app is a fruitless exercise which a reasonable person would not attempt. Presumably Mr. Murray’s backers would argue otherwise. This kind of uncertain situation is ripe for intimidation and selective prosecution.

This bill is terrible public policy, especially for the state that leads the world in the creation of innovative network software.

Enforceability and Steroids

Regular readers know that I am often skeptical about whether technology regulations can really be enforced. Often, a regulation that would make sense if it were (magically) enforceable, turns out to be a bad idea when coupled with a realistic enforcement strategy. A good illustrative example of this issue arises in Major League Baseball’s new anti-steroids program, as pointed out by David Pinto.

The program bars players from taking anabolic steroids, and imposes mandatory random testing, with serious public sanctions for players who test positive. A program like this helps the players, by eliminating the competitive pressure to take drugs that boost on-the-field performance but damage users’ health. Players are better off in a world where nobody takes steroids than in one where everybody does. But this is only true if drug tests can accurately tell who is taking steroids.

A common blood test for steroids measures T/E, the ratio of testosterone (T) to epitestosterone (E). T promotes the growth and regeneration of muscle, which is why steroids provide a competitive advantage. The body naturally makes E, and later converts it into T. Steroids are converted directly into T. So, all else being equal, a steroid user will have higher T/E ratio than a non-user. But of course all else isn’t equal. Some people naturally have higher T/E ratios than others.

The testing protocol will set some threshold level of T/E, above which the player will be said to have tested positive for steroids. What should the threshold be? An average value of T/E is about 1.0. About 1% of men naturally have T/E of 6.0 or above, so setting the threshold at that level would falsely accuse about 1% of major leaguers. (Or maybe more – if T makes you a better baseball player, then top players are likely to have unusually high natural levels of T.) That’s a pretty large number of false accusations, when you consider that these players will be punished, and publicly branded as steroid users. Even worse, nearly half of steroid users have T/E of less than 6.0, so setting the threshold there will give a violator a significant chance of evading detection. That may be enough incentive for a marginal player to risk taking steroids.

(Of course it’s possible to redo the test before accusing a player. But retesting only helps if the first test mismeasured the player’s true T/E level. If an innocent player’s T/E is naturally higher than 6.0, retesting will only seem to confirm the accusation.)

We can raise or lower the threshold for accusation, thereby trading off false positives (non-users punished) against false negatives (steroid users unpunished). But it may not be possible to have an acceptable false positive rate and an acceptable false negative rate at the same time. Worse yet, “strength consultants” may help players test themselves and develop their own customized drug regimens, to gain the advantages of steroids while evading detection by the official tests.

Taking these issues into account, it’s not at all clear that a steroid program helps the players. If many players can get away with using steroids, and some who don’t use are punished anyway, the program may actually be a lose-lose proposition for the players.

Are there better tests? Will a combination of multiple tests be more accurate? What tests will Baseball use? I don’t know. But I do know that these are the key questions to answer in evaluating Baseball’s steroids program. It’s not just a question of whether you oppose steroid use.

CBS Tries DRM to Block Criticism of Rathergate Report

Last week the panel investigating CBS’s botched reporting about President Bush’s military service released its report. The report was offered on the net in PDF format by CBS and its law firm. CBS was rightly commended for its openness in facing up to its past misbehavior and publicizing the report. Many bloggers, in commenting on the report and events that led to it, included quotes from the report.

Yesterday, Ernest Miller noticed that he could no longer copy and paste material from the report PDF into other documents. Seth Finkelstein confirmed that the version of the report on the CBS and law firm websites had been modified. The contents were the same but an Adobe DRM (Digital Restrictions Management) technology had been enabled, to prevent copying and pasting from the report. Apparently CBS (or its lawyers) wanted to make it harder for people to quote from the report.

This is yet another use of DRM that has nothing to do with copyright infringement. Nobody who wanted to copy the report as a whole would do so by copying and pasting – the report is enormous and the whole thing is available for free online anyway. The only plausible use of copy-and-paste is to quote from the report in order to comment, which is almost certainly fair use.

(CBS might reasonably have wanted to prevent modifications to the report file itself. They could have done this, within Adobe’s DRM system, without taking away the ability to copy-and-paste material from the file. But they chose instead to ban both modification and copy-and-paste.)

This sort of thing should not be a public policy problem; but the DMCA makes it one. If the law were neutral about DRM, we could just let the technology take its course. Unfortunately, U.S. law favors the publishers of DRMed material over would-be users of that material. For example, circumventing the DRM on the CBS report, in order to engage in fair-use commentary, may well violate the DMCA. (The DMCA has no fair-use exception, and courts have ruled that a DMCA violation can occur even if there is no copyright infringement.)

Worse yet, the DMCA may ban the tools needed to defeat this DRM technology. Dmitry Sklyarov was famously jailed by the FBI for writing a software tool that defeated this very same DRM technology; and his employer, Elcomsoft, was tried on criminal charges for selling fewer than ten copies of that tool.

As it turns out, the DRM can apparently be defeated easily by using Adobe’s own products. A commenter on Seth’s site (David L.) notes that he was able to turn off the restrictions using Adobe Acrobat: “The properties showed it set to password security. I was goofin around and changed it to No Security adn it turned off the security settings. I then saved the pdf and reopened it and the security was gone…. Apparently forging documents is not all that CBS sucks at.”

UPDATED (12:35 PM) to clarify: changed “cut-and-paste” to “copy-and-paste”, and added the parenthesized paragraph.