April 20, 2014

avatar

SearchKing Suit Dismissed

Stefanie Olsen at CNet News.com reports that SearchKing’s lawsuit against Google has been dismissed. The judge ruled, as expected, that Google’s page rankings are opinions, and that Google has a First Amendment right to state its opinions.

Here’s the background: SearchKing sells a service that claims to raise people’s page rankings on the Google search engine. Google adjusted their page ranking algorithm to demote SearchKing’s pages. SearchKing sued Google, asking the court to grant a preliminary injunction requiring Google to restore the page rankings of SearchKing’s pages. The court has now dismissed SearchKing’s suit. For a longer analysis of the case, see James Grimmelmann’s old LawMeme posting.

avatar

Aimster, Napster, and the Betamax

An interesting amicus brief has been filed in the Aimster case, on behalf of public-interest groups including EFF, PublicKnowledge, and the Home Recording Rights Coalition; library groups including the American Library Association; and industry groups including the Computing and Communications Industry Association and the Consumer Electronics Association. A trial court found Aimster liable for indirect copyright infringement for selling a sort of file-sharing product. The amicus brief was filed with the Court of Appeals that is considering Aimster’s appeal.

The brief does not take a position on whether Aimster should be found liable, but it does argue forcefully that the trial court misinterpreted the Supreme Court’s ruling in the 1984 Sony Betamax case. In Betamax, the Supreme Court found that Sony was not liable for selling VCRs, even though VCRs were often used to infringe copyrights. The Court found, essentially, that if a product has both infringing and noninfringing uses, then the product’s maker cannot be held liable simply for selling that product. The Betamax rule has been rehashed in recent cases, including Napster (which was found liable for participating in the infringing activity) and Grokster (which was found not liable under the Betamax rule). How the Betamax rule will be interpreted is one of the key legal issues for would-be designers of innovative media products. Courts have not been entirely consistent in their reading of Betamax.

The new brief urges the Court of Appeals to narrow the lower court’s reading of the Betamax rule. According to the brief, the lower court’s reading of Betamax would impose liability on the makers of common devices such as photocopiers and digital cameras, and the Court of Appeals, regardless of its ultimate decision about Aimster’s liability, should make clear that the lower court misread Betamax.

I won’t write any more on this , since the brief is relatively short and well-written – if I’m not careful, my summary of the brief will be longer than the brief itself!

Thanks for bringing the brief to my attention go to Aimee Deep, who, despite Frank Field’s occasional doubts, appears to really exist.

avatar

DVDCCA v. Bunner in California Supreme Court

DVDCCA v. Bunner – the “California DVD case” – was argued yesterday in the California Supreme Court. DVDCCA, which is basically the movie industry, sued Andrew Bunner for re-publishing the DeCSS program on his web site. DeCSS, you may recall, is a program for decrypting DVDs.

A previous case in Federal court, Universal v. Remeirdes (also known as “Universal v. Corley”, the “2600 case”, or the “New York DVD case”), led to a ruling that posting DeCSS violated the Digital Millennium Copyright Act (DMCA). There was no DMCA claim in Bunner; the movie industry argued instead that DeCSS contained their trade secrets, and so was illegal for Bunner to publish.

Bunner lost in the trial court but he won a reversal in the appeals court, with the appeals court ruling that DeCSS was speech and that an injunction against its publication would therefore be an unconstitutional prior restraint on speech.

Wired has a pretty poor story about this (bylined “Reuters”). Better is Lisa Bowman’s story at CNet News.com. Alex McGillivray was there and offers a deeper account of the legal arguments.

As usual in these cases, the plaintiffs’ lawyers offered strained analogies. California Attorney General Bill Lockyer called DeCSS a tool for “breaking, entering, and stealing”, ignoring that DeCSS only allows one to “break into” one’s own property. (The theory that using DeCSS amounts to a break-in was already rejected by a Norwegian court in the Johansen case.)

DVDCCA lawyer Robert Sugarman said something even odder. Bowman’s story quotes Sugarman as telling the court that DeCSS is designed “to allow individuals to steal a trade secret and, by virtue of that, hack into a system that protects the trade secrets of motion picture makers.” This description is wrong on several counts. First, it is at odds with the DVDCCA’s position, which is not that that DeCSS protects their trade secrets, but that it contains their trade secrets. Second, the only things “protected” by DeCSS are the digital versions of the movies, and movies in broad distribution can’t be trade secrets.

In any case, I have never understood why the industry’s basic trade secret argument wasn’t laughed out of court. By the time Bunner got hold of DeCSS and re-published it, it was available at hundreds of places on the Net, and had been available for months. Anybody who cared to know this “trade secret” already knew it, through no fault of Bunner’s. (I filed a declaration to that effect with the original trial court.) The industry never claimed that Bunner did anything illegal to get the “trade secret”; nor did they even prove that anybody else had done anything illegal to get it.

avatar

Texas Super-DMCA Apparently Dead

Louis Trager at the Washington Internet Daily reports that the Texas Super-DMCA bill appears to be dead, as this year’s legislative session ended without any action on the bill. There is still a small risk that it will be considered in special session, but the governor’s office says he does not intend to call such a special session. The Texas legislature is not scheduled to meet at all in 2004, so the bill appears to be dead there until at least 2005.

The is another significant victory for Super-DMCA opponents, along with the veto of the Colorado bill, and the withdrawal of the Tennessee and Oregon bills by their sponsors.

Trager quotes MPAA Vice President Vans Stevenson as saying that “Time is on our side. We have all the time in the world.”

Apparently MPAA will be patient, in the hope that opponents will tire of the struggle, or maybe in the hope of finding new opportunities to introduce stealth bills. That may be MPAA’s best hope, since the bills have fared poorly wherever open debate on their merits has been allowed.

avatar

Waldo on Standards

Jim Waldo (a Distinguished Engineer at Sun) has written two provocative blog entries about standardization. He argues that technical standards are a good idea when their purpose is to codify existing practice in the industry, but that it’s counterproductive for a standards group to try to invent new technology. I think he’s right.

When standards groups try to invent technology, they tend to do poorly, for two reasons. First, committees generally do a lousy job of designing anything; the best designs spring from the mind of a single person, or from a small group of like-minded people with a clear common goal in mind. Second, standards groups can easily degenerate into political wrangling which, regardless of their pretextual substance, really amount to a battle over which company’s product plans will be anointed as the standard – a failure mode that is much less likely when the only goal is to codify existing, widespread practices.

The worst case of all, of course, is when lawyers try to invent technology, by codifying their regulatory schemes as “standards.”

avatar

E-Voting Bill Introduced

My Congressman, Rep. Rush Holt, has introduced an important e-voting bill, H.R. 2239. The bill would address the serious concerns raised by a broad coalition of computer scientists (including me) about the security and trustworthiness of electronic voting systems.

The bill would do three main things. First, it would require that voting systems generate a paper trail that the voter can verify at the time he/she votes. Second, it would require the software used in voting machines to be open for public inspection. Third, it would institute random, surprise recounts in 0.5% of jurisdictions, as a quality control measure. The bill also contains safeguards to ensure that disabled voters can cast their votes.

The text of the bill is not yet on the House’s web site; I’ll post a link here when it becomes available. I have seen a preview copy of the bill, and I think it does an excellent job of ensuring that our transition to e-voting maintains the trustworthiness of our elections. I support it strongly, and I hope you will do so too.

UPDATE(10:55 AM, May 27): The bill’s text is now available.

avatar

Colorado Governor Vetoes Super-DMCA

Colorado governor Bill Owens has taken the Rocky Mountain News’ advice and vetoed his state’s Super-DMCA bill. Linda Seebach writes:

In his veto message [Owens] said the bill “could also stifle legal activity by entities all along the high tech spectrum, from manufacturers of communication parts to sellers of communication services.”

He urges the legislature, if it returns to this topic in the next session, “to be more careful in drafting a bill that adds protections that are rightfully needed, but does not paint a broad brush stroke where only a tight line is needed.”

avatar

Self-Destructing DVDs

Last week a company called FlexPlay announced Self-Destructing DVDs (SD-DVDs), which oxidize themselves – and so become unplayable – 48 hours after removal from their package. (The official name is, amusingly, “EZ-D”.) The idea is to provide the equivalent of a rental, while saving the consumer the trouble of returning the disk to the rental store afterwards.

This is an interesting kind of Digital Restrictions Management (DRM). Unlike most uses of DRM, this one does nothing to prevent copying or access to the disk. Consumers will be able to copy these DVDs as easily as any other DVDs. (Copying DVDs is often illegal, but many consumers are apparently willing to do it anyway.) SD-DVDs don’t do anything to make copying harder, and in fact their limited lifetime may create a new incentive to copy. While the use of DRM to (try to) control copying and access has gotten lots of attention, SD-DVDs are a nice illustration of the use of DRM to enable business models.

SD-DVDs may be a convenience for DVD-rental customers, but I doubt they will catch on, because consumers will find them offensive. Consumers hate planned obsolescence. The idea that a company would deliberately make a product worse, or make it wear out sooner than necessary, offends their sense of fairness. If Universal can press a regular DVD for one dollar, then why, ordinary consumers will ask, would they spend the same dollar to make a product that breaks? Fancy-pants economic arguments about efficiency and market segmentation won’t overcome this basic sense of unfairness.

Worse yet (and despite a claim to the contrary in FlexPlay’s press release), the nature of a chemical process like oxidation seems to imply that the disk’s decay will be gradual. Since DVDs use error correction, FlexPlay’s engineers can make the disk reliable for any desired period; but after that there will be an inevitable period of intermittent glitches as the disk gets worse and worse, until it becomes unusable. Seeing the decay, even if it lasts only for a short time, will only make consumers angrier.

The underlying problem is that because SD-DVDs will be sold for less than ordinary DVDs, they will draw consumers’ attention to the fact that ordinary DVDs are priced well above the marginal cost of producing them. That seems unfair to many consumers.

At this point, readers who are armchair economists (or real ones, for that matter) are raising their hands and bouncing in their seats, eager to point out that marginal-cost pricing isn’t sustainable in the movie business, given the high fixed cost of making a movie and the very low marginal cost of distributing a copy of it. That’s true, but I think consumers’ sense of fairness is based on a different kind of market in which variable costs of production dominate fixed costs.

As long as it seemed inherently expensive to manufacture and distribute a copy of a recorded movie, consumers tended not to notice that the copy was priced above marginal cost. As marginal cost approaches zero, the gap between marginal cost and price becomes much more apparent, and consumers increasingly conclude that the studios are ripping them off.

I see this as a big problem for the studios. The last thing they should want, at this point is to introduce a product like the Self-Destructing DVD that heightens consumers’ sensitivity to “unfair” pricing.

UPDATE (12:25 PM): Eric Rescorla has an interesting follow-up about consumer psychology. He also points out, in a separate post, that it is possible, at least in theory, to make an SD-DVD that fails cleanly and suddenly, rather than gradually.

avatar

NYT and Google

Sunday’s New York Times ran a piece by Geoffrey Nunberg complaining about (among other things) the relative absence of major-press articles from the top ranks of Google search results. This has triggered online discussion of why the Times itself doesn’t get much Googlejuice. Speculation has centered on the fact that Times articles get moved to a pay-for-access archive.

The real explanation is simpler : The Times forbids Google to index its site.

There’s a web standard that allows sites to declare a web-crawler program persona non grata. A file called “robots.txt” gives a set of rules, written in a standardized language, saying which automated programs have permission to access which parts of the site. The Times’ robots.txt file forbids all web-crawler programs to visit the parts of the Times site where the articles are. Google’s policy is to honor the requests in robots.txt files; that’s why Times stories don’t show up on Google.

avatar

A Challenging Response to Challenge-Response

One of the trendy ideas these days is challenge-response (CR) anti-spam technologies. The idea is simple: incoming email is intercepted before you see it, and a “challenge” email is returned to the sender. If the sender replies to the challenge message, then the original message is forwarded on to you; otherwise it is discarded. The idea is to require some kind of human involvement in the sending of each message. Sometimes the sender has to answer some kind of puzzle that is supposed to be easy for people but hard for computers.

Whenever we analyze a security technology – and that is what CR is – we need to look not only at the immediate effect of the technology, but also at how people will adapt to it. We need to look especially at how the bad guys will adapt. Will they adjust their attack strategy to defeat the new defense? Will the new defense create new opportunities for malicious attacks? Will the technology lead to an arms race between defenders and attackers? If so, can we predict the outcome of the arms race?

CR stands up poorly to this kind of analysis. To see why, suppose that Alice sends an email to Bob, and Bob is using CR. Bob’s computer sends a challenge message back to Alice and awaits her response. This challenge message had better get through to Alice; if it doesn’t, the whole scheme breaks down. If Alice is using anti-spam technology that blocks the challenge message, then she’ll never see the challenge – her original message won’t get through to Bob, and she won’t know what went wrong.

We can fix this problem by making sure that Alice’s anti-spam technology has a loophole for challenge messages, to make sure they are never blocked. (Note that although Bob is the one using CR, it is Alice who has to create the loophole.) If CR is going to succeed, most of the Alices out there will have to open the loophole. Messages with certain “challenge-ish” attributes will be mostly immune from spam controls.

At this point, the bad guys’ response is obvious: create spam that can exploit the loophole, spam that looks like a challenge message. If they can do this, then CR will have made things worse – spam will pour in through the loophole.

We might try to solve this problem by narrowing the loophole, requiring the challenge messages to be so narrowly stylized that they cannot carry a spam. This too creates an opportunity for the spammers. If the challenges are so predictable, then the spammers will be able to develop computer programs that spot the challenges and auto-send the required responses. If they can do this, then the spammers can just add automated CR responses to their automated email-sending software, and continue to pollute our inboxes.

Given all of this, I’m skeptical of CR as a response to email. If you’re the first on your block to adopt CR, and if nobody else uses anti-spam technology, then CR might provide you some modest benefit. But it’s hard to see how CR can be widely successful in a world where most people use some kind of spam defense.