October 12, 2024

Will they ever learn? Hollywood still pursuing DRM

In today’s New York Times, we read that Hollywood is working on a grand unified video DRM scheme intended to allow for video portability, such as, for example, when you visit a hotel room, you’d like to have your videos with you.

What’s sad, of course, is that you can have all of this today with very little fuss. I use iTiVo to extract videos from my TiVo, transcoding them to an iPhone-compatible format. I similarly use Fairmount to rip DVDs to my hard drive, making them easy to play later without worrying about the physical media getting damaged or lost. But if I want to download video, I have no easy mechanism to download non-DRM content. BitTorrent gives access to many things, including my favorite Top Gear, which I cannot get through any other channel, but many things I’d like aren’t available, and of course, there’s the whole legality issue.

I recently bought a copy of Disney/Pixar’s Up (Blu-ray), which includes a “Digital Copy” of some sort that’s rippable, but the other ones are rippable as well (even the Bluray), so I haven’t bothered to sort out how the “Digital Copy” works.

(UPDATE: the disc contains Windows and Mac executables which will ask the user for an “activation code” which is then sent to a Disney server which responds with some sort of decryption key. The resulting file is then installed in iTunes or Windows Media Player with their native DRM restrictions. The Disney server, of course, wants you to set up an account, and they’re working up some sort of YouTube-ish streaming experiences for movies where you’ve entered an activation code.)

So what exactly are the Hollywood types cooking up? There are no technical details in the article, but the broad idea seems to be that you authenticate as yourself from any device, anywhere, and then the central server will let you at “your” content. It’s unclear the extent to which they have an offline viewing story, such as you might want to do on your computer on an airplane. One would imagine they would download an encrypted file, perhaps customized for you, along with a dedicated video player that keeps the key material hidden away through easily broken, poorly conceived mechanisms.

It’s not like we haven’t been here before. I just wonder if we’ll have a repeat of the ill-fated SDMI challenge.

2009 Predictions Scorecard

As usual, we’ll kick off the new year by reviewing the predictions we made for the previous year. Here now, our 2009 predictions, in italics, with hindsight in ordinary type.

(1) DRM technology will still fail to prevent widespread infringement. In a related development, pigs will still fail to fly.

By tradition this is our first prediction, and it has always been accurate. Guess what our first 2010 prediction will be? Verdict: right.

(2) Patent reform legislation will come closer to passage in this Congress, but will ultimately fail as policymakers wait to determine the impact of the Bilski case’s apparent narrowing of business model patentability.

Everyone agrees that patent reform is needed, but no specific bill is close to passage, and everyone is waiting for the Supreme Court’s Bilski decision. Verdict: right.

(3) As lawful downloading of music and movies continues to grow, consumer satisfaction with lossy formats will decline, and higher-priced options that offer higher fidelity will begin to predominate. At least one major online music service will begin to offer music in a lossless format.

People seem to accept lossy formats. Verdict: wrong.

(4) The RIAA’s “graduated response” initiative will sputter and die because ISPs are unwilling to cut off users based on unrebutted accusations. Lawsuits against individual end-user infringers will quietly continue.

“Graduated response” has gotten lots of talk but hasn’t had much of a practical impact yet. Verdict: mostly right.

(5) The DOJ will bring criminal actions against big-time individual copyright infringers based on data culled from the server logs of a large “private” BitTorrent community.

I don’t think this happened. Verdict: wrong.

(6) Questions over the enforceability of free / open source software licenses will move closer to resolution.

Debate continued, but I don’t recall any major legal rulings on this issue. Verdict: mostly wrong.

(7) NebuAd and the regional ISPs recently sued for deploying NebuAd’s advertising system will settle with the class action plantiffs for an undisclosed sum. At least in part because of the lawsuit and settlement, no U.S. ISP will deploy a new NebuAd/Phorm-like system in 2009. Meanwhile, Phorm will continue to be successful with privacy regulators in the UK and will sign up reluctant ISPs there who are facing competitive pressure. Activists will raise strong objections to no avail.

NebuAd is now dead and Phorm appears to be in trouble. US ISPs steered clear of them after strong pushback from consumers and legislators. Phorm seemed to have some preliminary deals with ISPs in the UK, but it appears that they have not yet had a wide deployment there (since an early pilot program in 2007). Verdict: mostly right.

(8) The federal Court of Appeals for the Ninth Circuit will hear oral argument in the case of U.S. v. Lori Drew, the Megan Meier/MySpace prosecution. By year’s end, the Ninth Circuit panel still will not have issued a decision, although after oral argument, the pundits will predict a 3-0 or 2-1 reversal of the conviction.

The Drew case did not reach the Ninth Circuit, because the original trial judge set aside the jury’s guilty verdict. Verdict: wrong.

(9) As a result of the jury’s guilty verdict in U.S. v. Lori Drew, dozens of plaintiffs will file civil lawsuits in 2009 alleging violations of the federal Computer Fraud and Abuse Act premised on the theory that one can “exceed authorized access” or act “in excess of authorization” by violating Terms of Service. Thankfully, the Department of Justice won’t bring any other criminal cases premised on this theory, at least not until it sees how the Ninth Circuit rules.

Despite worries, we didn’t see many such lawsuits. The DoJ did not bring criminal cases. Verdict: mostly wrong.

(10) The Computer Fraud and Abuse Act (CFAA) will be the new DMCA. Many will argue that the law needs to be reformed, but this argument will struggle to gain traction with the lay public, notwithstanding the fact that lay users face potential liability for routine behaviors due to CFAA overbreadth.

This hasn’t happened, at least not yet. There are concerns about the CFAA, but the issue hasn’t gotten tracttion. Verdict: mostly wrong.

(11) An academic security researcher will face prosecution under the CFAA, anti wire tapping laws, or other computer intrusion statutes for violations that occurred in the process of research.

Thankfully, this didn’t happen. Verdict: wrong.

(12) An affirmative action lawsuit will be filed against a university, challenging the use of a software algorithm used in evaluating applicants.

Verdict: wrong.

(13) There will be lots of talk about net neutrality but no new legislation, as everyone waits to see how the Comcast/BitTorrent issue plays out in the courts.

There has been lots of talk but no legislation passed. The main action on net neutrality seems to be in the FCC. Verdict: right.

(14) The Obama administration will bring an atmosphere of antitrust enforcement to the IT industry, but no major cases will be brought in 2009.

The atmosphere is indeed more pro-enforcement. We almost made it to the end of the year without a major case being filed, but then the FTC brought a case against Intel in mid-December. Verdict: mostly right.

(15) The new administration will be seen as trying to “reboot” the FCC.

There are certainly changes at the FCC, but not a full-on reboot. Verdict: mostly wrong.

(16) One of the major American voting system manufacturers (Diebold/Premier, Sequoia, ES&S, or Hart InterCivic) will go out of business or be absorbed into one of its rivals.

ES&S bought Premier. This was one of our best calls. Verdict: correct.

(17) The federal voting machine certification regime will increasingly be seen as a failure. States will strengthen their own certification processes, and at least one major state will stop requiring federal certification. The failure of the federal process to certify systems or software patches in a timely fashion will be cited as a reason for this move.

Consensus is growing that the certification regime is expensive and ineffective. But not much has changed on the state level. Verdict: mostly wrong.

(18) Estonia and other countries will continue experimenting in real elections with online or mobile phone voting. They will claim that these trials are successful because “nothing went wrong.” Security analysts will continue to claim that these systems are fundamentally flawed and will continue to be ignored. Exactly the same thing will continue to happen with U.S. overseas and military voters.

Verdict: right.

(19) We’ll see the first clear-cut evidence of a malicious attack on a voting system fielded in a state or local election. This attack will exploit known flaws in a “toe in the water” test and vendors will say they fixed the flaw years ago and the new version is in the certification pipeline.

Thankfully, this didn’t happen. Verdict: wrong.

(20) U.S. federal government computers will suffer from at least one high-profile compromise by a foreign entity, leaking a substantial amount of classified or highly sensitive information abroad.

Such a breach probably happened — what are the odds that such a large number of computers could be secured continuously for a year — but I don’t recall a “high-profile” compromise. Verdict: mostly wrong.

(21) There will be one or more major Internet outages attributed to attacks on DNS, BGP, or other Internet plumbing that is immediately labeled an act of “cyber-warfare” or “cyber-terrorism.” The actual cause will be found to be the action of spammers or other professional Internet miscreants.

Thankfully, such attacks did not happen. Verdict: wrong.

(22) Present flaws in the web’s Certification Authority process, such as the MD5 issue or the leniency of some CAs in issuing certificates, will lead to regulation of the CA process. Among other things, there will be calls for restrictions on which CAs can issue certs for which Top Level Domains.

The CA process does have serious problems, but regulators have not stepped in. Verdict: mostly wrong.

(23) One or more major Internet services or top-tier network providers will experience prolonged failures and/or unrecoverable data severe enough that the company’s president ends up testifying before Congress about it.

The closest thing to this kind of failure was Danger’s loss of customer data. In the end, most of the data was recoverable; and no Congressional testimony occurred. Verdict: mostly wrong.

(24) Shortly after the start of the new administration, the TSA will quietly phase out the ban on flying with liquids or stop enforcing it in practice. The color-coded national caution levels (which have remained at “orange” forever) will be phased out.

Practical enforcement of the liquid ban became spotty. We still had to separate our liqud-baggie at the checkpoint, but in practice the TSA almost never complained about larger containers left in carry-ons. Of course, all this may have changed due to the attempted attack last week. The color-coded caution levels remained in place. Verdict: mostly wrong.

(25) All 20 of the top 20 U.S. newspapers by circulation will experience net reductions in their newsroom headcounts in 2009. At least 15 of the 20 will see weekday circulation decline by 15% or more over the course of the year. By the end of the year, at least one major U.S. city will lack a daily newspaper.

This one is tough to check exhaustively. Preliminary research shows headcount reductions at all major papers. Circulation fell at all of the top-20 papers that reported figures, but not by as much as we predicted. Half saw a 10% drop but only about a quarter saw a drop of 15% or more. We’re not sure if there’s a major U.S. city without a daily — it probably depends on what counts as “major”. On the whole, things were bad but not quite as bad as we predicted. Verdict: mostly right.

(26) Advertising spending in older media will plummet, but online ad spending will be roughly level, as advertisers warm to online ads whose performance is more easily measured. Traditional media will be forced to offer advertisers fire sale prices, and the ratio of content to advertising in many traditional media outlets will increase.

This one is hard to evaluate but is consistent with anecdotal reports. Verdict: mostly right.

(27) An embarrassing leak of personal data will emerge from one or more of the social networking firms (e.g., Facebook), leading Congress to consider legislation that probably won’t solve the problem and will never actually reach the floor for a vote.

We didn’t see an accidental leak, but Facebook’s privacy changes late in the year are a lesser version of what we predicted. It’s not clear yet what if anything Congress will do. Verdict: mostly wrong.

(28) Facebook will be sold for $4 billion and Mark Zuckerberg will step down as CEO.

Verdict: wrong.

(29) Web 2.0 startups will not be hammered by the economic downtown. In fact, web 2.0 innovation may prove to be countercyclical. Costs are controllable: today’s workstyles don’t require lavish office space, marketing can be viral, and pay-as-you-go computing services eliminate the need for big upfront investments in infrastructure. Laid off big-company workers and refugees from the financial world will keep skilled wages low. The surge in innovation will be real, but its effects will mostly be felt in future years.

It’s hard to judge this, but I think it was right. Innovation continued to be robust, even though investment capital was (relatively) scarce. Verdict: right.

(30) The Blu-ray format will increasingly be seen as a failure as customers rely more on online streaming.

Blu-ray growth has been disappointing, but streaming of movies has not shown as much growth as we predicted. Verdict: mostly right.

(31) Emboldened by Viacom’s example against Time Warner, TV network owners will increasingly demand higher payments from cable companies with the threat of moving content online instead. Cable companies will attempt to more heavily limit the content that network owners can host on Hulu and other sites.

Verdict: right.

(32) The present proliferation of incompatible set-top boxes that aim to connect your TV to the Internet will lead to the establishment of a huge industry consortium with players from three major interest groups (box builders, content providers, software providers), reminiscent of the now-defunct SDMI consortium, and with many of the same members. In 2009, they will generate a variety of press releases but will accomplish nothing.

An initiative called DECE tried to do exactly this, with the predicted results. Verdict: right.

(33) A hot Christmas item will be a cheap set-top box that allows normal people to download, organize, and view video and audio podcasts in their own living rooms. This product will work with all of the major free online sources of audio and video, and a few of the paid sources.

This prediction made a certain amount of market sense but, realistically, there would have been no way to negotiate the necessary permissions. Verdict: wrong.

(34) Internet Explorer’s usage share will fall below 50 percent for the first time in a decade, spurred by continued growth of Firefox and Safari and deals with OEMs to pre-load Google Chrome.

IE’s share is falling but is still above 60%. Chrome didn’t get many (any?) OEM deals. Verdict: mostly wrong.

(35) Somebody besides Apple will sell an iPod clone that’s a drop-in replacement for a real iPod, complete with support for iTunes DRM, video playback, and so forth. Apple will sue (or threaten to sue), but won’t be able to stop distribution of this product.

Verdict: wrong.

(36) Apple will release a netbook, which will be a souped-up iPhone with an 8″ screen and folding keyboard. It will sell for $899.

This didn’t happen. Instead, we will apparently get the long-rumored Apple tablet computer, a souped-up iPhone with a 8-10″ screen, selling for $800 or so. Verdict: wrong.

(37) No white space devices will be approved for use by the FCC. Submitted spectrum sensing devices will fare well in both laboratory and field tests, but approval will be delayed politically by the anti-white space lobby.

Verdict: right.

(38) More and more Internet traffic will encrypted, as concern grows about eavesdropping, content modification, filtering, and security attacks.

Concern about traffic modification is growing, but we haven’t seen much growth in encryption. Verdict: mostly wrong.

The bottom line: 9 right, 6 mostly right, 12 mostly wrong, 11 wrong. Our goal was to make bolder predictions than in previous years, and we did succeed in that respect. Our accuracy suffered accordingly. Interesting, provocative predictions are rarely safe, but still I would have preferred a higher success rate.

Stay tuned for our 2010 predictions.

Search Neutrality ? Net Neutrality

Sunday’s New York Times featured a provocative op-ed arguing in addition to regulating “net neutrality” the FCC should also effectuate “search neutrality” – requiring search providers rank results without consideration of business entities. The author heaps particular scorn upon Google for promoting its own context-relevant services (i.e. maps and weather) at the fore of search results. Others have already reviewed the proposal, leveled implementation critiques, and criticized the author’s gripes with his own site. My aim here is to rebut the piece’s core argument: the analogy of search neutrality to net neutrality. Clearly both are debates about the promotion of innovation and competition through a level playing field. But beyond this commonality the parallel breaks down.

Net neutrality advocates call for regulation because ISP discrimination could render innovative services either impossible to implement owing to traffic restrictions or too expensive to deploy owing to traffic pricing. Consumers cannot “vote with their dollars” for a nondiscriminatory ISP since most locales have few providers and the market is hard to break into. Violations of net neutrality, the argument goes, threaten to nip entire industries in the bud and rob the economy of growth.

Violations of search neutrality, on the other hand, at most increase marketing costs for an innovative or competitive offering. Consumers are more than clever enough to seek and use an alternative to a weaker Google offering (Yelp vs. Google restaurant reviews, anyone?). The author of the op-ed cites Google Maps’ dethroning of MapQuest as evidence of the power of search non-neutrality; on the contrary, I would contend users flocked to Google’s service because it was, well, better. If Google Maps featured MapQuest’s clunky interface and vice versa, would you use it? A glance at historical map site statistics empirically rebuts the author’s claim. The mid-May 2007 introduction of Google’s context-relevant (“universal”) search does not appear correlated with any irregular shift in map site traffic.

Moreover, unlike with net neutrality search consumers stand ready to “vote with their [ad] dollars.” Should Google consistently favor its own services to the detriment of search result quality, consumers can effortlessly shift to any of its numerous competitors. It is no coincidence Google sinks enormous manpower into improving result quality.

There may also be a benefit to the increase in marketing costs from existing violations of search neutrality, like Google’s map and weather offerings. If a service would have to be extensively marketed to compete with Google’s promoted offering – say, a current weather site vs. searching for “Stanford weather” – the market is sending a signal that consumers don’t care about the marginal quality of the product, and the non-Google provider should quit the market.

There is merit to the observation that violations of search neutrality are, on the margin, slightly anti-competitive. But this issue is dwarfed by the potential economy-scale implications of net neutrality. The FCC should not deviate in its rulemaking.

Open Government Workshop at CITP

Here at Princeton’s CITP, we have a healthy interest in issues of open government and government transparency. With the release last week of the Open Government Directive by the Obama Administration, our normally gloomy winter may prove to be considerably brighter.

In addition to creating tools like Recap and FedThread, we’ve also been thinking deeply about the nature of open and transparent government, how system designers and architects can better create transparent systems and how to achieve sustainability in open government. Related to these questions are those of the law.gov effort—providing open access to primary legal materials—and how to best facilitate the tinkerers who work on projects of open government.

These are deep issues, so we thought it best to organize a workshop and gather people from a variety of perspectives to dig in.

If you’re interested, come to our workshop next month! While we didn’t consciously plan it this way, the last day of this workshop corresponds to the first 45-day deadline under the OGD.

Open Government: Defining, Designing, and Sustaining Transparency
January 21–22, 2010
http://citp.princeton.edu/open-government-workshop/

Despite increasing interest in issues of open government and governmental transparency, the values of “openness” and “transparency” have been under-theorized. This workshop will bring together academics, government, advocates and tinkerers to examine a few critical issues in open and transparent government. How can we better conceptualize openness and transparency for government? Are there specific design and architectural needs and requirements placed upon systems by openness and transparency? How can openness and transparency best be sustained? How should we change the provision and access of primary legal materials? Finally, how do we best coordinate the supply of open government projects with the demand from tinkerers?

Anil Dash, Director of the AAAS’ new Expert Labs, will deliver the keynote. We are thrilled with the diverse list of speakers, and are looking forward to a robust conversation.

The workshop is free and open to the public, although we ask that you RSVP to so that we be sure to have a name tag and lunch for you.

Erroneous DMCA notices and copyright enforcement, part deux

A few weeks ago, I wrote about a deluge of DMCA notices and pre-settlement letters that CoralCDN experienced in late August. This article actually received a bit of press, including MediaPost, ArsTechnica, TechDirt, and, very recently, Slashdot. I’m glad that my own experience was able to shed some light on the more insidious practices that are still going on under the umbrella of copyright enforcement. More transparency is especially important at this time, given the current debate over the Anti-Counterfeiting Trade Agreement.

Given this discussion, I wanted to write a short follow-on to my previous post.

The VPA drops Nexicon

First and foremost, I was contacted by the founder of the Video Protection Alliance not long after this story broke. I was informed that the VPA has not actually developed its own technology to discover users who are actively uploading or downloading copyrighted material, but rather contracts out this role to Nexicon. (You can find a comment from Nexicon’s CTO to my previous article here.) As I was told, the VPA was contracted by certain content publishers to help reduce copyright infringement of (largely adult) content. The VPA in turn contracted Nexicon to find IP addresses that are participating in BitTorrent swarms of those specified movies. Using the IP addresses given them by Nexicon, the VPA subsequently would send pre-settlement letters to the network providers of those addresses.

The VPA’s founder also assured me that their main goal was to reduce infringement, as opposed to collecting pre-settlement money. (And that users had been let off with only a warning, or, in the cases where infringement might have been due to an open wireless network, informed how to secure their wireless network.) He also expressed surprise that there were false positives in the addresses given to them (beyond said open wireless), especially to the extent that appropriate verification was lacking. Given this new knowledge, he stated that the VPA dropped their use of Nexicon’s technology.

BitTorrent and Proxies

Second, I should clarify my claims about BitTorrent’s usefulness with an on-path proxy. While it is true that the address registered with the BitTorrent tracker is not usable, peers connecting from behind a proxy can still download content from other addresses learned from the tracker. If their requests to those addresses are optimistically unchoked, they have the opportunity to even engage in incentivized bilateral exchange. Furthermore, the use of DHT- and gossip-based discovery with other peers—the latter is termed PEX, for Peer EXchange, in BitTorrent—allows their real address to be learned by others. Thus, through these more modern discovery means, other peers may initiate connections to them, further increasing the opportunity for tit-for-tat exchanges.

Some readers also pointed out that there is good reason why BitTorrent trackers do not just accept any IP address communicated to it via an HTTP query string, but rather use the end-point IP address of the TCP connection. Namely, any HTTP query parameter can be spoofed, leading to anybody being able to add another’s IP address to the tracker list. That would make them susceptible to receiving DMCA complaints, just we experienced with CoralCDN. From a more technical perspective, their machine would also start receiving unsolicited TCP connection requests from other BitTorrent peers, an easy DoS amplification attack.

That said, there are some additional checks that BitTorrent trackers could do. For example, if the IP query string or X-Forwarded-For HTTP headers are present, only add the network IP address if it matches the query string or X-Forwarded-For headers. Additionally, some BitTorrent tracker operators have mentioned that they have certain IP addresses whitelisted as trusted proxies; in those cases, the X-Forwarded-For address is used already. Otherwise, I don’t see a good reason (plausible deniability aside) for recording an IP address that is known to be likely incorrect.

Best Practices for Online Technical Copyright Enforcement

Finally, my article pointed out a strategy that I clearly thought was insufficient for copyright enforcement: simply crawling a BitTorrent tracker for a list of registered IP addresses, and issuing a infringement notice to each IP address. I’ll add to that two other approaches that I think are either insufficient, unethical, or illegal—or all three—yet have been bandied about as possible solutions.

  • Wiretapping: It has been suggested that network providers can perform deep-packet inspection (DPI) on their customer’s traffic in order to detect copyrighted content. This approach probably breaks a number of laws (either in the U.S. or elsewhere), creates a dangerous precedent and existing infrastructure for far-flung Internet surveillance, and yet is of dubious benefit given the move to encrypted communication by file-sharing software.
  • Spyware: By surreptitiously installing spyware/malware on end-hosts, one could scan a user’s local disk in order to detect the existence of potentially copyrighted material. This practice has even worse legal and ethical implications than network-level wiretapping, and yet politicians such as Senator Orrin Hatch (Utah) have gone as far as declaring that infringers’ computers should be destroyed. And it opens users up to the real danger that their computers or information could be misused by others; witness, for example, the security weaknesses of China’s Green Dam software.

So, if one starts from the position that copyrights are valid and should be enforceable—some dispute this—what would you like to see as best practices for copyright enforcement?

The approach taken by DRM is to try to build a technical framework that restricts users’ ability to share content or to consume it in a proscribed manner. But DRM has been largely disliked by end-users, mostly in the way it creates a poor user experience and interferes with expected rights (under fair-use doctrine). But DRM is a misleading argument, as copyright infringement notices are needed precisely after “unprotected” content has already flown the coop.

So I’ll start with two properties that I would want all enforcement agencies to take when issuing DMCA take-down notices. Let’s restrict this consideration to complaints about “whole” content (e.g., entire movies), as opposed to those DMCA challenges over sampled or remixed content, which is a legal debate.

  • For any end client suspected of file-sharing, one MUST verify that the client was actually uploading or downloading content, AND that the content corresponded to a valid portion of a copyrighted file. In BitTorrent, this might be that the client sends or receives a complete file block, and that the file block hashes to the correct value specified in the .torrent file.
  • When issuing a DMCA take-down notice, the request MUST be accompanied by logged information that shows (a) the client’s IP:port network address engaged in content transfer (e.g., a record of a TCP flow); (b) the actual application request/response that was acted upon (e.g., BitTorrent-level logs); and (c) that the transferred content corresponds to a valid file block (e.g., a BitTorrent hash).

So my question to the readers: What would you add to or remove from this list? With what other approaches do you think copyright enforcement should be performed or incentivized?