October 20, 2017

Archives for February 2005

Forecast for Infotech Policy in the New Congress

Cameron Wilson, Director of the ACM Public Policy Office in Washington, looks at changes (made already or widely reported) in the new Congress and what they tell us about likely legislative action. (He co-writes the ACM U.S. Public Policy Blog, which is quite good.)

He mentions four hot areas. The first is regulation of peer-to-peer technologies. Once the Supreme Court’s decision in Grokster comes down, expect Congress to spring into action, to protect whichever side claims to be endangered by the decision. A likely focal point for this is the new Intellectual Property subcommittee of the Senate Judiciary Committee. (The subcommittee will be chaired by Sen. Orrin Hatch, who has not been shy about regulating infotech in the name of copyright. He championed of the Induce Act.) This issue will start out being about P2P but could easily expand to regulate a wider class of technologies.

The second area is telecom. Sen. Ted Stevens is the new chair of the Senate Commerce Committee, and he seems eager to work on a big revision of the Telecom Act of 1996. This will be a battle royal involving many interest groups, and telecom policy wonks will be fully absorbed. Regulation of non-telecom infotech products seems likely to creep into the bill, given the technological convergence of telecom with the Internet.

The third area is privacy. The Real ID bill, which standardizes state driver’s licenses to create what is nearly a de facto national ID card, is controversial but seems likely to become law. The recent ChoicePoint privacy scandal may drive further privacy legislation. Congress is likely to do something about spyware as well.

The fourth area is security and reliability of systems. Many people on the Hill will want to weigh in on this issue, but it’s not clear what action will be taken. There are also questions over which committees have jurisdiction. Many of us hope that PITAC’s report on the sad state of cybersecurity research funding will trigger some action.

As someone famous said, it’s hard to make predictions, especially about the future. There will surely be surprises. About the only thing we can be sure of is that infotech policy will get even more attention in this Congress than in the last one.

More on Ad-Blocking

I’m on the road today, so I don’t have a long post for you. (Good news: I’m in Rome. Bad news: It’s Rome, New York.)

Instead, let me point you to an interesting exchange about copyright and ad-blocking software on my course blog, in which “Archer” opens with a discussion of copyright and advertising revenue, and Harlan Yu responds by asking whether distributing Firefox AdBlock is a contributory infringement.

There’s plenty of interesting writing on the course blog. Check it out!

UPDATE (Feb. 28): Another student, “Unsuspecting Innocent,” has more on this topic.

Can P2P Nets Be Poisoned?

Christin, Weigend, and Chuang have an interesting new paper on corruption of files in P2P networks. Some files are corrupted accidentally (they call this “pollution”), and some might be corrupted deliberately (“poisoning”) by copyright owners or their agents. The paper measures the availability of popular, infringing files on the eDonkey, Overnet, Gnutella, and FastTrack networks, and simulates the effect of different pollution strategies that might be used.

The paper studied a few popular files for which corruption efforts were not occurring (or at least not succeeding). Polluted versions of these files are found, especially on FastTrack, but these aren’t a barrier to user access because non-corrupted files tend to have more replicas available than polluted files do, and the systems return files with more replicas first.

They move on to simulate the effect of various pollution strategies. They conclude that a sufficiently sophisticated pollution strategy, which injects different decoy versions of a file at different times, and injects many replicas of the same decoy at the same time, would significantly reduce user access to targeted files.

Some P2P programs use simple reputation systems to try to distinguish corrupted files from non-corrupted ones; the paper argues that these will be ineffective against their best pollution strategy. But they also note that better reputation systems could can detect their sophisticated poisoning strategy.

They don’t say anything more about the arms race between reputation technologies and pollution technologies. My guess is that in the long run reputation systems will win, and poisoning strategies will lose their viability. In the meantime, though, it looks like copyright owners have much to gain from poisoning.

[UPDATE (6:45 PM): I changed the second paragraph to eliminate an error that was caused by my misreading of the paper. Originally I said, incorrectly, that the study found little if any evidence of pollution for the files they studied. In fact, they chose those files because they were not subject to pollution. Thanks to Cypherpunk, Joe Hall, and Nicolas Christin for pointing out my error.]