March 20, 2018

Archives for July 2005

What is Spyware?

Recently the Anti-Spyware Coalition released a document defining spyware and related terms. This is an impressive-sounding group, convened by CDT and including companies like HP, Microsoft, and Yahoo.

Here is their central definition:

Spyware and Other Potentially Unwanted Technologies

Technologies implemented in ways that impair users’ control over:

  • Material changes that affect their user experience, privacy, or system security
  • User of their system resources, including what programs are installed on their computers
  • Collection, use and distribution of their personal or otherwise sensitive information

These are items that users will want to be informed about, and which the user, with appropriate authority from the owner of the system, should be able to easily remove or disable.

What’s interesting about this definition is that it’s not exactly a definition – it’s a description of things that users won’t like, along with assertions about what users will want, and what users should be able to do. How is it that this impressive group could only manage an indirect, somewhat vague definition for spyware?

The answer is that spyware is a surprisingly slippery concept.

Consider a program that lurks on your computer, watching which websites you browse and showing you ads based on your browsing history. Such a program might be spyware. But if your gave your informed consent to the program’s installation and operation, then public policy shouldn’t interfere. (Note: informed consent means that the consequences of accepting the program are conveyed to you fully and accurately.) So behaviors like monitoring and ad targeting aren’t enough, by themselves, to make a program spyware.

Now consider the same program, which comes bundled with a useful program that you want for some other purpose. The two programs are offered only together, you have to agree to take them both in order to get either one, and there is no way to uninstall one without uninstalling the other too. You give your informed consent to the bundle. (Bundling can raise antitrust problems under certain conditions, but I’ll ignore that issue here.) The company offering you the useful program is selling it for a price that is paid not in dollars but in allowing the adware to run. That in itself is no reason for public policy to object.

What makes spyware objectionable is not the technology, but the fact that it is installed without informed consent. Spyware is not a particular technology. Instead, it is any technology that is delivered via particular business practices. Understanding this is the key to regulating spyware.

Sometimes the software is installed with no consent at all. Installing and running software on a user’s computer, without seeking consent or even telling the user, must be illegal under existing laws such as the Computer Fraud and Abuse Act. There is no need to change the law to deal with this kind of spyware.

Sometimes “consent” is obtained, but only by deceiving the user. What the user gets is not what he thinks he agreed to. For example, the user might be shown a false or strongly misleading description of what the software will do; or important facts, such as the impossibility of uninstalling a program, might be withheld from the user. Here the issue is deception. As I understand it, deceptive business practices are generally illegal. (If spyware practices are not illegal, we may need to expand the legal rules against business deception.) What we need from government is vigilant enforcement against companies that use deceptive business practices in the installation of their software.

That, I think, is about as far as the law should go in fighting spyware. We may get more anti-spyware laws anyway, as Congress tries to show that it is doing something about the problem. But when it comes to laws, more is not always better.

The good news is that we probably don’t need complicated new laws to fight spyware. The laws we have can do enough – or at least they can do as much as the law can hope to do.

(If you’re not running an antispyware tool on your computer, you should be. There are several good options. Spybot Search & Destroy is a good free spyware remover for Windows.)

HD-DVD Requires Digital Imprimatur

Last week I wrote about the antitrust issues raised by the use of encryption to “protect” content. Here’s a concrete example.

HD-DVD, one of the two candidates for the next-gen DVD format, uses a “content protection” technology called AACS. And AACS, it turns out, requires a digital imprimatur on any content before it can be published.

(The imprimatur – the term is Latin for “let it be printed” – was an early technology of censorship. The original imprimatur was a stamp of approval granted by a Catholic bishop to certify that a work was free from doctrinal or moral error. In some times and places, it was illegal to print a work that didn’t have an imprimatur. Today, the term refers to any system in which a central entity must approve works before they can be published.)

The technical details are in the AACS Pre-recorded Video Book Specification. The digital imprimatur is called a “content certificate” (see p. 5 for overview), and is created “at a secure facility operated by [the AACS organization]” (p. 8 ). It is forbidden to publish any work without an imprimatur, and player devices are forbidden to play any work that lacks an imprimatur.

Like the original imprimatur, the AACS one can be revoked retroactively. AACS calls this “content revocation”. Every disc that is manufactured is required to carry an up-to-date list of revoked works. Player devices are required to keep track of which works have been revoked, and to refuse to play revoked works.

The AACS documents avoid giving a rationale for this feature. The closest they come to a rationale is a statement that the system was designed so that “[c]ompliant players can authenticate that content came from an authorized, licensed replicator” (p. 1). But the system as described does not seem designed for that goal – if it were, the disc would be signed (and the signature possibly revoked) by the replicator, not by the central AACS organization. Also, the actual design replaces “can authenticate” by “must authenticate, and must refuse to play if authentication fails”.

The goal of HD-DVD is to become the dominant format for release of movies. If this happens, the HD-DVD/AACS imprimatur will be ripe for anticompetitive abuses. Who will decide when the imprimatur will be used, and how? Apparently it will be the AACS organization. We don’t know how that organization is run, but we know that its founding members are Disney, IBM, Intel, Microsoft, Panasonic, Sony, Toshiba, and Warner Brothers. A briefing on the AACS site explains the “AACS Structure” by listing the founders.

I hope the antitrust authorities are watching this very closely. I hope, too, that consumers are watching and will vote with their dollars against this kind of system.

Controlling Software Updates

Randy Picker questions part of the computer science professors’ Grokster brief (of which I was a co-signer), in which we wrote:

Even assuming that Respondents have the right and ability to deliver such software to end users, there can be no way to ensure that software updates are installed, and stay installed. End users ultimately have control over which software is on their computers. If an end user does not want a software update, there is no way to make her take it.

This point mattered because Hollywood had suggested that Grokster should have used its software-update facility to deploy filtering software. (Apparently there is some dispute over whether Grokster had such a facility. I don’t know who is right on that factual question.)

Picker wonders whether ordinary users can really exercise this control in practice. As he notes, the user can disconnect from the net, but that’s too high a price for most people to pay. So how can users prevent updates?

The easiest method is simply to write-protect the program’s files or directories, so that they can’t be changed. Alternatively, the user can make a backup copy of the software (perhaps by copying it to another directory) and restore the backup when an update is installed.

Standard system security tools are also useful for controlling automatic updates. Autonomously self-updating programs look a lot like malicious code – the program code changes on its own (like a virus infection); the program makes network connections to odd places at odd times (like spyware); the program downloads and installs code without asking the user (like a malicious bot). Security tools specialize in identifying and blocking such behaviors, and the tools are reasonably configurable. Personal firewalls, for example, can block a program from making unapproved network connections. Some firewalls even do this by default.

Finally, a skilled person can figure out how to patch the program to disable the auto-update feature. He can then encapsulate this knowledge in a simple tool, so that other users can disable their auto-update by downloading the tool and double-clicking it. (This tool may violate copyright by modifying the program; but if we trusted users to obey copyright law we wouldn’t be having this conversation.)

The bottom line is that in computer security, possession is nine-tenths of control. Whoever has physical access to a device can control what it does. Whoever has physical control of a computer can control what software is installed on it. And users have physical control of their PCs.

A followup question is whether you can program the software to shut itself off if the user blocks updates for too long. As far as I know, nobody is claiming that Grokster had such a capability, but in principle a P2P system could be designed to (try to) work that way. This raises interesting issues too, but I’m approaching my word count limit so I’ll have to address them another day.