November 23, 2024

Archives for March 2004

An Inexhaustible Supply of Bugs

Eric Rescorla recently released an interesting paper analyzing data on the discovery of security bugs in popular products. I have some minor quibbles with the paper’s main argument (and I may write more about that later) but the data analysis alone makes the paper worth reading. Briefly, what Eric did is to take data about reported security vulnerabilities, and fit it to a standard model of software reliability. This allowed him to estimate the number of security bugs in popular software products and the rate at which those bugs will be found in the future.

When a product version is shipped, it contains a certain number of security bugs. Over time, some of these bugs are found and fixed. One hopes that the supply of bugs is depleted over time, so that it gets harder (for both the good guys and the bad guys) to find new bugs.

The first conclusion from Eric’s analysis is that there are many, many security bugs. This confirms the expectations of many security experts. My own rule of thumb is that typical release-quality industrial code has about one serious security bug per 3,000 lines of code. A product with tens of millions of lines of code will naturally have thousands of security bugs.

The second conclusion is a bit more surprising: there is little if any depletion of the bug supply. Finding and fixing bugs seems to have a small effect, or no effect at all, on the rate at which new bugs are discovered. It seems that the supply of security bugs is practically inexhaustible.

If true, this conclusion has profound implications for how we think about software security. It implies that once a version of a software product is shipped, there is nothing anybody can do to improve its security. Sure, we can (and should) apply software patches, but patching is just a treadmill and not a road to better security. No matter how many bugs we fix, the bad guys will find it just as easy to uncover new ones.

Suit Challenges Broadcast Flag

A lawsuit was filed last week, challenging the FCC’s Broadcast Flag decree. Petitioners include the American Library Association, several other library associations, the Consumer Federation of America, Consumers Union, the EFF, and PublicKnowledge. Here is a court filing outlining the petitioners’ arguments.

A Spoonful of Sugar

Here’s a brilliant idea. A group at Carnegie Mellon University has created The ESP Game, in which a pair of strangers, shown a photographic image, are each asked to guess the single word that the other will use to characterize the image. Get it right and you score valuable points. For an extra challenge, sometimes there are “taboo words” that you aren’t allowed to use. Players report that the game is semi-addictive.

The brilliant part is that the game “tricks” its players into doing an important and incredibly time-consuming job. By playing the game, you’re helping to build a giant index that associates each image on the internet with a set of words that describe it. It’s well known that indexing and searching a set of images requires the time-consuming manual step of assigning descriptive words to each image. Labeling all of the images on the internet is an enormous amount of work. When you play the ESP Game, you’re shown images randomly chosen from the internet. You’re doing the time-consuming manual work to index the whole internet’s images – and enjoying it! So far the group has collected over two million labels.

Utah Anti-Spyware Bill

The Utah state legislature has passed an anti-spyware bill, which now awaits the governor’s signature or veto. The bill is opposed by a large coalition of infotech companies, including Amazon, AOL, AT&T, eBay, Microsoft, Verizon, and Yahoo.

The bill bans the installation of spyware on a user’s computer. The core of the bill is its definition of “spyware”, which includes both ordinary spyware (which captures information about the user and/or his browsing habits, and sends that information back to the spyware distributor) and adware (which displays uninvited popup ads on a user’s computer, based on what the user is doing). Leaving aside the adware parts of the definition, we’re left with this:

(4) Except as provided in Subsection (5), “spyware” means software residing on a computer that:

(a) monitors the computer’s usage;
(b) (i) sends information about the computer’s usage to a remote computer or server; or [adware stuff omitted]; and
(c) does not:

(i) obtain the consent of the user, at the time of, or after installation of the software but before the software does any of the actions described in Subsection (4)(b):

(A) to a license agreement:

(I) presented in full; and
(II) written in plain language;

(B) to a notice of the collection of each specific type of information to be transmitted as a result of the software installation; [adware stuff omitted]
and

(ii) provide a method:

(A) by which a user may quickly and easily disable and remove the software from the user’s computer;
(B) that does not have other effects on the non-affiliated parts of the user’s computer; and
(C) that uses obvious, standard, usual, and ordinary methods for removal of computer software.

(5) Notwithstanding Subsection (4), “spyware” does not include:

(a) software designed and installed solely to diagnose or resolve technical difficulties;
(b) software or data that solely report to an Internet website information previously stored by the Internet website on the user’s computer, including:

(i) cookies;
(ii) HTML code; or
(iii) Java Scripts; or

(c) an operating system.

Since all spyware is banned, this amounts to a requirement that programs that meet the criteria of 4(a) and 4(b) (except those exempted by 5), must avoid 4(c) by obtaining user consent and providing a suitable removal facility.

The bill’s opponents claim the definition is overbroad and would cover many legitimate software services. If they’re right, it seems to me that the notice-and-consent requirement would be more of a burden than the removability requirement, since nearly all legitimate software is removable, either by itself or as part of a larger package in which it is embedded.

I have not seen specific examples of legitimate software that would be affected. A letter being circulated by opponents refers generically to “a host of important and beneficial Internet communication software”, that gather and communicate data that “may include information necessary to provide upgrade computer security, to protect against hacker attacks, to provide interactivity on web sites, to provide software patches, to improve Internet browser performance, or enhance search capabilities”. Can anybody think of a specific example, in which it would be burdensome to obtain the required consent or to provide the required removal facility?

[Opponents also argue that the bill’s adware language is overbroad. That in itself may be enough to oppose the bill; but I won’t discuss that aspect of the bill here.]

Senate File Pilfering Report Released

The report of a preliminary investigation into the Senate file pilfering has been released (in two parts) by Senate Sergeant-at-Arms Bill Pickle.

The report mostly confirms what was reported previously: many files on the shared server were unprotected, so that anybody who knew how could get them; a clerk working for the Republican staff, under the direction of a senior Republican staffer, accessed more than 4,000 of the Democrats’ files; and some of the juiciest files were leaked to the press, probably by the aforementioned Republican staffer.

The report also contradicts some claims made previously. It is clear from the report that the availability of the files was not widely known. The report also shows that the people making the accesses worked to cover their tracks, both during and after the time when the accesses occurred. It also appears that the Republican staff member who oversaw the accesses made false statements to the investigators.

I wrote before that it wasn’t clear whether the accesses violated the Computer Fraud and Abuse Act (CFAA). The key question in applying the CFAA to these facts was whether the staffers were “entitled to” access the particular files they downloaded; and the answer to that question depends on the rules and practices of the Senate.

The issue still isn’t clear-cut, but the facts recounted in the report tend to tip the balance toward violation of the CFAA. The accessors’ efforts to cover their tracks, both during and after the accesses, are revealing. And the report tells how the clerk, on initially discovering the files were accessible, took a pile of printed-out opposition files to one of his supervisors, who shredded the files and “admonished [the clerk] not to use the … documents”. These facts, plus the apparent false statements made to the investigators, tend to support the argument that the clerk and the staffer knew that the accesses were improper.

The report makes no recommendation for or against a referral of the CFAA matter to the Justice Department. That decision is in the hands of the Senators.