November 25, 2024

Use a Firewall, Go to Jail

The states of Massachusetts and Texas are preparing to consider bills that apparently are intended to extend the national Digital Millennium Copyright Act. (TX bill; MA bill) The bills are obviously related to each other somehow, since they are textually similar.

Here is one example of the far-reaching harmful effects of these bills. Both bills would flatly ban the possession, sale, or use of technologies that “conceal from a communication service provider … the existence or place of origin or destination of any communication”. Your ISP is a communication service provider, so anything that concealed the origin or destination of any communication from your ISP would be illegal – with no exceptions.

If you send or receive your email via an encrypted connection, you’re in violation, because the “To” and “From” lines of the emails are concealed from your ISP by encryption. (The encryption conceals the destinations of outgoing messages, and the sources of incoming messages.)

Worse yet, Network Address Translation (NAT), a technology widely used for enterprise security, operates by translating the “from” and “to” fields of Internet packets, thereby concealing the source or destination of each packet, and hence violating these bills. Most security “firewalls” use NAT, so if you use a firewall, you’re in violation.

If you have a home DSL router, or if you use the “Internet Connection Sharing” feature of your favorite operating system product, you’re in violation because these connection sharing technologies use NAT. Most operating system products (including every version of Windows introduced in the last five years, and virtually all versions of Linux) would also apparently be banned, because they support connection sharing via NAT.

And this is just one example of the problems with these bills. Yikes.

UPDATE (6:35 PM): It’s worse than I thought. Similar bills are on the table in South Carolina, Florida, Georgia, Alaska, Tennessee, and Colorado.

UPDATE (March 28, 9:00 AM): Clarified the paragraph above about encrypted email, to eliminate an ambiguity.

UPDATE: I now have a page with information about all of these bills, including the current status in each state.

Finkelstein Replies on ARDG and the Press

Seth Finkelstein replies to my previous posting on companies’ press policies by suggesting that companies are rational to keep their engineers away from the press, because of concerns about being unfairly misquoted.

I can see his point, by I think hatchet-job stories are pretty rare in the respectable media, and I also think that most readers recognize such stories and discount them. Reporters resent being manipulated and are more likely to seize on a misstatement if it is the only interesting thing you say. If you want them to write about substance, you have to talk to them about substance.

Seth’s example, the “Al Gore invented the Internet” story, is a good illustration. Gore’s organization was trying to manipulate the press, as all political campaigns do. Gore was available to the press mainly in highly scripted situations, so when he went off script and said something he shouldn’t have said, it was newsworthy.

(And though too much was made of Gore’s statement, he did say, “I took the initiative in creating the Internet”, which just isn’t true. Yes, Gore deserves credit for promoting the Internet before almost anyone else on Capitol Hill had even heard of it; and yes, he did take the initiative in funding the Internet at a crucial stage of its build-out. But there is a big difference between creating something and merely paying for a stage of its construction.)

NRC Report on Authentication Technology and Privacy

The authoritative National Research Council has issued an important new report entitled “Who Goes There?: Authentication Through the Lens of Privacy.” Like all NRC reports, this is an in-depth document reflecting the consensus of an impressive panel of experts.

Often people think of authorization (that is, ensuring that only authorized people get access to a resource) is antithetical to privacy, but this need not be true. One of the report’s findings is this:

Authorization does not always require individual authentication or identification, but mosts existing authorization systems perform one of these functions anyway. Similarly, a requirement for authentication does not always imply that accountability is needed, but many authentication systems generate and store information as though it were.

There are many ways to use authentication in designing systems, and a careful design can reduce the privacy cost that must be paid to achieve a given level of security. There is not a single “knob” that we can turn to trade off security against privacy, but a complex landscape in which we can hope to get more of both, if we choose wisely.

More on ARDG and the Press

I wrote yesterday about the ARDG’s policy, banning the press from the otherwise open ARDG meetings. Apparently the official rationale for this is that some companies refuse to allow the people who represent them at ARDG meetings to speak to the press.

I have to admit that I find these companies’ policies hard to understand. A company trusts somebody to speak on its behalf in a public forum, where many of the company’s competitors and customers are present, and where everybody is welcome to take notes. And yet somehow it is too dangerous to let that employee say the same things if a reporter is also present.

In my experience, companies that allow their best engineers to speak in public get more respect than ones that don’t. I can understand the desire to manage a company’s image, but reporters and the public have gotten pretty good at separating vacuous marketing-speak from substantive discussion, and at ignoring the former. You’re not doing yourself any favors by blocking access to the people who can best articulate your technical vision.

Microsoft’s approach to the Berkeley DRM conference is a great example of the benefits of letting your engineers speak. This was a large conference with many reporters present. Microsoft sent several senior engineers, who gave substantive presentations and engaged in real debate. What they said was not spin-free, of course, but whether you agreed or disagreed with their arguments, you had to respect them for participating in the debate.

James Grimmelmann’s definitive account of the Berkeley DRM conference has this to say:

… the [Microsoft] people at the conference are among the straightest shooters …. Compared with the other industry flacks

Leaks From CERT's "Good Guys" List

Brian McWilliams at Wired News reports on the leakage of unreleased security alerts from the government-funded CERT coordination center. Three secret alerts sent to members of CERT’s “good guys” club (known as the Information Security Alliance, or ISA) were reposted onto the open “Full Disclosure” mailing list.

The person who did this may have violated a contractual agreement to keep the information secret. If so, the release can be condemned on that basis.

In any case, this incident teaches us some valuable lessons. First, the idea of releasing vulnerability information only to a large set of “good guys” doesn’t work in practice. What’s to stop a malicious person from joining the club? And remember, a serious bad guy wouldn’t release the information to the public but would exploit it himself, or release it only to his malicious friends.

Ironically, one of the secret alerts that was leaked was little more than an abstract of a paper published recently by Stanford University researchers. Given CERT’s non-profit, public-good mission, it’s hard to see why CERT did not release this report to the public, given that the information on which it was based had already been released (and even discussed on Slashdot).

It’s worth noting that, having set up a system where it is paid to deliver security secrets to the ISA membership, CERT has an economic incentive to manufacture secrets or to increase their perceived value to ISA members by withholding the secrets from the public for longer than necessary. I have no reason to accuse CERT of doing this systematically, but its handling of the Stanford paper does raise questions.