April 26, 2024

What Color Is My Hat?

An article by Rob Lemos at news.com discusses the differences between “white hat,” “gray hat,” and “black hat” hackers. The article lists me as a gray hat.

In my book, there is no such thing as a gray hat. If you break into a computer system without the owner’s permission, or if you infringe a copyright, then your hat is black. Otherwise your hat is white.

This article, like so many others, tries to pin the “gray hat” image on anyone whose actions make a technology vendor unhappy. That’s why the article classifies me as a gray hat – because my research made the RIAA unhappy.

As a researcher, my job is not to make vendors happy. My job is to discover the truth and report it. If the truth makes a vendor look good, that’s great. If the truth makes a vendor look bad, so be it.

Comments on White House Cybersecurity Plan

As a computer security researcher and teacher, I was interested to see the White House’s draft cybersecurity plan. It looks to be mostly harmless, but there are a few things in it that surprised me.

First, I was surprised at the strong focus on issues late in the product lifecycle. Security is an issue throughout the life of a product, from the initial conception of the product through its design, implementation, revision, use, and maintenance. The usual rule of thumb is that an ounce of prevention is worth a pound of cure – that attention to security early in the lifecycle makes a big difference later.

Despite this, the White House plan emphasizes remediation late in the lifecycle, over prevention earlier in the lifecycle. There is much discussion of intrusion detection, management, application of patches, and training of users; and not so much discussion of how products could be made more secure “out of the box.”

In the short run, these late-lifecycle methods are necessary, because it is too late to redo the early lifecycles of the products we are using today. But in the long run a big part of the answer has to lie in better product design, a goal to which the plan gives some lip service but not much concrete support.

The second surprise was the section on higher education (pp. 33-34 if you’re reading along at home).

Cybersecurity is a big mess, and there is plenty of blame to go around. You would expect the plan, as a political document, to avoid direct criticism of anyone, but instead to accentuate the positive by pointing to opportunities for improvement rather than inadequate performance. Indeed, that is the tone of most of the plan.

Universities alone seem to come in for direct criticism, having “many insecure systems” that “have been … exploited by hackers” thereby “[placing] other sectors at risk.” Contrast this with the section on “large enterprises” (pp. 19-22). Universities “have been” exploited; large enterprises “can be” exploited. Universities “place other sectors at risk”; large enterprises “can play a unique role in developing resiliency”.

But the biggest surprise in the higher education section is that there is no mention of the fact that computer security education and research are taking place at universities. The discussions of other stakeholders are careful to genuflect to those sectors’ worthy training and research efforts, but the higher education section is strangely silent. This despite the fact that many of the basic technologies whose adoption the report urges were invented at universities. (Think, for instance, of public key crypto.)

This general lack of attention to the educational system is evident elsewhere in the report too. Consider discussion point D4-12 (emphasis added):

How can government and private industry establish programs to identify early students with a demonstrated interest in and/or talent for IT security work, encourage and develop their interest and skills, and direct them into the workforce?

That’s what we do at America’s schools and universities: we help students identify their interests and talents, we encourage and develop those interests and skills, and ultimately we help students direct themselves into the workforce. On the whole I think we do a pretty good job of it. We’re happy to have the help of government and industry, but it’s a bit dismaying to see this identified as somebody else’s job.

White House Cybersecurity Plan: On Life Support?

The White House’s “National Strategy to Secure Cyberspace,” initially slated for release on Wednesday, has been delayed, the Washington Post reports. This comes on the heels of the removal of some of the report’s proposals, and a leak of the draft proposal.

It looks like the report will end up as an eloquent expression of good intentions, coupled with few if any effective action items. Once the decision was made that the report would be changed to make all of the stakeholders happy, this result became inevitable. There are just too many agendas in play to reach any kind of consensus on this issue.

This is not necessarily a bad thing. The government can improve the security of its own systems, but there is little it can do to make ordinary non-government computing more secure. Our main problem is that the market doesn’t reward vendors for investing the large amounts of time and money necessary to build highly secure systems. There isn’t much the government can do to change that.

ABC News Hires "Hackers" to Disrupt Police

ABC News reports on their own hiring of “hackers” to disrupt the Huntington Beach, CA police department. (Start reading at the “Testing the system” heading.)

They tried to trick an officer into leaving his post to investigate a false “emergency.” They tried to infect the Chief’s computer with a virus. (Fortunately, neither of these attacks ended up working; but it wasn’t for lack of trying.)

What was ABC News thinking? Trying to disrupt a working police department, which the citizens were relying upon to cope with any real emergencies that developed, was an amazingly irresponsible thing to do.

The article implies, but does not directly say, that the police department consented to this test, but was kept in the dark about which day it would occur. If so, then the police department needs their heads examined just as badly as ABC News does.

I’m all in favor of testing critical systems, but not by mounting surprise attacks on the systems that ordinary citizens’ lives depend upon.

[Link credit: disLEXia]

Serious Linux Worm

New.com reports on a new worm infecting Linux/Apache servers. (A “worm” is a malicious standalone program that propagates on its own, without requiring any human action.)

A new worm that attacks Linux Web servers has compromised more than 3,500 machines, creating a rogue peer-to-peer network that has been used to attack other computers with a flood of data, security experts said Saturday.

It was only a matter of time before this happened. Linux in particular, and open-source software in general, are not immune to malware such as worms and viruses. Linux has gotten a free pass for a while, because malware developers, like all software developers, tend to target their code for the most popular platforms. Now that Linux is so popular on servers, it becomes a more natural target for malware.

Of course, whoever did this is a criminal and deserves to be punished.

If there is a silver lining here, it is that this serves as a wake-up call for those who view the poor state of computer security as a “Microsoft problem” or a “closed-source problem.” All software is riddled with bugs, and all security-critical software is riddled with security-critical bugs. We just don’t know how to build large, complex programs without them. Rather than pointing the finger at others, who might or might not have a few more bugs than we do, we all need to figure out how to do radically better than any of us are doing today.