December 25, 2024

OLPC Review Followup

Last week’s review of the One Laptop Per Child (OLPC) machine by twelve-year-old “SG” was one of our most-commented-upon posts ever. Today I want to follow up on a few items.

First, the machine I got for SG was the B2 (Beta 2) version of the OLPC system, which is not the latest. Folks from the OLPC project suggest that some of the problems SG found are fixed in the latest version. They have graciously offered to send an up to date OLPC machine for SG to review. SG has agreed to try out the new machine and review it here on Freedom to Tinker.

Second, I was intrigued by the back-and-forth in the comments over SG’s gender. I had originally planned to give SG a pseudonym that revealed SG’s gender, but a colleague suggested that I switch to a gender-neutral pseudonym. Most commenters didn’t seem to assume one gender or the other. A few assumed that SG is a boy, which generated some pushback from others who found that assumption sexist. My favorite comment in this series was from “Chris,” who wrote:

Why are you assuming the review was written by a boy?
At 12 we’re only two years from 8th grade level, the rumored grail (or natural default) of our national publications. SG, you’re clearly capable of writing for most any publication in this country, you go girl! (even if you are a boy)

Third, readers seem to be as impressed as I was by the quality of SG’s writing. Some found it hard to believe that a twelve-year-old could have written the post. But it was indeed SG’s work. I am assured that SG’s parents did not edit the post but only suggested in general terms the addition of a paragraph about what SG did with the machine. I suggested only one minor edit to preserve SG’s anonymity. Otherwise what you read is what SG wrote.

Though sentences like “My expectations for this computer were, I must admit, not very high.” seem unusual for a twelve-year-old, others show a kid’s point of view. One example: “Every time you hit a key, it provides a certain amount of satisfaction of how squishy and effortless it is. I just can’t get over that keyboard.”

SG is welcome to guest blog here in the future. Kids can do a lot, if we let them.

Sony-BMG Sues Maker of Bad DRM

Major record company Sony-BMG has sued the company that made some of the dangerous DRM (anti-copying) software that shipped on Sony-BMG compact discs back in 2005, according to an Antony Bruno story in Billboard.

Longtime Freedom to Tinker readers will remember that back in 2005 Sony-BMG shipped CDs that opened security holes and invaded privacy when inserted into Windows PCs. The CDs contained anti-copying software from two companies, SunnComm and First4Internet. The companies’ attempts to fix the problems only made things worse. Sony-BMG ultimately had to recall some of the discs, and faced civil suits and government investigations that were ultimately settled. The whole episode must have cost Sony-BMG many millions of dollars. (Alex Halderman and I wrote an academic paper about it.)

One of the most interesting questions about this debacle is who deserved the blame. SunnComm and First4Internet made the dangerous products, but Sony-BMG licensed them and distributed them to the public. It’s tempting to blame the vendors, but the fact that Sony-BMG shipped two separate dangerous products has to be part of the calculus too. There’s plenty of blame to go around.

As it turned out, Sony-BMG took most of the public heat and shouldered most of the financial responsibility. That was pretty much inevitable considering that Sony-BMG had the deepest pockets, was the entity that consumers knew, and had by far the most valuable brand name. The lawsuit looks like an attempt by Sony-BMG to recoup some of its losses.

The suit will frustrate SunnComm’s latest attempt to run from its past. SunnComm had renamed itself as Amergence Group and was trying to build a new corporate image as some kind of venture capitalist or start-up incubator. (This isn’t the first swerve in SunnComm’s direction – the company started out as a booking agency for Elvis impersonators. No, I’m not making that up.) The suit and subsequent publicity won’t help the company’s image any.

The suit itself will be interesting, if it goes ahead. We have long wondered exactly what Sony knew and when, as well as how the decision to deploy the dangerous technology was made. Discovery in the lawsuit will drag all of that out, though it will probably stay behind closed doors unless the case makes it to court. Sadly for the curious public, a settlement seems likely. SunnComm/Amergence almost certainly lacks the funds to fight this suit, or to pay the $12 million Sony-BMG is asking for.

More California E-Voting Reports Released; More Bad News

Yesterday the California Secretary of State released the reports of three source code study teams that analyzed the source code of e-voting systems from Diebold, Hart InterCivic, and Sequoia.

All three reports found many serious vulnerabilities. It seems likely that computer viruses could be constructed that could infect any of the three systems, spread between voting machines, and steal votes on the infected machines. All three systems use central tabulators (machines at election headquarters that accumulate ballots and report election results) that can be penetrated without great effort.

It’s hard to convey the magnitude of the problems in a short blog post. You really have read through the reports – the shortest one is 78 pages – to appreciate the sheer volume and diversity of severe vulnerabilities.

It is interesting (at least to me as a computer security guy) to see how often the three companies made similar mistakes. They misuse cryptography in the same ways: using fixed unchangeable keys, using ciphers in ECB mode, using a cyclic redundancy code for data integrity, and so on. Their central tabulators use poorly protected database software. Their code suffers from buffer overflows, integer overflow errors, and format string vulnerabilities. They store votes in a way that compromises the secret ballot.

Some of these are problems that the vendors claimed to have fixed years ago. For example, Diebold claimed (p. 11) in 2003 that its use of hard-coded passwords was “resolved in subsequent versions of the software”. Yet the current version still uses at least two hard-coded passwords – one is “diebold” (report, p. 46) and another is the eight-byte sequence 1,2,3,4,5,6,7,8 (report, p. 45).

Similarly, Diebold in 2003 ridiculed (p. 6) the idea that their software could suffer from buffer overflows: “Unlike a Web server or other Internet enabled applications, the code is not vulnerable to most ‘buffer overflow attacks’ to which the authors [Kohno et al.] refer. This form of attack is almost entirely inapplicable to our application. In the limited number of cases in which it would apply, we have taken the steps necessary to ensure correctness.” Yet the California source code study found several buffer overflow vulnerabilities in Diebold’s systems (e.g., issues 5.1.6, 5.2.3 (“multiple buffer overflows”), and 5.2.18 in the report).

As far as I can tell, major news outlets haven’t taken much notice of these reports. That in itself may be the most eloquent commentary on the state of e-voting: reports of huge security holes in e-voting systems are barely even newsworthy any more.

Where are the California E-Voting Reports?

I wrote Monday about the California Secretary of State’s partial release of report from the state’s e-voting study. Four subteams submitted reports to the Secretary, but as yet only the “red team” and accessibility teams’ reports have been released. The other two sets of reports, from the source code review and documentation review teams, are still being withheld.

The Secretary even held a public hearing on Monday about the study, without having released all of the reports. This has led to a certain amount of confusion, as many press reports and editorials (e.g. the Mercury News editorial) about the study seem to assume that the full evaluation results have been reported. The vendors and some county election officials have encouraged this misimpression – some have even criticized the study for failing to consider issues that are almost certainly addressed in the missing reports.

With the Secretary having until Friday to decide whether to decertify any e-voting systems for the February 2008 primary election, the obvious question arises: Why is the Secretary withholding the other reports?

Here’s the official explanation, from the Secretary’s site:

The document review teams and source code review teams submitted their reports on schedule. Their reports will be posted as soon as the Secretary of State ensures the reports do not inadvertently disclose security-sensitive information.

This explanation is hard to credit. The study teams were already tasked to separate their reports into a public body and a private appendix, with sensitive exploit-oriented details put in the private appendix that would go only to the Secretary and the affected vendor. Surely the study teams are much better qualified to determine the security implications of releasing a particular detail than the lawyers in the Secretary’s office are.

More likely, the Secretary is worried about the political implications of releasing the reports. Given this, it seems likely that the withheld reports are even more damning than the ones released so far.

If the red team reports, which reported multiple vulnerabilities of the most serious kind, are the good news, how bad must the bad news be?

UPDATE (2:45 PM EDT, August 2): The source code review reports are now up on the Secretary of State’s site. They’re voluminous so I won’t be commenting on them immediately. I’ll post my reactions tomorrow.

California Study: Voting Machines Vulnerable; Worse to Come?

A major study of three e-voting systems, commissioned by the California Secretary of State’s office, reported Friday that all three had multiple serious vulnerabilities.

The study examined systems from Diebold, Hart InterCivic, and Sequoia; each system included a touch-screen machine, an optical-scan machine, and the associated backend control and tabulation machine. Each system was studied by three teams: a “red team” did a hands-on study of the machines, a “source code team” examined the software source code for the system, and a “documentation team” examined documents associated with the system and its certification. (An additional team studied the accessibility of the three systems – an important topic but beyond the scope of this post.)

(I did not participate in the study. An early press release from the state listed me as a participant but that was premature. I ultimately had to withdraw before the study began, due to a scheduling issue.)

So far only the red team (and accessibility) reports have been released, which makes one wonder what is in the remaining reports. Here are the reports so far:

The bottom-line paragraph from the red team overview says this (section 6.4):

The red teams demonstrated that the security mechanisms provided for all systems analyzed were inadequate to ensure accuracy and integrity of the election results and of the systems that provide those results.

The red teams all reported having inadequate time to fully plumb the systems’ vulnerabilities (section 4.0):

The short time allocated to this study has several implications. The key one is that the results presented in this study should be seen as a “lower bound”; all team members felt that they lacked sufficient time to conduct a thorough examination, and consequently may have missed other serious vulnerabilities. In particular, Abbott’s team [which studied the Diebold and Hart systems] reported that it believed it was close to finding several other problems, but stopped in order to prepare and deliver the required reports on time. These unexplored avenues are presented in the reports, so that others may pursue them. Vigna’s and Kemmerer’s team [which studied the Sequoia system] also reported that they were confident further testing would reveal additional security issues.

Despite the limited time, the teams found ways to breach the physical security of all three systems using only “ordinary objects” (presumably paper clips, coins, pencil erasers, and the like); they found ways to modify or overwrite the basic control software in all three voting machines; and they were able to penetrate the backend tabulator system and manipulate election records.

The source code and documentation studies have not yet been released. To my knowledge, the state has not given a reason for the delay in releasing these reports.

The California Secretary of State reportedly has until Friday to decide whether to allow these systems to be used in the state’s February 2008 primary election.

[UPDATE: A public hearing on the study is being webcast live at 10:00 AM Pacific today.]