December 26, 2024

Archives for 2009

RIP Rocky Mountain News

The Rocky Morning News, Colorado’s oldest newspaper, closed its doors Friday. On their front page they have this incredibly touching video:

Final Edition from Matthew Roberts on Vimeo.

The closing of a large institution like a daily newspaper is an incredibly sad event, and my heart goes out to all the people who suddenly find their lives upended by sudden unemployment. Many talented and dedicated employees lost their jobs today, and some of them will have to scramble to salvage their careers and support their families. The video does a great job of capturing the shock and sadness that the employees of the paper feel—not just because they lost their jobs, but also because in some sense they’re losing their life’s work.

With that said, I do think it’s unfortunate that part of the video was spent badmouthing people, like me, who don’t subscribe to newspapers. One gets the impression that newspapers are failing because kids these days are so obsessed with swapping gossip on MySpace that they’ve stopped reading “real” news. No doubt, some people fit that description, but I think the more common case is something like the opposite: those of us with the most voracious appetite for news have learned that newsprint simply can’t compete with the web for breadth, depth, or timeliness. When I pick up a newspaper, I’m struck by how limited it is: the stories are 12 to 36 hours old, the range of topics covered is fairly narrow, and there’s no way to dig deeper on the stories that interest me most. That’s not the fault of the newspaper’s editors and reporters; newsprint is just an inherently limited medium.

As more newspapers go out of business in the coming years, I think it’s important that our sympathy for individual employees not translate into the fetishization of newsprint as a medium. And it’s especially important that we not confuse newsprint as a medium with journalism as a profession. Newsprint and journalism have been strongly associated in the past, but this an accident of technology, not something inherent to journalism. Journalism—the process of gathering, summarizing, and disseminating information about current events—has been greatly enriched by the Internet. Journalists have vastly more tools available for gathering the news, and much more flexible tools for disseminating it. The replacement of static newspapers with dynamic web pages is progress.

But that doesn’t mean it’s not a painful process. The web’s advantages are no consolation for Rocky employees who have spent their careers building skills connected to a declining technology. And the technical superiority of web will be of little consolation to Denver area readers who will, in the short run, have less news and information available about their local communities. So my thoughts and sympathy today are with the employees of the Rocky Mountain News.

NJ Voting-machine trial update

Earlier this month I testified in Gusciora v. Corzine, the trial in which the plaintiffs argue that New Jersey’s voting machines (Sequoia AVC Advantage) can’t be trusted to count the votes, because they’re so easily hacked to make them cheat.

I’ve previously written about the conclusions of my expert report: in 7 minutes you can replace the ROM and make the machine cheat in every future election, and there’s no practical way for the State to detect cheating machines (in part because there’s no voter-verified paper ballot).

The trial started on January 27, 2009 and I testified for four and a half days. I testified that the AVC Advantage can be hacked by replacing its ROM, or by replacing its Z80 processor chip, so that it steals votes undetectably. I testified that fraudulent firmware can also be installed into the audio-voting daughterboard by a virus carried through audio-ballot cartridges. I testified about many other things as well.

Finally, I testified about the accuracy of the Sequoia AVC Advantage. I believe that the most significant source of inaccuracy is its vulnerability to hacking. There’s no practical means of testing whether the machine has been hacked, and certainly the State of New Jersey does not even attempt to test. If we could somehow know that the machine has not been hacked, then (as I testified) I believe the most significant _other_ inaccuracy of the AVC Advantage is that it does not give adequate feedback to voters and pollworkers about whether a vote has been recorded. This can lead to a voter’s ballot not being counted at all; or a voter’s ballot counting two or three times (without fraudulent intent). I believe that this error may be on the order of 1% or more, but I was not able to measure it in my study because it involves user-interface interaction with real people.

In the hypothetical case that the AVC Advantage has not been hacked, I believe this user-interface source of perhaps 1% inaccuracy would be very troubling, but (in my opinion) is not the main reason to disqualify it from use in elections. The AVC Advantage should be disqualified for the simple reason that it can be easily hacked to cheat, and there’s no practical method that will be sure of catching this hack.

Security seals. When I examined the State’s Sequoia AVC Advantage voting machines in July 2008, they had no security seals preventing ROM replacement. I demonstrated on video (which we played in Court in Jan/Feb 2009) that in 7 minutes I could pick the lock, unscrew some screws, replace the ROM with one that cheats, replace the screws, and lock the door.

In September 2008, after the State read my expert report, they installed four kinds of physical security seals on the AVC Advantage. These seals were present during the November 2008 election. On December 1, I sent to the Court (and to the State) a supplemental expert report (with video) showing how I could defeat all of these seals.

In November/December the State informed the Court that they were changing to four new seals. On December 30, 2008 the State Director of Elections, Mr. Robert Giles, demonstrated to me the installation of these seals onto the AVC Advantage voting machine and gave me samples. He installed quite a few seals (of these four different kinds, but some of them in multiple places) on the machine.

On January 27, 2009 I sent to the Court (and to the State) a supplemental expert report showing how I could defeat all those new seals. On February 5th, as part of my trial testimony I demonstrated for the Court the principles and methods by which each of those seals could be defeated.

On cross-examination, the State defendants invited me to demonstrate, on an actual Sequoia AVC Advantage voting machine in the courtroom, the removal of all the seals, replacement of the ROM, and replacement of all the seals leaving no evidence of tampering. I then did so, carefully and slowly; it took 47 minutes. As I testified, someone with more practice (and without a judge and 7 lawyers watching) would do it much faster.

The Future of Smartphone Platforms

In 1985, I got my very first home computer: a Commodore Amiga 1000. At the time, it was awesome: great graphics, great sound, “real” multitasking, and so forth. Never mind that you spent half your life shuffling floppy disks around. Never mind that I kept my head full of Epson escape codes to use with my word processing program to get what I wanted out of my printer. No, no, the Amiga was wonderful stuff.

Let’s look at the Amiga’s generation. Starting with the IBM PC in 1981, the PC industry was in the midst of the transition from 8-bit micros (Commodore 64, Apple 2, Atari 800, BBC Micro, TI 99/4a, etc.) to 16/32-bit micros (IBM PC, Apple Macintosh, Commodore Amiga, Atari ST, Acorn Archimedes, etc.). These new machines each ran completely unrelated operating systems, and there was no consensus as which would be the ultimate winner. In 1985, nobody would have declared the PC’s victory to have been inevitable. Regardless, we all know how it worked out: Apple developed a small but steady market share, PCs took over the world (sans IBM), and the other computers faded away. Why?

The standard argument is “network effects.” PCs (and to a lesser extent Macs) developed sufficient followings to make them attractive platforms for developers, which in turn made them attractive to new users, which created market share, which created resources for future hardware developments, and on it went. The Amiga, on the other hand, became popular only in specific market niches, such as video processing and editing. Another benefit on the PC side was that Microsoft enabled clone shops, from Compaq to Dell and onward, to battle each other with low prices on commodity hardware. Despite the superior usability of a Mac or the superior graphics and sound of an Amiga, the PC came away the winner.

What about cellular smartphones then? I’ve got an iPhone. I have friends with Windows Mobile, Android, and Blackberry devices. When the Palm Pre comes out, it should gain significant market share as well. I’m sure there are people out there who love their Symbian or OpenMoko phones. The level of competition, today, in the smartphone world bears more than a passing resemblance to the competition in the mid-80’s PC market. So who’s going to win?

If you believe that the PCs early lead and widespread adoption by business was essential to its rise, then you could expect the Blackberry to win out. If you believe that the software/hardware coming from separate vendors was essential, then you’d favor Windows Mobile or Android. If you’re looking for network effects, look no farther than the iPhone. If you’re looking for the latest, coolest thing, then the Palm Pre sure does look attractive.

I’ll argue that this time will be different, and it’s the cloud that’s going to win. Right now, what matters to me, with my iPhone, is that I can get my email anywhere, I can make phone calls, and I can do basic web surfing. I occasionally use the GPS maps, or even watch a show purchased from the iTunes Store, but if you took those away, it wouldn’t change my life much. I’ve got pages of obscure apps, but none of them really lock me into the platform. (Example: Shazam is remarkably good at recognizing songs that it hears, but the client side of it is a very simple app that they could trivially port to any other smartphone.) On the flip side, I’m an avid consumer of Google’s resources (Gmail, Reader, Calendar, etc.). I would never buy a phone that I couldn’t connect to Google. Others will insist on being able to connect to their Exchange Server.

At the end of the day, the question isn’t whether a given smartphone interoperates with your friend’s phones, but whether it interoperates with your cloud services. You don’t need an Android to get a good mobile experience with Google, and you don’t need a Windows Mobile phone to get a good mobile experience with Exchange. Leaving one smartphone and adopting another one is, if anything, easier than transitioning with a traditional not-smartphone, since you don’t have to monkey as much with moving your address book around. As such, I think it’s reasonable to predict, in ten years, that we’ll still have at least one smartphone vendor per major cellular carrier, and perhaps more.

If we have further consolidation in the carrier market, that would put pressure on the smartphone vendors to cut costs, which could well lead to consolidation of the smartphone vendors. We could certainly also imagine carriers pushing on the smartphone vendors to include or omit particular features. We see plenty of that already. (Example: can you tether your laptop to a Palm Pre via Bluetooth? The answer seems to be a moving target.) Historically, the U.S. carriers are somewhat infamous for going out of their way to restrict what phones can do. Now, that seems to be mostly fixed, and for that, at least, we can thank Apple.

Let a thousand smartphones bloom? I sure hope so.

Federal Health IT Effort Is Making Progress, Could Benefit from More Transparency

President Obama has indicated that health information technology (HIT) is an important component of his administration’s health care goals. Politicians on both sides of the aisle have lauded the potential for HIT to reduce costs and improve care. In this post, I’ll give some basics about what HIT is, what work is underway, and how the government can get more security experts involved.

We can coarsely break HIT into three technical areas. The first area is the transition from paper to electronic records, which involves surprisingly many subtle technical issues like interoperability. Second, development of health information networks will allow sharing of patient data between medical facilities and with other appropriate parties. Third, as a recent National Research Council report discusses, digital records can enable research in new areas, such as cognitive support for physicians.

HIT was not created on the 2008 campaign trail. The Department of Veterans Affairs (VA) has done work in this area for decades, including its widely praised VistA system, which provides electronic patient records and more. Notably, VistA source code and documentation can be freely downloaded. Many other large medical centers also already use electronic patient records.

In 2004, then-President Bush pushed for deployment of a Nationwide Health Information Network (NHIN) and universal adoption of electronic patient records by 2014. The NHIN is essentially a nationwide network for sharing relevant patient data (e.g., if you arrive at an emergency room in Oregon, the doctor can obtain needed records from your regular doctor in Kansas). The Department of Health and Human Services (HHS) funded four consortia to develop smaller, localized networks, partially as a learning exercise to prepare for the NHIN. HHS has held a number of forums where members of these consortia, the government, and the public can meet and discuss timely issues.

The agendas for these forums show some positive signs. Sessions cover a number of tricky issues. For example, participants in one session considered the risk that searches for a patient’s records in the NHIN could yield records for patients with similar attributes, posing privacy concerns. Provided that meaningful conversations occurred, HHS appears to be making a concerted effort to ensure that issues are identified and discussed before settling on solutions.

Unfortunately, the academic information security community seems divorced from these discussions. Whether before or after various proposed systems are widely deployed, members of the community are eventually likely to analyze them. This analysis would be preferable earlier. In spite of the positive signs mentioned, past experience shows that even skilled developers can produce insecure systems. Any major flaws uncovered may be embarrassing, but weaknesses found now would be cheaper and easier to fix than ones found in 2014.

A great way to draw constructive scrutiny is to ensure transparency in federally funded HIT work. Limited project details are often available online, but both high- and low-level details can be hard to find. Presumably, members of the NHIN consortia (for example) developed detailed internal documents containing use cases, perceived risks/threats, specifications, and architectural illustrations.

To the extent legally feasible, the government should make documents like these available online. Access to them would make the projects easier to analyze, particularly for those of us less familiar with HIT. In addition, a typical vendor response to reported vulnerabilities is that the attack scenario is unrealistic (this is a standard response of e-voting vendors). Researchers can use these documents to ensure that they consider only realistic attacks.

The federal agenda for HIT is ambitious and will likely prove challenging and expensive. To avoid massive, costly mistakes, the government should seek to get as many eyes as possible on the work that it funds.

Hulu abandons Boxee—now what?

In our last installment, I detailed the trials and tribulations of my attempt to integrate legal, Internet-sourced video into my home theater via a hacked AppleTV, running Boxee, getting its feed from Hulu.

One day later (!), Hulu announced it was all over.

Later this week, Hulu’s content will no longer be available through Boxee. While we never had a formal relationship with Boxee, we are under no illusions about the likely Boxee user response from this move. This has weighed heavily on the Hulu team, and we know it will weigh even more so on Boxee users.

Our content providers requested that we turn off access to our content via the Boxee product, and we are respecting their wishes. While we stubbornly believe in this brave new world of media convergence — bumps and all — we are also steadfast in our belief that the best way to achieve our ambitious, never-ending mission of making media easier for users is to work hand in hand with content owners. Without their content, none of what Hulu does would be possible, including providing you content via Hulu.com and our many distribution partner websites.

(emphasis mine)

On Boxee’s blog, they wrote:

two weeks ago Hulu called and told us their content partners were asking them to remove Hulu from boxee. we tried (many times) to plead the case for keeping Hulu on boxee, but on Friday of this week, in good faith, we will be removing it. you can see their blog post about the issues they are facing.

At least I’m not to blame. Clearly, those who own content are threatened by the ideas we discussed before. Why overpay for cable when you can get the three shows you care about from Hulu for free?

Also interesting to note is the acknowledgment that there was no formal relationship between Hulu and Boxee. That’s the power of open standards. Hulu was publishing bits. Boxee was consuming those bits. The result? An integrated system, good enough to seriously consider dropping your cable TV subscription. Huzzah.

Notable by its absence: Hulu content is also supported on the Xbox 360 or Playstation 3 via PlayOn, which serves pretty much the same niche as Boxee. Similarly, there’s an XBMC Hulu plugin (recall that Boxee is based on the open-source XBMC project). We don’t know whether Hulu will continue to work with these other platforms or not. Hulu seems to be taking the approach of asking Boxee nicely to walk away. Will they ask the other projects to pull their Hulu support as well? Will all of those projects actually agree to pull the plug or will Hulu be forced to go down the failed DRM road?

It’s safe to predict that it won’t be pretty. My AppleTV can run XBMC just as well as it can run Boxee, which naturally returns us to the question of the obsolescence of cable TV.

There’s a truism that, if your product is going to become obsolete, you should be the one who makes it obsolete. Example: hardwired home telephones are going away. In rich countries, people use their cell phone so much that they eventually notice that they don’t need the landline any more. In poor countries, the cost of running wires is too high, so it’s cheaper to deploy cellular phones. Still, guess who runs the cell phone networks? It’s pretty much the same companies who run the wired phone networks. They make out just fine (except, perhaps, with international calling, where Skype and friends provide great quality for effectively nil cost).

Based on what I’ve observed, it’s safe to predict that cable TV, satellite TV, and maybe even over-the-air TV, are absolutely, inevitably, going to be rendered obsolete by Internet TV. Perhaps they can stave off the inevitable by instituting a la carte pricing plans, so I could get the two cable channels I actually care about and ignore the rest. But if they did that, their whole business model would be smashed to bits.

For my prediction to pan out, we have to ask whether the Internet can handle all that bandwidth. As an existence proof, it’s worth pointing out that I can also get AT&T U-verse for a price competitive with my present Comcast service. AT&T bumps up your DSL to around 30Mb/sec, and you get an HD DVR that sucks your shows down over your DSL line. They’re presumably using some sort of content distribution network to keep their bandwidth load reasonable, and the emphasis is on real-time TV channel watching, which lowers their need to store bits in the CDN fabric. Still, it’s reasonable to see how U-verse could scale to support video on demand with Hulu or Netflix’s full library of titles.

U-verse does a good enough job of pretending to be just like cable that it’s completely uninteresting to me. But if their standards were open and free of DRM, then third parties, like TiVo or Boxee, could build compatible boxes and we’d really have something interesting. I’d drop my cable for that.

(One of my colleagues has U-verse, and he complains that, when his kids watch TV, he can feel the Internet running slower. “Hey you kids, can you turn off the TV? I’m trying to do a big download.” It’s the future.)