October 13, 2024

Federal Health IT Effort Is Making Progress, Could Benefit from More Transparency

President Obama has indicated that health information technology (HIT) is an important component of his administration’s health care goals. Politicians on both sides of the aisle have lauded the potential for HIT to reduce costs and improve care. In this post, I’ll give some basics about what HIT is, what work is underway, and how the government can get more security experts involved.

We can coarsely break HIT into three technical areas. The first area is the transition from paper to electronic records, which involves surprisingly many subtle technical issues like interoperability. Second, development of health information networks will allow sharing of patient data between medical facilities and with other appropriate parties. Third, as a recent National Research Council report discusses, digital records can enable research in new areas, such as cognitive support for physicians.

HIT was not created on the 2008 campaign trail. The Department of Veterans Affairs (VA) has done work in this area for decades, including its widely praised VistA system, which provides electronic patient records and more. Notably, VistA source code and documentation can be freely downloaded. Many other large medical centers also already use electronic patient records.

In 2004, then-President Bush pushed for deployment of a Nationwide Health Information Network (NHIN) and universal adoption of electronic patient records by 2014. The NHIN is essentially a nationwide network for sharing relevant patient data (e.g., if you arrive at an emergency room in Oregon, the doctor can obtain needed records from your regular doctor in Kansas). The Department of Health and Human Services (HHS) funded four consortia to develop smaller, localized networks, partially as a learning exercise to prepare for the NHIN. HHS has held a number of forums where members of these consortia, the government, and the public can meet and discuss timely issues.

The agendas for these forums show some positive signs. Sessions cover a number of tricky issues. For example, participants in one session considered the risk that searches for a patient’s records in the NHIN could yield records for patients with similar attributes, posing privacy concerns. Provided that meaningful conversations occurred, HHS appears to be making a concerted effort to ensure that issues are identified and discussed before settling on solutions.

Unfortunately, the academic information security community seems divorced from these discussions. Whether before or after various proposed systems are widely deployed, members of the community are eventually likely to analyze them. This analysis would be preferable earlier. In spite of the positive signs mentioned, past experience shows that even skilled developers can produce insecure systems. Any major flaws uncovered may be embarrassing, but weaknesses found now would be cheaper and easier to fix than ones found in 2014.

A great way to draw constructive scrutiny is to ensure transparency in federally funded HIT work. Limited project details are often available online, but both high- and low-level details can be hard to find. Presumably, members of the NHIN consortia (for example) developed detailed internal documents containing use cases, perceived risks/threats, specifications, and architectural illustrations.

To the extent legally feasible, the government should make documents like these available online. Access to them would make the projects easier to analyze, particularly for those of us less familiar with HIT. In addition, a typical vendor response to reported vulnerabilities is that the attack scenario is unrealistic (this is a standard response of e-voting vendors). Researchers can use these documents to ensure that they consider only realistic attacks.

The federal agenda for HIT is ambitious and will likely prove challenging and expensive. To avoid massive, costly mistakes, the government should seek to get as many eyes as possible on the work that it funds.

Hulu abandons Boxee—now what?

In our last installment, I detailed the trials and tribulations of my attempt to integrate legal, Internet-sourced video into my home theater via a hacked AppleTV, running Boxee, getting its feed from Hulu.

One day later (!), Hulu announced it was all over.

Later this week, Hulu’s content will no longer be available through Boxee. While we never had a formal relationship with Boxee, we are under no illusions about the likely Boxee user response from this move. This has weighed heavily on the Hulu team, and we know it will weigh even more so on Boxee users.

Our content providers requested that we turn off access to our content via the Boxee product, and we are respecting their wishes. While we stubbornly believe in this brave new world of media convergence — bumps and all — we are also steadfast in our belief that the best way to achieve our ambitious, never-ending mission of making media easier for users is to work hand in hand with content owners. Without their content, none of what Hulu does would be possible, including providing you content via Hulu.com and our many distribution partner websites.

(emphasis mine)

On Boxee’s blog, they wrote:

two weeks ago Hulu called and told us their content partners were asking them to remove Hulu from boxee. we tried (many times) to plead the case for keeping Hulu on boxee, but on Friday of this week, in good faith, we will be removing it. you can see their blog post about the issues they are facing.

At least I’m not to blame. Clearly, those who own content are threatened by the ideas we discussed before. Why overpay for cable when you can get the three shows you care about from Hulu for free?

Also interesting to note is the acknowledgment that there was no formal relationship between Hulu and Boxee. That’s the power of open standards. Hulu was publishing bits. Boxee was consuming those bits. The result? An integrated system, good enough to seriously consider dropping your cable TV subscription. Huzzah.

Notable by its absence: Hulu content is also supported on the Xbox 360 or Playstation 3 via PlayOn, which serves pretty much the same niche as Boxee. Similarly, there’s an XBMC Hulu plugin (recall that Boxee is based on the open-source XBMC project). We don’t know whether Hulu will continue to work with these other platforms or not. Hulu seems to be taking the approach of asking Boxee nicely to walk away. Will they ask the other projects to pull their Hulu support as well? Will all of those projects actually agree to pull the plug or will Hulu be forced to go down the failed DRM road?

It’s safe to predict that it won’t be pretty. My AppleTV can run XBMC just as well as it can run Boxee, which naturally returns us to the question of the obsolescence of cable TV.

There’s a truism that, if your product is going to become obsolete, you should be the one who makes it obsolete. Example: hardwired home telephones are going away. In rich countries, people use their cell phone so much that they eventually notice that they don’t need the landline any more. In poor countries, the cost of running wires is too high, so it’s cheaper to deploy cellular phones. Still, guess who runs the cell phone networks? It’s pretty much the same companies who run the wired phone networks. They make out just fine (except, perhaps, with international calling, where Skype and friends provide great quality for effectively nil cost).

Based on what I’ve observed, it’s safe to predict that cable TV, satellite TV, and maybe even over-the-air TV, are absolutely, inevitably, going to be rendered obsolete by Internet TV. Perhaps they can stave off the inevitable by instituting a la carte pricing plans, so I could get the two cable channels I actually care about and ignore the rest. But if they did that, their whole business model would be smashed to bits.

For my prediction to pan out, we have to ask whether the Internet can handle all that bandwidth. As an existence proof, it’s worth pointing out that I can also get AT&T U-verse for a price competitive with my present Comcast service. AT&T bumps up your DSL to around 30Mb/sec, and you get an HD DVR that sucks your shows down over your DSL line. They’re presumably using some sort of content distribution network to keep their bandwidth load reasonable, and the emphasis is on real-time TV channel watching, which lowers their need to store bits in the CDN fabric. Still, it’s reasonable to see how U-verse could scale to support video on demand with Hulu or Netflix’s full library of titles.

U-verse does a good enough job of pretending to be just like cable that it’s completely uninteresting to me. But if their standards were open and free of DRM, then third parties, like TiVo or Boxee, could build compatible boxes and we’d really have something interesting. I’d drop my cable for that.

(One of my colleagues has U-verse, and he complains that, when his kids watch TV, he can feel the Internet running slower. “Hey you kids, can you turn off the TV? I’m trying to do a big download.” It’s the future.)

TiVo, AppleTV, Boxee, and the future of HD television delivery

I don’t watch as much TV as I once did. Yet, I’m still paying Comcast every month, as they’re the only provider who will sell me HD service compatible with my TiVo-HD. Sadly, Comcast is far from ideal. I’m regularly frustrated at their inability to debug their signal quality problems. (My ABC-HD and PBS-HD signals are right on the edge, in terms of signal quality, so any slight degradation makes those channels unwatchable through the MPEG block errors, which seems to happen on an irregular basis.) Comcast customer service wants me to sit around all day waiting for a tech to come out when the problem has nothing whatsoever to do with my house. When I’ve attempted to report the signal strength measurements I’ve taken and how they vary from channel to channel, I’ve found I might as well be speaking to a brick wall.

Yes, I know I could put an old-school antenna on the roof and feed it into my TiVo. That would do pretty good for the local channels, but then why am I paying Comcast at all? Answer: for the handful of shows that we watch from cable channels. More than one person has asked me why I don’t just download these shows online and cut the cable. You can get Comedy Central programming from their web site. You can get all sorts of things from Hulu.com. All free and legal!

To that end, I’ve hacked my AppleTV with the latest patchstick, a remarkably painless process, and now my AppleTV, running Boxee, based on the open-source xbmc project, can play DVD rips from my file server (including DVD menus), just about anything I download from BitTorrent [see sidebar], and can get at content from a variety of streaming providers, including Comedy Central and Hulu.com, theoretically covering enough ground that I could legitimately consider dropping the Comcast subscription altogether.

In practice, the Internet TV experience was a let-down. I’ve got AT&T’s “Elite” DSL package (“up to 6Mb”, which is pretty close to what I see in practice), so I’ve got enough bandwidth for streaming. What I actually see is not utilizing that bandwidth. Comedy Central is not giving anywhere near 30 frames per second. It’s jumpy, unwatchable. Hulu has moments of greatness (i.e., higher resolution and quality relative to the non-HD channels that Comcast feeds me, but nowhere near broadcast HD) but Hulu also freezes up, sometimes for seconds at a time. If Boxee implemented TiVo-like Season Passes, they could download my shows in advance and yield a real winner of an experience. Or TiVo could implement Hulu support, as they already have batch downloads of Internet video content, mostly from Amazon, albeit with low SD quality and unacceptable self-destructing DRM.

Astute readers will note that I have several other options left to pursue. I could sign up for an unlimited Netflix subscription and have access to their streaming library (either to my TiVo or to my Boxee/AppleTV). I could also “subscribe” to the shows that I care about through Apple’s iTunes Store. (That’s how I’ve been watching Entourage, since I can’t otherwise justify the $20/month that I’d have to pay Comcast for HBO. See also the sidebar.)

Netflix doesn’t have the current TV shows that I want, and the iTunes store is pretty pricey. Those Entourage episodes are $2 each for 30 minutes of SD quality video. iTunes HD content, when available, is pretty much broadcast HD quality. Good stuff. iTunes SD content looks fine on an iPhone, but has a variety of problems on a proper HD set, most notably that any dark colors are pulled down to 100% black, presumably to improve compression. Very distracting. Regardless, friends I have with Netflix streaming seem to swear by it, and the iTunes Store clearly provides a good experience, albeit with high prices.

Clearly, Comcast is in deep trouble. Their product is expensive. Their customer service is lacking. Similar issues can be expected for other cable TV vendors, much less the satellite people. The Internet already has sufficient capacity to deliver the non-broadcast shows that I follow, directly to my TV. All the pieces are in place and they’re starting to work well together. The only missing piece is the business model for the future of online TV delivery. Hulu.com, for example, probably thinks they have to require video streaming so they can force you to watch ads. If you could download it, you could skip the ads and there goes their revenue.

I figure the one true hope in all of this is the ever-declining cost of serving up content. At some distant point in the future, the cost of delivering tens of megabits per second of video, for several hours every day, to all of the homes who might want it, will eventually be small enough to not matter any more. Once we get there, the people who make shows can sell them direct to the consumer, insert occasional and targeted ads, and still come out ahead. It could be a long wait.

[Sidebar: BitTorrent is a brilliant system, from a technical perspective, but it was never designed to provide any anonymity to its users. If you join the torrent for, say, an HBO show, HBO can trivially observe that you (or, at least, your IP address) is there, giving them grounds to go after you in one form or another. From that perspective, you’d have to be insane to download a mainstream movie or TV show from BitTorrent, or you’d have to do something terribly anti-social, like tunnel your entire BitTorrent session through Tor, which Tor was never designed to handle, although there are several designs to improve Tor or anonymize BitTorrent. So then, what shows do I feel safe to download via BitTorrent? So far, only the latest episodes of the BBC’s Top Gear. They air in the U.K. six months to a year ahead of their appearance on BBC America and availability on the U.S. iTunes Store. If there were a way to get these shows in the U.S. simultaneous with their British release, I’d happily pay for the privilege, even the $2 rate at the iTunes Store, but I’m not given that option at any price.]

New Internet? No Thanks.

Yesterday’s New York Times ran a piece, “Do We Need a New Internet?” suggesting that the Internet has too many security problems and should therefore be rebuilt.

The piece has been widely criticized in the technical blogosphere, so there’s no need for me to pile on. Anyway, I have already written about the redesign-the-Net meme. (See Internet So Crowded, Nobody Goes There Anymore.)

But I do want to discuss two widespread misconceptions that found their way into the Times piece.

First is the notion that today’s security problems are caused by weaknesses in the network itself. In fact, the vast majority of our problems occur on, and are caused by weaknesses in, the endpoint devices: computers, mobile phones, and other widgets that connect to the Net. The problem is not that the Net is broken or malfunctioning, it’s that the endpoint devices are misbehaving — so the best solution is to secure the endpoint devices. To borrow an analogy from Gene Spafford, if people are getting mugged at bus stops, the solution is not to buy armored buses.

(Of course, there are some security issues with the network itself, such as vulnerability of routing protocols and DNS. We should work on fixing those. But they aren’t the problems people normally complain about — and they aren’t the ones mentioned in the Times piece.)

The second misconception is that the founders of the Internet had no plan for protecting against the security attacks we see today. Actually they did have a plan which was simple and, if executed flawlessly, would have been effective. The plan was that endpoint devices would not have remotely exploitable bugs.

This plan was plausible, but it turned out to be much harder to execute than the founders could have foreseen. It has become increasingly clear over time that developing complex Net-enabled software without exploitable bugs is well beyond the state of the art. The founders’ plan is not working perfectly. Maybe we need a new plan, or maybe we need to execute the original plan better, or maybe we should just muddle through. But let’s not forget that there was a plan, and it was reasonable in light of what was known at the time.

As I have said before, the Internet is important enough that it’s worthwhile having people think about how it might be redesigned, or how it might have been designed differently in the first place. The Net, like any large human-built institution, is far from perfect — but that doesn’t mean that we would be better off tearing it down and starting over.

Final version of Government Data and the Invisible Hand

Thanks to the hard work of our patient editors at the Yale Journal of Law and Technology, my coauthors and I can now share the final version of our paper about online transparency, Government Data and the Invisible Hand.

If you have read the first version, you know that our paper is informed by a deep disappointment with the current state of the federal government’s Internet presence. A naive viewer, like we once were, might look at the chaos of clunky sites in .gov and entertain doubts about the webmasters who run those sites. But that would be—was, on our part—a mistake. We’re happy to set the record straight today.

Barack Obama’s web team is certainly one of the best that has ever been assembled. His staff did a fantastic job on the campaign site, and produced an also excellent, if slightly less dynamic, transition site at Change.gov. On its way to the White House, however, a team comprised of many of the same people seemed to lose its mojo. The complaints about the new Whitehouse.gov site—slow to be updated, lacking in interactivity—are familiar to observers of other .gov sites throughout the government.

What happened? It’s not plausible to suppose that Obama’s staffers have somehow gotten worse as they have moved from campaign to transition to governance. Instead, they have faced an increasingly stringent and burdensome array of regulations as they have become progressively more official. The transition was a sort of intermediate phase in this respect, and the new team now faces the Presidential Records Act, the Paperwork Reduction Act, and a number of other pre-Internet statutory obligations. This experience teaches that the limitations of the federal web reflect the thicket of rules to which such sites are subject—not the hardworking people who labor under those rules.

One of the most exciting things about the new administration’s approach to online media is the way it seeks to enable federal webmasters to move beyond some of the limitations of dated policies, using their expertise to leverage government data online.

My coauthors and I look forward to continuing to work on these issues. We are humbled to recognize the remarkable reservoir of talent and energy that is being brought to bear on the problem, from both within and beyond government.