November 29, 2024

Burn Notice, season 4, and the abuse of the MacGuffin

One of my favorite TV shows is Burn Notice. It’s something of a spy show, with a certain amount of gadgets but generally no James Bond-esque Q to supply equipment that’s certainly beyond the reach of real-world spycraft. Burn Notice instead focuses on the value of teamwork, advance planning, and clever subterfuge to pull off its various operations combined with a certain amount of humor and romance to keep the story compelling and engaging. You can generally watch along and agree with the feasibility of what they’re doing. Still, when they get closer to technology I actually know something about, I start to wonder.

One thing they recently got right, at least in some broad sense, was the ability to set up a femtocell (cell phone base station) as a way of doing a man-in-the-middle attack against a target’s cell phone. A friend of mine has one of these things, and he was able to set it up to service my old iPhone without anything more than my phone number. Of course, it changed the service name (from “AT&T” to “AT&T Microcell” or something along those lines), but it’s easy to imagine, in a spy-vs-spy scenario, where that would be easy to fix. Burn Notice didn’t show the necessary longer-range antenna or amplifier in order to reach their target, who was inside a building while our wiretapping heroes were out on the street, but I’m almost willing to let the get away with that, never mind having to worry about GSM versus CDMA. Too much detail would detract from the story.

(Real world analogy: Rop Gonggrijp, a Dutch computer scientist who had some tangential involvement with WikiLeaks, recently tweeted: “Foreign intel attention is nice: I finally have decent T-Mobile coverage in my office in the basement. Thanks guys…”)

What’s really bothered me about this season’s Burn Notice, though, was the central plot MacGuffin. Quoting Wikipedia: “the defining aspect of a MacGuffin is that the major players in the story are (at least initially) willing to do and sacrifice almost anything to obtain it, regardless of what the MacGuffin actually is.” MacGuffins are essential to many great works of drama, yet it seems that Hollywood fiction writers haven’t yet adapted the ideas of MacGuffins to dealing with data, and it really bugs me.

Without spoiling too much, Burn Notice‘s MacGuffin for the second half of season 4 was a USB memory stick which happened to have some particularly salacious information on it (a list of employee ID numbers corresponding to members of a government conspiracy), and which lots of people would (and did) kill to get their hands on. Initially we had the MacGuffin riding around on the back of a motorcycle courier; our heroes had to locate and intercept it. Our heroes then had to decide whether to use the information themselves or pass it onto a trusted insider in the government. Later, after various hijinks, wherein our heroes lost the MacGuffin, the bad guy locked it a fancy safe which our heroes had to physically find and then remove from a cinderblock wall to later open with an industrial drill-press.

When the MacGuffin was connected to a computer, our heroes could read it, but due to some sort of unspecified “cryptography” they were unable to make copies. Had that essential element been more realistic, the entire story would have changed. Never mind that there’s no such “encryption” technology out there. For a show that has our erstwhile heroes regularly use pocket digital cameras to photograph computer screens or other sensitive documents, you’d think they would do something similar here. Nope. The problem is that any realistic attempt to model how easy it is to copy data like this would have blown apart the MacGuffin-centric nature of the plot. Our protagonists could have copied the data, early on, and handed the memory card over. They could have then handed over bogus data written to the same memory stick. They could have created thousands of webmail accounts, each holding copies of the data. They could have anonymously sent the incriminating data to any of a variety of third parties, perhaps borrowing some plot elements from the whole WikiLeaks fiasco. In short, there could still have been a compelling story, but it wouldn’t have followed the standard MacGuffin structure, and it would almost certainly have reached a very different conclusion.

All in all, it’s probably a good thing I don’t know too much about combat tactics, explosives, or actual spycraft, or I’d be completely unable to enjoy a show like this. I expect James Bond to do impossible things, but I appreciate Burn Notice for its ostensibility. I can almost imagine it actually happening.

The Flawed Legal Architecture of the Certificate Authority Trust Model

Researchers have recently criticized the Certificate Authority Trust Model — which involves the issuance and use of digital certificates to authenticate the identity of websites to end-users — because of an array of technical and institutional problems. The criticism is significant not only because of the systemic nature of the noted problems, but also because the Model is universally relied upon by websites offering secure connections (SSL and TLS) to end-users. The Model comes into play in virtually every commercial and business transaction occurring over the Internet, as well as in a wide variety of other confidential and private on-line communications. What has not been addressed to date, however, is the nature of the legal relationships between the parties involved with, or impacted by, the Model.

Steve Schultze and I tackle this topic in our recent article “The Certificate Authority Trust Model for SSL: A Defective Foundation for Encrypted Web Traffic and a Legal Quagmire.” We looked at the standard legal documents issued by the certificate authorities or “CAs,” including exemplar Subscriber Agreements (agreements between CAs and website operators); “Certification Practice Statements” (statements by CAs outlining their business practices); and Relying Party Agreements (purported agreements between CAs and “relying parties,” such as end-users). What we found was surprising:

  • “Relying Party Agreements” purport to bind end-users to their terms despite the apparent absence of any mechanism to either affirmatively alert the end-user as to the existence of the supposed Agreements or afford the end-user an opportunity to register his or her acceptance or rejection of the Agreements’ terms
  • Certification Practice Statements that suffer from the same problem (i.e. no affirmative notice to the end-user and no meaningful opportunity for acceptance or rejection of terms)

There were other issues as well. For example, the Relying Party Agreements and Certification Practice Statements set forth various obligations on the part of end-users (i.e. “relying parties”) such as: the requirement that end-users make an independent determination of whether it is reasonable to trust a website offering a secure connection (isn’t that the whole point of having a CA, so that the end-user doesn’t have to do that?); the requirement that the end-user be familiar with the crypto software and processes used to carry out the authentication process; and the end-user’s duty to indemnify and hold harmless the CA in the event of legal claims by third parties.

Given the absence of notice to the end-user and assent by the end-user, it would appear that many CAs would have a difficult time holding an end-user to the terms of the relying party agreements or certification practice statements. To date, the CA Trust Model’s legal architecture has apparently not been the subject of any published court decision and remains untested.

The bottom line is that the CA Trust Model’s legal architecture inures to the benefit of no one. Neither website operators, certificate authorities, nor end-users can be sure of their rights or exposure. The Model’s legal structure may therefore be just as troubling as its security vulnerabilities.

You can read the full article in PDF form.

[Editor: Steve Roosa gave a followup luncheon talk at CITP entitled The Devil is in the Indemnity Agreements: A Critique of the Certificate Authority Trust Model’s Putative Legal Foundation. Slides and audio are now posted.]

Ninth Circuit Ruling in MDY v. Blizzard

The Ninth Circuit has ruled on the MDY v. Blizzard case, which involves contract, copyright, and DMCA claims. As with the district court ruling, I’ll withhold comment due to my involvement as an expert in the case, but the decision may be of interest to FTT readers.

[Editor: The EFF has initial reactions here. Techdirt also has an overview.]

Court Rules Email Protected by Fourth Amendment

Today, the United States Court of Appeals for the Sixth Circuit ruled that the contents of the messages in an email inbox hosted on a provider’s servers are protected by the Fourth Amendment, even though the messages are accessible to an email provider. As the court puts it, “[t]he government may not compel a commercial ISP to turn over the contents of a subscriber’s emails without first obtaining a warrant based on probable cause.”

This is a very big deal; it marks the first time a federal court of appeals has extended the Fourth Amendment to email with such care and detail. Orin Kerr calls the opinion, at least on his initial read, “quite persuasive” and “likely . . . influential,” and I agree, but I’d go further: this is the opinion privacy activists and many legal scholars, myself included, have been waiting and calling for, for more than a decade. It may someday be seen as a watershed moment in the extension of our Constitutional rights to the Internet.

And it may have a more immediate impact on Capitol Hill, because in its ruling the Sixth Circuit also declares part of the Stored Communications Act (SCA) of the Electronic Communications Privacy Act unconstitutional. 18 U.S.C. 2703(b) allows the government to obtain email messages with less than a search warrant. This section has been targeted for amendment by the Digital Due Process coalition of companies, privacy groups, and academics (I have signed on) for precisely the reason now attacked by this opinion, because it allows warrantless government access to communications stored online. I am sure some congressional staffers are paying close attention to this opinion, and I hope it helps clear the way for an amendment to the SCA, to fix a now-declared unconstitutional law, if not during the lame duck session, then early in the next Congressional term.

Update: Other reactions from Dissent and the EFF.

Two Stories about the Comcast/Level 3 Dispute (Part 2)

In my last post I told a story about the Level 3/Comcast dispute that portrays Comcast in a favorable light. Now here’s another story that casts Comcast as the villain.

Story 2: Comcast Abuses Its Market Power

As Steve explained, Level 3 is an “Internet Backbone Provider.” Level 3 has traditionally been considered a tier 1 provider, which means that it exchanges traffic with other tier 1 providers without money changing hands, and bills everyone else for connectivity. Comcast, as a non-tier 1 provider, has traditionally paid Level 3 to carry its traffic to places Comcast’s own network doesn’t reach directly.

Steve is right that the backbone market is highly competitive. I think it’s worth unpacking why this is in a bit more detail. Let’s suppose that a Comcast user wants to download a webpage from Yahoo!, and that both are customers of Level 3. So Yahoo! sends its bits to Level 3, who passes it along to Comcast. And traditionally, Level 3 would bill both Yahoo! and Comcast for the service of moving data between them.

It might seem like Level 3 has a lot of leverage in a situation like this, so it’s worth considering what would happen if Level 3 tried to jack up its prices. There are reportedly around a dozen other tier 1 providers that exchange traffic with Level 3 on a settlement-free basis. This means that if Level 3 over-charges Comcast for transit, Comcast can go to one of Level 3’s competitors, such as Global Crossing, and pay it to carry its traffic to Level 3’s network. And since Global Crossing and Level 3 are peers, Level 3 gets nothing for delivering traffic to Global Crossing that’s ultimately bound for Comcast’s network.

A decade ago, when Internet Service Retailers (to use Steve’s terminology) were much smaller than backbone providers, that was the whole story. The retailers didn’t have the resources to build their own global networks, and their small size meant they had relatively little bargaining power against the backbone providers. So the rule was that Internet Service Retailers charged their customers for Internet access, and then passed some of that revenue along to the backbone providers that offered global connectivity. There may have been relatively little competition in the retailer market, but this didn’t have much effect on the overall structure of the Internet because no single retailer had enough market power to go toe-to-toe with the backbone providers.