July 23, 2016

Dan Wallach

Dan Wallach is a professor in the Department of Computer Science and a Rice Scholar at the Baker Institute for Public Policy at Rice University. He has also served as a member of the Air Force Science Advisory Board and as member of the USENIX Board of Directors. His research concerns computer systems security.

avatar

On distracted driving and required phone searches

A recent Arstechnica article discussed several U.S. states that are considering adding a “roadside textalyzer” that operates analogously to roadside Breathalyzer tests. In the same way that alcohol and drugs can impair a driver’s ability to navigate the road, so can paying attention to your phone rather than the world beyond. Many states “require” drivers to consent to Breathalyzer tests, where that “requirement” boils down to serious penalties if the driver declines. Vendors like Cellebrite are pushing for analogous requirements, for which they just happen to sell products.
[Read more…]

avatar

An analogy to understand the FBI’s request of Apple

After my previous blog post about the FBI, Apple, and the San Bernadino iPhone, I’ve been reading many other bloggers and news articles on the topic. What seems to be missing is a decent analogy to explain the unusual nature of the FBI’s demand and the importance of Apple’s stance in opposition to it. Before I dive in, it’s worth understanding what the FBI’s larger goals are. Cyrus Vance Jr., the Manhattan DA, states it clearly: “no smartphone lies beyond the reach of a judicial search warrant.” That’s the FBI’s real goal. The San Bernadino case is just a vehicle toward achieving that goal. With this in mind, it’s less important to focus on the specific details of the San Bernadino case, the subtle improvements Apple has made to the iPhone since the 5c, or the apparent mishandling of the iCloud account behind the San Bernadino iPhone.

Our Analogy: TSA Luggage Locks

When you check your bags in the airport, you may well want to lock them, to keep baggage handlers and other interlopers from stealing your stuff. But, of course, baggage inspectors have a legitimate need to look through bags. Your bags don’t have any right of privacy in an airport. To satisfy these needs, we now have “TSA locks”. You get a combination you can enter, and the TSA gets their own secret key that allows airport staff to open any TSA lock. That’s a “backdoor”, engineered into the lock’s design.

What’s the alternative? If you want the TSA to have the technical capacity to search a large percentage of bags, then there really isn’t an alternative. After all, if we used “real” locks, then the TSA would be “forced” to cut them open. But consider the hypothetical case where these sorts of searches were exceptionally rare. At that point, the local TSA could keep hundreds of spare locks, of all makes and models. They could cut off your super-duper strong lock, inspect your bag, and then replace the cut lock with a brand new one of the same variety. They could extract the PIN or key cylinder from the broken lock and install it in the new one. They could even rough up the new one so it looks just like the original. Needless to say, this would be a specialized skill and it would be expensive to use. That’s pretty much where we are in terms of hacking the newest smartphones.

Another area where this analogy holds up is all the people who will “need” access to the backdoor keys. Who gets the backdoor keys? Sure, it might begin with the TSA, but every baggage inspector in every airport, worldwide, will demand access to those keys. And they’ll even justify it, because their inspectors work together with ours to defeat smuggling and other crimes. We’re all in this together! Next thing you know, the backdoor keys are everywhere. Is that a bad thing? Well, the TSA backdoor lock scheme is only as secure as their ability to keep the keys a secret. And what happened? The TSA mistakenly allowed the Washington Post to publish a photo of all the keys, which makes it trivial for anyone to fabricate those keys. (CAD files for them are now online!) Consequently, anybody can take advantage of the TSA locks’ designed-in backdoor, not just all the world’s baggage inspectors.

For San Bernadino, the FBI wants Apple to retrofit a backdoor mechanism where there wasn’t one previously. The legal precedent that the FBI wants creates a capability to convert any luggage lock into a TSA backdoor lock. This would only be necessary if they wanted access to lots of phones, at a scale where their specialized phone-cracking team becomes too expensive to operate. This no doubt becomes all the more pressing for the FBI as modern smartphones get better and better at resisting physical attacks.

Where the analogy breaks down: If you travel with expensive stuff in your luggage, you know well that those locks have very limited resistance to an attacker with bolt cutters. If somebody steals your luggage, they’ll get your stuff, whereas that’s not necessarily the case with a modern iPhone. These phones are akin to luggage having some kind of self-destruct charge inside. You force the luggage open and the contents will be destroyed. Another important difference is that much of the data that the FBI presumably wants from the San Bernadino phone can be gotten elsewhere, e.g., phone call metadata and cellular tower usage metadata. We have very little reason to believe that the FBI needs anything on that phone whatsoever, relative to the mountain of evidence that it already has.

Why this analogy is important: The capability to access the San Bernadino iPhone, as the court order describes it, is a one-off thing—a magic wand that converts precisely one traditional luggage lock into a TSA backdoor lock, having no effect on any other lock in the world. But as Vance makes clear in his New York Times opinion, the stakes are much higher than that. The FBI wants this magic wand, in the form of judicial orders and a bespoke Apple engineering process, to gain backdoor access to any phone in their possession. If the FBI can go to Apple to demand this, then so can any other government. Apple will quickly want to get itself out of the business of adjudicating these demands, so it will engineer in the backdoor feature once and for good, albeit under duress, and will share the necessary secrets with the FBI and with every other nation-state’s police and intelligence agencies. In other words, Apple will be forced to install a TSA backdoor key in every phone they make, and so will everybody else.

While this would be lovely for helping the FBI gather the evidence it wants, it would be especially lovely for foreign intelligence officers, operating on our shores, or going after our citizens when they travel abroad. If they pickpocket a phone from a high-value target, our FBI’s policies will enable any intel or police organization, anywhere, to trivially exercise any phone’s TSA backdoor lock and access all the intel within. Needless to say, we already have a hard time defending ourselves from nation-state adversaries’ cyber-exfiltration attacks. Hopefully, sanity will prevail, because it would be a monumental error for the government to require that all our phones be engineered with backdoors.

avatar

Apple, the FBI, and the San Bernadino iPhone

Apple just posted a remarkable “customer letter” on its web site. To understand it, let’s take a few steps back.

In a nutshell, one of the San Bernadino shooters had an iPhone. The FBI wants to root through it as part of their investigation, but they can’t do this effectively because of Apple’s security features. How, exactly, does this work?

  • Modern iPhones (and also modern Android devices) encrypt their internal storage. If you were to just cut the Flash chips out of the phone and read them directly, you’d learn nothing.
  • But iPhones need to decrypt that internal storage in order to actually run software. The necessary cryptographic key material is protected by the user’s password or PIN.
  • The FBI wants to be able to exhaustively try all the possible PINs (a “brute force search”), but the iPhone was deliberately engineered with a “rate limit” to make this sort of attack difficult.
  • The only other option, the FBI claims, is to replace the standard copy of iOS with something custom-engineered to defeat these rate limits, but an iPhone will only accept an update to iOS if it’s digitally signed by Apple. Consequently, the FBI convinced a judge to compel Apple to create a custom version of iOS, just for them, solely for this investigation.
  • I’m going to ignore the legal arguments on both sides, and focus on the technical and policy aspects. It’s certainly technically possible for Apple to do this. They could even engineer their customized iOS build to measure the serial number of the iPhone on which it’s installed, such that the backdoor would only work on the San Bernadino suspect’s phone, without being a general-purpose skeleton key for all iPhones.

With all that as background, it’s worth considering a variety of questions.
[Read more…]

avatar

On compromising app developers to go after their users

In a recent article by Scahill and Begley, we learned that the CIA is interested in targeting Apple products. I largely agree with the quote from Steve Bellovin, that “spies gonna spy”, so of course they’re interested in targeting the platform that rides in the pockets of many of their intelligence collection targets. What could be a tastier platform for intelligence collection than a device with a microphone, cellular network connection, GPS, and a battery, which your targets willingly carry around in their pockets? Even better, your targets will spare you the trouble of recharging your spying device for you. Of course you target their iPhones! (And Androids. And Blackberries.)

To my mind, the real eyebrow raising moment was that the CIA is also allegedly targeting app developers through “whacking” Apple’s Xcode tool, presumably allowing all subsequent software shipped from the developer to the app store to contain some sort of malicious implant, which will then be distributed within that developer’s app. Nothing has been disclosed about how widespread these attacks are (if ever used at all), what developers might have been targeted, or how the implants might function.
[Read more…]

avatar

Android WebView security and the mobile advertising marketplace

Freedom to Tinker readers are probably aware of the current controversy over Google’s handling of ongoing security vulnerabilities in its Android WebView component. What sounds at first like a routine security problem turns out to have some deep challenges.  Let’s start by filling in some background and build up to the big problem they’re not talking about: Android advertising.
[Read more…]

avatar

Striking a balance between advertising and ad blocking

In the news, we have a consortium of French publishers, which somehow includes several major U.S. corporations (Google, Microsoft), attempting to sue AdBlock Plus developer Eyeo, a German firm with developers around the world. I have no idea of the legal basis for their case, but it’s all about the money. AdBlock Plus and the closely related AdBlock are among the most popular Chrome extensions, by far, and publishers will no doubt claim huge monetary damages around presumed “lost income”.
[Read more…]

avatar

Your TV is spying on you, and what you can do about it

A recent UK observer with a packet sniffer noticed that his LG “smart” TV was sending all his viewing habits back to an LG server. This included filenames from an external USB disk. Add this atop observations that Samsung’s 2012-era “smart” TVs were riddled with security holes. (No word yet on the 2013 edition.)

What’s going on here? Mostly it’s just incompetence. Somebody thought it was a good idea to build these TVs with all these features and nobody ever said “maybe we need some security people on the design team to make sure we don’t have a problem”, much less “maybe all this data flowing from the TV to us constitutes a massive violation of our customers’ privacy that will land us in legal hot water.” The deep issue here is that it’s relatively easy to build something that works, but it’s significantly harder to build something that’s secure and respects privacy.
[Read more…]

avatar

Engineering an insider-attack-resistant email system and why you wouldn’t want to use it

Earlier this week, Felten made the observation that the government eavesdropping on Lavabit could be considered as an insider attack against Lavabit users. This leads to the obvious question: how might we design an email system that’s resistant to such an attack? The sad answer is that we’ve had this technology for decades but it never took off. Phil Zimmerman put out PGP in 1991. S/MIME-based PKI email encryption was widely supported by the late 1990’s. So why didn’t it become ubiquitous?
[Read more…]

avatar

Lavabit and how law enforcement access might be done in the future

The saga of Lavabit, the now-closed “secure” mail provider, is an interesting object of study. They’re in the process of appealing a court order to produce their SSL private keys, with which a government eavesdropper would then have access to the entirety of all traffic going in and out of Lavabit. You can read Lavabit’s appeal brief and a general summary of their legal situation. What jumps out is that Lavabit tried to propose an alternative: giving access exclusively to metadata from the target of the investigation. Lavabit’s proposal:

  • The government would pay $3500 for Lavabit’s development costs and operations
  • The operations would provide a variety of email headers on the subject of the investigation, notably excluding the subject line
  • This surveillance data would be sent in daily batches to the government

It appears that the government wasn’t interested in negotiating this, instead going for the whole enchilada, which then led Lavabit to pull the plug on its service. The question I want to pursue is how this whole situation could have happened in a way that would have satisfied the government’s investigative needs without its flagrant violation of the Fourth Amendment prohibition against unreasonable search and seizure. Consider whether Lavabit might have adopted Google’s legal procedures. Google clearly spells out what they’re willing to divulge with a subpoena, court order, or warrant (and nicely defines each of those terms). In Google’s process, the government brings a written search warrant, Google’s legal team reviews it, and then they provide access to the targeted account, providing notice to the affected user when they’re allowed to. Seems reasonable, right?

If all the government needed was real-time traces of specific subjects, that would seem to be a reasonable point of negotiation between them and Lavabit. For the right price, Lavabit could certainly have engineered a solution to their needs. It appears that there wasn’t any serious attempt at negotiation. The government wanted much more than this, creating the dispute. (The Guardian claims that the government also wasn’t willing to pay $3500, calling it unreasonable. It’s hard to stomach that claim, given all the other expenses involved in a major criminal investigation.)

Lavabit used SSL to protect data in transit, and some other crypto derived from the user’s password to protect data on their hard drives. But when the user logs in, the necessary key material is necessarily available to present the data to the user. While users might be able to use stronger cryptographic means to protect their data against legal warrants (e.g., using Thunderbird with the enigmail OpenPGP plugin), ultimately the lesson of Lavabit is that technology cannot alone solve a legal problem. A future Lavabit needs to have its legal processes sorted out in advance, making reasonable promises to its users and making reasonable access available to the government. Likewise, it’s time for Congress to establish some clear limits on government surveillance to prevent unreasonable search and collection practices in the future.

avatar

On the NSA’s capabilities

Last Thursday brought significant new revelations about the capacities of the National Security Agency. While the articles in the New York Times, ProPublica, and The Guardian skirted around technical specifics, several broad themes came out.

  • NSA has the capacity to read significant amounts of encrypted Internet traffic.
  • NSA has some amount of cooperation from vendors to weaken cryptographic aspects of their products.
  • NSA has a stable of exploits designed to break into specifically targeted computers (“tailored access”, in NSA parlance)
  • NSA shares this technology with its counterparts at some of our close allies, apparently the “five eyes” group (USA, Canada, Great Britain, Australia, and New Zealand)

The NSA appears to be taking a holistic approach toward its interception technologies. [Read more…]