October 5, 2024

An analogy to understand the FBI's request of Apple

After my previous blog post about the FBI, Apple, and the San Bernadino iPhone, I’ve been reading many other bloggers and news articles on the topic. What seems to be missing is a decent analogy to explain the unusual nature of the FBI’s demand and the importance of Apple’s stance in opposition to it. Before I dive in, it’s worth understanding what the FBI’s larger goals are. Cyrus Vance Jr., the Manhattan DA, states it clearly: “no smartphone lies beyond the reach of a judicial search warrant.” That’s the FBI’s real goal. The San Bernadino case is just a vehicle toward achieving that goal. With this in mind, it’s less important to focus on the specific details of the San Bernadino case, the subtle improvements Apple has made to the iPhone since the 5c, or the apparent mishandling of the iCloud account behind the San Bernadino iPhone.

Our Analogy: TSA Luggage Locks

When you check your bags in the airport, you may well want to lock them, to keep baggage handlers and other interlopers from stealing your stuff. But, of course, baggage inspectors have a legitimate need to look through bags. Your bags don’t have any right of privacy in an airport. To satisfy these needs, we now have “TSA locks”. You get a combination you can enter, and the TSA gets their own secret key that allows airport staff to open any TSA lock. That’s a “backdoor”, engineered into the lock’s design.

What’s the alternative? If you want the TSA to have the technical capacity to search a large percentage of bags, then there really isn’t an alternative. After all, if we used “real” locks, then the TSA would be “forced” to cut them open. But consider the hypothetical case where these sorts of searches were exceptionally rare. At that point, the local TSA could keep hundreds of spare locks, of all makes and models. They could cut off your super-duper strong lock, inspect your bag, and then replace the cut lock with a brand new one of the same variety. They could extract the PIN or key cylinder from the broken lock and install it in the new one. They could even rough up the new one so it looks just like the original. Needless to say, this would be a specialized skill and it would be expensive to use. That’s pretty much where we are in terms of hacking the newest smartphones.

Another area where this analogy holds up is all the people who will “need” access to the backdoor keys. Who gets the backdoor keys? Sure, it might begin with the TSA, but every baggage inspector in every airport, worldwide, will demand access to those keys. And they’ll even justify it, because their inspectors work together with ours to defeat smuggling and other crimes. We’re all in this together! Next thing you know, the backdoor keys are everywhere. Is that a bad thing? Well, the TSA backdoor lock scheme is only as secure as their ability to keep the keys a secret. And what happened? The TSA mistakenly allowed the Washington Post to publish a photo of all the keys, which makes it trivial for anyone to fabricate those keys. (CAD files for them are now online!) Consequently, anybody can take advantage of the TSA locks’ designed-in backdoor, not just all the world’s baggage inspectors.

For San Bernadino, the FBI wants Apple to retrofit a backdoor mechanism where there wasn’t one previously. The legal precedent that the FBI wants creates a capability to convert any luggage lock into a TSA backdoor lock. This would only be necessary if they wanted access to lots of phones, at a scale where their specialized phone-cracking team becomes too expensive to operate. This no doubt becomes all the more pressing for the FBI as modern smartphones get better and better at resisting physical attacks.

Where the analogy breaks down: If you travel with expensive stuff in your luggage, you know well that those locks have very limited resistance to an attacker with bolt cutters. If somebody steals your luggage, they’ll get your stuff, whereas that’s not necessarily the case with a modern iPhone. These phones are akin to luggage having some kind of self-destruct charge inside. You force the luggage open and the contents will be destroyed. Another important difference is that much of the data that the FBI presumably wants from the San Bernadino phone can be gotten elsewhere, e.g., phone call metadata and cellular tower usage metadata. We have very little reason to believe that the FBI needs anything on that phone whatsoever, relative to the mountain of evidence that it already has.

Why this analogy is important: The capability to access the San Bernadino iPhone, as the court order describes it, is a one-off thing—a magic wand that converts precisely one traditional luggage lock into a TSA backdoor lock, having no effect on any other lock in the world. But as Vance makes clear in his New York Times opinion, the stakes are much higher than that. The FBI wants this magic wand, in the form of judicial orders and a bespoke Apple engineering process, to gain backdoor access to any phone in their possession. If the FBI can go to Apple to demand this, then so can any other government. Apple will quickly want to get itself out of the business of adjudicating these demands, so it will engineer in the backdoor feature once and for good, albeit under duress, and will share the necessary secrets with the FBI and with every other nation-state’s police and intelligence agencies. In other words, Apple will be forced to install a TSA backdoor key in every phone they make, and so will everybody else.

While this would be lovely for helping the FBI gather the evidence it wants, it would be especially lovely for foreign intelligence officers, operating on our shores, or going after our citizens when they travel abroad. If they pickpocket a phone from a high-value target, our FBI’s policies will enable any intel or police organization, anywhere, to trivially exercise any phone’s TSA backdoor lock and access all the intel within. Needless to say, we already have a hard time defending ourselves from nation-state adversaries’ cyber-exfiltration attacks. Hopefully, sanity will prevail, because it would be a monumental error for the government to require that all our phones be engineered with backdoors.

Comments

  1. The govt can compel people to do many things that are deemed to be in the interest of the public. You can be compelled to sell your property and move. You are compelled every year to fill out complicated tax forms, or to hire experts to do it for you. Businesses are compelled to comply with tons of complicated regulations at huge expense. If a judge can be persuaded to approve a warrant to search your rented property, then your landlord may be compelled to go unlock the door for the investigator, and you might not even know that your privacy has been invaded if they do it secretly and carefully. This cellphone issue is covered under the legal authority of the US govt to pursue criminal activity to keep the public safer. Why can’t Apple find a way to cooperate under limited and strictly controlled processes? And without re-writing their code to be hackable by any joe schmo? Isn’t Apple smart enough to do that? I don’t get it. :-/

  2. Poor analogy. I have lost an enormous amount of respect for people who once seemed rational and of presumed integrity. The EFF has placed itself in the toilet. It is odd when Donald Trump makes more sense than self proclaimed security and privacy experts. Uncomfortable, but these are odd times indeed. The “I” in “CIA” among the self professed security illuminati is at an all time low. Maybe they have a bad case of Snowden envy or a lack of state secrets to peddle at the moment.

    What the FBI asked is no more than what every bank has to provide if a safety deposit box needs to be drilled. In this case, the safety deposit box could further be transported off site by design. The FBI would examine it by Skype connection. OMG, a special drill bit might be required. Oh horrors a different drill bit might be used on a different safety deposit box. The VERY principle of drilling safety boxes, which of course nobody has ever done before (just ask the wise “security experts”), is beyond comprehension. This cannot be even contemplated without generating the largest catch of red herrings since Bush II’s WMD stories. Why it is enough to have Snowden himself bring his FSB colleagues to pay a visit to those so bold as to even think of it. OMG, nobody would ever trust the bank’s brand again because economic power is more important than criminal justice or counter terrorism. And let’s face people are stupid and gullible or else these “experts” would be seen for the emperor’s clothing they wear.

    Money driven “free speech” of corporate power and the oligarchs who manipulate the system must not even be questioned. No, these oligarchs must be “heroes” of the common man, a scam a certain political candidate also seems to be good at right now. Obviously since software is free speech, much like political campaign bribery, which every Apple and elite security fanboy knows, we can now stop patenting software because you can’t patent free speech. Oh why do those dense fools who believe in due process not see the wisdom of this? Cue, two more rounds of hysteria. Perhaps we can gin up frothing at the mouth to be worthy of invading Iraq or send to Gitmo all who would request such a filthy disgusting thing as complying with due process in the name of good corporate citizenship.

  3. This is all laughable propaganda. Apple is just another corrupt corporation that fully cooperates with the criminals of the US government. This is all just a stupid marketing show for ignorant joe-six-packs.

  4. “Apple will quickly want to get itself out of the business of adjudicating these demands, so it will engineer in the backdoor feature once and for good”

    I agree with the first part, but not the conclusion. Predicting what Apple will do is naturally fraught with speculation and uncertainty, but it’s difficult to see a path where they would do something like that. I would expect at worst Apple to focus more on recouping their costs, and/or assisting law enforcement in teaching them how to do their job better.

    I think it’s very likely that the other comments are more likely correct: Apple will react by making it even harder to break the encryption, removing this particular avenue.

    All that said, I am bewildered by law enforcement’s insistence that Apple help with this. I’m not a hacking expert, but my understanding is that having physical access to the hardware opens the door to a number of sophisticated attacks.

    For example, why can’t the FBI simply copy the flash RAM independently of the phone hardware itself? Presumably with access to the hardware, they should be able to make a bit-for-bit copy of the encrypted data. At the very least, this would protect against the risk of auto-deletion; they can run the attack as often as they want, let the phone delete the data as often as it wants, all they have to do is restore the original copy and start over.

    Similarly, once they have the bit-for-bit copy of the encrypted data, who’s to say they have to run the OS on the phone at all? It seems to me that they should be able to extract the necessary information from the various parts of the phone (including security chips that encrypt/decrypt the OS components and/or related data), and then run that in a virtual machine. With complete control over the machine, they could do things like spoof the clock (negating the protection against rapidly trying many codes), emulate the touch screen (allowing automation of the code entry), or even bypass signing requirements for the OS (allowing them to provide their version of the OS that provides those features directly).

    And having implemented such a solution, they would have the tools and expertise to apply that approach to _any_ phone, made by _any_ manufacturer. They could execute warrants on their own, without dragging third parties into it, avoiding all of the fuss.

    Granted, there is the risk that these tools, and possibly even the expertise to use them, could be leaked from the law enforcement agency. But it’s not like we don’t already have the potential for hostile entities developing such tools and expertise independently anyway. The key here is to avoid the constitutional violations inherent in what the FBI wants now.

  5. A few notes:
    1) Isn’t a search warrant a “magic wand” that unlocks the front door of your house? I don’t understand how a computer is so different than the rest of your physical life (your car, your house, the computer inside your house). Do people who object to search warrants for phones think it’s just fine that your front door can be broken down and all other records searched?
    2) I really object to the term “backdoor.” To me that implies (a) secret and (b) intentional weakening of the product. This is not secret and is just a result of design decisions Apple made (you really think there are no nation-states capable of this right now? There is no world this is the case).

    I think there’s only one possible resolution here: Apple is forced to cooperate here and implement the hacked firmware, and then they will implement a phone that is secure from everyone including Apple. Then, the debate moves to the actual question: is it legal to sell an encrypted phone that cannot be broken?

    • “Do people who object to search warrants for phones think it’s just fine that your front door can be broken down and all other records searched?”

      Non sequitur. The issue here is whether the owner or a third-party can be *compelled* to assist in the execution of the warrant. Historically, law enforcement has been granted warrants (of course), but these warrants carry a presumption that law enforcement themselves will do the dirty work.

      Frankly, I already object to the general attitude on the part of law enforcement that an executed warrant (properly or even, for that matter, improperly!) removes their obligation to restore damaged property after the fact. This movement would extend that offense, by forcing otherwise uninvolved persons or companies to incur a cost of time and money to assist in the investigate.

      “I really object to the term “backdoor.” To me that implies (a) secret and (b) intentional weakening of the product.”

      “Backdoor” does not imply “secret”, hence the TSA luggage lock analogy (that backdoor is hardly secret). And both the luggage lock and a provision that would allow law enforcement to force third parties to weaken the security on a product (even on a case by case basis) certainly sounds like an “intentional weakening of the product” to me.

      “is it legal to sell an encrypted phone that cannot be broken?”

      Define “cannot”. Given sufficient resources, any encryption can be broken. It seems to me that we really are already debating the actual question: should third parties be able to be compelled to weaken security to reduce the resources required to break encryption.

  6. peg dash fab says

    “Apple will … engineer in the backdoor feature once and for good, albeit under duress, and will share the necessary secrets with the FBI and with every other nation-state’s police and intelligence agencies.”

    I suspect Apple will not go this route, but will instead engineer the next generation of phones to eliminate the ability to build these bespoke backdoors, e.g., by causing the enclave to wipe its contents when reflashed.