Apple just posted a remarkable “customer letter” on its web site. To understand it, let’s take a few steps back.
In a nutshell, one of the San Bernadino shooters had an iPhone. The FBI wants to root through it as part of their investigation, but they can’t do this effectively because of Apple’s security features. How, exactly, does this work?
- Modern iPhones (and also modern Android devices) encrypt their internal storage. If you were to just cut the Flash chips out of the phone and read them directly, you’d learn nothing.
- But iPhones need to decrypt that internal storage in order to actually run software. The necessary cryptographic key material is protected by the user’s password or PIN.
- The FBI wants to be able to exhaustively try all the possible PINs (a “brute force search”), but the iPhone was deliberately engineered with a “rate limit” to make this sort of attack difficult.
- The only other option, the FBI claims, is to replace the standard copy of iOS with something custom-engineered to defeat these rate limits, but an iPhone will only accept an update to iOS if it’s digitally signed by Apple. Consequently, the FBI convinced a judge to compel Apple to create a custom version of iOS, just for them, solely for this investigation.
- I’m going to ignore the legal arguments on both sides, and focus on the technical and policy aspects. It’s certainly technically possible for Apple to do this. They could even engineer their customized iOS build to measure the serial number of the iPhone on which it’s installed, such that the backdoor would only work on the San Bernadino suspect’s phone, without being a general-purpose skeleton key for all iPhones.
With all that as background, it’s worth considering a variety of questions.
Does the FBI’s investigation actually need access to the internals of the iPhone in question?
Apple’s letter states:
When the FBI has requested data that’s in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we’ve offered our best ideas on a number of investigative options at their disposal.
In Apple’s FAQ on iCloud encryption, they describe how most iCloud features are encrypted both in transit and at rest, with the notable exception of email. So, if the San Bernadino suspect’s phone used Apple’s mail services, then the FBI can read that email. It’s possible that Apple genuinely cannot provide unencrypted access to other data in iCloud without the user’s passwords, but it’s also possible that the FBI could extract the necessary passwords (or related authentication tokens) from other places, like the suspect’s laptop computer.
Let’s assume, for the sake of discussion, that the FBI has not been able to get access to anything else on the suspect’s iPhone or its corresponding iCloud account, and they’ve exhausted all of their technical avenues of investigation. If the suspect used Gmail or some other service, let’s assume the FBI was able to get access to that as well. So what might they be missing? SMS / iMessage. Notes. Photos. Even knowing what other apps the user has installed could be valuable, since many of them have corresponding back-end cloud services, chock full of tasty evidence. Of course, the suspects emails and other collected data might already make for a compelling case against them. We don’t know.
Could the FBI still find a way into their suspect’s iPhone?
Almost certainly yes. Just yesterday, the big news was a security critical bug in glibc that’s been around since 2008. And for every bug like this that the public knows about, our friends in the government have many more that they keep to themselves. If the San Bernadino suspect’s phone is sufficiently valuable, then it’s time to reach into the treasure chest (both figuratively and literally) and engineer a custom exploit. There’s plenty of attack surface available to them. That attack surface stretches to the suspect’s personal computers and other devices.
The problem with this sort of attack plan is that it’s expensive, it’s tricky, and it’s not guaranteed to work. Since long before the San Bernadino incident, the FBI has wanted a simpler solution. Get a legal order. Get access. Get evidence. The San Bernadino case clearly spells this out.
What’s so bad about Apple doing what the FBI wants?
Apple’s concern is the precedent set by the FBI’s demand and the judge’s order. If the FBI can compel Apple to create a backdoor like this, then so can anybody else. You’ve now opened the floodgates to every small-town police chief, never mind discovery orders in civil lawsuits. How is Apple supposed to validate and prioritize these requests? What happens when they come from foreign governments? If China demands a custom software build to attack a U.S. resident, how is Apple supposed to judge whether that user and their phone happen to be under the jurisdiction of Chinese law? What if the U.S. then passes a law prohibiting Apple from honoring Chinese requests like this? That way lies madness, and that’s where we’re going.
Even if we could somehow make this work, purely as an engineering matter, it’s not feasible to imagine a backdoor mechanism that will support the full gamut of seemingly legal requests to exercise it.
Is backdoor engineering really feasible? What are the tradeoffs?
If there’s anything that the computer security community has learned over the years, it’s that complexity is the enemy of security. One highly relevant example is SSL/TLS support for “export-grade cryptography” — a bad design left over from the 1990’s when the U.S. government tried to regulate the strength of cryptographic products. Last year’s FREAK attack boils down to an exploit that forces SSL/TLS connections to operate with degraded key quality. The solution? Remove all export-grade cipher suites from SSL/TLS implementations, since they’re not used and not needed any more.
The only way that we know how to build secure software is to make it simple, to use state of the art techniques, and to get rid of older feature that we know are weak. Backdoor engineering is the antithesis of this process.
What are appropriate behaviors for an engineering organization like Apple? I’ll quote Google’s Eric Grosse:
Eric Grosse, Google’s security chief, suggested in an interview that the N.S.A.’s own behavior invited the new arms race.
“I am willing to help on the purely defensive side of things,” he said, referring to Washington’s efforts to enlist Silicon Valley in cybersecurity efforts. “But signals intercept is totally off the table,” he said, referring to national intelligence gathering.
“No hard feelings, but my job is to make their job hard,” he added.
As a national policy matter, we need to decide what’s more important: backdoor access to user data, or robustness against nation-state adversaries. If you want backdoor access, then the cascade of engineering decisions that will be necessary to support those backdoors will fundamentally weaken our national security posture. On the flip side, strong defenses are strong against all adversaries, including the domestic legal system.
Indeed, the FBI and other law enforcement agencies will need to come to terms with the limits of their cyber-investigatory powers. Yes the data you want is out there. No, you can’t get what you want, because cyber-defense must be a higher priority.
What are the alternatives? Can the FBI make do without what it’s asking?
How might the FBI cope in a world where Apple, Google, and other engineering organizations build walls that law enforcement cannot breach? I suspect they’ll do just fine. We know the FBI has remarkably good cyber-investigators. For example, the FBI hacked “approximately 1300” computers as part of a child pornography investigation. Likewise, even if phone data is encrypted, the metadata generated just walking around with a phone is amazing. For example, researchers discovered that
data from just four, randomly chosen “spatio-temporal points” (for example, mobile device pings to carrier antennas) was enough to uniquely identify 95% of the individuals, based on their pattern of movement.
In other words, even if you use “burner” phones, investigators can connect them together based on your patterns of movement. With techniques like this, the FBI has access to a mountain of data on their San Bernadino suspects, far more than they ever might have collected in the era before smartphones.
In short, the FBI’s worries that targets of its investigations are “going dark” are simply not credible, and their attempts to coopt technology companies into give them back doors are working against our national interests.
Wow I think I understand the issue a lot better however I am still supporting Apple
“cc” points out that the phone’s employer has explicitly granted the FBI access to whatever data they’ve got. We don’t know exactly what that entails and what might be leftover in the phone. SMS? They might be able to get that from the carrier. Email? On the employer’s server. Other apps? Each has a cloud provider behind it that might well have the data. Maybe even the contacts list and such are backed by the employer’s Exchange server.
This leads directly to “Or Liraz”, who suggests that the FBI could well have used a third-party provider like Cellebrite, who offer “mobile forensics” to law enforcement. (Puff piece: http://www.officer.com/article/12154412/mobile-device-data-unlocks-the-critical-connections-that-solve-crimes) We can presume that the FBI has comparable skills in-house, but that Apple’s evolution toward improved security features are slowing them down.
For a high profile case like this, the FBI can almost certainly afford the resources (time and money) to learn what they need to know. It’s the smaller, lower visibility cases where perhaps they can’t. I suspect that the real agenda here is using San Bernadino as a wedge to go after what the FBI feels is a growing problem.
Not only do I think its a wedge, but a hugely dangerous precedent based on a seemingly benign request.
https://lawfareblog.com/not-slippery-slope-jump-cliff
This is “force provider to do malicious update”…
“This is “force provider to do malicious update”…”
… with the backing of a warrant signed off by a judge. Reality.
The FBI has no right to compel Apple to create a back door. Frankly this is slave labor. It’s the “writ of assistance” which King George had, which the Founding Fathers abolished.
If the FBI wants a back door, they can damn well hack the phone themselves.
Key Q&A: http://blog.erratasec.com/2016/02/some-notes-on-apple-decryption-san.html#.Vsdyu29qz0M
Q: How is this different from the 70 other times the FBI has asked Apple to unlock a phone?
A: Because those times took a few minutes of effort on Apple’s part. This is asking Apple to spend 2000 hours on creating a new technology that could potentially be used to unlock all older phones. In other words, it’s conscripting a man-year’s worth of labor.
It is strange that the FBI is not using a 3rd party vendor like Cellebrite.
This could have solved this issue without involving Apple which makes you/me think if all of this discussion is perhaps about something else.
All of these points are well taken, does the calculus change because the phone is actually the property of San Bernadino county?
“The work phone is property of San Bernardino County Department of Public Health, where Farook worked as an environmental health specialist trainee.”
therefore there is no reasonable expectation of privacy
Privacy Protection for the terrorist/ killers of innocent people?
How do the families feel?
Sell Apple stock? change to Samsung? back to new Microsoft laptops?
I think the point is privacy protection for all of us.
Remember when the USSR was bad, and one of the reasons we all felt so morally superior is that we had freedom and the USSR was a surveillance state?
But, if a terrorist does something bad, we must all immediately panic and surrender all privacy rights, that is, be terrorized. Fold, I say, fold now in the face of adversity!
Or, you know, we could try to be rational and consider the long-term implications of a knee-jerk change in policy.
I, for one, choose not to be terrorized.
“Remember when the USSR was bad, and one of the reasons we all felt so morally superior is that we had freedom and the USSR was a surveillance state?”
There is also the possibility that, though the USSR was horrid, our own rulers lied to us repeatedly and massively, and the communists were for “We, The People” all along and the west was for “The One Percenters” all along. Maybe.
Karl Marx, no. Lech Walensa, sure.
Nice write-up, Dan.
I’m unclear on the bit about installing the firmware update. Don’t you have to unlock the phone with the passcode to be able to install an update?
Here’s another blogger with more on the technical details and some theories about how the FBI’s order might work.
http://blog.erratasec.com/2016/02/some-notes-on-apple-decryption-san.html
How is Apple able to even publish this letter? I was under the impression that most law enforcement demands like this were accompanied by a gag order (e.g., Lavabit). I wonder if the executive branch isn’t simply using this as a way to force the legislative branch to take a position on the issue (pitting individual safety/national security against individual rights/privacy) in an election year. But then again, maybe I’ve just been watching too many shows like House of Cards.
I think the security is around National Security Letters.
Actually Apple asked to keep the request sealed — the government insisted on publishing it. They want to make the fight public.
http://www.nytimes.com/2016/02/19/technology/how-tim-cook-became-a-bulwark-for-digital-privacy.html
I thought NSLs were intended for situations where the people being investigated don’t know (or aren’t supposed to know) they’re being investigated.
Even if the the San Bernardino suspects had survived their shootout the cops, it would be too late for them to not know they’re suspects. NSLs are to keep the element of surprise, so people don’t go hiring lawyers, petitioning courts to get cops out of their lives, and to be fair to the government: destroying evidence, too.
Once the cat is out of the bag, and _especially_ if they’re _dead_, what’s the point?