Many readers have asked for a primer summarizing the privacy and security issues at stake in the the dispute between Apple and the U.S. Justice Department, which last week convinced a judge in California to order Apple to unlock an iPhone used by one of assailants in the recent San Bernardino massacres. I don’t have much original reporting to contribute on this important debate, but I’m visiting it here because it’s a complex topic that deserves the broadest possible public scrutiny.
A federal magistrate in California approved an order (PDF) granting the FBI permission to access to the data on the iPhone 5c belonging to the late terror suspect Syed Rizwan Farook, one of two individuals responsible for a mass shooting in San Bernadino on Dec. 2, 2015 in which 14 people were killed and many others were injured.
Apple CEO Tim Cook released a letter to customers last week saying the company will appeal the order, citing customer privacy and security concerns.
Most experts seem to agree that Apple is technically capable of complying with the court order. Indeed, as National Public Radio notes in a segment this morning, Apple has agreed to unlock phones in approximately 70 other cases involving requests from the government. However, something unexpected emerged in one of those cases — an iPhone tied to a Brooklyn, NY drug dealer who pleaded guilty to selling methamphetamine last year.
NPR notes that Apple might have complied with that request as well, had something unusual not happened: Federal Magistrate Judge James Orenstein did not sign the order the government wanted, but instead went public and asked Apple if the company had any objections.
“The judge seemed particularly skeptical that the government relied in part on an 18th-century law called the All Writs Act,” reports NPR’s Joel Rose. “Prosecutors say it gives them authority to compel private companies to help carry out search warrants.”
Nevertheless, Apple is resisting this latest order, citing the precedent that complying might set, Apple’s CEO claims.
“We have great respect for the professionals at the FBI, and we believe their intentions are good,” Cook wrote. “Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.”
Cook continued: “The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.”
In a letter posted to Lawfare.com and the FBI’s home page, FBI Director James Comey acknowledged that new technology creates serious tensions between privacy and safety, but said this tension should be resolved by the U.S. courts — not by the FBI or by Apple.
“We simply want the chance, with a search warrant, to try to guess the terrorist’s passcode without the phone essentially self-destructing and without it taking a decade to guess correctly,” Comey said. “That’s it. We don’t want to break anyone’s encryption or set a master key loose on the land. I hope thoughtful people will take the time to understand that. Maybe the phone holds the clue to finding more terrorists. Maybe it doesn’t. But we can’t look the survivors in the eye, or ourselves in the mirror, if we don’t follow this lead.”
According to the government, Apple has the capability to bypass the password on some of its devices, and can even disable an iPhone’s optional auto-erase function that is set to delete all data on the phone after some number of tries (the default is 10).
The iPhone at issue was an iPhone 5C, but it was running Apple’s latest operating system, iOS 9 (PDF), which prompts users to create six digit passcode for security. Since iOS 9 allows users to set a 4-digit, 6-digit or alphanumeric PIN, cracking the passcode on the assailant’s iPhone could take anywhere from a few hours to 5.5 years if the FBI used tools to “brute-force” the code and wasn’t hampered by the operating system’s auto-erase feature. That’s because the operating system builds in a tiny time delay between each guess, rendering large scale brute-force attacks rather time-consuming and potentially costly ventures.
In an op-ed that ran in The Washington Post on Feb. 18, noted security expert and cryptographer Bruce Schneier notes that the authority the U.S. government seeks is probably available to the FBI if the agency wants to spring for the funding to develop the capability itself, and that the FBI sees this as a privacy vs. security debate, while the tech community sees it as a security vs. surveillance debate.
“There’s nothing preventing the FBI from writing that hacked software itself, aside from budget and manpower issues,” Schneier wrote. “There’s every reason to believe, in fact, that such hacked software has been written by intelligence organizations around the world.”
Schneier said what the FBI wants to do would make us less secure, even though it’s in the name of keeping us safe from harm.
“The danger is that the court’s demands will pave the way to the FBI forcing Apple and others to reduce the security levels of their smart phones and computers, as well as the security of cars, medical devices, homes, and everything else that will soon be computerized,” Schneier wrote. “The FBI may be targeting the iPhone of the San Bernardino shooter, but its actions imperil us all.”
Nicholas Weaver, a senior researcher in networking and security for the International Computer Science Institute (ICSI), said the same logic behind what the FBI seeks could just as easily apply to a mandate forcing Microsoft, Google, Apple, and others to push malicious code to a device through automatic updates when the device isn’t yet in law enforcement’s hand.
“The request to Apple is accurately paraphrased as ‘Create malcode designed to subvert security protections, with additional forensic protections, customized for a particular target’s phone, cryptographically sign that malcode so the target’s phone accepts it as legitimate, and run that customized version through the update mechanism’,” Weaver wrote.
Apple appears ready to fight this all the way to the Supreme Court. If the courts decide in the government’s favor, the FBI won’t soon be alone in requesting this authority, Weaver warns.
“Almost immediately, the National Security Agency is going to secretly request the same authority through the Foreign Intelligence Surveillance Court (FISC),” Weaver wrote. “How many honestly believe the FISC wouldn’t rule in the NSA’s favor after the FBI succeeds in getting the authority?”
This debate will almost certainly be decided in the courts, perhaps even by the U.S. Supreme Court. In the meantime, lawmakers in Washington, D.C. are already positioning themselves to…well, study the issue more.
In letters sent last week to Apple and the Justice Department, the House Energy & Commerce Committee invited leaders of both organizations to come testify on the issue in an upcoming hearing. In addition, Sen. Mark Warner (D-Va.) and Rep. Michael McCaul (R-Texas) say they plan to unveil legislation later this week to create a “Digital Security Commission” to investigate whether Congress has a bigger role to play here.
Twitter addicts can follow this lively debate at the hashtag #FBIvsApple, although to be fair the pro-Apple viewpoints appear to be far more represented so far. Where do you come down on this debate? Sound off in the comments below.
Recommended further reading: Jonathan Zdziarski’s take on why this case is different from previous requests from the FBI to Apple.
Hey Brian, your link for the recommended reading by Jonathan Zdziarski is not working. please update it
Why should we be concerned about the FBI backdoor when the Chinese who build the Apple phones already have a backdoor?
That’s a pretty good question, Dave. A variant of it: why is Apple’s position so deeply loved by the digidweebery when we have managed to get along without such encryption for many years? Another variant: how do ordinary citizens hold Cook and Apple responsible for facilitating crimes?
I’m amazed there isn’t a work around to cracking the iPhones the FBI is trying to access. Can’t they remove the hard drive and clone it and make a 1,000 cloned iPhones? With 10 attempts per phone they could put 100 lawyers in a room and have the code in a matter of hours.
I don’t think you understand the scope of the problem.
From a quick search, it appears that Apple uses a 256 bit AES key — possibly multiple, if I’m reading this right — to encrypt the data on the iPhone. So from the beginning, cloning the data storage on the iPhone may not even be possible until you break a DIFFERENT password of even greater (since it’ll be a code and not a phrase) difficulty.
Even presuming that it is, assuming they’re putting in passwords by hand, with people swapping the phones for them, they’d still need… oh… around 150 billion, billion billion billion billion billion billion (insert a few billion billions here) years to crack the code.
To put it another way, from a great Reddit thread on the subject, the fastest supercomputer in the world is the Tianhe-2 in China. It runs at 34 petaflops (stupidly fast).
One of these supercomputers can guess around 1 septillion (1 with 24 zeroes after it) passwords a year.
Even so, that means that it would take around 5e52 (5 with 52 0es after it) years to break the code with that supercomputer. That’s many, many, MANY times longer than the estimated lifetime of the entire universe.
If you wanted to merely break it in the time it took the Universe to create our planet, our civilization, and approach the current day (i.e., the entire history of everything, presuming you started at the big bang), you would merely need 1,000,000,000,000,000,000,000,000,000,000,000,000,000 of these computers, running nonstop since the big bang.
In short… no. Barring some form of backdoor, there’s no way they’re getting that data. And to be honest, I’d be surprised if a backdoor could even break through it.
If you believe this research cracking into the iPhone should be easy, but I bet some idiot restarted or let the battery drain…
What’s the big deal with making a signed update for ONE phone to disable the auto erase feature and then giving it back to the DOJ? It’s not like the update would be pushed OTA to all phones, and it can be deleted after the ONE phone is flashed.
Apple is playing games and pinning this into a PR / marketing stunt.
“We’re not giving a backdoor to the FBI”
That’s ok, the NSA already has one 😉
They can’t find the original buying receipt of that iPhone..
Any insights on the latest news, re: this iPhone? It sounds like the feds plan to do a byte-image copy of the phone, and run their cracking software against multiple copies of the phone, maybe in VMs. If after so many tries, that copy wipes itself, they’ll just move on to another copy.
Nevermind. Didn’t see the previous comments.