February 22, 2016

Many readers have asked for a primer summarizing the privacy and security issues at stake in the the dispute between Apple and the U.S. Justice Department, which last week convinced a judge in California to order Apple to unlock an iPhone used by one of assailants in the recent San Bernardino massacres. I don’t have much original reporting to contribute on this important debate, but I’m visiting it here because it’s a complex topic that deserves the broadest possible public scrutiny.

Image: Elin Korneliussen

Image: Elin Korneliussen (@elincello)

A federal magistrate in California approved an order (PDF) granting the FBI permission to access to the data on the iPhone 5c belonging to the late terror suspect Syed Rizwan Farook, one of two individuals responsible for a mass shooting in San Bernadino on Dec. 2, 2015 in which 14 people were killed and many others were injured.

Apple CEO Tim Cook released a letter to customers last week saying the company will appeal the order, citing customer privacy and security concerns.

Most experts seem to agree that Apple is technically capable of complying with the court order. Indeed, as National Public Radio notes in a segment this morning, Apple has agreed to unlock phones in approximately 70 other cases involving requests from the government. However, something unexpected emerged in one of those cases — an iPhone tied to a Brooklyn, NY drug dealer who pleaded guilty to selling methamphetamine last year.

NPR notes that Apple might have complied with that request as well, had something unusual not happened: Federal Magistrate Judge James Orenstein did not sign the order the government wanted, but instead went public and asked Apple if the company had any objections.

“The judge seemed particularly skeptical that the government relied in part on an 18th-century law called the All Writs Act,” reports NPR’s Joel Rose. “Prosecutors say it gives them authority to compel private companies to help carry out search warrants.”

Nevertheless, Apple is resisting this latest order, citing the precedent that complying might set, Apple’s CEO claims.

“We have great respect for the professionals at the FBI, and we believe their intentions are good,” Cook wrote. “Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.”

Cook continued: “The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.”

In a letter posted to Lawfare.com and the FBI’s home page, FBI Director James Comey acknowledged that new technology creates serious tensions between privacy and safety, but said this tension should be resolved by the U.S. courts — not by the FBI or by Apple.

“We simply want the chance, with a search warrant, to try to guess the terrorist’s passcode without the phone essentially self-destructing and without it taking a decade to guess correctly,” Comey said. “That’s it. We don’t want to break anyone’s encryption or set a master key loose on the land. I hope thoughtful people will take the time to understand that. Maybe the phone holds the clue to finding more terrorists. Maybe it doesn’t. But we can’t look the survivors in the eye, or ourselves in the mirror, if we don’t follow this lead.”

According to the government, Apple has the capability to bypass the password on some of its devices, and can even disable an iPhone’s optional auto-erase function that is set to delete all data on the phone after some number of tries (the default is 10).

The iPhone at issue was an iPhone 5C, but it was running Apple’s latest operating system, iOS 9 (PDF), which prompts users to create six digit passcode for security. Since iOS 9 allows users to set a 4-digit, 6-digit or alphanumeric PIN, cracking the passcode on the assailant’s iPhone could take anywhere from a few hours to 5.5 years if the FBI used tools to “brute-force” the code and wasn’t hampered by the operating system’s auto-erase feature. That’s because the operating system builds in a tiny time delay between each guess, rendering large scale brute-force attacks rather time-consuming and potentially costly ventures.

In an op-ed that ran in The Washington Post on Feb. 18, noted security expert and cryptographer Bruce Schneier notes that the authority the U.S. government seeks is probably available to the FBI if the agency wants to spring for the funding to develop the capability itself, and that the FBI sees this as a privacy vs. security debate, while the tech community sees it as a security vs. surveillance debate.

“There’s nothing preventing the FBI from writing that hacked software itself, aside from budget and manpower issues,” Schneier wrote. “There’s every reason to believe, in fact, that such hacked software has been written by intelligence organizations around the world.”

Schneier said what the FBI wants to do would make us less secure, even though it’s in the name of keeping us safe from harm.

“The danger is that the court’s demands will pave the way to the FBI forcing Apple and others to reduce the security levels of their smart phones and computers, as well as the security of cars, medical devices, homes, and everything else that will soon be computerized,” Schneier wrote. “The FBI may be targeting the iPhone of the San Bernardino shooter, but its actions imperil us all.”

Nicholas Weaver, a senior researcher in networking and security for the International Computer Science Institute (ICSI), said the same logic behind what the FBI seeks could just as easily apply to a mandate forcing Microsoft, Google, Apple, and others to push malicious code to a device through automatic updates when the device isn’t yet in law enforcement’s hand.

“The request to Apple is accurately paraphrased as ‘Create malcode designed to subvert security protections, with additional forensic protections, customized for a particular target’s phone, cryptographically sign that malcode so the target’s phone accepts it as legitimate, and run that customized version through the update mechanism’,” Weaver wrote.

Apple appears ready to fight this all the way to the Supreme Court. If the courts decide in the government’s favor, the FBI won’t soon be alone in requesting this authority, Weaver warns.

“Almost immediately, the National Security Agency is going to secretly request the same authority through the Foreign Intelligence Surveillance Court (FISC),” Weaver wrote. “How many honestly believe the FISC wouldn’t rule in the NSA’s favor after the FBI succeeds in getting the authority?”

This debate will almost certainly be decided in the courts, perhaps even by the U.S. Supreme Court. In the meantime, lawmakers in Washington, D.C. are already positioning themselves to…well, study the issue more.

In letters sent last week to Apple and the Justice Department, the House Energy & Commerce Committee invited leaders of both organizations to come testify on the issue in an upcoming hearing. In addition, Sen. Mark Warner (D-Va.) and Rep. Michael McCaul (R-Texas) say they plan to unveil legislation later this week to create a “Digital Security Commission” to investigate whether Congress has a bigger role to play here.

Twitter addicts can follow this lively debate at the hashtag #FBIvsApple, although to be fair the pro-Apple viewpoints appear to be far more represented so far. Where do you come down on this debate? Sound off in the comments below.

Recommended further reading: Jonathan Zdziarski’s take on why this case is different from previous requests from the FBI to Apple.


194 thoughts on “The Lowdown on the Apple-FBI Showdown

  1. Mike Korzen

    Apple doth protest too much. They want to look like they are really concerned with privacy, so they are making a loud protest that they will ultimately lose,but then can appear blameless – old PR stuff.

    1. MattyJ

      Apple has more legal and financial resources than the FBI, they won’t likely lose.

    2. Foggyworld

      I think Apple is right and they along with other smart phone manufacturers have been working with other US government agencies to turn their smart phones into the tool that will access our bank accounts.

      So security becomes more and more of an issue for Apple and the others and the government in this case is going after that rare thing today known as an outstanding US corporation. Why?

      And the State Department didn’t do a thorough job on letting this young woman into the US – she didn’t even give her correct street address. Then the California County where San Bernadino employed this man and provided him with a county iPhone, ordered all employees to change their passwords but neglected to supervise that activity. So the County made the known password on that phone disappear.

      Doubt sincerely if at this point there is anything worthwhile on that iPhone. Any contacts he might have made are by now certainly AWOL.

      Maybe the FBI should ask the NSA to make a search in that huge data collection center in Utah because for years they were scooping up all of those calls. We have spent billions of dollars on these programs and when they fail, to turn to the manufacturer and ask it to compromise their good product to offset the sloppiness of government is just unfair.

      And should Apple be forced to comply, the government’s enthusiasm for the cashless society may just have to wait because few of us will feel the confidence we did in our iPhones until this.

    3. JCitizen

      We as Apple customers, have demanded this kind of security in our products, to keep nosy criminals, and meddling nation state bad actors out of our business. So now we are supposed to stand by and just let the government torpedo yet another, otherwise successful design? I don’t think so!

      It is high time someone stood up against this – if the NSA or FBI want to crack an Apple they can do that on their own. The Allies had to do it during WW2 with superior computing power, and you can’t tell me there isn’t a self respecting black hat out there that couldn’t be contracted to do this. I think this is just more unnecessary government intimidation and meddling in the affairs of legal commerce and personal 4th amendment protections. Either they figure out how to crack it themselves or they can just GO FISH!! Let those gumshoes learn how to do good police work to find out how to connect the dots with these splinter terrorist groups, and lone wolfs. Any good city cop worth his badge could do it without all this undue interference!!

      1. Michael Iger

        Absolute privacy against legal court orders was never part of anyone’s personal right. Up until electronic encryption was invented that right didn’t exist and is not part of our laws. As the article states, the government always had the right to get at one’s private papers and files when it was in the best interest of all society. Apple is making a case for marketing reasons to enhance its international profits as a secure device. This is put into question by the statement that the FBI could develop the code to crack the iPhone with proper funding. What is to stop any other foreign intelligence agency from doing the same? Apple is making a dubious claim at best, sounds good, but won’t hold up.

        1. Jonathan Jaffe

          Michael Iger: A court order to a person to divulge the content of their mind, if doing so might tend to incriminate them, is a violation of the Fifth Amendment. See http://nc3.mobi/references/biometrics/#cc for case law and references.

          The question has nothing to do with personal privacy as the phone belonged to the government and the lack of privacy was properly disclosed to the recipient.

          Sadly this whole issue could have been avoided had San Bernardino county government actually installed Apple mobile device management (MDM) for which their taxpayers had paid. (see http://nc3.mobi/cskim/#20160221 for source)

          Jonathan @NC3mobi

        2. James B

          You seem to think that rights don’t exist until they’re guaranteed in writing. I take the exact opposite stance – I have the absolute right to privacy regardless of the lack of laws on the books to support that right.

          We don’t need legislation to preserve our rights every time a new encryption technology comes out. That right to privacy is implicit to being an American citizen.

          Also, I don’t think Apple made any claim about FBI capability. That came instead from an independent resource, if I’m reading this article correctly.

        3. CJD

          So would you have a safe manufacturer provide a way into a safe they made in this same example? It is NOT illegal to produce tools that enhance security and privacy, and we should not be compelling companies to defeat their own products. Should lock manufacturers be required to provide the govt with a master key to ALL locks?

          In the key example, the issue at play here, isnt the creation of the key, it is turning it over to a govt that WILL at some point misuse the key without a warrant, and that key might get copied by someone that is disgruntled or being blackmailed.

          The big difference here, is that there isn’t a second way in – they cant just take a torch to the safe, or break a window to get around the door lock. If a safe manufacturer found a way to build a safe so that it would take 5 years of constant drilling to break into it, should we compel them to crack the lock?

          The law states that you can be compelled to turn over something you HAVE – Apple doesn’t have a way in, they must create one. 4th A search and seizure law isn’t the question here, the Writs Act is.

    4. Jason

      No, the FBI is whines too much.

      Apple already gave them the phone’s iCloud backup from 6 weeks prior. The FBI could have had the most recent activity but they changed the iCloud password, and lost it that data. Now they want Apple cover their mistakes by writing them code to break into all iPhones any time? Screw that!

    5. Armando

      How can organization forced to do something “protest too much??”

  2. Peter Weinberger

    Tiny detail: I think the phone belongs to his employer (who failed to put MDM, Apple’s enterprise manager, on it) [or maybe i misremember]

      1. Bob Brown

        … and the county is quite welcome to open up that phone. If they can.

    1. Pat

      It does- and I read somewhere that someone screwed up and reset the password, after it had already been in police custody, rendering a sync to icloud impossible and setting off a whole chain of unneccessary trouble. They could have just done what they needed themselves, had they not screwed up.

  3. James

    Thanks for the breakdown Brian. I’m sure the comments page will light up with debate on where everyone stands, but I’m curious on where you stand. Do you think companies such as Apple, should create a “backdoor” for law enforcement? No shades of grey either. Yes or No.

    1. Ken Robinson

      To me, no shade of grey here, Apple should fight this the whole way. This opens pandora’s box from which there is no return.

      One element that few columns have yet considered is that computer code is protected under the First Amendment for free speech. If Apple is compelled to write code it deems to violate its stance on freedom, privacy, etc, and also to sign that code, First Amendment rights are being violated.

      There are other ways around the issue. We are warned about malware on the various software stores. Along this line, what if the FBI would write a program to pull the data on the phone and send it to them, and the would have a warrant to compel Apple to add this to the account and have it pushed to the phone. And, the FBI can even make the app free to anyone who wants to download it. “Get this app and let the FBI snoop on your life!!!” There would probably be takers. This may not be a perfect solution, but would still require a warrant, and would be similar to a wiretap.

      But, also, keep in mind that it is unlikely the phone was ever used for anything illicit. At best, location data may reveal places he had gone that could be used, but that information is also available from the carrier, and isn’t needed from the phone. And, even if used, chances are an app with its own encryption was used, and nothing Apple does would allow access into that.

  4. Phyllis

    I find irony in all of this. The technology distributors gather and track the user without the user knowing until it is exposed by someone like Brian or another party and that’s supposed to be ok. But yet, when they have the control they so enjoy and don’t want to give it up, like the user of their technology would also like to have, provider falls back on the oh, it’s such a risk song. How many user’s have had their information disclosed in a breach just to find out it was being recorded/captured. 99.9% of these data breach victims didn’t kill anyone…..and ultimately, I find it unfathomable that Apple does not already have the knowledge, skill, ability and tool already right in front of them to fulfill this request. They surely do testing of the product before deployment and they’re on what the iPhone 6 blah blah blah now, likely the same security approach was repeated/replicated. Stupid debate….

    1. MattyJ

      From what I understand they don’t have the capability because they don’t *want* the capability, and didn’t build it into the system. A system with a backdoor is not a secure system so (at least as far as everyone is willing to disclose) it doesn’t exist. The FBI is asking them to build one.

      That’s part of the high cost of Apple products, which I gladly pay. There’s no magic formula to check the identity of someone using a backdoor to get into your device. It would be as easily exploitable by hackers/thieves as the FBI once people have a chance to scrutinize it.

      1. emaz

        Although I agree with you, in this case something should be done. I think this is more hyperbole than anything. There is no reason why both sides cant come to a reasonable agreement. Instead this is no longer about privacy but now it is about public bantering.

        1. F

          Sure, and so because of “terrorists” we should abandon the Constitution, the Bill or Rights, and all basic Human Rights… cause… you know… just this one case, no big deal! First they came for…

          1. securemyspace

            I think the important thing to remember that the *terrorist* is dead, the *terrorism* has already happened. Unlocking this phone will not make the public any safer. It won’t foil a planed terrorist attack.

            Apple are afraid because they rely on their perceived security to help sell devices (especially in the corporate world – they’re the new Blackberry). If a backdoor is intentionally created & publically known, then they’ll lose that security perception.

            In an election year, and it goes to the supreme court, my money is on Apple.

      2. timeless

        I was going to write a detailed explanation of why I think it’s possible for Apple to do what’s requested, but I found a good one [1].

        The thing is. Apple shouldn’t do this. Not because the FBI is good or bad, but because if an agency of the US Government can request this, then so can an agency of the Russian, Indian, Chinese, or any other foreign country’s government. And that’s even scarier.

        The main point I wanted to make (and hence this reply here) is that while Apple may not have actually written this firmware, I don’t think it’s incredibly hard for them to write (especially for the 5C which appears to be before some hardware upgrades that could have properly tied its hands) — The FBI did a fairly good job of outlining what they want Apple to do (see [1] for a breakdown of the request).

        [1] http://blog.trailofbits.com/2016/02/17/apple-can-comply-with-the-fbi-court-order/

  5. emaz

    My feeling is this. Had they just went behind closed doors and worked something out this wouldn’t be such a mess. I mean think about what this is all based off of, a terrorist attack that happened here in the states. Is there a chance this guy was smart and didn’t leave anything worthwhile on there? Sure, however is there a chance he didn’t think about what he was putting on his phone and could have placed information that could help the FBI in the future? I have to go with the latter. I get the whole idea behind privacy and keeping that aspect safe. At the same time though, if you mean to tell me that a half trillion dollar company along with the federal government can’t come up with a plan to help out each other without comprising the security of the phone. Instead retort to public bantering, we are very much in scary times. One side has a billon devices floating around the world and the other is supposed help protect our country by foreign and domestic threats.

  6. Larry Seltzer

    I don’t know how Schneier can blithely claim that the FBI can hack the software themselves and install it and all they have to do is steal Apple’s code signing key. Honestly, in the name of preserving privacy or the law or something he says this would be better than Apple complying with the order. He’s too smart a man to believe this.

    1. Henry Robinson

      I agree. The FBI would need to get the iOS source code and Apple’s private signing key first. At that point it would be possible, but that is a SIGNIFICANT overreach of the FBI’s mandate! There’s no way Apple would agree to that unless forced at gunpoint.

      What they’re _actually_ doing is a far smaller overreach than handing over the keys to the kingdom. The implications are just as significant, though– once the signed iOS firmware with weakened security exists, every police department in the world will immediately request the same thing for every criminal investigation, and more importantly foreign powers like China and Russia will get it for espionage purposes. And at that point, the iPhone is an unlocked door.

      The iPhone versions with touchID and the secure enclave aren’t exempt either. The secure enclave’s firmware can be updated. Once the precedent is set, nobody’s device will be secure.

      The only solution is for Apple to change their hardware in future releases. And I hope they do. But– won’t the feds mandate a hardware backdoor, then, using the same archaic 1700s law?

    2. Berend de Boer

      If the FBI has the signing key, then you really have a problem. Then every iphone would be vulnerable. The government has a terrible record with regards to safe guarding their information. Before you know it it ends up on Hillary Clinton’s email server.

      1. Notme

        Ha Ha HA too funny the private email server that was not hacked like all my stuff that got lost at the OPM. Ha HA made my day with that one!

  7. ODA155

    Personally, I believe that both or either NSA or CIA may already have the capability to break the encryption on that device, but will not publicize that fact because of the Edward Snowden situation. I could be wrong but I just cannot believe that with all of the other capabilities exposed that this would not be something that could be possible.

    1. JCitizen

      I got to admit it would be a bit of genius in tactics for the FBI to make it look like they were lost in this aspect – but none-the-less, I hope they lose in court on this particular issue.

      1. ODA155

        Think about it… the FBI is a law enforcement division of the US government and although they have an intelligence capability, it’s main purpose is law enforcement. However, the NSA and CIA both are strictly intelligence and “technically” are outward facing organizations (regardless of recent history)… so if they do possess the means to break that encryption I don’t think they could share it legally.

    2. Kostas Kritsilas

      No the NSA/CIA/FBI do NOT have that capability, as of iOS7 or iOS8. The phone re-encrypts data every time it is rebooted using a new key, and the encryption key is long, variable, and the encryption is strong. That is why the FBI needs to get Apple to write a vesion of iOS with a “backdoor”. If they could have broken it, they would have.

      For your own edification, look up “Secure Enclave” and iPhone, perhaps in a Google Search. The TV show idea of extracting the memory chips and reading them out with an external piece of equipment won’t help either; the data on the memory chips is encrypted, and stored as encrypted data (i.e. it is never available in a non-encrypted state while not in active use).

      The whole thing comes down to non-technical people who don’t have the slightest clue about the technology trying to tell the people that do understand the technology what to do. Encryption is nothing special; the math is out there, the encryption algorithms are public knowledge. If terrorists get wind of iPhones being decrypted, they will stop using iPhones, or they will use other encryption methods to secure their data. Apple, the US, or the free world DO NOT HAVE A MONOPOLY on encryption. You can get encryption software from Russia, China, North Korea, or some criminal/terrorist organizations can write their own. The only thing the FBI is accomplishing with their court order is damaging Apple. Going forward, please have no doubt that the bad guys are already moving to third party encryption methods that are out of the reach of the US legal system.

      1. ODA155

        Kostas Kritsilas… who said anything about a TV show… I certainly did not. I simply stayed my opinion as did you, which could be just as plausible as yours. May I suggest toning down your own rhetoric before dismissing other peoples opinions. As for yours, maybe I will take a look at it, but either way, it’s still just your opinion based on something that you read someplace that you’re taking at face value.

        Anyway, thanks and have a good evening.

        1. Kostas Kritsilas

          Firstly, I never meant to imply that you mentioned a TV show, I thorught that it was germane to the topic at hand.

          Secondly, none of this is opinion. It is documented to work this way by Apple’s own documentation for iOS7. The “secure enclave” exists inside currently supported iPhones, and it was put there for a number of reasons, chief amongst them, Apple Pay. If Apple Pay is anything but absolutely secure, people won’t use it. Over and above all of this, and this part is opinion, is that Apple just got fed up of being accused of tracking people, and the huge number of requests from law enforcement, at all levels, from making requests to extract data from iPhones. End of opinion.

          So they came up with a constantly changing key that partially consists of the PIN (4 or 6 digit), the device serial number (which is very long to start with), the time of day (timed to the millisecond), and some other piece, which may or may not be a random number generator., and all of this is hashed out Every bit of data is then encrypted with this key (not music or applications, just user data). When you reboot the phone, a new key is generated based on the above, while the old key is kept. The old key is then used to decrypt the existing data, the data is re-encrypted with the new key, and when all of the data has been re-encrypted with the new key, the old key is erased. This is why iPhones take so long to reboot; they aren’t just loading iOS, they are going through the decryption/encryption of the data on the phone. The same methodology makes trying to read the data off the memory chips impossible. It is also why, when Apple says they cannon decrypt the data on the phone, they mean it. Nobody can decrypt the data off the iPhone (unless you have a few supercomputers and 10,000 years or so with nothing better to do).

          As for how much this is an opinion, as I said, the information is in Apple’s documents. Don’t want to go through that? Steve Gibson did a two part podcase on the iPhone security and the Secure Enclave. Go look it up, listen to it, and then come back and we can discuss how much of my post was opinion. Until such time, we don’t have any reason to discuss this further.

          As for the statement about using third party applications to encrypt information, Bruce Schneier did a piece on this, read it and understand where my opinions are coming from. The US Gov’t and US Law Enforcement need to understand that the US doesn’t have a monopoly on encryption (see the German Enigma Machine of WW II as one example), nor on mathematicians. Anybody, meaning any nation, any well funded organization on the side of good or evil, can create encryption algorithms that are close to impossible to crack, and the methods are public domain. Math is pbulic donmain. And there are many, many universities around the world that are graduating math PhDs, Masters, and Bachelor graduates around the world. An opinion: The terrorists/extremists are already using either third party or in-house encryption technologies after this hit the press. The FBI just shot themselves in the food, if not in the head.

  8. Scott Schober

    Brian -great summary of events. Below are some thoughts I had in addition.

    Apple is being pushed too far in this particular instance. If you read the court order they are asking for more than just brute force the PIN and provide the contents (data). Apple has done this in the past where Apple provides a ‘service’ and they break into the phone & provide the data.

    The FBI is requesting that Apple develop a forensics tool. This is considered an ‘instrument’ and in legal terms has to be validated by 3rd parties, accepted by peer review(defense experts), supported with documentation. This instantly opens the door up with numerous hands having access on this ‘hacking tool’. All it takes is one individual that is bribed to compromise this new software. Security for the iPhone will be drastically weakened and extremely vulnerable going forward.

    Apple has been put in a awful predicament. They have been working with law enforcement for almost two months providing technical guidance on this specific iPhone, but have been asked to load the gun and shoot themselves in the foot by developing a hacking tool that would enable the FBI and anyone else that gets their hands on this the ability to compromise the iPhone.

    Thanks
    Scott Schober
    Pres/CEO of BVS
    http://www.scottschober.com

    1. Henry Robinson

      The FBI’s request specifically says Apple can load the new firmware on their premises, and that the FBI will brute-force it there as well, that the weakened software will never leave Apple property. So I’m not sure it would be counted as a forensics tool, subject to all those controls you mention.

  9. James

    Once a warrant is received, the subject has no privacy rights. Police can even force their way into someone’s home if that’s on the warrant. Therefore, this case is about Apple’s image, not privacy rights. I don’t think Apple’s image is more important than the FBI’s investigation.

    1. MattyJ

      I suggest you read Scott Schober’s comment above. You clearly don’t understand the issue.

    2. Gnar Kill

      There is also an “out” in the court order. Read item 7.

    3. David Thompson

      The warrant apparently does not say nobody has the right to privacy, whatever their legal circumstances, which it appears would be the result of the FBI getting its way.
      Blackberry submitted to governmental intrusion and went from being the serious person’s instrument of choice to – not much anymore.

      1. timeless

        This isn’t the reason…

        At the end of the day, the consumer purchased phones that weren’t locked/limited by employers. Their phones were shinier and more flexible (often because they were no longer crippled by their employers). They then insisted they be able to bring them into work. (More than insisted, they started doing it anyway in violation of policy.)

        Their employers caved.

        This broke up the purchasing dynamic that enabled companies that sold to businesses to succeed.

        The other side was that the carriers broke out of contracts which included annual per-device income. Since this was a core part of the business model and was used for development, it significantly impacted development.

  10. Gordon Walker

    Just no. There is no way that such attaining such authority will end well for anyone. A prime example is the Patriot Act. Although it was brought into being for “good intentions”, there is nothing really stopping from the powers that be from abusing it. If the FBI is allowed this authority, it will only end badly. It is one thing to grant access to one device, but enabling blanket backdoor access to all devices is a stretch too far. Hopefully the courts will see reason.

  11. Mike

    There is nothing on the iPhone that should be considered “private”. Those phone calls and texts are not private anymore than the email that people use. Everyone ‘thinks’ they have privacy (often because the word “encrypted” is attached) when it is no such thing.

    Infectious software/malware/hacks has made it’s way into the system already. Otherwise, The Fappening would never have happened.

    I would not want government in control of this stuff and i would certainly trust Apple long before the FBI. But, I don’t really trust Apple either. Which is why I don’t own an iphone. Situations like this one serve to vilify that sentiment.

    Whatever ends up happening though; Apple has made it quite clear to the world what it would take to bring the entire system down. Don’t think for one moment that this situation hasn’t resulted in a dramatic increase in hacking attempts by people all over the world to create a virus to do exactly what Apple is being asked to do.

    1. Henry Robinson

      Nope. The Fappening content sourced from hacked iCloud accounts, not iPhones. These terrorists stopped uploading to iCloud over a month before their attacks, and as we know, the local PD changed their iCloud password so it can no longer try to upload.

      1. Mike

        That is so fine a line of distinction that it’s barely worth mentioning. I use that incident as an example just simply because the system is not all that secure. Certainly not like most people seem to think. The truth is, if these things were ‘private’, that would not have happened. If it were private, other people would never have access. These things are stored on servers that the user has no authority over and thus no real control.

        1. Henry Robinson

          It’s not a fine distinction at all! It’s what this entire furor is actually about!

  12. NewsJunkieEd

    Two can keep a secret only if one of them is dead. If code is developed to defeat Apple’s protection it will be in the public domain within a few months. The law of unintended consequences would prevail.

  13. LM Mastela

    “Those who surrender freedom for security will not have, nor do they deserve, either one.”
    ― Benjamin Franklin

    Apple is in difficult position but their long term perspective is superior to the short term want of the FBI, protect privacy – it’s becoming a rare commodity.

  14. Barry Phillips

    I support Apple. The federal government has repeatedly demonstrated it cares lilltle or nothing about the rights of ordinary citizens. I do not trust them one whit.

  15. Curt Wilson

    Lots of tweets on this subject recently. It was said that the shooters had personal iphones that were physically destroyed. It is of course possible that some type of coordination may have been done with the work phone but it seems like likely.

    Some useful reading on the matter:

    https://medium.com/@thegrugq/feeble-noise-pollution-627acb5931a2#.uoln7hlyi

    And more about the forensic instrument that would have numerous opportunities to grow legs and walk away to the upstream nation-state:

    http://www.zdziarski.com/blog/?p=5645

  16. Dennis Davis

    It would not be a real surprise that our FBI has already broken this Apple security barrier&downloaded all&whatever is there, but has to create this public smokescreen, much like the Allies in WW2 could not use too obviously the information they were decoding via their Enigma Machine or their breaking of the Japanese Naval Code since too obvious use would just result in the Axis adopting a different code which would result in having to break the codes again. [Haven’t I read that Churchill knew RE the scheduled attack on Coventry, against which he could do nothing additional to defend for fear of making it obvious the German code had been broken?]

    If our FBI hasn’t done this, then it should contract with NSA or CIA or equivalent Israeli secrets people who certainly should be able to do this. And quickly without time lost to this court case. This is no time for bureaucratic bluster&smoke.

    These terrorists are dead. It’s not like there’s going to be a court case with its demand for a chain of custody.

    Initially I could see carving out an exemption for terrorist activity, only that can’t be done. A few days ago on NPR a NYC “district attorney” was interviewed who said he had ~150 cellphones he wanted to enter, but was blocked by their software security. Suddenly it was obvious that should Apple open this one, there’s going to be a tremendous number of other jurisdictions that have other secured cellphones they will want also opened. And there will be other courts at all levels to issue court orders. This FBI court case is the proverbial camel’s nose for all levels&qualities of jurisdictions&Constitutionally-committed. It will be a total morass&loss for all of us. [This is written by an old man who uses no cellphone of any type. Retirement has its gifts.]

  17. Mike

    “According to the government, Apple has the capability to bypass the password on some of its devices, and can even disable an iPhone’s optional auto-erase function that is set to delete all data on the phone after some number of tries (the default is 10).”

    I’d be interested in seeing the Federal Government’s evidence that Apple [currently] has this capability to “even disable the optional auto-erase function…” Is that claim being made because the feature is optional? There was a passcode reset performed on that phone shortly after the shooting took place, at the behest of the county entity that issued the phone. How is it that the Feds can’t work that route and get the new passcode from the County? (I am not familiar with how such a reset works specifically with an iPhone, btw.)

    (Note: have not finished reading article yet.)

  18. DavidD

    “According to the government, Apple has the capability to bypass the password on some of its devices, and can even disable an iPhone’s optional auto-erase function that is set to delete all data on the phone after some number of tries (the default is 10).”

    So there’s an assumption being made by the FBI that Apple has this capability, but based on what? I’m not aware of anything technical that Apple has ever published, acknowledged or even intimated that this can be done. To be clear I’m not talking about the feasibility of a massive brute force attack AES encryption. Do we have a genuine proof of concept hack that bypasses this auto delete feature?

  19. Dave Howe

    One minor point – while Apple has indeed unlocked 70 (if not more) phones before on legal authority, that isn’t what is being asked here – There is a difference between handing over a designed-in unlock code for a device, and writing (and signing) malware. Apple (as far as we know) has never done this before, never mind 70 times.

    1. Danny

      Minor point on a minor point, Apple has not unlocked 70 phones. “On devices running iOS 7 and previous, Apple actually has the capability to extract data, including (at various stages in its encryption march) contacts, photos, calls and iMessages without unlocking the phones. That last bit is key, because in the previous cases where Apple has complied with legitimate government requests for information, this is the method it has used.” (ref: http://techcrunch.com/2016/02/18/no-apple-has-not-unlocked-70-iphones-for-law-enforcement/)

  20. Earl Killian

    When I read about FBI v. Apple it always seems there is an issue that is left undiscussed. If Apply complies with this FBI request, i.e. if Apple writes special software to hack into this particular 5c, then in the future Microsoft, Apple, and Google will be hit with demands from the US, UK, FR, RU, CN, IL, SA, IR, SY, EG, TR, YE, SO, MX, PK, etc. to provide software to hack into the devices seized in those states. In effect every government will have hacking capabilities for every device they want. If Apple refuses any country’s demand while complying with US demands, then that countries is likely to respond by blocking sales of Apple products in their country. This then becomes an life and death issue for the tech companies. Perhaps they can afford to give up sales in YE, but not CN and many others.

  21. Earl Killian

    There was an intriguing part of the story that your summary left out. The NYT reported the following:

    ‘Apple had asked the F.B.I. to issue its application for the tool under seal. But the government made it public, prompting Mr. Cook to go into bunker mode to draft a response, according to people privy to the discussions, who spoke on condition of anonymity. The result was the letter that Mr. Cook signed on Tuesday, where he argued that it set a “dangerous precedent” for a company to be forced to build tools for the government that weaken security.’

    So Apple looks like it was willing to comply as long as the rest of world did not find out, but the government wanted to make it public, thereby forcing Apple to balk, lest it be forced to deliver hacking software to every world government or stop sales in those countries (effectively causing it to exit the phone market). Why did the FBI force a public confrontation?

  22. Antonio

    It sounds like the back door already exists, in that Apple is completely capable of unlocking the phone. They just don’t want to.

  23. Not Snowden but....

    Between the FBI, NSA, DHS, CIA, and the various cyber forces, with budgets in the multi-billions, they need to figure this out on their own. All of these “experts” need to step up and earn their paychecks. NSA TAO probably already has a solution, I’m in the camp that this is the case that the FBI and other members of the IC want to use to dummy down encryption.

  24. Larry Guthrie

    Life is possible and enjoyable without an iphone. No iphone=no issue.

  25. Jeff L.

    What strikes me about the FBI’s request is that it’s unnecessary. A locked iPhone is not encrypted – it’s just locked with a passcode. If they can’t get through the passcode screen, there are other ways to get at the data, given physical access to the device. Worst case, the data is sitting unencrypted on memory chips on the circuit board. If the FBI doesn’t have people who know how to pull memory chips from a circuit board and read the contents, then they need to hire some (or work with the NSA, who most certainly do have such people).

    So understanding that their request is unnecessary (and they certainly know it), it leaves me wondering what their true motive is. It could just be laziness, of course – trying the most expedient way first and seeing if Apple would comply; or it could be something more nefarious. I’m inclined to distrust the government more often than not, so I’m betting they have a deeper motive here.

    1. timeless

      Your statement may describe some older version of iOS, but….

      Recent versions actually do leave the data encrypted at rest.
      iOS 8 seems to be the main turning point for this.

  26. Marc Erickson

    The FBI v. Apple isn’t at all the way you think it is
    http://www.cringely.com/2016/02/19/the-fbi-v-apple-isnt-at-all-the-way-you-think-it-is/

    TL;DR: “So if you are a President who is a lawyer and former teacher of constitutional law and you’ve come over time to see that this idea of secret backdoors into encrypted devices is not really a good idea, but one that’s going to come up again and again pushed by nearly everyone from the other political party (and even a few from your own) wouldn’t right now be the best of all possible times to kinda-sorta fight this fight all the way to the Supreme Court and lose?

    If it doesn’t go all the way to the Supremes, there’s no chance to set a strong legal precedent and this issue will come back again and again and again.”

  27. robert stark

    why don’t they lift the memory chip off of the phone and then decrypt the data directly. or am i missing something?

    1. timeless

      In at least some models, there’s a tamper resistant module which has pieces of the key. The modules cooperate to produce the (de/en)-cryption key.

      So, while you could remove the storage module and look at it, all you should be able to get is the encrypted data.

      On the right hardware versions, w/o the CPU + special element, you can’t get the key, and thus what you have should be equivalent to noise.

    2. jon trantham

      While it’s true that decrypting the NAND flash’s contents requires the phone’s controller, it would still be a prudent step to remove (by carefully sawing) the NAND and dumping out its contents for preservation. That would at least allow for the phone to be restored to its present state, regardless of what is done to it.

      1. timeless

        The problem is that there are two components:
        A. the encrypted data
        B. the decryption key

        Getting a copy of A isn’t particularly difficult.

        On modern iPhones, trying to extract the decryption key (either by knocking on it using PINs, or by using physical machinery to try to extract the bits) is likely to result in the material to destroy the decryption key.

    3. Thomas D Dial

      Yes. The missing piece of information is that there is no known effective way to decrypt the information on the memory chips without the encryption key that the pass code guards. If the FBI is able, with or without Apple’s help, to recover the pass code, they will be able to use the key and the phone together to search the phone. Otherwise there is no reasonable way to do it

Comments are closed.