February 22, 2016

Many readers have asked for a primer summarizing the privacy and security issues at stake in the the dispute between Apple and the U.S. Justice Department, which last week convinced a judge in California to order Apple to unlock an iPhone used by one of assailants in the recent San Bernardino massacres. I don’t have much original reporting to contribute on this important debate, but I’m visiting it here because it’s a complex topic that deserves the broadest possible public scrutiny.

Image: Elin Korneliussen

Image: Elin Korneliussen (@elincello)

A federal magistrate in California approved an order (PDF) granting the FBI permission to access to the data on the iPhone 5c belonging to the late terror suspect Syed Rizwan Farook, one of two individuals responsible for a mass shooting in San Bernadino on Dec. 2, 2015 in which 14 people were killed and many others were injured.

Apple CEO Tim Cook released a letter to customers last week saying the company will appeal the order, citing customer privacy and security concerns.

Most experts seem to agree that Apple is technically capable of complying with the court order. Indeed, as National Public Radio notes in a segment this morning, Apple has agreed to unlock phones in approximately 70 other cases involving requests from the government. However, something unexpected emerged in one of those cases — an iPhone tied to a Brooklyn, NY drug dealer who pleaded guilty to selling methamphetamine last year.

NPR notes that Apple might have complied with that request as well, had something unusual not happened: Federal Magistrate Judge James Orenstein did not sign the order the government wanted, but instead went public and asked Apple if the company had any objections.

“The judge seemed particularly skeptical that the government relied in part on an 18th-century law called the All Writs Act,” reports NPR’s Joel Rose. “Prosecutors say it gives them authority to compel private companies to help carry out search warrants.”

Nevertheless, Apple is resisting this latest order, citing the precedent that complying might set, Apple’s CEO claims.

“We have great respect for the professionals at the FBI, and we believe their intentions are good,” Cook wrote. “Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.”

Cook continued: “The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.”

In a letter posted to Lawfare.com and the FBI’s home page, FBI Director James Comey acknowledged that new technology creates serious tensions between privacy and safety, but said this tension should be resolved by the U.S. courts — not by the FBI or by Apple.

“We simply want the chance, with a search warrant, to try to guess the terrorist’s passcode without the phone essentially self-destructing and without it taking a decade to guess correctly,” Comey said. “That’s it. We don’t want to break anyone’s encryption or set a master key loose on the land. I hope thoughtful people will take the time to understand that. Maybe the phone holds the clue to finding more terrorists. Maybe it doesn’t. But we can’t look the survivors in the eye, or ourselves in the mirror, if we don’t follow this lead.”

According to the government, Apple has the capability to bypass the password on some of its devices, and can even disable an iPhone’s optional auto-erase function that is set to delete all data on the phone after some number of tries (the default is 10).

The iPhone at issue was an iPhone 5C, but it was running Apple’s latest operating system, iOS 9 (PDF), which prompts users to create six digit passcode for security. Since iOS 9 allows users to set a 4-digit, 6-digit or alphanumeric PIN, cracking the passcode on the assailant’s iPhone could take anywhere from a few hours to 5.5 years if the FBI used tools to “brute-force” the code and wasn’t hampered by the operating system’s auto-erase feature. That’s because the operating system builds in a tiny time delay between each guess, rendering large scale brute-force attacks rather time-consuming and potentially costly ventures.

In an op-ed that ran in The Washington Post on Feb. 18, noted security expert and cryptographer Bruce Schneier notes that the authority the U.S. government seeks is probably available to the FBI if the agency wants to spring for the funding to develop the capability itself, and that the FBI sees this as a privacy vs. security debate, while the tech community sees it as a security vs. surveillance debate.

“There’s nothing preventing the FBI from writing that hacked software itself, aside from budget and manpower issues,” Schneier wrote. “There’s every reason to believe, in fact, that such hacked software has been written by intelligence organizations around the world.”

Schneier said what the FBI wants to do would make us less secure, even though it’s in the name of keeping us safe from harm.

“The danger is that the court’s demands will pave the way to the FBI forcing Apple and others to reduce the security levels of their smart phones and computers, as well as the security of cars, medical devices, homes, and everything else that will soon be computerized,” Schneier wrote. “The FBI may be targeting the iPhone of the San Bernardino shooter, but its actions imperil us all.”

Nicholas Weaver, a senior researcher in networking and security for the International Computer Science Institute (ICSI), said the same logic behind what the FBI seeks could just as easily apply to a mandate forcing Microsoft, Google, Apple, and others to push malicious code to a device through automatic updates when the device isn’t yet in law enforcement’s hand.

“The request to Apple is accurately paraphrased as ‘Create malcode designed to subvert security protections, with additional forensic protections, customized for a particular target’s phone, cryptographically sign that malcode so the target’s phone accepts it as legitimate, and run that customized version through the update mechanism’,” Weaver wrote.

Apple appears ready to fight this all the way to the Supreme Court. If the courts decide in the government’s favor, the FBI won’t soon be alone in requesting this authority, Weaver warns.

“Almost immediately, the National Security Agency is going to secretly request the same authority through the Foreign Intelligence Surveillance Court (FISC),” Weaver wrote. “How many honestly believe the FISC wouldn’t rule in the NSA’s favor after the FBI succeeds in getting the authority?”

This debate will almost certainly be decided in the courts, perhaps even by the U.S. Supreme Court. In the meantime, lawmakers in Washington, D.C. are already positioning themselves to…well, study the issue more.

In letters sent last week to Apple and the Justice Department, the House Energy & Commerce Committee invited leaders of both organizations to come testify on the issue in an upcoming hearing. In addition, Sen. Mark Warner (D-Va.) and Rep. Michael McCaul (R-Texas) say they plan to unveil legislation later this week to create a “Digital Security Commission” to investigate whether Congress has a bigger role to play here.

Twitter addicts can follow this lively debate at the hashtag #FBIvsApple, although to be fair the pro-Apple viewpoints appear to be far more represented so far. Where do you come down on this debate? Sound off in the comments below.

Recommended further reading: Jonathan Zdziarski’s take on why this case is different from previous requests from the FBI to Apple.


194 thoughts on “The Lowdown on the Apple-FBI Showdown

    1. BrianKrebs Post author

      For what it’s worth, the FBI came out with a statement (I think it’s quoted in the LA Times) saying that this password change/reset/whatever never happened.

  1. Eaglewerks

    There is some developing stench surrounding the FBI/US D.O.J. Government lawsuit concerning encryption. It seems to some that this is simply a “test case” being developed by the FBI or DOJ to create legal precedent to gain access to encrypted personal information.

    1) In the Apple vs FBI situation here is a Blog that has some interesting insight:
    http://daringfireball.net/2016/02/san_bernardino_password_reset

    2) The following Reuters story claims a well known private attorney was contacted a week ago by the Justice Department and local prosecutors about representing the victims of the attack.:
    http://www.reuters.com/article/us-apple-encryption-victims-exclusive-idUSKCN0VV00B

  2. Dan Riley

    Apple did not unlock the phones in the previous cases (NPR has a correction up on their story). In the previous cases, Apple read out the device key and decrypted the memory directly, as the device key was used as the encryption key. In the latest versions of iOS, the encryption key involves a hash of the device key and the PIN, so they need to brute force the PIN. This change coincided with secure enclave, Apple Pay, TouchID, etc.

    AIUI, the normal use of the All Writs act is to produce something the subject already has or can easily obtain. There’s no precedent for using it to force a company to make something they don’t already have.

  3. Hardy

    C’mon you lot. The deal’s already done. The data has already been extracted.
    Everything else you read is part of a plan to make the public believe that things are being done with just the right amount of public scruitany. Which means apple will win in the public’s eye and everyone else will be happy.
    So if you excuse me now, I have to jump in my plane to escape the feds knocking on my front door.

  4. Jeff L

    I posted this before but the spam filter must have eaten it. Brian – sorry if this is a duplicate.

    What strikes me about the FBI’s request is that it’s unnecessary. A locked iPhone is not encrypted – it’s just locked with a passcode. If they can’t get through the passcode screen, there are other ways to get at the data, given physical access to the device. Worst case, the data is sitting unencrypted on memory chips on the circuit board. If the FBI doesn’t have people who know how to pull memory chips from a circuit board and read the contents, then they need to hire some (or work with the NSA, who most certainly do have such people).

    So understanding that their request is unnecessary (and they certainly know it), it leaves me wondering what their true motive is. It could just be laziness, of course – trying the most expedient way first and seeing if Apple would comply; or it could be something more nefarious. I’m inclined to distrust the government more often than not, so I’m betting they have a deeper motive here.

    1. Will B

      Everything on iOS 9 is encrypted by default. They have a crypto unit that sits between flash memory and the sys memory to efficiently encrypt everything. On top of that they use what they call ‘ Data Protection Technology’ to protect flash memory.

      And no, they can’t just pull out the drive and decrypt that via brute force. It’s tied to the hardware (‘Secure Enclave’

      1. Will B

        Correction, just read the phone in question is a 5c which doesn’t have secure enclave. But the UID used as the encryption key is still in the hardware and inaccessible. You’d need that to brute force outside of the iphone.

        1. Mike

          Inaccessible to most attackers. A state sponsored attacker can still obtain that key sitting in hardware if the attacker has physical control of the phone; if the attacker is lucky Apple did not adequately obfuscate the EM and side channel attack is possible so you don’t even know your phone was pwned while you slept.

          Where it gets interesting is when the phone is powered off and that (clear) key isn’t (or shouldn’t be) in hardware. Now the attacker may be reduced to brute forcing the passphrase.

    2. Dave W

      Jeff L,
      The deeper motive is explained here.
      http://www.zdziarski.com/blog/?p=5645

      The government doesn’t care about the data on that specific phone. What that want is a legally valid ‘instrument’ for obtaining evidence, instead of relying on ‘lab services’ from Apple.

  5. Jose Navarro

    You want privacy? Don’t put your confidential information on the Cloud, the Web, your iPhone, Facebook, etc. We are the masters of our own privacy. We weep for our privacy and curse the Government. We have the luxury of not knowing what they know.
    Sacrificing some of our privacy as tragic as it may seem to some, it probably saves lives and the existence of the FBI, NSA and any other three letter agency as incomprehensible to some, it saves lives. As someone mentioned before, do you really think that the criminal element has not developed a hack for this? How naive can we be? Let’s take care of our own privacy and stop depending on someone else to protect it!

  6. WTKJD

    How can the FTC require companies to provide “privacy by design,” on the one hand, while the FBI seeks to force companies to comply with “broken by design.” The legacy of J. Edgar Hoover, who broke the law with impunity lives on 40 years after his death.

  7. Stux

    Who has actually read the order? DOJ is not asking anyone to break any encryption. They are asking Apple to disable the security features that delay incorrect PIN entry attempts and wipe the phone after 10 incorrect unlock attempts. They want to use their own PIN brute forcing to unlock the phone. All of the reporting on an order to break their encryption is patently false.

    https://assets.documentcloud.org/documents/2714005/SB-Shooter-Order-Compelling-Apple-Asst-iPhone.pdf

    Tremendous amount of misinformation in the media.

    1. Mike

      Someone at Apple was actually paying attention when Sony gave us the root kit. They know what will happen. They already know the potential dangers and the potential backlash.

      It’s just a matter of time before we are all reading about a virus or hack attack that no one can stop against the iPhone.

  8. wsmac7

    Personally, I hope this goes to the Supreme Court and the FBI loses… with Apple prevailing totally.
    .
    As has been mentioned already, this is more than just an iPhone/Apple issue.
    Those who want to claim, “…no iPhone, no problem…” are either being intentionally ignorant or really just don’t understand the potential ramifications if the FBI wins.
    .
    This goes beyond a current desire to access a certain cellphone, and sets all citizens/visitors of the United States up for intrusions in the future of any lockable/secure device they may carry/possess within areas of U.S. possession.
    .
    No matter how much ‘hacking’ can be done today to render so many of our devices unsecured, it’s an issue of openly and legally granting any government agency, from the Fed’s down, full access that they currently cannot, openly and legally, have.
    .
    WAY more than just an Apple issue here!
    I’m in Apple’s corner… in case I hadn’t clarified that bit… 😉

    1. Xuts

      I agree with you about the principle, but I don’t share your confidence about the Supreme Court. They have been making unconstitutional rulings so it wouldn’t surprise me they side with the FBI

  9. Burton Strauss

    Something I don’t get – how can a company (or an individual) be ordered to produce something (a version of the OS) that doesn’t exist? If the US Government wants in my front door, they either kick it in themselves, OR they HIRE a locksmith and PAY him/her to make a key (think covert FISA entry).

    There are TWO things that they are demanding Apple do:

    1. Make a change or several changes to the iOS, code, unit test, etc.
    2. Sign that iOS build with Apple’s private key so the phone accepts it.

    and maybe

    3. special infrastructure to push that update to the phone (at least with the Android devices I’m used to, they don’t auto-update – they ask you to press a button and that only appears if the phone is already unlocked)…

    1. Eric

      It might be something that does not yet exist, but it is something that Apple could easily produce if they were willing to.

  10. J

    Even though the NSA gobbles up so much data, it didn’t flag the shooter’s scheming texts, email and calls before he struck. But why can’t the NSA just go back over everything the shooter ever emailed, texted or called?

    Based on what I’ve read, the only thing the NSA’s vacuum-everything program facilitated was enabling agents to spy on their own spouses and/or significant others. I’d wager the NSA also spied/spies on non-violent U.S. organizations, a la cross-dressing Edgar Hoover’s spying on civil rights groups, et al. (Hoover’s possibly the worst American in history).

    The iPhone-FBI brouhaha is like the banking collapse in one respect: both the FBI and banks screwed up before and during crises then try to burden the public with the costly cleanup — in the FBI’s/NSA’s case, trying to burden and weaken a publicly-owned company.

    Gratuitous aside:
    No doubt Microsoft would have complied since Windows 10 is like one big backdoor for the federal government.

  11. James

    It seems kind of odd that Apple cannot have a key for each phone. After all, if one can have unique keys for SSH, etc., it should be possible for phones to have a unique key.

    For a lost/stolen phone to be disabled remotely kind of implies each phone is unique. Maybe I am generalizing here as I am no phone expert.

    1. Dan

      Apple deliberately does not want to have a way to access your data. That way they cannot be obliged to hand it over should any government of any nation ask them. It was a way for them to say we cannot access the data. The issue is even though in theory apple cannot access the data, it may be possible to auto-update the phone to a new OS allowing anything to be done.

      I am sure apple considers many governments not on the peoples sides and I am sure you could name a few. If this is done it sets a precedent that any government can comply apple to follow.

      1. jim

        Sorry, what about the apple os for phones given to a non friendly government so that it may be used to spy on religous groups? I believe that was exposed in november. So im looking for a cheap hankook phone, that acts apple now. Just to see the pricepoint.

  12. kit

    Brian, I’ve been following this story closely, and when I heard that NPR story this morning, I knew it was crap, and as others have noted, it was later corrected. It conflated different situations and generally dumbed down what “exactly the same” meant — all based exclusively on biased (law enforcement) sources. “70 times before! And Apple would have done it again, if only the judge had kept quiet!” OMG, can’t you hear the spin?

    Then taking the Comey letter at face-value? Really?

    The big difference between what has happened in the past and now is that the FBI is trying to force Apple to create a tool for them to use to hack, where in the past Apple has been asked to perform a service for them — data extraction. That’s why the, “It’s just one phone,” argument doesn’t hold water.

    Glad you then summarized the Schneier and Weaver articles, but in the end, this post ends up being a he-said-she-said summary, with a “Let’s hear what you think!” at the end. I expect this from journalists with less expertise — heck, they don’t know who is right — but you actually are in a position to catch tech bs when you hear it, yet you report each position as if it’s equally valid.

    Lots of people count on you to explain these complex matters clearly yet precisely, and do analysis. I was disappointed in this story. You have (and will!) do better. Please!

    Kit

    1. jim

      Actually this looks to be, a who pays for it argument. Each phone done costs x dollars, at government expense, so would it be cheaper to get a “tool” to do the job?

  13. John Willis

    I suppose if Apple looses they could throw the switch the other way, basically.. none of our phones have any encryption what so ever.

    Then everyone would change their behavior to suit the situation. Placing the burden or security on the end user.

    It would fiscally destroy the market for secure transactions or identity through a portable computer.

    Or the end user could use whatever third party or opensource encryption technology they wanted, such that there would not be [one] encyption standard to harvest or harness.. but a zoo of competing and constatly mutating encryption toolkits.

    Apple might make it even more interesting by offering a panopoly of encryption choices with no one mechanism to defeat, sort of like double hashing with the users choice of multiple overlapping tools. Salting the result.

    1. Zach

      That is exactly what I was thinking might be a best case solution to this problem. Individual encryption on top of the OS, chosen and set by the user.

      1. James

        GPG – peer to peer certificate authority – already works on macos for email and file system encryption.

        The US government should be spending their time on doing things they can rather than blaming others as to why they can’t. Even if Apple does create something they currently do not have – that was design against allowing – this would only provide a new design innovation or the greater adoption of greater distributed models built on blockchain that allow for no central authority.

  14. Leo

    Ordering Apple to provide a “backdoor” software, which doesn’t exist in first place, is a bit harsh.

    However, if Apple is capable of producing one, then there’s no reason Apple shouldn’t comply if it’s helping law-enforcement agency to continue it’s investigation on this terrorism case.

    If indeed Apple is capable and doesn’t comply with court orders, Apple will seriously need to think about their future and brand name !

    1. Leo

      BTW, I attended the CheckPoint Security Event in Sydney last year and totally loved your presentation and how you eventually tracked down the The Fly !

    2. CJ

      “Code” has been deemed speech in the past in federal court cases. Speech can not be compelled or ordered as it would violate first amendment rights.

      That is the defense Apple is going to use…along with it would be a burden upon their resources to perform this task.

  15. Sam

    I honestly cannot understand what all the bickering is about when it’s quite plain to me that this is the WRONG MOVE by the U.S. authorities in the face of these two FACTS:

    1. According to Bruce Schneier about 2/3 of all cryptographic products are outside the control of the U.S. authorities. https://www.schneier.com/crypto-gram/archives/2016/0215.html#11 If the U.S. authorities would like to show the rest of the world that they can compromise any and all U.S.-based cryptographic products then I say they have just screwed over ALL American technology companies that sell encryption products.

    2. Encryption is public domain mathematics that cannot be controlled by anyone – not even the hubris of the ‘murrican ‘thoritAYZ – so what THIS manoeuvre tells me is that the typical regular law-abiding citizens shall be SPIED upon because they use compromised off-the-shelf encryption products while the criminals – full on knowing this – will simply use and/or develop encryption products that CANNOT be controlled by these same fascists.

    The only way any of this makes any sense is if the end result is the criminalization of the use of any and all encryption that is not certified (e.g., compromised) by the authorities.

    Is that where we’re going, do you suppose?

    This whole thing is absolutely ridonculous. And not even for any of the reasons most people are bickering on about: encryption is MATHS, people, and the genie AIN’T going back in the bottle.

    Am I wrong about this, or what?

    1. RightOrWrong

      You can be right, and still be wrong. How? Being right is often a matter of fact, or opinion based on natural law. If in being right I have a bad attitude, hurt others, or sin, then I am wrong.

      I am a Christian. This is a problem with many Christians, myself included. I know what is right because I have read it and experienced it. If I act on those things without Love I am wrong.

      google:
      1 Corinthians 13:1–13

      1. Sam

        Ok but I’d be more interested in what you thought about the rest of the post. You know, the stuff above the last line?

  16. Peter Pearson

    I suspect that Apple offered to pull the data out of this phone, quietly, but the FBI saw this as an opportunity to demand that the judiciary or (if necessary) Congress publicly endow the FBI with the power to make Apple do it whenever they want. The FBI might also plan on using this unnecessarily adversarial incident to promote legislation requiring built-in backdoors.

    1. NotMe

      I agree with you. It’s just an excuse to get want they want. They don’t want to go the political route and wait for congress to pass a law. The perpetrator is dead, they may not get any useful information at all. What they really want is the ability to get into encrypted devices. This whole circus act is being staged to put pressure on Apple. It’s not to get evidence. They already have all the call records anyway. So let’s call it what it is. Media zoo and pressure to get a change done.

  17. anonymous

    I am confused.. Is encryption legal or not? I mean…in order for me to buy the latest DVD movie, it came with CSS encryption. Until the code was slipped.

    Now, movies come on Blu-ray with that encryption. And, it is illegal for me to decrypt.

    Games are also encrypted. And if they don’t work….I dont get my money back. Or a sorry. Maybe a patch, but not a lot else.

    So, if those other guys are using it to protect their goods, then encryption must be legal.

    Until I get it to use encryption? To protect my selfies? I have private data….and dont want someone just willy nilly flipping thru my stuff. The phone constantly checks in with the cloud. During that check-in….the NSA could swoop all of the data. Isn’t that enough? No…someone wants the ability to see everything with their back door.

    Before we go and say it’s OK because it will never happen to me….(without thinking about the previous cases where attractive women were pulled over just to find the selfies)…think about what happened to the PS3 or DVD once they were compromised.

    The they both got new code….but it was too late. The genie couldn’t be put back in. A whole new platform had to be made.

  18. Ed Tomchin

    I’ve seen an article by a hacker saying that there are about a dozen people around, people like Krebs, who could crack that phone and recover the data with no problem. Is that true at all?

    1. CJD

      Yes and no. It would be trivial for an experienced person to create the code needed, if they had the iOS source code. WIthout that, /could/ someone create this, probably, but the issue is that you can’t load a new version of iOS to an iDevice unless it is signed with Apples certificate, which means you couldnt do anything. The FBI has the resources to build what is needed, and could probably compel Apple to turn over the source code for iOS, but they still wouldnt be able to load a new version with out that certificate to sign the code as official Apple code.

  19. null

    The US constitution clearly says the government can search via the warrant process.

    There is a political process to change the constitution.

    1. Bob

      The government has the phone. Let them search.

      I have a problem with the government trying to force Apple to create a tool to make it easier for them to search.

        1. null2

          like the guy you replied to said, its not about a search. the constitution is fine as-is.

  20. Josh

    I think if Apple won’t abide by law enforcement requests then law enforcement needs to get government and politicians involved. If they still won’t comply then then apple should be shutdown and closed, Delisted and refused to be able to manufacture phones globally, starting first with the USA. How can companies start calling the shots when it involves serious crime and law enforcement.

    1. Sam

      Great then no one will buy American encryption products. Which is fine since 2/3 of them are not American and therefore not subject to U.S. laws.

  21. Jeff L

    For those who still think Apple should comply, I have a question for you:

    What if terrorists communicated by mailing letters instead of using smart phones, and they ran the papers through a shredder after reading them? What would the government do then? Require manufactures of paper shredders to shred documents in such a way as to allow the pieces to be put back together by law enforcement? Would you buy a paper shredder that was crippled in such a manner? I wouldn’t. I would burn my papers in a fire instead, or shred them by hand, or buy a shredder from a foreign manufacturer who isn’t subject to the US mandate. US paper shredder manufacturers would quickly move overseas to avoid the mandate, or go bankrupt. A black market for imported paper shredders would develop, and paper shredders without a back door would be bought and sold in the shadowy underground. And the Bad Guys would still be communicating and shredding their papers, and after decimating the US paper shredder industry for naught, the government would still be no closer to catching terrorists by intercepting their communications.

    Now apply the same logic to phones and operating systems, and you’ll see why Apple needs to resist.

  22. ipWitan

    Apple has already said that they CAN do this, but said they don’t want too because it will cause others to try and do the same thing illegally. But isn’t Apple’s admission that it can do it enough of an incentive for hackers to try? This admission seems to completely undercut Apple’s argument. Is Apple afraid that if they make the modification that they won’t be able to control it and prevent it from being released to the wild? If that is the case, that is on Apple, not the FBI.

    A valid compromise is to permit the FBI to access the phone completely but with a bit of a time delay, say 48 hours. In such a situation when the FBI has a target phone in their possession they can contact Apple to prevent it from being remotely wiped, yet at the same time get the information in a reasonably fast period of time.

    This time delay would prevent a thief from stealing the phone and accessing the data before the owner realized the phone was missing and contacted Apple to report, lock, and potentially wipe the phone.

    1. Eric

      The problem for Apple is exactly this – that it *can* be done in principle, and if Apple were to do it, it probably would not be all that hard.

      If it were the case that it just wasn’t technically feasible, then Apple would be able to say that while they would like to help, but there is nothing they can do. And they have not said this.

      The question comes down to what the technical obstacles are to law enforcement creating their own hacked version. And here I have a bit of ignorance as to how exactly Apple does things – do they require digital signatures for updates that are being pushed, or not? If they do, then the challenge for the authorities is getting past this.

  23. DavidFMM

    The conspiracy theorist in me says that the NSA et al already have the contents decrypted and need the blessing of the courts in order to use what they know publicly for further prosecutions.

    Regardless, there are three important players that we need protection against concerning the ability to easily decrypt iPhone contents:
    1) Bad guys in the world. It is inevitable that the software/techniques will get out to the world. Malware writers will be in heaven if (make that when) they can get a copy of the hacking software.
    2) Good guys in the world. The US Government has repeatedly shown that every ability and access they get is ALWAYS abused, both with tacit blessings from their superiors and by rogue government employees.
    3) Other governments of the world. China comes to mind first, followed by Russia, Germany, England, etc., etc., etc., with no end in sight. This could be the precedent that will finally kill Apple as a major player and spur a world wide rebellion in open source circles. Choose your sides now, people.

  24. CR

    And today the WSJ is reporting that the DOJ wants another dozen phones backdoored. But “It was only one phone”.

    Wait until Iran, Russia, Canada etc all want “just one phone” for “dissidents” or “terrorists”.

  25. Dan R

    Brian,

    I appreciate the detailed/unbiased article, however, just out of curiosity where do you stand on this? I’m also interested in your views on the larger question of encryption/privacy. I know you’re a busy man, but I would be delighted to hear from you.

    Respectfully,
    Dan R.

  26. What is the law?

    I think it was a company phone, the owner is dead. Who decides if it can be unlocked or not?

    Morals and ethics are not the same as legal but with ties to a terrorist group and the amount of people who died this is a tough one.

    1. Jeff L

      It’s not about the right to access the phone – the phone is owned by San Bernadino County, and the feds have the permission of the owner to search the phone – no warrant or court order needed for that.

      It’s about whether the government has the right to compel a private company to cripple its own product and put itself at a competitive disadvantage, for the sake of furthering an investigation.

      If the feds bang on my door, I can refuse to open it. With the proper authority they can break the door down, but they can’t force me to open it for them.

      1. What is the law?

        >If the feds bang on my door, I can refuse to open it. With the >proper authority they can break the door down, but they can’t >force me to open it for them.

        I agree but what if the feds have a warrant? Can they open your lockbox, gun safe or iPhone?

        1. Jeff L

          Absolutely. But they can’t force me (or anybody else) to open it for them. They’ll have to find a way to open it themselves. That’s the point.

  27. David

    Apple needs to add a new feature to iOS 10 which would erase the phone if it is not unlocked, and a software or firmware update is initiated.

    Of course, this would be a configurable option defaulted to On.

     

  28. LYNN CORRENTY

    PLEASE STICK TO YOUR GUNS APPLE!!!

    RUINING THE cryptographically SECURE APPLE PRODUCTS PUTS US ALL IN PERIL……THE ELECTRIC GRID, HOSPITAL EQUIPMENT WE GET HOOKED UP TO, etc……all can be hacked and would do us irreparable harm!

    Let APPLE figure out how to supply the information our government says it is after to curtail and/or eliminate the deadly behavior of criminals, be they national criminals or international criminals. APPLE HAS AGREED TO QUICKLY SUPPLY THE CRIMINAL DATA FROM THE OFFENDING PHONES. TO ASK MORE IS EXTREMELY SUSPECT!

    KEEP US SAFE APPLE CEO COOK AND Nicholas Weaver, the senior researcher in networking and security for the International Computer Science Institute (ICSI). WE TRUST YOU. YOU HAVE ALREADY WON OUR TRUST THROUGH YOUR WONDERFUL PRODUCTS. YOU HAVE SHOWN AMERICAN ALTRUISM AT ITS BEST.

    WE THANK YOU APPLE.

    1. Scott

      Why all the yelling? And no, Apple has not agreed to quickly supply the criminal data from the offending phone. Its a bit of a stretch to jump from iPhone encryption to the grid and hospital equipment. Seeing that both have already been hacked this is nonsense.

      I’m not a “if your doing nothing wrong then what do you have to hide?” kinda guy. I just find it hard to believe that the creator of some of the most technologically advanced equipment and operating systems can’t securely access this data and hand it over to the feds without the need for the feds and the rest of the world to see the magic behind it. This is purely a business move, they did it for 70 other devices, they are pandering to their base to sell more devices and its as un-American as having them built in China.

      1. Mark Mayer

        Apple did it for 70 other devices, up until they released iOS 8, which improved the security safeguards.

        Anyway, while we can consider this case very narrowly, I think it’s important to look at the context. We can perhaps see some patterns by looking at the big picture.

        We know that the FBI has been pushing for mandated-by-law backdoors. They’ve been lobbying Congress, so far with little success. We know that the FBI has been criticizing encryption in general and Apple in particular, making inflammatory claims about “going dark” and how strong encryption will lead to a child’s death.

        It’s not hard to see this case as part of a wider campaign against encryption and security. I don’t think it’s alarmist to see that the legal argument the DOJ is using gives the government very broad powers to dictate how companies operate, nor is it unreasonable to see this case as an end run around the legislative process. In his recent press release, Comey says “the people should decide”, but his agency’s actions don’t match his words.

        Taken very narrowly, the FBI/DOJ case might seem reasonable. However, once you look at big picture and start asking questions (such as, how else might the FBI use this expanded authority?), it’s hard not to be alarmed. Of course the FBI isn’t trying to weaken hospitals’ IT security. But what if that is the result of them getting what they want?

        Maybe I’ve become overly cynical, but it seems clear to me that the FBI is willing to trade our collective cyber security (which is a national strategic “good”) for a short term tactical advantage (a much more limited good).

        Please note that I haven’t yet mentioned privacy. Privacy is a subsidiary issue. The main issue is our right to secure ourselves from cyber criminals and other bad actors.

        1. SeymourB

          It’s also worth noting that law enforcement has been fighting encryption for years. They tried to force the clipper chip down everyone’s throats in the 9os, and you would think after that burned all their bridges with it that they would have learned something, but apparently not.

          1. Anonymous Sec Researcher

            “It’s also worth noting that law enforcement has been fighting encryption for years. They tried to force the clipper chip down everyone’s throats in the 9os, and you would think after that burned all their bridges with it that they would have learned something, but apparently not.”

            We would not have the modern internet we have, if they had succeeded.

            They also attempted, and failed, at a similar work in keeping back gps from everyday consumers.

        2. Robert Flowers

          Mark,
          Well stated. This has more to do with securing data than it has to do with privacy rights. This has the potential to set back Information Security efforts to not only practice good security, but to comply with some current and possibly future security requirements, be they industry (i.e. PCI) or legal requirements.

        3. OhioMC

          Interesting to see how this plays out over different generations and in different industries. 20 some years ago the telecom industry began a changeover to a digital switching system (7?) that had the potential to let the FBI sit in their offices and listen to any phone conversation in the country without having to get a physical tap. Talk about power! The telecoms fought tooth and nail. No way, no how would they mess with millions of lines of code to make this happen. Then the government decided to pay for the software code and I believe they also indemnified the telecoms against any loss due to problems with it. Objections withered and died. President Clinton signed this into law (CALEA). The Israelis purportedly hacked it at one point and gave themselves the same access.

          One day the law enforcement and national security apparatus is likely to get what it wants. Keep the focus on a robust oversight process and separation of duties so political appointees are prohibited from any access to any program like this.

          In the meantime, can Apple assure us that the compromises they made to their offering in China cannot be used as a gateway to users in the U.S. or other parts of the world?

      2. Anonymous Sec Researcher

        “I’m not a “if your doing nothing wrong then what do you have to hide?” kinda guy. I just find it hard to believe that the creator of some of the most technologically advanced equipment and operating systems can’t securely access this data and hand it over to the feds without the need for the feds and the rest of the world to see the magic behind it. ”

        Nothing to do with that. It is all about setting legal precedent and correctly previously very flawed legal precedent.

        The FBI could do this. Apple can do it. Unless, the terrorists used a longer then four digit code. That is theoretical. That either organization could do this, is not.

        It will open the floodgates, if the wrong precedent is set.

        Apple has a vested interest in presenting themselves as a secure corporation producing secure products to a global marketplace.

        Especially after the PRISM documents release.

        And especially in a climate where US authorities are clamoring for mandated backdoors in everything. In the context of the revelations of the US indiscriminately hacking all nations, and all citizens of those nations.

        Morality is not about absolutes. It is about lesser of two evils.

        Or the better of two imperfect goods.

    2. jim

      Sorry, your confusing you hand unit over the air, with other “safeguard” systems. I agree with you on one point. That there should be a warrent issued to safeguard the public. There should be no way to have ” something” planted on the system. T could be anything and incrominating for some cause. Which could be, they just dont have proof, but need a throwaway cause.
      The part you are confusing on, is the encryption, that should be on all medical, hippa.
      You see, all communication is ad censored nowadays. There are cookies that spy on you from all sources. And all three oses have them, they are platform indepentant. And contain pushware. Damn. So much for your privicy. You have none. Even if you encrypt, you are contaminated. They touch all aspects and equipment on the phone. So, unless, you need the entertainment value of a smartphone, its better tohave a dumbphone for communications.

Comments are closed.