February 22, 2016

Many readers have asked for a primer summarizing the privacy and security issues at stake in the the dispute between Apple and the U.S. Justice Department, which last week convinced a judge in California to order Apple to unlock an iPhone used by one of assailants in the recent San Bernardino massacres. I don’t have much original reporting to contribute on this important debate, but I’m visiting it here because it’s a complex topic that deserves the broadest possible public scrutiny.

Image: Elin Korneliussen

Image: Elin Korneliussen (@elincello)

A federal magistrate in California approved an order (PDF) granting the FBI permission to access to the data on the iPhone 5c belonging to the late terror suspect Syed Rizwan Farook, one of two individuals responsible for a mass shooting in San Bernadino on Dec. 2, 2015 in which 14 people were killed and many others were injured.

Apple CEO Tim Cook released a letter to customers last week saying the company will appeal the order, citing customer privacy and security concerns.

Most experts seem to agree that Apple is technically capable of complying with the court order. Indeed, as National Public Radio notes in a segment this morning, Apple has agreed to unlock phones in approximately 70 other cases involving requests from the government. However, something unexpected emerged in one of those cases — an iPhone tied to a Brooklyn, NY drug dealer who pleaded guilty to selling methamphetamine last year.

NPR notes that Apple might have complied with that request as well, had something unusual not happened: Federal Magistrate Judge James Orenstein did not sign the order the government wanted, but instead went public and asked Apple if the company had any objections.

“The judge seemed particularly skeptical that the government relied in part on an 18th-century law called the All Writs Act,” reports NPR’s Joel Rose. “Prosecutors say it gives them authority to compel private companies to help carry out search warrants.”

Nevertheless, Apple is resisting this latest order, citing the precedent that complying might set, Apple’s CEO claims.

“We have great respect for the professionals at the FBI, and we believe their intentions are good,” Cook wrote. “Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.”

Cook continued: “The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.”

In a letter posted to Lawfare.com and the FBI’s home page, FBI Director James Comey acknowledged that new technology creates serious tensions between privacy and safety, but said this tension should be resolved by the U.S. courts — not by the FBI or by Apple.

“We simply want the chance, with a search warrant, to try to guess the terrorist’s passcode without the phone essentially self-destructing and without it taking a decade to guess correctly,” Comey said. “That’s it. We don’t want to break anyone’s encryption or set a master key loose on the land. I hope thoughtful people will take the time to understand that. Maybe the phone holds the clue to finding more terrorists. Maybe it doesn’t. But we can’t look the survivors in the eye, or ourselves in the mirror, if we don’t follow this lead.”

According to the government, Apple has the capability to bypass the password on some of its devices, and can even disable an iPhone’s optional auto-erase function that is set to delete all data on the phone after some number of tries (the default is 10).

The iPhone at issue was an iPhone 5C, but it was running Apple’s latest operating system, iOS 9 (PDF), which prompts users to create six digit passcode for security. Since iOS 9 allows users to set a 4-digit, 6-digit or alphanumeric PIN, cracking the passcode on the assailant’s iPhone could take anywhere from a few hours to 5.5 years if the FBI used tools to “brute-force” the code and wasn’t hampered by the operating system’s auto-erase feature. That’s because the operating system builds in a tiny time delay between each guess, rendering large scale brute-force attacks rather time-consuming and potentially costly ventures.

In an op-ed that ran in The Washington Post on Feb. 18, noted security expert and cryptographer Bruce Schneier notes that the authority the U.S. government seeks is probably available to the FBI if the agency wants to spring for the funding to develop the capability itself, and that the FBI sees this as a privacy vs. security debate, while the tech community sees it as a security vs. surveillance debate.

“There’s nothing preventing the FBI from writing that hacked software itself, aside from budget and manpower issues,” Schneier wrote. “There’s every reason to believe, in fact, that such hacked software has been written by intelligence organizations around the world.”

Schneier said what the FBI wants to do would make us less secure, even though it’s in the name of keeping us safe from harm.

“The danger is that the court’s demands will pave the way to the FBI forcing Apple and others to reduce the security levels of their smart phones and computers, as well as the security of cars, medical devices, homes, and everything else that will soon be computerized,” Schneier wrote. “The FBI may be targeting the iPhone of the San Bernardino shooter, but its actions imperil us all.”

Nicholas Weaver, a senior researcher in networking and security for the International Computer Science Institute (ICSI), said the same logic behind what the FBI seeks could just as easily apply to a mandate forcing Microsoft, Google, Apple, and others to push malicious code to a device through automatic updates when the device isn’t yet in law enforcement’s hand.

“The request to Apple is accurately paraphrased as ‘Create malcode designed to subvert security protections, with additional forensic protections, customized for a particular target’s phone, cryptographically sign that malcode so the target’s phone accepts it as legitimate, and run that customized version through the update mechanism’,” Weaver wrote.

Apple appears ready to fight this all the way to the Supreme Court. If the courts decide in the government’s favor, the FBI won’t soon be alone in requesting this authority, Weaver warns.

“Almost immediately, the National Security Agency is going to secretly request the same authority through the Foreign Intelligence Surveillance Court (FISC),” Weaver wrote. “How many honestly believe the FISC wouldn’t rule in the NSA’s favor after the FBI succeeds in getting the authority?”

This debate will almost certainly be decided in the courts, perhaps even by the U.S. Supreme Court. In the meantime, lawmakers in Washington, D.C. are already positioning themselves to…well, study the issue more.

In letters sent last week to Apple and the Justice Department, the House Energy & Commerce Committee invited leaders of both organizations to come testify on the issue in an upcoming hearing. In addition, Sen. Mark Warner (D-Va.) and Rep. Michael McCaul (R-Texas) say they plan to unveil legislation later this week to create a “Digital Security Commission” to investigate whether Congress has a bigger role to play here.

Twitter addicts can follow this lively debate at the hashtag #FBIvsApple, although to be fair the pro-Apple viewpoints appear to be far more represented so far. Where do you come down on this debate? Sound off in the comments below.

Recommended further reading: Jonathan Zdziarski’s take on why this case is different from previous requests from the FBI to Apple.


194 thoughts on “The Lowdown on the Apple-FBI Showdown

  1. Thomas D Dial

    The specific case is, indeed, about access to a single device. There is no doubt, however, that if the government wins the case there will be hundreds of additional requests, substantially identical in nature very quickly afterward. The Manhattan district attorney has 175 iPhones for which he has, or would seek, search warrants and, if necessary, similar supporting court orders.

    The case is not, however, about privacy: when a search warrant is issued that entitlement (in the US) vanishes for anything the search warrant covers.

    The case is not about protecting innocent iPhone users. Nothing the order requires would directly affect any iPhone but the one specified in the order, for which the FBI obtained both a search warrant and the owner’s permission. The order requires that the software it directs be usable on only the iPhone it describes. Future use on other devices would require that it be changed based on another court order.

    Because of Apple’s software and firmware installation security, the software the order requires would not increase the risk to other users even if it became public. If Apple’s software and firmware integrity checking is not compromised, the order’s software could not be installed on any other iPhone or changed and used for any other purpose without Apple’s participation. If Apples software and firmware integrity checking is, or later becomes, compromised, there are thousands of individuals and organizations in the US and elsewhere that could create and the same or more intrusive software. But that risk is the same whether or not Apple honors this request.

    1. Jeff

      In this case Apple must create the software. They are being forced to create a malware version of the iOS. This is not limited to one phone as is being argued. Once that version of the iOS is created every other agency can say it has been done before and imposes no hardship for Apple. Has anyone been forced to CREATE for the government before? If anyone knows of a case please post it.

    1. kopecky

      Surveillance State=Police State. See Ben Franklin on security& freedom, support EFF to fight this outcome. The password change done by Berdoo County eliminated best chance for discovery. My read of the Press is this was encouraged by FBI. Read between the lines for intent of FBI by doing thus. Does any one still use self inflicted breach’s of privacy/security such as: The Cloud/Facebook/Twitter et al.

    2. Allan S

      Except it’s not a ‘hacking tool’ that the FBI is asking for, nor is it likely that forensic validity is important being that the guy is kinda dead so he’s probably not going to end up in court.

      1. Jason

        What about the fact that this will set a legal precedence? Then there will other cases with a defense and the court will be required to give them the tool for independent verification.

  2. Leslie

    The government got caught with their pants down, they should invest in understanding technology instead of strong-arming Apple into hacking devices. I bet other countries are closer to hacking the software than the US. Invest dollars into understanding technology changes and force the US to compete with other countries.

  3. Mark Mayer

    The FBI and the DOJ are being dishonest when claiming that only Apple will have access to the tool Apple creates. The minute that evidence derived from this tool is used in a court case, the defendant will have the right to inspect and test the tool. In practice this means the legal defense team will hire experts to closely examine and test the source code. We know already that law enforcement agencies are poised to make hundreds of requests. Hundreds of experts will have access to the code. Can the DOJ/FBI or the courts absolutely and credibly guarantee all those experts will honor confidentiality agreements?

    1. Allan S

      F.U.D.

      1) I would imagine that any competent defense attorney would challenge the validity of the data, given that the OS will have been modified by a request of law enforcement.

      2) Even if the source code was open, I somehow doubt that it will allow for a take over of Apple’s over-the-air update methodology.

    1. Sam

      Geral please either remove or fix that horrible orange and grey graphic it’s really distracting me from wanting to continue on reading your story there, thanks.

  4. DG

    One must think all the media coverage with the agencies wanting to access devices, one would use more than just a secure passcode on a device. Looking at it from the bad guys side being the instigator of terrorism when PRISM was exposed mobile networks in the conflict zones went silent, what is to stop them doing the same to keeping information on devices without extra encryption.

    However that being said, the FBI can gather quite alot of meta data intel from a device outside of plain text plans that are encrypted.

    In the words of Helen Lovejoy “Won’t someone please think of the children!”

  5. vagina pocket that

    Well, I asked the subsequent girl vagina pocket that
    came to the site me if she had seen Benny Boy.
    It’s tough being confident after you’re struggling to last a while and give your significant other an orgasm.
    Best male sex toy Anyone who watches the parade of evil featured around the evening
    news will reaffirm that conclusion.

    You may also wear these using your clothing while
    doing all your house work. The band relocated to Los Angeles in 1987, and quickly became a favourite at venues like Scream.

  6. ethsin

    Newbie question, but why doesn’t the FBI just clone the entire iPhone storage as a virtual copy, load it into AWS Mobile Hub or virtual machines etc, and simultaneous crack the password in thousands of cloned iPhones at once, and reload them with the copy each time data is wiped out?
    It would eliminate the auto-erase function, and probably cost less than taking Apple to court.

  7. unclebunkiefromhell

    And dont forget a law California is trying to pass to disallow encrypted cellphones. I will give up my cellphone if I have to and go back to a land line.

  8. Jarrad

    The biggest issue I foresee with this is lack of oversight.
    You can be sure that Apple engineers will not be present to install the OS nor be able to confirm that the OS for this device is destroyed properly once it is deployed.

    Or in other words the FBI is pushing for a key that will open doors now and for years to come. You would also have to be really naive if you don’t think the NSA, and consequently the rest of Five Eyes wouldn’t end up with it. And of course from there, every other nation state with good intelligence agencies (let’s start with Russia and China) will gain access to it.

    But the FBI only want it this once.

    It is exactly the lack of future sight being exhibited by the agency that makes this entire exercise so dangerous if it proceeds.

  9. peter

    Apple says that it NEVER unlocked 70 iPhones. It DID extract data from locked iPhones but it did NOT unlock them. The difference is subtle but significant now.

  10. Allan S

    “In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks ”

    Except that’s not really the case. They are asking for a unique version of the IOS firmware to be ota loaded to the single phone, in order to allow for some brute force number punching.

    While the technique could be used multiple times, each application would be unique. And presumably requiring a court order or some equivalent to a search warrant to put into effect.

    We’re okay with police being allowed to go into someone’s home when searching for evidence of a crime /as long as it comes with a search warrant/. If a safe (thank you DTSR) has potentially criminal evidence within it, the authorities are allowed to hire a safecracker to get into it.
    In this case the Feds are asking for a safe cracker to crack this one item, not to have a baked in all-access number.

    While the Feds have long asked for a back door, this isn’t it.

    1. Jeff L

      But they aren’t “hiring a safe cracker” – they are forcing the designer of the safe to make it crackable.

      There is a difference.

      They can hire Apple to crack the safe for them if they want to. Hell, if they pay enough they can hire ME to crack the safe for them if they want to. But forcing Apple to do it for them, against their will, isn’t something that should happen in America.

    2. piet

      The FBI certainly has a right to hire a safe cracker. But should they be able to force a safe cracker to open a safe for free? And Apple isn’t a safe cracker, but a manufacturer. They are asking Apple to do a cracker’s job (for free), precisely because Apple has a unique way into the safe; by hijacking the update mechanism. That is a backdoor, unavailable to third parties. So yes, the Feds are asking for a backdoor, albeit for one device. For now. Once the tool exists, all it takes is a process to rubber stamp subpoenas (e.g. FISC) and it is effectively a back door by proxy. This is clearly not a good thing. Privacy concerns aside, Apple would be kept quite busy cracking their own customers. Surveillance is also a cost for businesses.

    3. Anonymous Sec Researcher

      “Except that’s not really the case. They are asking for a unique version of the IOS firmware to be ota loaded to the single phone, in order to allow for some brute force number punching.”

      OTA is a pretty technical term for what is being discussed here.

      Glad you know it.

      OTA US intelligence (or foreign) can use against any apple or samsung phone – or any other – in their country, at anytime. And handsets, in general, remain eminently hackable for a very wide number of reasons.

      All you need are two vulnerabilities: one, userland in any popular, even third party application. Not to even mention first party manufacturer or telco supplied (and forced) application.

      Which, btw, I have certainly contributed to.

      The second vulnerability is merely escalation of privileges to root.

      That is it.

      Most can be especially easily obtained by MITM. Which a person can do with COTS open source software and 200 dollars of equipment.

      If not 50. Can openbts run on rtl-sdr?

      The issue is simply precedent. And domestic law enforcement. We can watch intelligence well enough, but can simply not watch domestic law enforcement as well.

      The legal precendent opens the door to domestic law enforcement – who is also domestic intelligence – going crazy. How do you regulate them? By their own agency?

      FBI counterintelligence is just that. Set around FOREIGN intelligence. They have crap for self-enforcement. And state and local law enforcement have almost zero safeguards.

      If there is an FBI mole working for foreign intelligence, the FBI might, finally, be able to police that. Kind of.

      But, for FBI abuses? Or state and local abuses? Nothing.

      Policing intelligence mandated to operate only on foreign shores is hard enough.

      Opening that door for domestic is insane and one must hate democracy to do so. Or be entirely ignorant of history.

      Not sure which it is. Probably, both.

  11. Nick

    As aggressive is the government with punching holes in our right to privacy in the name of fighting terrorism, it is equally passive in preventing the flow of guns to terrorists. Al Qaeda is using our lax gun laws to recruit jihadis to kill us.

    See https://www.youtube.com/watch?v=mfj7OxwDIU4.

    Yet the government has done NOTHING to fight this. The Second Amendment is a suicide pact, and we must all risk death to protect the right to bear arms.

    If the Second Amendment is a suicide pact, then so are the First and Fourth. I will accept no more restrictions on my rights to free speech and privacy than the government imposes on gun rights.

  12. Felix Uribe

    This is an interesting dilemma. We can use the most sophisticated technologies in the world (good and bad people) to spy on everybody’s phone in the planet, but we have to ask permission to unlock one dam phone! Isn’t that special? “Collect all”… “Unlock one”!

  13. JD

    The judge should sign a warrant.

    Also, last week, the FBI said this was NOT terrorism. No connections to any terror group was found.

    So – by calling it terrorism – the FBI has tried to bully all of us, not just Apple into doing things that are different from normal.

    Would the FBI and judge make the same request if this was a postal worker who broke down? Basically, that is what this case is. Plus, the husband/wife team are both dead. No more justice to be had. So no only is the FBI and judge wasting time and money, they are disingenuous about the true reasons.

    Not terror. Just “postal.”

    IMHO.

  14. Anonymous Sec Researcher

    Thanks for the solid synopsis.

    From my perspective, as someone who has worked on cell phone security (having contracted with telco and one manufacturer), and who has a family background from the DoJ — also as someone who has been engaged in “privacy” software creation and implementation:

    This is about bad laws and setting precedents. It is also about correcting precedents which have already been created.

    It is marketing, a political-legal move.

    Bluntly, the “statement”, ‘but this is just for one telephone, come on, and it is a major terrorist case’ is a lie.

    It is one of those polite lies politicians and unelected officials who are public facing make.

    What does the average citizen know about precedent? I think it is very common knowledge that precedent is what makes and breaks cases.

    It certainly does open the flood gates, despite the denials. Which are lies. Or extreme ignorance.

    As stated, not my area of expertise and employment. Legal assessments. So, there, I do have to go by what legal opinions I have read on this case, but have the ability to juxtapose that with plenty of firsthand experience of how DoJ lawyers work.

    From a professional note, I can point out that the FBI, unlike intelligence organizations (pure intelligence organizations), has had dismal security researcher budgets and expertise.

    And unlike other DoD organizations.

    That is a budgetary failure on the part of the FBI. And it is a severe leadership problem.

    So, while, for instance, DoD and intel might have tens and hundreds of vulnerability finding “farms”, the FBI only had one.

    Like OPM, they are continuing to fail in their computer security responsibilities. This should not be tolerated, nor be treated in political fashion.

    The FBI has had very strong growth in their cyber division. However, the culture is very behind intel and DoD organizations, despite their significant forensic advancements.

    No small part of that is an archaic culture problem.

    Is my assessment.

    Historically, the FBI has had very significant problems separating their “nerd” departments from their special agent departments.

    Where “nerd” is divisions like counter-intelligence and computer security.

    Their strong suit is in legal work, being a mere division of many divisions of the DoJ. But, legal work, while important, often engages in political fights, as opposed to getting down to the technical components of computer or physical security.

    Can Apple crack this system? I believe there is significant evidence that they can. Should this capacity be there? No. This is an archaic system under question, and has significant security flaws.

    I have not specifically worked at attempting to break their latest encryption systems, but going from doing research on the issue, I noted that one former Apple engineer who worked at that level pointed out that the firmware controls can be trivially overwritten.

    Put another way: there is some evidence that, if the terrorists did use more then a default four digit unlock code, neither Apple nor the FBI could crack it. As a security researcher, I tend to view this as unlikely, even if this is the best opinion.

    On Weaver’s point, about intelligence: Intelligence will already be able to remotely “own” any phone they want to. Apple does engage in a signature based firmware update system OTA, however, that firmware can be trivially overwritten. And intelligence will have both userland and root level capacity to do this.

    That is not my concern. My concern is not on “privacy”, but on the security of domestic communications. Bad security means we find ourselves inevitably in the “Hoover situation”, where politicians are trivially surveilled and so controlled.

    This point may sound far fetched to consumers who want to believe otherwise. But, the fact is, like with the problem of stingrays, once that door is open, it is impossible (or nearly so) to properly regulate. And the potential for abuse is enormous.

    Frankly, there is no way to maintain a democracy, with unregulated, domestic surveillance possible. Which is why I put “privacy” in quotes.

    The problem, simply, is the impossibility to regulate that with a domestic intelligence slash law enforcement agency. Which is exactly what the FBI is.

    We, at least, have regulation possibilities with DoD and intelligence.

    I have some confidence in.

    These are, regardless, difficult to explain terms to people outside of these industries. I am open for further arguing these points, however, if anyone wishes to question me.

    (FYI, my initial usage of the term “privacy” was specifically in quotes, because the systems I worked on were engaged specificially for dissidents and potential and real operating agents/moles in totalitarian nations. )

    The term “privacy” is so often used. The reality is? Your average Joe/Jane, is not interesting to anyone. Your domestic VIP very much is. Especially when they control your budgets and overall power and authority.

  15. Darth Bama

    The Obama administration is being hypocritical in the pursuit of Apple unlocking the iPhone. The federal government has laws preventing its own agencies from looking at social media as part of the Visa vetting process. Tashfeen Malik posted on Facebook her support for jihad and ISIS. As such, she was allowed to enter the US. It seems this is about a lot more than security. It’s multi-faceted and at the same time simple. The government doesn’t want to be denied what it wants. And it will politicize a terrorist’s iPhone to force the issue on privacy and encryption.

  16. markg

    FBI has the account name and phone number.
    FBI has access to all phone records incoming and outgoing.
    FBI has access to all text messages.
    FBI has access to all emails.

    Explain why the FBI cannot do simple investigation 101 and extrapolate everything that could or might be on this encrypted cellphone. The various records will have given the FBI all the information about how this cellphone was used. This whole thing is a smoke screen for the purpose of defeating any encryption that may be used to protect us from people stealing or otherwise compromising cellphones.

    Bow to the government or else seems to be the mantra. Hopefully Apple sticks to their guns and refuses to create a backdoor.
    The odds of these nut jobs having anything of value on the cellphone is slim, none, to never. In the meantime maybe law enforcement should learn how to do basic investigations.

  17. JasonZ

    I still think everyone is ignoring crucial parts of the court order… Everyone is focused on the first half of part 3 of the order, where the FBI says that Apples assistance “MAY” include writing this software…

    But the court order goes on to say at the end of part 3 that it is allowing Apple to do this at an Apple facility (and not hand the software over to the FBI at all), as long as they give remote access to the FBI to brute-force the passcodes with their forensic software…

    And, part 4 goes on to give Apple a MAJOR out… It says if Apple can achieve the main parts of the order (bypassing the auto-erase and not introducing the delay) by any other means that doesn’t involve writing the code, that would comply with the court order.

    So why is everyone greatly exaggerating the court order by stating that the FBI is forcing Apple to write this code, when that’s simply not true?

  18. yettybiyi

    One question I will keep asking is, “why can’t FBI use other methods or techniques to seek and acquire the information they need to get what they need. It does not make sense having apple to create a back door tool to hack a device they produce. Did FBI forgot that even people or apple security personnel that will create this tool could eventually, someday, use this same technique to harm Apple. Did FBI forgot about disgruntled employees effects, insider attackers when it comes to security issues?

  19. XxHaimBondxX

    I’m no Apple Genius, however, wouldn’t it be easier for everyone to have Apple unlock the phone, download the data and hand it to the FBI? One criminal, one phone and leaves FBI without this master key.

  20. Bob Puffr

    I don’t want to continually give up my freedom every time an official states it is to protect our security. We’ve had WAY too much abuse of that mandate.

  21. Aye Bee

    Just a few counter thoughts…

    What if the FBI already has all the data and access they want, and this is all a smokescreen to get everyone to believe Apple really truly protects privacy? And, gosh, what a great advertising campaign!

    Meanwhile, the bad guys already have the means to really encrypt their information beyond anyone’s ability to crack. That genie cannot be put back in the bottle.

    And, what if Apple is compelled to write a new system for the FBI, and no one showed up to work on it? Certainly Tim Cook doesn’t have the experience to do it.

    Just random thoughts.

  22. Greg Scott

    We all need for Apple to win this fight. Of course Apple could engineer a firmware update to help the FBI break into that phone, and maybe the FBI might even find a few tidbits of interest. But at what ultimate cost in the real war against terrorism and crime?

    I put together a bunch of thoughts about all this at:
    http://www.infrasupport.com/apple-fbi-encryption/

    – Greg Scott

  23. Jerome

    It is highly unlikely that the phone contains any useful operational information, as the two terrorists had destroyed other cell phones and a hard drive. I find it highly suspect that he forgot about his work cell phone in the operational clean up, and find it more likely that he used his work for phone for work only. All destroyed devices were part of the operation which explains their destruction. Work cell phone not destroyed, not part of the operation.

    The FBI also has all the metadata for that phone available to them, which would be easily gathered from the carrier with a warrant (likely they have this information already). In addition they could clone this phone’s data, and attempt brute force attacks on an iPhone farm til the cows come home.

    I don’t think the FBI wants the backdoor for this phone only, but is using this event as a pretext for an attack on encryption in general.

  24. Rob

    Frankly, I’m pretty horrified by most of the comments on here. I work in a digital forensics lab for law enforcement. Each year we receive thousands of requests for digital analysis, many of which include smartphones. Every one of these requests include a warrant signed by a judge making these searches legal. Ever since iOS 8 came out, forensic analysis of the iPhone itself has become impossible. You can get information from iCould backups, and some phone records, but that’s it. And FYI….most users (apparently most of them being investigated for crimes, anyway) do not usually backup their iPhones, so backups and iCloud data is not usually helpful. There are no longer any ways (such as chip-off or JTAG) to obtain the unencrypted data resident on the phone.

    This shuts down what is usually one of the only sources of evidence in an investigation, thus ending the investigation (because, contrary to what most of the commenters here think, we do NOT live in a police state….if you want an example, look at how Apple does business in China…the LA Times has had some nice write-ups regarding that issue lately).

    Here’s what I’m appalled about: these police investigations involve real people, and none of you seem to care. How would you feel if you or your child were a victim or rape, sex assault, child porn, etc.? Most of the requests we get involve these crimes, and the phone ends up being the best possible source of any evidence. With an encrypted phone, and no way to get at the data, the investigation ends with a police officer going to the victim and saying “I’m sorry. I understand something horrible happened to you or your child, but we can’t unlock the suspect’s phone, and there’s nothing further we can do to help you.”

    Hopefully none of you ever have to experience that, so you can continue to wear your tin-foil hats and worry about your ‘privacy’. (Spoiler alert: Most digital product/service companies don’t care about your privacy. See: online advertising).

    1. koppecky

      This link to Amici Curiae re: Apple vs FBI for all you Badges that want to violate the 4th amendment, & any other Right that gets in your way. CIS Technologists Apple Brief Final.pdf

      1. Rob

        Koppecky, the search is not a violation of the 4th Amendment if there is a valid search warrant. That means a judge reviewed the facts of the case, and agreed there was sufficient evidence to proceed with the investigation, and to legally search the iPhone.

  25. jmoney

    Once they outlaw encryption, only outlaws will use encryption.

Comments are closed.