Judge orders Apple to access iPhone belonging to San Bernardino shooter [u]

Posted:
in General Discussion edited February 2016
A U.S. magistrate judge on Tuesday ordered Apple to comply with FBI requests to help extract data from an iPhone owned by one of the shooters involved in December's terrorist attack in San Bernardino, Calif.




Judge Sheri Pym informed Apple that it must provide specialized software that will allow law enforcement officials to thwart iPhone's built-in security measures, specifically a feature that automatically erases handset data after a certain number of unsuccessful login attempts, the Associated Press reports (via ABC News).

It is unclear whether the iPhone in question is running iOS 8 or iOS 9, both of which feature so-called "strong encryption" that even Apple can't break. The report is also vague on the level to which Apple must participate. From the AP's wording, it appears Apple could be forced to hand over a software package that might be copied and later applied to similarly locked devices, undermining the company's encryption efforts.

Today's ruling comes less than a week after FBI director James Comey said law enforcement technicians have attempted, but so far failed, to access information stored on an iPhone owned by the county, but used by Syed Rizwan Farook. Farook and his wife Tashfeen Malik fatally shot 14 people in a terrorist attack last year before being killed in an ensuing police shootout.

"We still have one of those killers' phones that we haven't been able to open," Comey said at a hearing of the Senate Intelligence Committee last week. "It has been two months now and we are still working on it."

The iPhone model in question has yet to be identified, but Apple's iOS operating system has for years provided password-based and remote data wipe options as part of its security suite. The ability to erase phone data is just one facet of a comprehensive encryption system built to secure a highly sensitive personal information, including passwords, contacts, biometric data, financial data and more. Apple took an extra step with end-to-end encryption in iOS 8, a protocol the company claims even it can't break.

Update: The Washington Post adds detail to today's court order, saying the phone in question is running Apple's latest iOS 9 operating system. Once the auto-wipe feature is deactivated, technicians can conduct a brute force attack to unlock the code, but it is not clear that Apple is capable of such a feat.
«13456

Comments

  • Reply 1 of 102
    cnocbuicnocbui Posts: 3,613member
    Judge orders Apple to hand over live unicorn or face penalties.  The world waits with baited breath.
    bobschloblolliverradarthekatewtheckmananantksundaramchiaquadra 610Rayz2016electrotechEsquireCats
  • Reply 2 of 102
    dreyfus2dreyfus2 Posts: 1,070member
    Well, nobody will deny that this is a legitimate case to ask for access. But there is a huge difference between "providing assistance" (which means accessing this particular phone, if at all possible) and "handing over a software", which could then be used on any phone. While I fully agree with the former, I see no reason for the latter.
    bobschlobanantksundaramchiaRayz2016mcarlingicoco3magman1979jony0
  • Reply 3 of 102
    Apple will appeal.

    "provide specialized software" sure sounds like demanding a back door. I don't see how a local judge even has the authority to make such a demand. 
    bobschloblolliverlatifbpanantksundaramlostkiwiRayz2016bdkennedy1002electrotechentropysaaron sorenson
  • Reply 4 of 102
    This is setting up a potential Supreme Court showdown: privacy vs. security.  Unfortunately it puts Apple in an awkward P.R. position: take a stand on privacy and risk being painted as blocking the investigation of obviously ruthless murderers.
    bobschloblolliverewtheckmanchialostkiwielectrotechjony0
  • Reply 5 of 102
    The more the US Government sues Apple in order to break into an iPhone, the better I feel about my privacy.
    calilatifbpchialostkiwibuckaleclibertyforallmr oaaron sorensonjony0Omaha
  • Reply 6 of 102
    It's somehow comforting to know that nether the federal government nor Apple can hack into my phone. That's the way it should be. We just need to find other ways to fight terrorism that do not jeopardize the privacy of millions of Americans. Apple is taking the correct stance on this issue in my opinion.
    bobschlobcaliewtheckmanlatifbpanantksundaramkingofsomewherehotchiaquadra 610lostkiwiRayz2016
  • Reply 7 of 102
    apple ][apple ][ Posts: 8,650member
    The title states that the iPhone belongs to the terrorist lowlife, but is that really accurate, because later on in the article, it states that it is the county that owns the iPhone.

    I suppose that it was a work phone, given to the terrorist by their employer, which they were using but didn't technically personally own?


    edited February 2016
  • Reply 8 of 102
    The right to privacy, though not explicitly enumerated in the Constitution, is now considered settled law.
    While this will occasionally cause issues such as this, I believe that protecting privacy rights for the overwhelmingly majority of Americans supersedes those handful of situations where information cannot be accessed. I hope Apple sticks to its guns and prevails.
    calilatifbpkingofsomewherehotlostkiwimonstrositymacsince1988mcarlingdamn_its_hotmagman1979ai46
  • Reply 9 of 102
    It sounds like Apple is being asked to install software that prevents the built-in code that auto-deletes/destroys the phone's content when too many bad password are attempted (brute force attack).

    In other words, it will give the FBI an unlimited  number of password retries.

    To paraphrase Mission Impossible:
    "This phone will self-destruct in 5 seconds."
  • Reply 10 of 102
    lkrupplkrupp Posts: 7,061member
    Now would be a great time for all those asshat security “researchers” who claim OS X and iOS are riddled with security holes and are “child’s play” to hack into to put up or shut up. You say it’s easy to get into an iPhone? Then call the FBI and tell them how to do it. Otherwise you people are blathering hypocritical nincompoops.
    edited February 2016 lollivercalichiaericthehalfbeequadra 610lostkiwiRayz2016monstrosityxiamenbillfotoformat
  • Reply 11 of 102
    Apparently, the judge magistrate doesn't understand words like "cannot" and "never" or even just "encryption."
    lollivercaliquadra 610lostkiwimcarlingdamn_its_hotmagman1979
  • Reply 12 of 102
    It's a slippery slope. They'll start with this case and then demand help decrypting the cell phones of outvoted discrimination victims at underage drinking parties, to find out who bought for them.
    xiamenbillmcarlingdamn_its_hot
  • Reply 13 of 102
    It sounds like Apple is being asked to install software that prevents the built-in code that auto-deletes/destroys the phone's content when too many bad password are attempted (brute force attack).

    In other words, it will give the FBI an unlimited  number of password retries.

    To paraphrase Mission Impossible:
    "This phone will self-destruct in 5 seconds."
    Apple can sort of do this with specialized tools.  The phone memory can be imaged.  Then a new phone with the same serial numbers and other parameters can run that copy and do a few tries.  It would be labor intensive but the limited retries can be gotten around by extreme measures. 
    muppetry
  • Reply 14 of 102
    "provide specialized software" sure sounds like demanding a back door. I don't see how a local judge even has the authority to make such a demand. 
    A federal appointed judge? You bet they have such authority. This is why it's a lifetime appointment.
  • Reply 15 of 102
    To the judge and the FBI:


    EsquireCatsmcarlingmagman1979
  • Reply 16 of 102
    beltsbear said:
    It sounds like Apple is being asked to install software that prevents the built-in code that auto-deletes/destroys the phone's content when too many bad password are attempted (brute force attack).

    In other words, it will give the FBI an unlimited  number of password retries.

    To paraphrase Mission Impossible:
    "This phone will self-destruct in 5 seconds."
    Apple can sort of do this with specialized tools.  The phone memory can be imaged.  Then a new phone with the same serial numbers and other parameters can run that copy and do a few tries.  It would be labor intensive but the limited retries can be gotten around by extreme measures. 

    I don't think so. Apple uses a dedicated chip to store and process the encryption. They call this the Secure Enclave. The secure enclave stores a full 256-bit AES encryption key.

    Within the secure enclave itself, you have the device's Unique ID (UID) . The only place this information is stored is within the secure enclave. It can't be queried or accessed from any other part of the device or OS. Within the phone's processor you also have the device's Group ID (GID). Both of these numbers combine to create 1/2 of the encryption key. These are numbers that are burned into the silicon, aren't accessible outside of the chips themselves, and aren't recorded anywhere once they are burned into the silicon. Apple doesn't keep records of these numbers. Since these two different pieces of hardware combine together to make 1/2 of the encryption key, you can't separate the secure enclave from it's paired processor.

    The second half of the encryption key is generated using a random number generator chip. It creates entropy using the various sensors on the iPhone itself during boot (microphone, accelerometer, camera, etc.) This part of the key is stored within the Secure Enclave as well, where it resides and doesn't leave. This storage is tamper resistant and can't be accessed outside of the encryption system. Even if the UID and GID components of the encryption key are compromised on Apple's end, it still wouldn't be possible to decrypt an iPhone since that's only 1/2 of the key.

    The secure enclave is part of an overall hardware based encryption system that completely encrypts all of the user storage. It will only decrypt content if provided with the unlock code. The unlock code itself is entangled with the device's UDID so that all attempts to decrypt the storage must be done on the device itself. You must have all 3 pieces present: The specific secure enclave, the specific processor of the iphone, and the flash memory that you are trying to decrypt. Basically, you can't pull the device apart to attack an individual piece of the encryption or get around parts of the encryption storage process. You can't run the decryption or brute forcing of the unlock code in an emulator. It requires that the actual hardware components are present and can only be done on the specific device itself.

    The secure enclave also has hardware enforced time-delays and key-destruction. You can set the phone to wipe the encryption key (and all the data contained on the phone) after 10 failed attempts. If you have the data-wipe turned on, then the secure enclave will nuke the key that it stores after 10 failed attempts, effectively erasing all the data on the device. Whether the device-wipe feature is turned on or not, the secure enclave still has a hardware-enforced delay between attempts at entering the code: Attempts 1-4 have no delay, Attempt 5 has a delay of 1 minute. Attempt 6 has a delay of 5 minutes. Attempts 7 and 8 have a delay of 15 minutes. And attempts 9 or more have a delay of 1 hour. This delay is enforced by the secure enclave and can not be bypassed, even if you completely replace the operating system of the phone itself. If you have a 6-digit pin code, it will take, on average, nearly 6 years to brute-force the code. 4-digit pin will take almost a year. if you have an alpha-numeric password the amount of time required could extend beyond the heat-death of the universe. Key destruction is turned on by default.

    Even if you pull the flash storage out of the device, image it, and attempt to get around key destruction that way it won't be successful. The key isn't stored in the flash itself, it's only stored within the secure enclave itself which you can't remove the storage from or image it.

    Each boot, the secure enclave creates it's own temporary encryption key, based on it's own UID and random number generator with proper entropy, that it uses to store the full device encryption key in ram. Since the encryption key is also stored in ram encrypted, it can't simply be read out of the system memory by reading the RAM bus.

    The only way I can possibly see to potentially unlock the phone without the unlock code is to use an electron microscope to read the encryption key from the secure enclave's own storage. This would take considerable time and expense (likely millions of dollars and several months) to accomplish. This also assumes that the secure enclave chip itself isn't built to be resistant to this kind of attack. The chip could be physically designed such that the very act of exposing the silicon to read it with an electron microscope could itself be destructive.

    It comes down to: "Do you want to allow criminals to access your iPhone so that law enforcement can also access a criminal's iPhone?" I certainly don't.

    The feds would get further doing some social engineering on the guy, or building one of these.

    lighteningkidlatifbplostkiwiRayz2016bdkennedy1002macsince1988bestkeptsecretpunkndrubliczimmermannradster360
  • Reply 17 of 102
    lkrupp said:
    Now would be a great time for all those asshat security “researchers” who claim OS X and iOS are riddled with security holes and are “child’s play” to hack into to put up or shut up. You say it’s easy to get into an iPhone? Then call the FBI and tell them how to do it. Otherwise you people are blathering hypocritical nincompoops.
    Que the crickets from the gallery of nincompoops.
    lostkiwiEsquireCatsicoco3
  • Reply 18 of 102
    I'm really not surprised by this ruling. It was only a matter of time before some judge would force Apple to access a phone. This is actually positive PR for Apple since the FBI can't get past the iPhone's security. I'm surprised the stock didn't tank after this news came out. 
    lostkiwientropys
  • Reply 19 of 102
    lkrupp said:
    Now would be a great time for all those asshat security “researchers” who claim OS X and iOS are riddled with security holes and are “child’s play” to hack into to put up or shut up. You say it’s easy to get into an iPhone? Then call the FBI and tell them how to do it. Otherwise you people are blathering hypocritical nincompoops.
    Those security researchers aren't talking about how hard it is to get past a 4 digit or 6 digit code to get IN the phone. 
    They are talking about security holes in the OS when someone is USING the phone.  Code executed thru web browser, apps with security holes, etc etc etc. 

    If iOS / OS X was so secure that it had NOTHNG to worry about, don't you think the government and the rest of the world would switch instantly? 

    Encryption and Passwords that wipe hard drives after 10 wrong guesses aren't what the black hat hackers are talking about. It's what Can happen while using it is where those "researchers" claims are TRUE.
    edited February 2016 Blastercnocbui
  • Reply 20 of 102
    fallenjtfallenjt Posts: 3,976member
    This is utterly bullshit. Apple can by court order give law enforcement access to a customer's iPhone in special situation like this one but hand over the software for backdoor access. 

Sign In or Register to comment.