Professor proves NAND mirroring attack thwarts iPhone 5c security protocols

2

Comments

  • Reply 21 of 41
    So this 5c stymies the entire FBI who has to pay a hacker a zillion dollars to get it done and this guy does it in 40 hours with a $100 of hardware? Am I missing something here?
    big
  • Reply 22 of 41
    Rosyna said:
    hungover said:
    foggyhill said:
    Didn't we ALREADY know that. I actually knew that without even trying (because I'm a computer engineer).
    That; why Apple changed it later.
    It's not an easy attack though; if a person is doing that to your phone, I'm guessing you can spring for an Iphone 7...
    If Apple were aware that the data could be extracted from the 5c using that technique why did they refuse to help the FBI and insist that the only way to extract the data would require custom code that could fall into the hands of criminals?

    Apple could have helped the FBI and then said "sorry guys, as much as we'd like to, we can't help you with any newer handsets" . Doing so would have been likely to have strengthened their case against future law enforcement demands and saved the public purse a shed load of money.
    No, what it proves is that the FBI never needed Apple's help at all. That the FBI's attempt to coerce a
    Apple was just to set a precedent so Apple couldn't refuse to help them in the future.

    That's also why the FBI used a phone in PR that they knew had absolutely nothing of value on it.
    Sorry, I am having problems following your logic.

    According to Foggyhill, Apple knew that the 5C could be subjected to a NAND attack when they released it (and presumably believed that the 5S, released at the same time would be protected from the method). He/she seems to be suggesting that it would have been common knowledge.

    By insisting that the FBI never needed help, you appear to be concurring with him/her.

    So why did Apple tell the courts that helping the FBI would require them to create a "backdoor" custom boot code? Did Apple and the FBI both lie to the courts and the public?

    Regarding precedent setting- Yes a phone used by a terrorist is very emotive and the FBI may have assumed that it would swing public favour in their direction (regardless of the  value of the data on it). Nevertheless, Apple simply needed to explain that the newer handsets with the security enclaves could not be cracked using the same techniques if at all, thus negating any attempts to set a precedent.

    And why would the FBI then pay (reputedly) a million dollars to a third party to crack the phone and drop the supposed "
    precedent" setting case?


  • Reply 23 of 41
    MacProMacPro Posts: 19,718member
    Rosyna said:
    hungover said:
    foggyhill said:
    Didn't we ALREADY know that. I actually knew that without even trying (because I'm a computer engineer).
    That; why Apple changed it later.
    It's not an easy attack though; if a person is doing that to your phone, I'm guessing you can spring for an Iphone 7...
    If Apple were aware that the data could be extracted from the 5c using that technique why did they refuse to help the FBI and insist that the only way to extract the data would require custom code that could fall into the hands of criminals?

    Apple could have helped the FBI and then said "sorry guys, as much as we'd like to, we can't help you with any newer handsets" . Doing so would have been likely to have strengthened their case against future law enforcement demands and saved the public purse a shed load of money.
    No, what it proves is that the FBI never needed Apple's help at all. That the FBI's attempt to coerce a
    Apple was just to set a precedent so Apple couldn't refuse to help them in the future.

    That's also why the FBI used a phone in PR that they knew had absolutely nothing of value on it.
    Exactly.  Few seem to ever cotton on to this.
  • Reply 24 of 41
    hungover said:
    Regarding precedent setting- Yes a phone used by a terrorist is very emotive and the FBI may have assumed that it would swing public favour in their direction (regardless of the  value of the data on it). Nevertheless, Apple simply needed to explain that the newer handsets with the security enclaves could not be cracked using the same techniques if at all, thus negating any attempts to set a precedent.

    And why would the FBI then pay (reputedly) a million dollars to a third party to crack the phone and drop the supposed "
    precedent" setting case?

    I believe the concern there is that once the FBI had successfully required Apple to provide a back door for the affected phone, precedent would be established, and they would then be able to require it for any future phones, as well as phones from other manufacturers, as well, regardless of whether or not the old back door would work on the new phones.  IMO, it's not beyond the realm of possibility that they knew they could eventually break into this phone, and also knew that breaking into future ones would be significantly more difficult, so they decided to try to set a precedent with it.

    As for why they would drop the case, it could be because they saw a decreasing chance of winning, and decided to retreat and fight another day on a stronger case.  One where they don't screw up the potential evidence before asking for help. :)

  • Reply 25 of 41
    hungover said:

    So why did Apple tell the courts that helping the FBI would require them to create a "backdoor" custom boot code? Did Apple and the FBI both lie to the courts and the public?

    And why would the FBI then pay (reputedly) a million dollars to a third party to crack the phone and drop the supposed "
    precedent" setting case?

    The FBI wanted the unfettered ability to endlessly brute-force the passcode.  Without resorting to NAND mirroring every ten tries. It doesn't matter which parties knew about NAND mirroring, and/or when.
    jasenj1nolamacguyDeelron
  • Reply 26 of 41
    hungover said:
    Rosyna said:
    hungover said:
    foggyhill said:
    Didn't we ALREADY know that. I actually knew that without even trying (because I'm a computer engineer).
    That; why Apple changed it later.
    It's not an easy attack though; if a person is doing that to your phone, I'm guessing you can spring for an Iphone 7...
    If Apple were aware that the data could be extracted from the 5c using that technique why did they refuse to help the FBI and insist that the only way to extract the data would require custom code that could fall into the hands of criminals?

    Apple could have helped the FBI and then said "sorry guys, as much as we'd like to, we can't help you with any newer handsets" . Doing so would have been likely to have strengthened their case against future law enforcement demands and saved the public purse a shed load of money.
    No, what it proves is that the FBI never needed Apple's help at all. That the FBI's attempt to coerce a
    Apple was just to set a precedent so Apple couldn't refuse to help them in the future.

    That's also why the FBI used a phone in PR that they knew had absolutely nothing of value on it.
    Sorry, I am having problems following your logic.

    According to Foggyhill, Apple knew that the 5C could be subjected to a NAND attack when they released it (and presumably believed that the 5S, released at the same time would be protected from the method). He/she seems to be suggesting that it would have been common knowledge.

    By insisting that the FBI never needed help, you appear to be concurring with him/her.

    So why did Apple tell the courts that helping the FBI would require them to create a "backdoor" custom boot code? Did Apple and the FBI both lie to the courts and the public?

    Regarding precedent setting- Yes a phone used by a terrorist is very emotive and the FBI may have assumed that it would swing public favour in their direction (regardless of the  value of the data on it). Nevertheless, Apple simply needed to explain that the newer handsets with the security enclaves could not be cracked using the same techniques if at all, thus negating any attempts to set a precedent.

    And why would the FBI then pay (reputedly) a million dollars to a third party to crack the phone and drop the supposed "precedent" setting case?


    This is a hack. When you want to hack something you go to hackers. Apple is not a hacker so they wouldn't use this technique. What they will do, if forced, is create a new software to unlock the phone. 
  • Reply 27 of 41
    mike1mike1 Posts: 3,275member
    macseeker said:
    Soli said:
    A four-digit passcode took about 40 hours to crack, Skorobogatov said, adding that a six-digit code could take hundreds of hours. Apple estimated similar numbers when the FBI obtained a court order forcing Apple to access an iPhone 5c tied to last year's San Bernardino terror attack. 
    1) I wonder how long that takes when you enable a complex passcode that uses the entirety of the Apple keyboard? Let's say a 6-character word using letters, numbers, and punctuation.

    2) Can this system even discern that the system wants complex over simple? I assume there would be an unprotected marker since the iOS keyboard changes to account for this change.
    Using all the character keys on the keyboard, there are (46*2) to the 6th power or 606,355,001,344 combinations.  Wonder how long it would take to find the password.  I also used the special characters.  46 keys plus using the shift key thus 46 * 2.
    A lot longer than the "hundreds of hours" the guy claims. Unless, of course, you got really lucky.
  • Reply 28 of 41
    radarthekatradarthekat Posts: 3,842moderator
    macseeker said:
    Soli said:
    A four-digit passcode took about 40 hours to crack, Skorobogatov said, adding that a six-digit code could take hundreds of hours. Apple estimated similar numbers when the FBI obtained a court order forcing Apple to access an iPhone 5c tied to last year's San Bernardino terror attack. 
    1) I wonder how long that takes when you enable a complex passcode that uses the entirety of the Apple keyboard? Let's say a 6-character word using letters, numbers, and punctuation.

    2) Can this system even discern that the system wants complex over simple? I assume there would be an unprotected marker since the iOS keyboard changes to account for this change.
    Using all the character keys on the keyboard, there are (46*2) to the 6th power or 606,355,001,344 combinations.  Wonder how long it would take to find the password.  I also used the special characters.  46 keys plus using the shift key thus 46 * 2.
    And, one wouldn't need to limit himself to the 6-characters Soli suggested.  A terrorist who wants to secure his data/contacts might go with eight, or 11, or 26 characters, switching from a manageable 6 or 7-character passcode to a much longer one just prior to carrying out an attack.  
  • Reply 29 of 41
    radarthekatradarthekat Posts: 3,842moderator
    hungover said:
    Rosyna said:
    hungover said:
    foggyhill said:
    Didn't we ALREADY know that. I actually knew that without even trying (because I'm a computer engineer).
    That; why Apple changed it later.
    It's not an easy attack though; if a person is doing that to your phone, I'm guessing you can spring for an Iphone 7...
    If Apple were aware that the data could be extracted from the 5c using that technique why did they refuse to help the FBI and insist that the only way to extract the data would require custom code that could fall into the hands of criminals?

    Apple could have helped the FBI and then said "sorry guys, as much as we'd like to, we can't help you with any newer handsets" . Doing so would have been likely to have strengthened their case against future law enforcement demands and saved the public purse a shed load of money.
    No, what it proves is that the FBI never needed Apple's help at all. That the FBI's attempt to coerce a
    Apple was just to set a precedent so Apple couldn't refuse to help them in the future.

    That's also why the FBI used a phone in PR that they knew had absolutely nothing of value on it.
    Sorry, I am having problems following your logic.

    According to Foggyhill, Apple knew that the 5C could be subjected to a NAND attack when they released it (and presumably believed that the 5S, released at the same time would be protected from the method). He/she seems to be suggesting that it would have been common knowledge.

    By insisting that the FBI never needed help, you appear to be concurring with him/her.

    So why did Apple tell the courts that helping the FBI would require them to create a "backdoor" custom boot code? Did Apple and the FBI both lie to the courts and the public?

    Regarding precedent setting- Yes a phone used by a terrorist is very emotive and the FBI may have assumed that it would swing public favour in their direction (regardless of the  value of the data on it). Nevertheless, Apple simply needed to explain that the newer handsets with the security enclaves could not be cracked using the same techniques if at all, thus negating any attempts to set a precedent.

    And why would the FBI then pay (reputedly) a million dollars to a third party to crack the phone and drop the supposed "
    precedent" setting case?


    It's because the FBI explicitly demanded a backdoor. They did not ask that Apple use some hardware approach to break into that phone. All parties agreed that doing so could damage the data on the phone, and since the FBI'/ actual motivation was to secure from Apple a backdoor, they deliberately avoided adding to the conversation any notion of asking Apple to perform this type of NAND mirroring technique. The fact the press was talking about it as a possibility is a separate issue from the carefully crafted communication between the FBI and Apple.

     As to why Apple might not bring it up, it could be as simple as the fact that Apple doesn't want to get into the business of becoming a forensics lab, with all the heavy burden of documenting processes used to gather evidence and dealing with the challenges brought by defense attorneys, which would require handing over to defense experts details or any technique used to hack into an iPhone. This is also a part of the reason they wouldn't want to provide a backdoor; because defense attorneys could legally require them to hand over source code, and then iPhone'/ security secrets would be flying around in the hands of those who are assigning, or hired, to represent the world's baddest of bad actors. And every hacker who could hack their servers. And every other ad actor who could pay them for a copy. Nope, Apple avoided that whole slippery slope for very valid reasons.
    mcarlingtheunfetteredmindDeelronbig
  • Reply 30 of 41
    lkrupp said:
    And the typical user should be worried about this because...?
    Because the US government lobbied for and wants to impose encryption bypass capabilities. They used the case cited as justification for such when any decent engineer knew it could be bypassed without government mandated changes - and that even then the content could be encrypted using something besides what the OS offers automatically.
  • Reply 31 of 41
    davidwdavidw Posts: 2,036member

    Sorry, I am having problems following your logic.

    According to Foggyhill, Apple knew that the 5C could be subjected to a NAND attack when they released it (and presumably believed that the 5S, released at the same time would be protected from the method). He/she seems to be suggesting that it would have been common knowledge.

    By insisting that the FBI never needed help, you appear to be concurring with him/her.

    So why did Apple tell the courts that helping the FBI would require them to create a "backdoor" custom boot code? Did Apple and the FBI both lie to the courts and the public?

    Regarding precedent setting- Yes a phone used by a terrorist is very emotive and the FBI may have assumed that it would swing public favour in their direction (regardless of the  value of the data on it). Nevertheless, Apple simply needed to explain that the newer handsets with the security enclaves could not be cracked using the same techniques if at all, thus negating any attempts to set a precedent.

    And why would the FBI then pay (reputedly) a million dollars to a third party to crack the phone and drop the supposed "precedent" setting case?


    The FBI never asked Apple to retrieve the data in the phone. The FBI asked Apple specifically, in the court order, to write a new iOS that was less secure. The new iOS wasn't to include the 10 incorrect guess and data gets deleted feature and the time delay between incorrect guesses. Plus, the new iOS had to allow the passcode to be inputed using a computer connected to its lighting port. Once written, Apple was to replace the iOS in the terrorist iPhone with the new, less secure one. Once installed, Apple was to hand over the iPhone to the FBI so they can now brute force the passcode using a computer, in their own lab. Apple never said that the only way to access the data was to write a backdoor. A backdoor was what the FBI wanted Apple create.

    installing the new less secure iOS in the terrorist iPhone was something only Apple could do, as the new iOS has to be signed off on Apple server end. If the FBI could force Apple to this, then the FBI could conceivably hire their own programers to write a new iOS for any iPhone and then using this precedent, force Apple to sign off on it. 

    The reason why the FBI dropped the case was because it was becoming evident that they were going to lose. If they lose, then the precedent is set in Apple's (and other tech companies) favor, as they can use it to any time the FBI try to force them to hack into their own device. The FBI couldn't take that chance.   
    edited September 2016 mcarlingDeelronbig
  • Reply 32 of 41
    nclman said:
    4a6d65 said:
    I agree with foggyhill and EsquireCats. It's going to scaremonger people as well that don't understand that this type of attack can't work on those devices with a SecureEnclave.
    Why not? The Secure Enclave is not changed in this case. Hence, UID/GID used to wrap the protection class keys are valid. And the data to be read in NAND is unchanged.

    I would argue that unless the Secure Enclave is using also the NAND device ID in its data protection, this method should work as well.
    What am I missing...
    The Secure Enclave is a hardware encryption system. This hack relies on the fact that with software encryption, all the relevant security data (failed password counter, encryption keys, etc...) live on the NAND. With an encryption chip like the Secure Enclave, these are typically built into the chip itself, so you can't reset the counter simply by swapping the NAND for a copy that doesn't 'know' about the failed attempts. Well designed encryption chips are typically tamper resistant so that you can't simply clone those, and even if they weren't, they are custom jobs that would be more difficult to source and clone than simple NAND memory.

    Every failed password increments the failed counter on the Secure Enclave (which can't be reset by replacing NAND), then when you hit the limit, the chip permanently deletes the encryption keys, leaving any information on the NAND totally un-recoverable (using current technology).
  • Reply 33 of 41
    It's because the FBI explicitly demanded a backdoor. They did not ask that Apple use some hardware approach to break into that phone. All parties agreed that doing so could damage the data on the phone, and since the FBI'/ actual motivation was to secure from Apple a backdoor, they deliberately avoided adding to the conversation any notion of asking Apple to perform this type of NAND mirroring technique. The fact the press was talking about it as a possibility is a separate issue from the carefully crafted communication between the FBI and Apple.

     As to why Apple might not bring it up, it could be as simple as the fact that Apple doesn't want to get into the business of becoming a forensics lab, with all the heavy burden of documenting processes used to gather evidence and dealing with the challenges brought by defense attorneys, which would require handing over to defense experts details or any technique used to hack into an iPhone. This is also a part of the reason they wouldn't want to provide a backdoor; because defense attorneys could legally require them to hand over source code, and then iPhone'/ security secrets would be flying around in the hands of those who are assigning, or hired, to represent the world's baddest of bad actors. And every hacker who could hack their servers. And every other ad actor who could pay them for a copy. Nope, Apple avoided that whole slippery slope for very valid reasons. For now, the Feds ain't flying the wight flag yet though, not by a long short they've retreated to re-group re-strategise and wait for the next event, but make no mistake governments will have the backdoor, if people can be put in a position where they have little choice but choose between Trump or Hillary, who is the lesser evil ? Who do you hate least ?, what choice is this ? No different than "we'v confiscated an iOS device that a real live terrorist in custody has identified as containing information about an immanent attack at some location in New York but the device is incripterd, there is no time to sit around and debate about morality and any such nonsense Apple give us that key" do you think Apple could say no ? safety or incription ? What choice is this? probably no different than Trump or Hillary it isn't choice it is the mere appearance of it I think most are missing the game being played, if the government acted in the right time on the hills of just the right disaster I don't think this will be a difficult thing, Apple can make the right choice and fight because the law allows it, but laws can be amended if sicumstecise pressed. Before posting dislikes ask you're selfs this which government level action has ever been taken by asking the people if affected first ? how many Americans voted to go to war ? Or enact the Patriot Act ? people like to go on about how they live in a democratic country where theire leaders are accountable but that's just an illusion for the sake of order do people really think that the FBI is rsally filled with such incompetence that they would persue a beckdoore to incription with out realizing that if they got one terrorist would just stop using iOS or install custom incription ? Really ? This is an attemp at something fare more unscrupulous, a persute of control, pure hegemonic domination of the population I think terrorism has fucked us all in more ways than the obvious
  • Reply 34 of 41
    1st1st Posts: 443member
    svanthem said:
    So this 5c stymies the entire FBI who has to pay a hacker a zillion dollars to get it done and this guy does it in 40 hours with a $100 of hardware? Am I missing something here?

    The "guy" is super smart computer geek professor from top notch university in UK.  FBI may not have access to his brain at any price.  As for $100 or so hardware, it depend upon the user/designer, not the hardware... you missing big here.  Knowing who is the best in the field is priceless.  
    aderutter
  • Reply 35 of 41
    Rayz2016 said:
    lkrupp said:
    And the typical user should be worried about this because...?
    The typical user shouldn't be worried about this. However, the typical user should worried that the FBI is either lying or incompetent. 
    Or both.
  • Reply 36 of 41
    davidw said:
    nclman said:
    4a6d65 said:
    I agree with foggyhill and EsquireCats. It's going to scaremonger people as well that don't understand that this type of attack can't work on those devices with a SecureEnclave.
    Why not? The Secure Enclave is not changed in this case. Hence, UID/GID used to wrap the protection class keys are valid. And the data to be read in NAND is unchanged.

    I would argue that unless the Secure Enclave is using also the NAND device ID in its data protection, this method should work as well.
    What am I missing…

    The time delayed after every missed guess. That is built into the secure enclave, not the OS, and does not reset when the limit is reached and the data is deleted in the NAND. And does not reset after rebooting the iPhone. When a new clone NAND is put in after 10 guesses, the time delay remains and guesses can only be inputed at a rate of  maybe 1 every hour (or so).  So theoretically, this technique may work, but it will take a lot, lot, lot, lot, lot, longer to brute force the code. So even a 4 digit passcode may take years to hack. 
    Thanks for the explanation. That would imply that the secure enclave processor has some sort of eeprom? Would you have some links to some supporting documents?
  • Reply 37 of 41
    davidwdavidw Posts: 2,036member
    nclman said:
    davidw said:
    nclman said:
    4a6d65 said:
    I agree with foggyhill and EsquireCats. It's going to scaremonger people as well that don't understand that this type of attack can't work on those devices with a SecureEnclave.
    Why not? The Secure Enclave is not changed in this case. Hence, UID/GID used to wrap the protection class keys are valid. And the data to be read in NAND is unchanged.

    I would argue that unless the Secure Enclave is using also the NAND device ID in its data protection, this method should work as well.
    What am I missing…

    The time delayed after every missed guess. That is built into the secure enclave, not the OS, and does not reset when the limit is reached and the data is deleted in the NAND. And does not reset after rebooting the iPhone. When a new clone NAND is put in after 10 guesses, the time delay remains and guesses can only be inputed at a rate of  maybe 1 every hour (or so).  So theoretically, this technique may work, but it will take a lot, lot, lot, lot, lot, longer to brute force the code. So even a 4 digit passcode may take years to hack. 
    Thanks for the explanation. That would imply that the secure enclave processor has some sort of eeprom? Would you have some links to some supporting documents?
    https://daringfireball.net/linked/2016/02/17/ios-security-guide
  • Reply 38 of 41
    davidw said:
    nclman said:
    Thanks for the explanation. That would imply that the secure enclave processor has some sort of eeprom? Would you have some links to some supporting documents?
    https://daringfireball.net/linked/2016/02/17/ios-security-guide
    I suppose you are referring to this statement:

    "On devices with an A7 or later A-series processor, the delays are enforced by the Secure Enclave. If the device is restarted during a timed delay, the delay is still enforced, with the timer starting over for the current period."

    Doesn't mention how it's done though.
    I suppose it's possible that secure enclave stores the failed attempts/delays in embedded eeprom or register. But I can't find any confirmation.
  • Reply 39 of 41
    davidwdavidw Posts: 2,036member
    nclman said:
    davidw said:
    nclman said:
    Thanks for the explanation. That would imply that the secure enclave processor has some sort of eeprom? Would you have some links to some supporting documents?
    https://daringfireball.net/linked/2016/02/17/ios-security-guide
    I suppose you are referring to this statement:

    "On devices with an A7 or later A-series processor, the delays are enforced by the Secure Enclave. If the device is restarted during a timed delay, the delay is still enforced, with the timer starting over for the current period."

    Doesn't mention how it's done though.
    I suppose it's possible that secure enclave stores the failed attempts/delays in embedded eeprom or register. But I can't find any confirmation.

    Logic would dictate that the secure enclave have access to some form of memory storage, that isn't reset with each boot, as somewhere is the stored digital image of the fingerprint used to unlock the iPhone and the CC number used to generate the one time use token. Otherwise it would not know when to unlock the iPhone when the proper fingerprint is detected or generate a token for a CC transaction with ApplePay. But from what I gather, these data is encrypted by the secure enclave and can only be read by it. Whether the storage is imbedded internally in the secure enclave chip or is part of its supporting external architecture, there's no way to access this data. At least not without Apple's help. And even then, maybe Apple can't even access it. 

    Maybe you can figure it with this, I can't. It way more info that I ever need or care to know.

    https://www.blackhat.com/docs/us-16/materials/us-16-Mandt-Demystifying-The-Secure-Enclave-Processor.pdf
  • Reply 40 of 41
    davidw said:

    Logic would dictate that the secure enclave have access to some form of memory storage, that isn't reset with each boot, as somewhere is the stored digital image of the fingerprint used to unlock the iPhone and the CC number used to generate the one time use token. Otherwise it would not know when to unlock the iPhone when the proper fingerprint is detected or generate a token for a CC transaction with ApplePay. But from what I gather, these data is encrypted by the secure enclave and can only be read by it. Whether the storage is imbedded internally in the secure enclave chip or is part of its supporting external architecture, there's no way to access this data. At least not without Apple's help. And even then, maybe Apple can't even access it. 

    Maybe you can figure it with this, I can't. It way more info that I ever need or care to know.

    https://www.blackhat.com/docs/us-16/materials/us-16-Mandt-Demystifying-The-Secure-Enclave-Processor.pdf
    I read this thing weeks ago. Nothing mentioned about a persistent storage in secure enclave, even in the patent.
    Although in the patent, there was the following statement:

    "In addition to security peripherals designed to perform specific functions, there may also be security peripherals that are interface units for secure interfaces such as the secure interface unit 36E. In the illustrated embodiment, the secure interface unit 36E may be an interface to an off SOC 10 (“off-chip”) secure memory. For example, the interface may an interface to an off SOC Smart Card."
Sign In or Register to comment.