This might be how law enforcement agencies break into the iPhone

Posted:
in General Discussion edited December 2020
A group of cryptography experts have proposed a theory about how law enforcement can still break into iPhone despite continuous iOS patches and layers of safeguards -- Apple's strongest encryption protects less data than it used to.

Credit: Blocks/Unsplash
Credit: Blocks/Unsplash


Matthew Green, an associate professor at Johns Hopkins Information Security Institute, proposed the theory in a Twitter thread on Wednesday in response to news of the ACLU suing for information about iPhone unlocking methods. The theory is based on research from two of his students, Maximilian Zinkus and Tushar M. Jois.

My students @maxzks and Tushar Jois spent most of the summer going through every piece of public documentation, forensics report, and legal document we could find to figure out how police were "breaking phone encryption". 1/ https://t.co/KqkmQ1QrEy

-- Matthew Green (@matthew_d_green)


Green contends that law enforcement agencies no longer need to break the strongest encryption on an iPhone because not all types of user data are protected by it.

The research was prompted by the fact that forensic companies reportedly no longer have the ability to break Apple's Secure Enclave Processor. That means it's very difficult to crack a iPhone's password. Given that law enforcement agencies continue to break into locked devices, Green and his students began researching how that could be possible.

They came up with a possible answer, which Green said would be fully detailed in a report after the holidays. Although it's conjecture, it could explain how government and police entities are still able to extract data from locked iPhones.

It boils down to the fact that an iPhone can be in one of two states: Before First Unlock (BFU) and After First Unlock (AFU). When you first power up your device and enter your passcode, it goes into the AFU state. When a user types in their code, the iPhone uses it to derive different sets of cryptographic keys that stay in memory and are used to encrypt files.

When a user locks their device again, it doesn't go into BFU, but remains in the AFU state. Green notes that only one set of cryptographic keys gets purged from memory. That set stays gone until a user unlocks their iPhone again.

The purged set of keys is the one used to decrypt a subset of an iPhone's files that fall under a specific protection class. The other key sets, which stay in memory, are used to decrypt all other files.

From here, all a law enforcement entity needs to do is use known software exploits to bypass the iOS lock screen and decrypt most of the files. Using code that runs with normal privileges, they could access data like a legitimate app. As Green points out, the important part appears to be which files are protected by the purged set of keys.

Based on Apple's documentation, it appears that the strongest protection class only applies to mail and app launch data.

Apple *sort of* vaguely offers a list of the apps whose files get this special protection even in the AFU state. But notice how vague this language is. I have to actually decode it. 14/ pic.twitter.com/OMIy297605

-- Matthew Green (@matthew_d_green)


Comparing that to the same text from 2012, it seems that the strongest encryption doesn't safeguard as many data types as it once did.

The data types that don't get the strong protection include Photos, Texts, Notes, and possibly certain types of location data. Those are all typically of particular interest to law enforcement agencies.

So this answers the great mystery of "how are police breaking Apple's encryption in 2020". The answer is they probably aren't. They're seizing unlocked phones and using jailbreaks to dump the filesystem, most of which can be accessed easily since keys are in memory. 20/

-- Matthew Green (@matthew_d_green)


Third-party apps, however, are able to opt-in to protect user data with the strongest protection class.

As far as why Apple seems to have weakened the protections, Green theorizes that the company forfeited maximum security to enable specific app or system features like location-based reminders. Similarly, some apps wouldn't be able to function properly if the strongest encryption class was used for most data.

Green notes that the situation is "similar" on Android. But, for Apple, the cryptography professor says that "phone encryption is basically a no-op against motivated attackers."

If I could tell Apple to do one thing, I would tell them to figure this problem out. Because without protection for the AFU state, phone encryption is basically a no-op against motivated attackers.

Maybe Apple's lawyers prefer it this way, but it's courting disaster. 25/

-- Matthew Green (@matthew_d_green)


The findings, as well as other details and possible solutions are outlined in a research paper penned by Green, Zinkus, and Jois.
«1

Comments

  • Reply 1 of 28
    Fascinating... His 25-tweet thread is really interesting. It sounds like it comes down to the OS using the weaker encryption option on most of a device's relevant content in order to allow the software to do things in the background while your phone is locked -- using the decryption key stored in memory, which attackers have access to:

    Most apps like to do things in the background, while your phone is locked. They read from files and generally do boring software things. When you protect files using the strongest protection class and the phone locks, the app can’t do this stuff.
    edited December 2020 watto_cobra
  • Reply 2 of 28
    elijahgelijahg Posts: 2,759member
    I wonder if this is intentional so Apple can keep telling its users their data is encrypted, which it is, but then also able to turn a blind eye to the hacks the law enforcement uses to dump the phone's contents. That way they don't get forced to put in an explicit backdoor, because there is a workaround. Either that, or Apple has been secretly forced to allow access and these encryption workarounds give the illusion of privacy and non-compliance with law enforcement bigwigs and yet they actually are bending, with this being the best way they've got to keep the agreement secret.
    dewmemuthuk_vanalingambluefire1tobianNotoriousDEV
  • Reply 3 of 28
    Rayz2016Rayz2016 Posts: 6,957member
    Fascinating... His 25-tweet thread is really interesting. It sounds like it comes down to the OS using the weaker encryption option on most of a device's relevant content in order to allow the software to do things in the background while your phone is locked -- using the decryption key stored in memory, which attackers have access to:

    Most apps like to do things in the background, while your phone is locked. They read from files and generally do boring software things. When you protect files using the strongest protection class and the phone locks, the app can’t do this stuff.
    Indeed. 

    The location data makes sense, but the other stuff he mentioned doesn’t really need to be unencrypted while no one’s looking at it. I’m wondering if it’s a change that was made to conserve power. 

    Oh wait. Here’s something that happens in the background: indexing and processing for machine learning.  I reckon a lot of that gets done while the phone is locked and probably can’t be done without decrypting the data. 
    edited December 2020 watto_cobra
  • Reply 4 of 28
    dewmedewme Posts: 5,356member
    elijahg said:
    I wonder if this is intentional so Apple can keep telling its users their data is encrypted, which it is, but then also able to turn a blind eye to the hacks the law enforcement uses to dump the phone's contents. That way they don't get forced to put in an explicit backdoor, because there is a workaround. Either that, or Apple has been secretly forced to allow access and these encryption workarounds give the illusion of privacy and non-compliance with law enforcement bigwigs and yet they actually are bending, with this being the best way they've got to keep the agreement secret.
    I think you are actually pointing in the right direction. Apple isn't stupid, and to believe that they are somehow being repeatedly "duped" by US and Israeli security experts despite their proclamations of providing "total security and privacy" for their customers is a little bit more than a stretch or the ultimate "oops." There is probably a game of Chicken going on between Apple and government agencies like the NSA. Apple knows that it could lock down their stuff in ways that would make life miserable for the NSA. At the same time Apple also knows if they actually did this all pretense of civility and private industry operating independently and without the heavy hand of government slapping them down would vanish. No matter how you want to spin this, there is no way that Apple (or any other private company) would come out as the "winner" in this struggle. The winners and losers in such a conflict are predetermined, so we'll all get to to witness these little theatrical performances for as long as it takes to avoid or at least delay the inevitable outcome. 
    muthuk_vanalingamtobianelijahg
  • Reply 5 of 28
    GabyGaby Posts: 190member
    I’d be interested to know that if when you manually lock the phone down with a long press on sleep/wake + volume - which locks biometrics and necessitates password Re entry, if this is considered BFU or AFU. Technically it is AFU, but from what I remember Apple execs discussing, that is supposed to lock down the phone. In which case it is still feasible to lock people out without a power down. Hmmm.... 
    fastasleepwatto_cobra
  • Reply 6 of 28
    Is this a possible frog boiling Trojan Horse that has been ongoing since 2011, or long prior ?  
    Why do all roads increasingly seem to lead to iCloud ?
    Has Apple acknowledged they have YOUR key?
    arstechnica.com/tech-policy/2020/01/apple-reportedly-nixed-plan-for-end-to-end-encryption-in-iphone-backups/
    www.reuters.com/article/us-china-apple-icloud-insight-idUSKCN1G8060
    How does the Patriot Act affect those beyond the US border, for what borders and rights might be worth these days...?
    Why was Photos auto tagging upon intro with no off user switch ?
    Do onboard storage and T2 offer a way to verify user data, 'anonymized' or not ?
    Does every update seed further potential 'roots' into personal information to 'improve the user experience' ?
    If there may seem little cost now, more importantly what might such cost be in an unknown future...?
    Is Apple increasingly 'anything for a buck'?
    en.wikipedia.org/wiki/Surveillance_capitalism
    Let the flames begin...
    edited December 2020
  • Reply 7 of 28
    Is this a possible frog boiling Trojan Horse that has been ongoing since 2011, or long prior ?  
    Why do all roads increasingly seem to lead to iCloud ?
    Has Apple acknowledged they have YOUR key?
    arstechnica.com/tech-policy/2020/01/apple-reportedly-nixed-plan-for-end-to-end-encryption-in-iphone-backups/
    www.reuters.com/article/us-china-apple-icloud-insight-idUSKCN1G8060
    How does the Patriot Act affect those beyond the US border, for what borders and rights might be worth these days...?
    Why was Photos auto tagging upon intro with no off user switch ?
    Do onboard storage and T2 offer a way to verify user data, 'anonymized' or not ?
    Does every update seed further potential 'roots' into personal information to 'improve the user experience' ?
    If there may seem little cost now, more importantly what might such cost be in an unknown future...?
    Is Apple increasingly 'anything for a buck'?
    en.wikipedia.org/wiki/Surveillance_capitalism
    Let the flames begin...
    Please just move into a Faraday cage already.
    Fidonet127williamlondonwatto_cobraigorskyapplguytenthousandthings
  • Reply 8 of 28
    Rayz2016Rayz2016 Posts: 6,957member
    Is this a possible frog boiling Trojan Horse that has been ongoing since 2011, or long prior ?  
    Why do all roads increasingly seem to lead to iCloud ?
    Has Apple acknowledged they have YOUR key?
    arstechnica.com/tech-policy/2020/01/apple-reportedly-nixed-plan-for-end-to-end-encryption-in-iphone-backups/
    www.reuters.com/article/us-china-apple-icloud-insight-idUSKCN1G8060
    How does the Patriot Act affect those beyond the US border, for what borders and rights might be worth these days...?
    Why was Photos auto tagging upon intro with no off user switch ?
    Do onboard storage and T2 offer a way to verify user data, 'anonymized' or not ?
    Does every update seed further potential 'roots' into personal information to 'improve the user experience' ?
    If there may seem little cost now, more importantly what might such cost be in an unknown future...?
    Is Apple increasingly 'anything for a buck'?
    en.wikipedia.org/wiki/Surveillance_capitalism
    Let the flames begin...
    Please just move into a Faraday cage already.
    Well that flame war ended pretty quickly …
    mwhitewilliamlondonwatto_cobraigorskyapplguytenthousandthings
  • Reply 9 of 28
    bluefire1bluefire1 Posts: 1,302member
    Apple’s first priority should be to us, their customers. 
    The company is consistently touting the importance of privacy, yet if these cryptography experts  are correct, then privacy has taken a backseat in order to
    advance other features. I, for one, wish Apple would, at the very least, restore the level of encryption provided in the 2012 iPhone. 
    elijahg
  • Reply 10 of 28
    It sounds like the ACLU believes that the "unlocked phone" part of the process isn't happening legally, i.e., some type of coercion that violates rights. 
    watto_cobra
  • Reply 11 of 28
    Rayz2016Rayz2016 Posts: 6,957member
    bluefire1 said:
    Apple’s first priority should be to us, their customers. 
    The company is consistently touting the importance of privacy, yet if these cryptography experts  are correct, then privacy has taken a backseat in order to
    advance other features. I, for one, wish Apple would, at the very least, restore the level of encryption provided in the 2012 iPhone. 

    Well if they want to the AI stuff then a lot of that probably happens when you’re not looking at the phone so they need the keys in memory. 

    It’s not a question of customers not being the priority. The problem is weighing up inconvenience against security. I imagine they could fix a lot of problems by dumping the ML and removing the lightning port. 
  • Reply 12 of 28
    blastdoorblastdoor Posts: 3,276member
    Three things seem in play here from apple’s POV:
    1. Apple execs have long claimed that they comply with the law in every country where they operate. (Kind of a ‘no duh’ — they have to do that)
    2. But they have also said they are an American company that values privacy/security and they use the freedoms in the us to argue for the rights of their users.
    3. They want to provide ML-enabled features, with as much work done on-device as possible to safeguard privacy 
    The big question is whether #1 is in play here or not. If it’s not, them this issue involves trade offs between 2 and 3, and that’s in apples control. They can choose to strike a different balance between privacy and other features. 

    But if #1 is in play, then things are different. And of course different countries have different laws. I wonder if iPhones sold in different countries will end up with different degrees of privacy/security...
    watto_cobra
  • Reply 13 of 28
    I wouldn't be surprised in the least if Tim Cook and company passively make it easier for government like China, US, EU, etc., to gain access to private phone data.   However, I’m mystified that this MIT Professor is apparently unable to demonstrate his theory which he believes all these alphabet soup government agencies can already do.  We’re suppose to believe that governments can readily break the encryption but the MIT Professor and every non-government security expert in the entire world are either incapable or dupes for Apple.   Instead of the ACLU suing Apple for PR, how about just publicly replicate what the governments can allegedly already do.  
    edited December 2020 williamlondonwatto_cobra
  • Reply 14 of 28
    zimmiezimmie Posts: 651member
    dewme said:
    elijahg said:
    I wonder if this is intentional so Apple can keep telling its users their data is encrypted, which it is, but then also able to turn a blind eye to the hacks the law enforcement uses to dump the phone's contents. That way they don't get forced to put in an explicit backdoor, because there is a workaround. Either that, or Apple has been secretly forced to allow access and these encryption workarounds give the illusion of privacy and non-compliance with law enforcement bigwigs and yet they actually are bending, with this being the best way they've got to keep the agreement secret.
    I think you are actually pointing in the right direction. Apple isn't stupid, and to believe that they are somehow being repeatedly "duped" by US and Israeli security experts despite their proclamations of providing "total security and privacy" for their customers is a little bit more than a stretch or the ultimate "oops." There is probably a game of Chicken going on between Apple and government agencies like the NSA. Apple knows that it could lock down their stuff in ways that would make life miserable for the NSA. At the same time Apple also knows if they actually did this all pretense of civility and private industry operating independently and without the heavy hand of government slapping them down would vanish. No matter how you want to spin this, there is no way that Apple (or any other private company) would come out as the "winner" in this struggle. The winners and losers in such a conflict are predetermined, so we'll all get to to witness these little theatrical performances for as long as it takes to avoid or at least delay the inevitable outcome. 
    The contents of messages actually have very limited intelligence value. The NSA and FBI mostly care about communication metadata - specifically, the sender and recipient of messages. TOR is the closest you can get to concealing endpoint data, and breaking that anonymization only takes controlling about 30% of the nodes in the TOR network. It fundamentally isn't possible for Apple to make the FBI's or NSA's lives significantly harder.

    About 95% of the FBI's claimed outrage about device encryption is performative. They are attempting to portray themselves as the good guys to win sympathy from the general public and distract from the creepy things they actually care about. The value of an unlocked phone is mostly that it can be used to correlate the message endpoints which belong to one person.

    Some data stored locally on the device can be useful in certain criminal proceedings (like access to the photos can prove possession of CSAM). Individuals' crimes only rarely catch the attention of the FBI and NSA. They mostly care about groups: terrorist cells, people distributing CSAM, that sort of thing. The photos, notes, messages, and so on stored on a phone are far less useful for that than the communication metadata.
    watto_cobradewmeJFC_PAGaby
  • Reply 15 of 28
    Fascinating... His 25-tweet thread is really interesting. It sounds like it comes down to the OS using the weaker encryption option on most of a device's relevant content in order to allow the software to do things in the background while your phone is locked -- using the decryption key stored in memory, which attackers have access to:

    Most apps like to do things in the background, while your phone is locked. They read from files and generally do boring software things. When you protect files using the strongest protection class and the phone locks, the app can’t do this stuff.
    This is super fascinating.  This is the kind of news I enjoy!  And it makes complete sense.  Looking forward to hearing more detail after the holidays.

    Gaby said:
    I’d be interested to know that if when you manually lock the phone down with a long press on sleep/wake + volume - which locks biometrics and necessitates password Re entry, if this is considered BFU or AFU. Technically it is AFU, but from what I remember Apple execs discussing, that is supposed to lock down the phone. In which case it is still feasible to lock people out without a power down. Hmmm.... 
    I’d be curious if doing this operation would work around this as well.  It would also be nice to know if we disable the convenience of the background app refresh, if the phone could be better protected.  Given the above details, it would make logical sense, but I also suppose there are Siri requests that require access to encrypted data.  I would much prefer to get a notification in my Lock Screen saying something along the lines of: process x needs you to unlock your phone to complete.
    Gaby
  • Reply 16 of 28
    vmarksvmarks Posts: 762editor
    Gaby said:
    I’d be interested to know that if when you manually lock the phone down with a long press on sleep/wake + volume - which locks biometrics and necessitates password Re entry, if this is considered BFU or AFU. Technically it is AFU, but from what I remember Apple execs discussing, that is supposed to lock down the phone. In which case it is still feasible to lock people out without a power down. Hmmm.... 

    Emergency SOS mode is not the same as BFU.

    Notable differences are that in true BFU, the iPhone does not connect to Wi-Fi. With Emergency SOS mode, Wi-Fi remains connected.

    We have no idea how many keys get discarded when Emergency SOS mode is activated. What we do know is, it disables USB data, so any vulnerability is going to rely on a network attack (sending an iMessage exploit, for example.)
    Gabyelijahg
  • Reply 17 of 28
    markbyrn said:
    I wouldn't be surprised in the least if Tim Cook and company passively make it easier for government like China, US, EU, etc., to gain access to private phone data.   However, I’m mystified that this MIT Professor is apparently unable to demonstrate his theory which he believes all these alphabet soup government agencies can already do.  We’re suppose to believe that governments can readily break the encryption but the MIT Professor and every non-government security expert in the entire world are either incapable or dupes for Apple.   Instead of the ACLU suing Apple for PR, how about just publicly replicate what the governments can allegedly already do.  
    I agree with this. Don’t submit a theory like he’s doing when you can go a step further and actually prove it. I also don’t think Apple’s stupid: they knew the ins and outs of their products, and they know what types of data their encryption protects and what it doesn’t.  

    @zimmie suggested the data they can capture isn’t of much value to law enforcement. So maybe the professor can tackle that next (if he has any forensic law background). If Joe decides to send a text to someone to sell some coke, how valuable is that information if  a) it’s not explicit, b) no names are mentioned. My guess is that LEO’s job is that much harder. The most he can alleges it that Joe talked about “snow” to someone. There’s no implication there, just inference. And that’s not enough for the court to convict.
    edited December 2020
  • Reply 18 of 28
    maltzmaltz Posts: 453member
    Wait, is this new?  I've always known (assumed?) that the phone is in a less secure state after first unlock.  However, I did assume that the SOS mode lock put the phone into BFU state - it appears it does not, although with USB disabled, it might be sort of in-between BFU and AFU from a security perspective.  Nor, presumably, is it in BFU state when it asks for the passcode to re-enable Face/Touch ID every week, though I don't know whether USB is enabled or not in that scenario.
  • Reply 19 of 28
    I'm not going to pretend to understand all of this, but I believe that these issues will mostly all be moot once the iPhone, inevitably, goes portless.
    edited December 2020
  • Reply 20 of 28
    maltzmaltz Posts: 453member
    igorsky said:
    I'm not going to pretend to understand all of this, but I believe that these issues will mostly all be moot once the iPhone, inevitably, goes portless.

    Yes, I'm so looking forward to my next phone rendering useless my Car Play, ALL my charging adapters and portable battery pack, and reduce my charging speed by 2/3rds.  (Yes, I'm aware that I can spend a few hundred to restore SOME of that.)

    I really hope that never happens - but it's totally something I could see Apple doing.
    elijahgtokyojimumuthuk_vanalingam
Sign In or Register to comment.