Apple again rumored to goose iCloud security amid iPhone encryption flap
In a move sure to complicate the ongoing Apple vs. FBI court case, Apple is reportedly developing stronger iCloud encryption methods that would prevent even it from accessing and extracting user data protected by a passcode.
Racks of Apple's iCloud servers in Maiden, NC
Citing sources familiar with Apple's plans, The Wall Street Journal on Tuesday reported that while preparations for a more secure iCloud are underway, executives are still trying to find a workable balance between strong encryption and customer convenience.
Currently, iCloud can be configured to store daily device backups, messages, photos, notes and other data, much of which is accessible by Apple. But the purported plan is to encrypt that data and restrict access to holders of user-created passkeys. Apple's supposed encryption plans were first report by the Financial Times in late February.
If Apple does enact stronger iCloud security measures, particularly those that would render warrants for data access moot, it could exacerbate an already tenuous situation. The company is embroiled in a heated court battle over the unlocking of an iPhone used by San Bernardino shooter Syed Rizwan Farook. Apple was compelled by a federal magistrate judge to help in FBI efforts to break into the device, but the company has so far resisted, sparking a contentious debate over privacy rights and national security.
Currently, iCloud backups are Apple's go-to, non-destructive option for law enforcement requests. And for agencies like the FBI, iCloud has quickly become the only way to access data as part of a criminal investigation.
Apple introduced strong on-device encryption with iOS 8, making it nearly impossible to extract usable intel from hardware running the latest OS. Certain information, however, is sent up to the cloud and can potentially be accessed by Apple on behalf of the government. That all ends if Apple puts encryption keys wholly in the hands of its customers.
A version of this all-or-nothing strategy is already up and running in iCloud Keychain. The feature lets users store especially sensitive data like passwords and credit card information that can be accessed remotely, synced and transferred to and from other devices. Apple is unable to decrypt and read the data, but at the same time it can't restore or retrieve the information if a user loses or forgets their password.
It remains unclear when Apple intends to implement the iCloud changes, if at all. However, given the company's intractable stance on strong encryption, consumers could see enhancements roll out sooner rather than later.
Racks of Apple's iCloud servers in Maiden, NC
Citing sources familiar with Apple's plans, The Wall Street Journal on Tuesday reported that while preparations for a more secure iCloud are underway, executives are still trying to find a workable balance between strong encryption and customer convenience.
Currently, iCloud can be configured to store daily device backups, messages, photos, notes and other data, much of which is accessible by Apple. But the purported plan is to encrypt that data and restrict access to holders of user-created passkeys. Apple's supposed encryption plans were first report by the Financial Times in late February.
If Apple does enact stronger iCloud security measures, particularly those that would render warrants for data access moot, it could exacerbate an already tenuous situation. The company is embroiled in a heated court battle over the unlocking of an iPhone used by San Bernardino shooter Syed Rizwan Farook. Apple was compelled by a federal magistrate judge to help in FBI efforts to break into the device, but the company has so far resisted, sparking a contentious debate over privacy rights and national security.
Currently, iCloud backups are Apple's go-to, non-destructive option for law enforcement requests. And for agencies like the FBI, iCloud has quickly become the only way to access data as part of a criminal investigation.
Apple introduced strong on-device encryption with iOS 8, making it nearly impossible to extract usable intel from hardware running the latest OS. Certain information, however, is sent up to the cloud and can potentially be accessed by Apple on behalf of the government. That all ends if Apple puts encryption keys wholly in the hands of its customers.
A version of this all-or-nothing strategy is already up and running in iCloud Keychain. The feature lets users store especially sensitive data like passwords and credit card information that can be accessed remotely, synced and transferred to and from other devices. Apple is unable to decrypt and read the data, but at the same time it can't restore or retrieve the information if a user loses or forgets their password.
It remains unclear when Apple intends to implement the iCloud changes, if at all. However, given the company's intractable stance on strong encryption, consumers could see enhancements roll out sooner rather than later.
Comments
The governments of the world can go figure.
There will always be a way to show them the finger, (even if Apple engineers don't do it).
If the nice customer service person can give you back your access he or she can give the very same access to whomever does a good phishing job.
One way way to phish proof your account? On those security questions? LIE. And write the lies down. That way no amount of research can find the answer.
The biggest setback would be other countries not trusting Apple anymore.
The government cannot and should not win this.
Trust? They'd all be lining up to grab the very same spy tools from Apple and all the rest.
Problem is....
People choose bad passphrases, which can be easily brute forced offline since apple will still give them the data. People forget them. And if they forget, you lose - everything.
Of course, one could forget the password but still have their iphone. Remake the password and resync. If you lose your device though, you are screwed if you forgot your "key."
Might blow some peoples minds here but Google already does this with Chrome Sync and Mozilla with their sync. They can't read your browser history you sync with them if you pick a passphrase/password. Even if that password is the same as your main account, using it still doesn't grant them access.
Since you have to have access to the phone to know where the files are, that really ups the ante on recovery :-).
Obviously, if you lose your phone your really really screwed, unless you can back the recovery directory in a locally encrypted cache with a long passcode (not the same as your phone hopefully)..
No one knows where the files are but you :-)..
LastPass says in big letters "Choose a long master password. Do not lose your password. We do not know your password. We cannot give it to you if you forget it..."
I'm sure Apple could do something similar.
But yeah... the burden is still on the users to be responsible... and that doesn't always happen.
Why? Let's say Apple gives you the option of using a pass phrase that isn't stored anywhere on Apple servers or even on your iPhone (well, it IS stored on your iPhone, but only a hash is stored, not the original pass phrase).
Whenever your iPhone connects to iCloud to backup data, it firsts get encrypted locally on your iPhone (using your pass phrase) and then gets sent for backup. The iCloud servers would be storing already encrypted data. If someone got hold of this information it would be useless as they don't have the pass phrase to decrypt it (they'd have to brute force decrypt it, which could take some time depending on the pass phrase you chose).
Apple could even take this further. The "username" the data gets stored under isn't your actual Apple ID, but a token that's an encrypted representation of your Apple ID hashed with your iCloud pass phrase. So even Apple employees themselves wouldn't be able to link a set of iCloud data to a specific user or device (like they can do now, presumably because there's an identifier linked to your Apple ID and device).
Hell, let's forget pass phrases altogether. We know that Apple takes your fingerprint and stores a mathematical representation (a hash) of it in the secure enclave, in a format that makes it impossible to reverse back to the original print. What makes it so impressive is you can use a PORTION of your finger at different angles or orientations on the fingerprint sensor and Touch ID still somehow manages to "match" this with the stored hash of your fingerprint. So why not make a fingerprint as the pass phrase that encrypts your iCloud data? Now you can't forget it, and because of the complexity of your fingerprint you'd have the equivalent of a very long pass phrase (impossible to decrypt).
I don't think security rides so much on WHERE the servers are, but on HOW the data is stored. They could be treated as "dumb" servers who simply store information sent to them, without actually knowing anything about the data itself or who it belongs to.
To prevent the iPhone (that the County issued to the shooter) from backing up to iCloud.
So they could cherry-pick this highly emotional case to use as leverage against iPhone security and privacy.
Blatantly obvious. Shameful. And they think they can get away with it.
PS: Of course, this slicing is done through Tor or something like that to up the level of crazy even more :-).
Unless you remove you finger print (acid?) or die and get brutally mutilated the government will get access to your data.
To my mind the only passkey that can give some security has to be in your mind only. As soon as you rely on physical object (your finger) you are hackable by organizations with enough money to spend.
How would Apple protect 500 million iPhone users' privacy... while simultaneously providing law enforcement access to 50,000 terrorists' iPhones?
Imagine what sort of information you could find on people's phones these days: their home address, pictures of their children, their children's school, schedules, emails, access to door locks, garage door openers, health data, etc. Do you really want that stuff to be easily accessible to any common criminal?
I certainly don't. It should be as secure as it can possibly be.
But by keeping that information secure... it also prevents law enforcement from getting into criminals' phones too.
There's no way to selectively make some phones secure while making other phones easy to open.
So what do you think Apple should do?