Apple confirms iOS 12's 'USB Restricted Mode' will thwart police, criminal access [u]
Apple in a statement to AppleInsider on Wednesday said iOS 12's incarnation of "USB Restricted Mode" will thwart criminals, and enhance user's privacy.
In regards to law enforcement, the feature was created to protect iPhone owners in countries where the police seize phones at will. The move is aimed at regions with fewer legal protections than the U.S.
"At Apple, we put the customer at the center of everything we design. We're constantly strengthening the security protections in every Apple product to help customers defend against hackers, identity thieves and intrusions into their personal data," Apple said. "We have the greatest respect for law enforcement, and we don't design our security improvements to frustrate their efforts to do their jobs."
Apple decided to make improvements to iOS security after learning of iPhone cracking techniques being used by both criminals and law enforcement agencies. In particular, the company opted to take the USB stack out of the equation, a move that provides enhanced protection without serious detriments to the user experience.
With USB Restricted Mode, those attempting to gain unwarranted access to an iPhone will have an hour or less to reach a cracking device before being locked out.
Under the iOS 12 beta, data access through a Lightning port is cut off if a device hasn't been unlocked in the last hour. That's even tougher than Apple's initial beta versions of USB Restricted Mode, which simply required accessories to be connected to an unlocked device -- or a device to be unlocked with an accessory attached -- at least once per week.
The new policy seems bent on disrupting the hacking techniques of digital forensics firms like Cellebrite and GrayShift. Cellebrite in particular is believed to the firm the FBI used to crack the iPhone 5c of San Bernardino shooter Syed Rizwan Farook, allowing both Apple and the U.S. Department of Justice to avoid a protracted battle over whether the former could be forced to code a backdoor into iOS.
A number of officials in U.S. spy and law enforcement agencies have complained that internet communications are "going dark," thanks to the growing use of end-to-end encryption, which prevents even the companies implementing it from intercepting data. Some politicians have aligned behind mandating backdoors, though nothing has come of those efforts.
Apple and other encryption supporters have countered, saying privacy is a right and that any backdoor is bound to be discovered by malicious criminals and governments. Some critics may include the U.S. in the latter category, given mass surveillance efforts by the FBI and NSA.
Updated with comments from Apple
In regards to law enforcement, the feature was created to protect iPhone owners in countries where the police seize phones at will. The move is aimed at regions with fewer legal protections than the U.S.
"At Apple, we put the customer at the center of everything we design. We're constantly strengthening the security protections in every Apple product to help customers defend against hackers, identity thieves and intrusions into their personal data," Apple said. "We have the greatest respect for law enforcement, and we don't design our security improvements to frustrate their efforts to do their jobs."
Apple decided to make improvements to iOS security after learning of iPhone cracking techniques being used by both criminals and law enforcement agencies. In particular, the company opted to take the USB stack out of the equation, a move that provides enhanced protection without serious detriments to the user experience.
With USB Restricted Mode, those attempting to gain unwarranted access to an iPhone will have an hour or less to reach a cracking device before being locked out.
Under the iOS 12 beta, data access through a Lightning port is cut off if a device hasn't been unlocked in the last hour. That's even tougher than Apple's initial beta versions of USB Restricted Mode, which simply required accessories to be connected to an unlocked device -- or a device to be unlocked with an accessory attached -- at least once per week.
The new policy seems bent on disrupting the hacking techniques of digital forensics firms like Cellebrite and GrayShift. Cellebrite in particular is believed to the firm the FBI used to crack the iPhone 5c of San Bernardino shooter Syed Rizwan Farook, allowing both Apple and the U.S. Department of Justice to avoid a protracted battle over whether the former could be forced to code a backdoor into iOS.
A number of officials in U.S. spy and law enforcement agencies have complained that internet communications are "going dark," thanks to the growing use of end-to-end encryption, which prevents even the companies implementing it from intercepting data. Some politicians have aligned behind mandating backdoors, though nothing has come of those efforts.
Apple and other encryption supporters have countered, saying privacy is a right and that any backdoor is bound to be discovered by malicious criminals and governments. Some critics may include the U.S. in the latter category, given mass surveillance efforts by the FBI and NSA.
Updated with comments from Apple
Comments
Believe it or not Apple has been solving usability challenges for decades and knows what it’s doing. I doubt very much that your scenario is reality.
I wish there was a way to allow legitimate law enforcement access while at the same time maintaining security. The San Bernardino shooting is a prime example - the police had a clear, legitimate and legal need to access the contents of the phone. Unfortunately, back doors and other techniques have virtually universally been hacked and abused, as others have pointed out. Even GrayShift made no attempt to ensure that their device would be limited to legitimate uses.
However, it's a Good Thing that Apple is doing this, the agencies should have (be given) different and better methods. Interestingly, the ex-head of GCHQ (UK's equivalent of NSA) is on record as opposing backdoors: he says they're technically difficult and and open to abuse (https://www.theregister.co.uk/2017/07/10/former_gchq_wades_into_encryption_debate/). About what has been said here in the past.
BTW The original story is here (but it may not be accessible outside the UK):https//www.bbc.co.uk/news/av/technology-40554686/end-to-end-encryption-back-door-a-bad-idea
Just because the ex-head of your GCHQ says backdoors are bad doesn't necessarily mean they haven't tried forcing them in, just like our NSA might be doing. Everyone watches too much TV but you have to wonder how many of the spying movies actually are based on some fact. We'll never know because I'm sure some are run without too much oversight. How much of the Bourne Identity is based on current "lawful" spying? I'd rather Apple continue to try their hardest to protect my data from criminals, advertisers, and governments who don't need to see my legal data.
Just saw this: "Microsoft has issued a Windows 10 security update to prevent hackers from breaking into PCs using Cortana." Was this an honest bug or someone forcing in a backdoor?