Who knows but one thing is true. Apple can only go so far with security before it starts to affect usability and the customer experience. If you lock something down too tight it becomes too difficult to use.
Apple should worry about poking the Bear. Never now when ill will can cause problems or turn into Media disaster.
What ill will? Protecting your customers isn’t ill will.
Who knows but one thing is true. Apple can only go so far with security before it starts to affect usability and the customer experience. If you lock something down too tight it becomes too difficult to use.
Apple should worry about poking the Bear. Never now when ill will can cause problems or turn into Media disaster.
Apple made it so you can turn off this security feature.
Of course, sooner or later, Apple will simply remove the Lightning Port, particularly with WiFI becoming so fast, and magnetic charging becoming a reality.
Exactly how the company managed to defeat the USB lockdown is unclear. Further details of the supposed workaround are unavailable, though a second person responding to the original email noted Grayshift "addressed" USB Restricted Mode in a recent webinar. Whether that session outlined a successful exploit is also unclear.
Other digital forensics firms are working on similar workarounds. ElcomSoft in May suggested it might be possible to extend USB Restricted Mode's window beyond the hour-long restriction by connecting an iPhone to a paired accessory or computer while it is unlocked. The company added that dedicated hardware could potentially disable the feature completely.
USB Restricted Mode is suppose to have the hour-long restriction even if connected to an accessory or computer when it is released.
I continue to wonder why companies like GreyKey aren’t treated like hackers and arrested. Unless the NSA is fronting them, they are common criminals not a police force and should be treated as such.
I continue to wonder why companies like GreyKey aren’t treated like hackers and arrested. Unless the NSA is fronting them, they are common criminals not a police force and should be treated as such.
Based on what crime?
There's no reasonable answer that can be had when the OP states that they need to be "treated like hackers and arrested" as if finding a chink in one's armour is inherently a criminal offense.
Well. Failing to claim that they had a work around for the iPhones new security would affect their cash flow. Their product is very expensive. Who would buy it if it didn’t work against Apple’s latest and greatest software and/or hardware. My guess is they don’t yet have a work around. If they can’t continue to sell their product they have no company. 🚀
Who knows but one thing is true. Apple can only go so far with security before it starts to affect usability and the customer experience. If you lock something down too tight it becomes too difficult to use.
While security traditionally comes at the cost of user convenience, it's not always the case, which is something Apple has proven with their biometrics. I see no reason why Apple can't advance their ML to know that a device isn't being unlocked via USB in a normal location (e.g.: a geo-fence location and/or WiFi network á la your home, which could mean requiring the passcode immediately even if it's a known computer), considering if the gyro and accelerometer are not moving enough (e.g.: like if it's placed flat on table while attempting to be unlocked repeatedly), and/or characters to unlock the device being entered with precise, digital timing like a machine—because it is—instead of like a human moving their fingers to press characters which would take a variable amount of time depending on the character distance and the user's personal typing pattern (see Google's reCAPTCHA for a simplified example of how that might work to detect whether a human is involved).
And those are three things off the top of my head. Add in there the potential for new Apple silicon that will act as an extra layer of security between the system that will keep track of these actions even when the core system is reset in a way that keeps GreyKey's reset mechanism from wiping the device and I think Apple can end up making it increasingly harder for hackers without causing the user any additional effort.
PS: Regardless, I'd use Apple full keyboard instead of just a 6-digit PIN to access my device. With their American English keyboard you have nearly 2 billion combinations with just 4 characters if you employ their very special characters (á la long press on a key). If and when Emoji are ever allowed the palette opens up to around a BASE-1000 system and may even be easier for people to remember since ideograms can be more relatable to an individual than individual characters.
Who knows but one thing is true. Apple can only go so far with security before it starts to affect usability and the customer experience. If you lock something down too tight it becomes too difficult to use.
Apple should worry about poking the Bear. Never now when ill will can cause problems or turn into Media disaster.
Apple made it so you can turn off this security feature.
Of course, sooner or later, Apple will simply remove the Lightning Port, particularly with WiFI becoming so fast, and magnetic charging becoming a reality.
Who knows but one thing is true. Apple can only go so far with security before it starts to affect usability and the customer experience. If you lock something down too tight it becomes too difficult to use.
Apple should worry about poking the Bear. Never now when ill will can cause problems or turn into Media disaster.
Apple IS the bear.
Hackers should worry about harassing them.
I certainly am am not happy that these sneaky peeps are trying to invent ways to access my private data.
I guess the easiest way to defeat the time-based lockout would be to somehow pause/rewind/reset the clock on the phone, by stopping it from connecting to a time server, or fiddling the time server adjust the time. I’m not suggesting it’s actually “easy” but probably the least line of resistance.
This is exactly why Grayshift and others charge so much for their tools because they won't sell many as the functionality is a moving target entirely reliant on their engineers outsmarting Apple engineers. No system will be entirely secure but it's getting quite close and that Secure Enclave is quite impressive.
I would prefer they just always ask for the passcode when connecting to USB if you are going to do anything beyond charging. But if it's required for charging, meh, I can handle it for the piece of mind. It's not the police I fear it is the future abuse of government. They already are doing too much and privacy is nonexistent.
I continue to wonder why companies like GreyKey aren’t treated like hackers and arrested. Unless the NSA is fronting them, they are common criminals not a police force and should be treated as such.
0) you say hackers like it's a bad thing. Hacking is not illegal. It's part of the innovation cycle. Eli Whitney was a hacker, Tesla a hacker, Woz a hacker. 1) because once greykey buys (and/or obtains legally) the phone, it's their phone. Can't be arrested for probing something you own. Apple could sue them for violating a EULA, but that is sketchy space 2) 'common criminals'... you realize you are treading on 1st,2nd,4th, and probably the 9th amendments, (Look up the Phil Zimmerman, and the history of PGP encryption being classified as 'munitions' ).
I continue to wonder why companies like GreyKey aren’t treated like hackers and arrested. Unless the NSA is fronting them, they are common criminals not a police force and should be treated as such.
What law did they break?
Arguably the DMCA. They exist to circumvent technological protections preventing copying of data which is not authorized by the rights holder. The DMCA makes no exception for scholarly work, so their products aren't allowed to be used in a research setting.
There's the CFAA. Again, they exist entirely to provide access to computing systems which are protected by passwords and which is not authorized by the owner of the system.
Of course, governments contract out work all the time. I would be very surprised if Grayshift were not somehow legally protected.
Who knows but one thing is true. Apple can only go so far with security before it starts to affect usability and the customer experience. If you lock something down too tight it becomes too difficult to use.
Apple should worry about poking the Bear. Never now when ill will can cause problems or turn into Media disaster.
In the country of guns & locks & castle doctrine, i really cannot see increased end-user security being a "media disaster" amongst real denizens.
Talking heads and politicians will take sides, but American love their freedom from tyranny.
I continue to wonder why companies like GreyKey aren’t treated like hackers and arrested. Unless the NSA is fronting them, they are common criminals not a police force and should be treated as such.
The NSA is too busy collecting all of the internet transmissions to save them for future decryption.
I continue to wonder why companies like GreyKey aren’t treated like hackers and arrested. Unless the NSA is fronting them, they are common criminals not a police force and should be treated as such.
Based on what crime?
I actually wonder if they don’t run afoul of some portion of the DMCA laws
Comments
Of course, sooner or later, Apple will simply remove the Lightning Port, particularly with WiFI becoming so fast, and magnetic charging becoming a reality.
USB Restricted Mode is suppose to have the hour-long restriction even if connected to an accessory or computer when it is released.
Hackers should worry about harassing them.
I certainly am am not happy that these sneaky peeps are trying to invent ways to access my private data.
Go Apple!
I would prefer they just always ask for the passcode when connecting to USB if you are going to do anything beyond charging. But if it's required for charging, meh, I can handle it for the piece of mind. It's not the police I fear it is the future abuse of government. They already are doing too much and privacy is nonexistent.
1) because once greykey buys (and/or obtains legally) the phone, it's their phone. Can't be arrested for probing something you own.
Apple could sue them for violating a EULA, but that is sketchy space
2) 'common criminals'... you realize you are treading on 1st,2nd,4th, and probably the 9th amendments, (Look up the Phil Zimmerman, and the history of PGP encryption being classified as 'munitions' ).
There's the CFAA. Again, they exist entirely to provide access to computing systems which are protected by passwords and which is not authorized by the owner of the system.
Of course, governments contract out work all the time. I would be very surprised if Grayshift were not somehow legally protected.