exceptionhandler
About
- Username
- exceptionhandler
- Joined
- Visits
- 966
- Last Active
- Roles
- member
- Points
- 344
- Badges
- 0
- Posts
- 381
Reactions
-
iOS 15 adoption rate appears to be lagging behind past updates
StrangeDays said:exceptionhandler said:
It may “technically” be E2E, as long as you disregard the the watchdog that is not yours at your front door. Even then, it will not be true E2E… remove the thresholds and ability to report to someone else outside of the senders intended parties, then it will be true E2E.StrangeDays said:elijahg said:crowley said:neverindoubt said:“ It's likely that the reduced adoption rates are because of a change in Apple's update stance. The company is no longer forcing users to install the latest operating system version to gain important security updates.”
99.9% off the reason.
“Apple's set of iOS 15 features -- including the controversial-but-delayed CSAM detection system -- may also be playing a part.”
0.1% of the reason.
If I had child rape photos and didn’t want them to report them to police, I’d disable iCloud Photos.
I’m much more interested in the E2E encryption I’ll get when this eventually rolls out. That offers me value. Worrying about child rape porn? Not a concern.
The only person who is unhappy in this scenario is somebody with child rape photos on their device. The greater good is clearly on the side of all the customers in the world who will enjoy E2E privacy on all their data, vs the child rape collectors unhappy that they will still get in trouble. Oh well.
Currently CSAM child rape photo scanning takes place on the servers of Apple, Google, Microsoft, etc. If some nefarious govt wants to make them look for Winnie the Pooh images, they can do that easier server-side, and could force them to do it today. Nothing about on-device makes it any easier, and arguably more difficult. -
What to expect from the 'iPhone Fold'
Hank2.0 said:fallenjt said:Ver Unlikely. It’s a gimmick and pron for damage but add zero or little benefit...
To view the screen, id have to pull it out and now flip it open. How many more gymnastics are you wanting one handers to go through to be able to use their phones… bigger screens are hard enough. Not to mention the mechanical wear and tear on the bend and supporting structures that a static chassis does not have. I’d also be curious about how dropping the phone would/could affect the hinge.
If this comes to fruition, I guess I’ll be sticking with my 12 mini till it dies. And if it does, maybe upgrade to a 13 mini if it’s still available?
I wouldn’t mind a bigger screen, but not at the cost at my hands. Maybe I’ll accept it when there are holographic projected screens. -
Apple wipes on-device CSAM photo monitoring from site, but plans unchanged
zimmie said:
We already trust that Apple won't rework their on-device content scanning. Or did you forget what Spotlight and the object recognition in Photos are? Not like you can disable either of those.https://forums.appleinsider.com/discussion/comment/3346154/#Comment_3346154
https://forums.appleinsider.com/discussion/comment/3342666/#Comment_3342666Many things require access to the data in order to work, but as long as the data remains on device? ¯\_(ツ)_/¯ I don’t care how much scanning is done; if it makes my life easier, great. At some point, the software has to decrypt the data, otherwise we would only see a jumbled mess on our screens. This is also part of why Apple has added the neural processing cores to its chips: to enable fast, more complex on-device (and more inherently secure) AI to do neat/useful things without the need to send it off to a server (how Siri works for a great many things, though with iOS 15, some of that has changed). -
Apple wipes on-device CSAM photo monitoring from site, but plans unchanged
zimmie said:exceptionhandler said:zimmie said:exceptionhandler said:badmonk said:I suspect CSAM screening will ultimately be performed and confined iCloud server side, like every other cloud based service has been doing for years (and not talked about it).
I always thought iCloud screened for CSAM, after all MSFT and Google have been doing it for years.
Freedom has its limits.
I see no reason for the move. As some people have previously stated, “maybe Apple is going to e2e encrypt iCloud photos”. The rub here is that it would not be e2e encrypted either way. Scanning and reporting necessitates access to the data. E2E encryption is only E2E encryption IFF there is no process to circumvent it (including at either end) to send the data to someone else outside of the authorized recipients intended by the sender. This very fact alone means that iCloud photos will never be e2e encrypted as Apple needs to do CSAM scanning.
So all things stated, I’m fine with the current state of server side scanning as it’s not on my device and the only way the scanning and reporting applies is IFF you use the service (some may argue that’s the way it would work on device, but that is subject to change, whereas if it’s on the server, they can’t make that change to scan more than what’s sent to iCloud)
To protect against one of those partial directions being used as evidence of possession, they also set the system up to emit fake partial directions which are indistinguishable from real ones until after you have uploaded enough real ones.
The clear solution is to spin all of the iCloud-related functionality out of Photos and into a separate application. CSAM scanning then goes in that application. If you want to upload stuff to iCloud, you need that application, which has the CSAM scanning. If you don't want the CSAM scanning, don't load the application (or remove it if it comes preinstalled). Done. Addresses everybody's concerns.
Right now, Apple employees can view photos you store in iCloud. Presumably they are prohibited from doing this by policy, but they have the technical capability. With the endpoint CSAM scanning as explained in the technical papers Apple presented, they would no longer have that capability. That's because the endpoint CSAM scanning intrinsically involves end-to-end encryption.
We already trust that Apple won't rework their on-device content scanning. Or did you forget what Spotlight and the object recognition in Photos are? Not like you can disable either of those.
Note: I made a typo: reshape should have been re-shared. And yes, I’m well aware encryption keys need to be safely stored. Now I ask you this, once data is encrypted, to be sent, how does one decrypt it? With the corresponding decryption key (which is different from the encryption key of course but the 2 come as a pair).(but you still have to trust those involved not to reshape it)Apple has used a threshold algorithm to enable review of the contents… remove that, and you will have true e2e encryption; otherwise it’s just “wish it were e2e”. It’s close, but no cigar.
Once a threshold is met what happens then? Do we trust Apple to handle the data correctly? Do we trust the authorities will handle the data correctly? Do we trust individuals at these places not to share it or leak it? Do we trust that the thresholds won’t change? Do we trust that other types of images won’t be deemed “illicit” in the future? Do we trust a similar threshold algorithm won’t be applied to text in imessages, looking for messages of terrorism or “hate” speech? Do we trust that it will only be done for photos uploaded to iCloud?
I for one am fine with my images not being e2e encrypted in iCloud, as I consider it public space anyway and act accordingly. I would expect apple is employing encryption for data at rest, which a department(s) has access to the keys. So which would you prefer, “wish it were e2e” or a select few with access to the data at rest? 6 one way, half a dozen another, (except one in my opinion has far more bad implications for the future)… both ways still involve trust however.¯\_(ツ)_/¯ -
Apple wipes on-device CSAM photo monitoring from site, but plans unchanged
zimmie said:exceptionhandler said:badmonk said:I suspect CSAM screening will ultimately be performed and confined iCloud server side, like every other cloud based service has been doing for years (and not talked about it).
I always thought iCloud screened for CSAM, after all MSFT and Google have been doing it for years.
Freedom has its limits.
I see no reason for the move. As some people have previously stated, “maybe Apple is going to e2e encrypt iCloud photos”. The rub here is that it would not be e2e encrypted either way. Scanning and reporting necessitates access to the data. E2E encryption is only E2E encryption IFF there is no process to circumvent it (including at either end) to send the data to someone else outside of the authorized recipients intended by the sender. This very fact alone means that iCloud photos will never be e2e encrypted as Apple needs to do CSAM scanning.
So all things stated, I’m fine with the current state of server side scanning as it’s not on my device and the only way the scanning and reporting applies is IFF you use the service (some may argue that’s the way it would work on device, but that is subject to change, whereas if it’s on the server, they can’t make that change to scan more than what’s sent to iCloud)
To protect against one of those partial directions being used as evidence of possession, they also set the system up to emit fake partial directions which are indistinguishable from real ones until after you have uploaded enough real ones.
The clear solution is to spin all of the iCloud-related functionality out of Photos and into a separate application. CSAM scanning then goes in that application. If you want to upload stuff to iCloud, you need that application, which has the CSAM scanning. If you don't want the CSAM scanning, don't load the application (or remove it if it comes preinstalled). Done. Addresses everybody's concerns.