exceptionhandler

About

Username
exceptionhandler
Joined
Visits
966
Last Active
Roles
member
Points
344
Badges
0
Posts
381
  • iOS 15 adoption rate appears to be lagging behind past updates

    elijahg said:
    crowley said:
    “ It's likely that the reduced adoption rates are because of a change in Apple's update stance. The company is no longer forcing users to install the latest operating system version to gain important security updates.”

    99.9% off the reason.

    “Apple's set of iOS 15 features -- including the controversial-but-delayed CSAM detection system -- may also be playing a part.”

    0.1% of the reason.
    Agree.  Why would a feature that most people don't know or care about and that was pulled be paying a part?
    Because people don't trust that Apple isn't doing it anyway due to the entirely out-of-the-blue way they announced it? That said I was on Big Sur until about a week ago. There was nothing compelling enough for me to spend days fixing things after they break as Apple deprecates random core bits of the OS.
    Why on earth would I be worried about CSAM fingerprint scanning for child rape images in iCloud Photos, even if they turned it on tomorrow? There is no reason to worry about this, since Apple already does it server-side on their iCloud servers, as do Google, Microsoft, Dropbox, etc.

    If I had child rape photos and didn’t want them to report them to police, I’d disable iCloud Photos. 

    I’m much more interested in the E2E encryption I’ll get when this eventually rolls out. That offers me value. Worrying about child rape porn? Not a concern.

    It may “technically” be E2E, as long as you disregard the the watchdog that is not yours at your front door.  Even then, it will not be true E2E… remove the thresholds and ability to report to someone else outside of the senders intended parties, then it will be true E2E.
    Incorrect. Sorry but you're speaking from ignorance here. Deploying 100% E2E is exactly why Apple designed on-device child rape CSAM scanning. It scans on the device because once it gets synced to iCloud, they can't scan it there because...encrypted. 

    The only person who is unhappy in this scenario is somebody with child rape photos on their device. The greater good is clearly on the side of all the customers in the world who will enjoy E2E privacy on all their data, vs the child rape collectors unhappy that they will still get in trouble. Oh well.

    Currently CSAM child rape photo scanning takes place on the servers of Apple, Google, Microsoft, etc. If some nefarious govt wants to make them look for Winnie the Pooh images, they can do that easier server-side, and could force them to do it today. Nothing about on-device makes it any easier, and arguably more difficult.
    What good is E2E if someone is watching everything going in and reporting to their boss?
    williamlondonelijahg
  • What to expect from the 'iPhone Fold'

    Hank2.0 said:
    fallenjt said:
    Ver Unlikely. It’s a gimmick and pron for damage but add zero or little benefit...
    The benefit follows the principle of greater functionality with a smaller size for transporting. Isn't that the basic idea of a cell phone in the first place?
    It’s not “a smaller size” as you describe.  Folding just changes it’s shape.  Instead of a thin and wide device with a specific weight, it could fold into a thicker and less wide device with the same weight.  I fail to see how it’s smaller, it’s just reconfigured.  Instead of wallet bulge, this is going to be phone bulge with the thickness.

    To view the screen, id have to pull it out and now flip it open.  How many more gymnastics are you wanting one handers to go through to be able to use their phones… bigger screens are hard enough.  Not to mention the mechanical wear and tear on the bend and supporting structures that a static chassis does not have.  I’d also be curious about how dropping the phone would/could affect the hinge.

    If this comes to fruition, I guess I’ll be sticking with my 12 mini till it dies.  And if it does, maybe upgrade to a 13 mini if it’s still available?

    I wouldn’t mind a bigger screen, but not at the cost at my hands.  Maybe I’ll accept it when there are holographic projected screens.
    williamlondon12Strangerswatto_cobrabaconstangFileMakerFeller
  • Apple wipes on-device CSAM photo monitoring from site, but plans unchanged

    zimmie said:

    We already trust that Apple won't rework their on-device content scanning. Or did you forget what Spotlight and the object recognition in Photos are? Not like you can disable either of those.
    I have also said in the past, that I am not opposed to scanning, it is a very useful tool for many purposes.  It the scanning and reporting that data to another party that’s the issue.  


    Many things require access to the data in order to work, but as long as the data remains on device? ¯\_(ツ)_/¯  I don’t care how much scanning is done; if it makes my life easier, great.  At some point, the software has to decrypt the data, otherwise we would only see a jumbled mess on our screens.  This is also part of why Apple has added the neural processing cores to its chips: to enable fast, more complex on-device (and more inherently secure) AI to do neat/useful things without the need to send it off to a server (how Siri works for a great many things, though with iOS 15, some of that has changed).
    muthuk_vanalingam
  • Apple wipes on-device CSAM photo monitoring from site, but plans unchanged

    zimmie said:
    zimmie said:
    badmonk said:
    I suspect CSAM screening will ultimately be performed and confined iCloud server side, like every other cloud based service has been doing for years (and not talked about it).

    I always thought iCloud screened for CSAM, after all MSFT and Google have been doing it for years.

    Freedom has its limits.
    I think they’ve always been scanning iCloud photos and reporting on the server side.  I think someone previously linked to a document that states so.  So the claims about protecting peds is unwarranted, because it will still be done on the server side.  What apple was trying to do was impose/move their house rule (server side) into our houses (iPhones) for whatever reason.

    I see no reason for the move.  As some people have previously stated, “maybe Apple is going to e2e encrypt iCloud photos”.  The rub here is that it would not be e2e encrypted either way.  Scanning and reporting necessitates access to the data. E2E encryption is only E2E encryption IFF there is no process to circumvent it (including at either end) to send the data to someone else outside of the authorized recipients intended by the sender.  This very fact alone means that iCloud photos will never be e2e encrypted as Apple needs to do CSAM scanning.

    So all things stated, I’m fine with the current state of server side scanning as it’s not on my device and the only way the scanning and reporting applies is IFF you use the service (some may argue that’s the way it would work on device, but that is subject to change, whereas if it’s on the server, they can’t make that change to scan more than what’s sent to iCloud)
    The proposed on-device CSAM scanning intrinsically involves end-to-end encryption. The whole point is to allow your device to encrypt photos with a key which Apple doesn't hold. If it detects CSAM, it also uploads partial directions on how to find the key. Once enough partial directions are uploaded, they can be used to find the key to decrypt the images, but Apple doesn't have the key until then. They also can't create the partial directions on their own.

    To protect against one of those partial directions being used as evidence of possession, they also set the system up to emit fake partial directions which are indistinguishable from real ones until after you have uploaded enough real ones.

    The clear solution is to spin all of the iCloud-related functionality out of Photos and into a separate application. CSAM scanning then goes in that application. If you want to upload stuff to iCloud, you need that application, which has the CSAM scanning. If you don't want the CSAM scanning, don't load the application (or remove it if it comes preinstalled). Done. Addresses everybody's concerns.
    It’s not intrinsic, the encryption used as a veil of protection.  I get that it’s theoretically impossible to decrypt the offending files until a threshold is met, but in order to report those images, it necessarily circumvents the E2E process between the normal intended sender and receiver(s), to send them for review at Apple and authorities.  Yes, the process heavily relies on encryption methods, but allows for another originally unintended party to be involved.  True E2E encryption means only specified sender and receiver have access to the information (but you still have to trust those involved not to reshape it).  Top it off, it’s on device, where it’s easier to “expand” functionality to include other things, whereas when it’s in the service, it’s physically unable to access anything on device unless it’s sent (hopefully over ssl) or already stored in the service.
    It sounds like you misunderstand exactly what end-to-end encryption is. When it's used, either end can betray the other and leak the unencrypted information, or even the key. That doesn't make it not end-to-end. This is one end (your phone) potentially leaking the key. Unless your phone decides to do that, Apple has no way to get at the key or any of the encrypted data. Compare with non-end-to-end cryptosystems, such as Telegram's old encryption method, where Telegram-the-company is a party to the key negotiation, and gets key material they could use to decrypt the conversation without either end being aware.

    Right now, Apple employees can view photos you store in iCloud. Presumably they are prohibited from doing this by policy, but they have the technical capability. With the endpoint CSAM scanning as explained in the technical papers Apple presented, they would no longer have that capability. That's because the endpoint CSAM scanning intrinsically involves end-to-end encryption.

    We already trust that Apple won't rework their on-device content scanning. Or did you forget what Spotlight and the object recognition in Photos are? Not like you can disable either of those.
    I’m well aware what e2e encryption is, and is predicated on trust.  It always boils down to trust to quote one of my comp sci professors.  It’s what I alluded to in 

    (but you still have to trust those involved not to reshape it)
    Note: I made a typo: reshape should have been re-shared.  And yes, I’m well aware encryption keys need to be safely stored.  Now I ask you this, once data is encrypted, to be sent, how does one decrypt it?  With the corresponding decryption key (which is different from the encryption key of course but the 2 come as a pair). 

    Apple has used a threshold algorithm to enable review of the contents… remove that, and you will have true e2e encryption; otherwise it’s just “wish it were e2e”. It’s close, but no cigar.

    Once a threshold is met what happens then? Do we trust Apple to handle the data correctly? Do we trust the authorities will handle the data correctly?  Do we trust individuals at these places not to share it or leak it?  Do we trust that the thresholds won’t change? Do we trust that other types of images won’t be deemed “illicit” in the future? Do we trust a similar threshold algorithm won’t be applied to text in imessages, looking for messages of terrorism or “hate” speech?  Do we trust that it will only be done for photos uploaded to iCloud?

    I for one am fine with my images not being e2e encrypted in iCloud, as I consider it public space anyway and act accordingly.  I would expect apple is employing encryption for data at rest, which a department(s) has access to the keys.  So which would you prefer, “wish it were e2e” or a select few with access to the data at rest?  6 one way, half a dozen another, (except one in my opinion has far more bad implications for the future)… both ways still involve trust however.

    ¯\_(ツ)_/¯ 
    muthuk_vanalingam
  • Apple wipes on-device CSAM photo monitoring from site, but plans unchanged

    zimmie said:
    badmonk said:
    I suspect CSAM screening will ultimately be performed and confined iCloud server side, like every other cloud based service has been doing for years (and not talked about it).

    I always thought iCloud screened for CSAM, after all MSFT and Google have been doing it for years.

    Freedom has its limits.
    I think they’ve always been scanning iCloud photos and reporting on the server side.  I think someone previously linked to a document that states so.  So the claims about protecting peds is unwarranted, because it will still be done on the server side.  What apple was trying to do was impose/move their house rule (server side) into our houses (iPhones) for whatever reason.

    I see no reason for the move.  As some people have previously stated, “maybe Apple is going to e2e encrypt iCloud photos”.  The rub here is that it would not be e2e encrypted either way.  Scanning and reporting necessitates access to the data. E2E encryption is only E2E encryption IFF there is no process to circumvent it (including at either end) to send the data to someone else outside of the authorized recipients intended by the sender.  This very fact alone means that iCloud photos will never be e2e encrypted as Apple needs to do CSAM scanning.

    So all things stated, I’m fine with the current state of server side scanning as it’s not on my device and the only way the scanning and reporting applies is IFF you use the service (some may argue that’s the way it would work on device, but that is subject to change, whereas if it’s on the server, they can’t make that change to scan more than what’s sent to iCloud)
    The proposed on-device CSAM scanning intrinsically involves end-to-end encryption. The whole point is to allow your device to encrypt photos with a key which Apple doesn't hold. If it detects CSAM, it also uploads partial directions on how to find the key. Once enough partial directions are uploaded, they can be used to find the key to decrypt the images, but Apple doesn't have the key until then. They also can't create the partial directions on their own.

    To protect against one of those partial directions being used as evidence of possession, they also set the system up to emit fake partial directions which are indistinguishable from real ones until after you have uploaded enough real ones.

    The clear solution is to spin all of the iCloud-related functionality out of Photos and into a separate application. CSAM scanning then goes in that application. If you want to upload stuff to iCloud, you need that application, which has the CSAM scanning. If you don't want the CSAM scanning, don't load the application (or remove it if it comes preinstalled). Done. Addresses everybody's concerns.
    It’s not intrinsic, the encryption used as a veil of protection.  I get that it’s theoretically impossible to decrypt the offending files until a threshold is met, but in order to report those images, it necessarily circumvents the E2E process between the normal intended sender and receiver(s), to send them for review at Apple and authorities.  Yes, the process heavily relies on encryption methods, but allows for another originally unintended party to be involved.  True E2E encryption means only specified sender and receiver have access to the information (but you still have to trust those involved not to reshape it).  Top it off, it’s on device, where it’s easier to “expand” functionality to include other things, whereas when it’s in the service, it’s physically unable to access anything on device unless it’s sent (hopefully over ssl) or already stored in the service.
    muthuk_vanalingam