Apple expands feature that blurs iMessage nudity to UK, Canada, New Zealand, and Australia...

2»

Comments

  • Reply 21 of 23
    nicholfdnicholfd Posts: 824member
    command_f said:
    jdw said:
    <Snip>.
    I agree.  I was and still am against on device CSAM detection.  I’m ok if CSAM detection happens in the cloud, however… their house, their rules. On device CSAM necessitates the circumvention of encryption to report to a 3rd originally unintended party the data that is in violation of the rules.
    I don't follow your logic. CSAM detection is OK if it's done on someone else's computer (Apple's Server in the cloud) but not if it's done on your own phone? If it's done on your phone, you then get the warning and no-one else can tell (unless you keep doing it, which is my memory of what Apple proposed). If it's done on someone else's server then the data is present somewhere out of your control.

    I also don't understand what encryption you think is being "circumvented". The data on your phone is encrypted but, within the phone, the encryption keys are available and the data unlocked: if it wasn't, how would Photos be able to display your photos to you? 
    As I’ve said before: I treat anything I send to iCloud photos as public knowledge (Even if it’s meant only for friends and family).  Anything I want to keep private stays on my devices. On device scanning and reporting encroaches on that privacy (even given the current software “limitations” of thresholds and only done when sent to iCloud)… so, given that, I would much rather the CSAM reporting stay server side.

    You must have missed the part about CSAM on-device scanning.  It only occurs in the pipeline of the photo being uploaded to iCloud Photos.  So it is going to be data on their servers.  They are doing the courtesy of looking at it on your device, as it's being pushed to their cloud.  This allows the cloud version to be encrypted, without Apple having access to the cloud version.  If they can scan it in the cloud, they can look at it.

    If you have stuff you don't want scanned, don't turn on iCloud Photos, or store it outside Photos.  The control is in the user's hands - iCloud Photos is off by default on a new/clean device.
    watto_cobra
  • Reply 22 of 23
    nicholfd said:
    command_f said:
    jdw said:
    <Snip>.
    I agree.  I was and still am against on device CSAM detection.  I’m ok if CSAM detection happens in the cloud, however… their house, their rules. On device CSAM necessitates the circumvention of encryption to report to a 3rd originally unintended party the data that is in violation of the rules.
    I don't follow your logic. CSAM detection is OK if it's done on someone else's computer (Apple's Server in the cloud) but not if it's done on your own phone? If it's done on your phone, you then get the warning and no-one else can tell (unless you keep doing it, which is my memory of what Apple proposed). If it's done on someone else's server then the data is present somewhere out of your control.

    I also don't understand what encryption you think is being "circumvented". The data on your phone is encrypted but, within the phone, the encryption keys are available and the data unlocked: if it wasn't, how would Photos be able to display your photos to you? 
    As I’ve said before: I treat anything I send to iCloud photos as public knowledge (Even if it’s meant only for friends and family).  Anything I want to keep private stays on my devices. On device scanning and reporting encroaches on that privacy (even given the current software “limitations” of thresholds and only done when sent to iCloud)… so, given that, I would much rather the CSAM reporting stay server side.

    You must have missed the part about CSAM on-device scanning.  It only occurs in the pipeline of the photo being uploaded to iCloud Photos.  So it is going to be data on their servers.  They are doing the courtesy of looking at it on your device, as it's being pushed to their cloud.  This allows the cloud version to be encrypted, without Apple having access to the cloud version.  If they can scan it in the cloud, they can look at it.

    If you have stuff you don't want scanned, don't turn on iCloud Photos, or store it outside Photos.  The control is in the user's hands - iCloud Photos is off by default on a new/clean device.
    I have not missed that part, I’m well aware of that, and also commented about that in the part you quoted:

    On device scanning and reporting encroaches on that privacy (even given the current software “limitations” of thresholds and only done when sent to iCloud)… 

    And also note software is prone to change, and again, this all boils down to trust:

    https://forums.appleinsider.com/discussion/comment/3350611#Comment_3350611

    Once a threshold is met what happens then? Do we trust Apple to handle the data correctly? Do we trust the authorities will handle the data correctly?  Do we trust individuals at these places not to share it or leak it?  Do we trust that the thresholds won’t change? Do we trust that other types of images won’t be deemed “illicit” in the future? Do we trust a similar threshold algorithm won’t be applied to text in imessages, looking for messages of terrorism or “hate” speech?  Do we trust that it will only be done for photos uploaded to iCloud?


  • Reply 23 of 23
    nicholfd said:
    If you have stuff you don't want scanned, don't turn on iCloud Photos, or store it outside Photos.  The control is in the user's hands - iCloud Photos is off by default on a new/clean device.
    I’ve explicitly not upgraded to iOS 15, and I don’t use iCloud photos for this very reason until Apple makes an official statement about what’s to be done with on device CSAM reporting.  If they proceed, I’ll be sticking to my current version indefinitely.
Sign In or Register to comment.