Apple wipes on-device CSAM photo monitoring from site, but plans unchanged

Posted:
in iOS edited December 2021
Apple removed all signs of its CSAM initiative from the Child Safety webpage on its website at some point overnight, but the company has made it clear that the program is still coming.




Apple announced in August that it would be introducing a collection of features to iOS and iPadOS to help protect children from predators and limit the spread of Child Sexual Abuse Material (CSAM). Following considerable criticism from the proposal, it appears that Apple is pulling away from the effort.

The Child Safety page on the Apple Website had a detailed overview of the inbound child safety tools, up until December 10. After that date, reports MacRumors, the references were wiped clean from the page.

It is unusual for Apple to completely remove mentions of a product or feature that it couldn't release to the public, as this behavior was famously on display for the AirPower charging pad. Such an action could be considered as Apple publicly giving up on the feature, though a second attempt in the future is always plausible.

Apple's features consisted of several elements, protecting children in a few ways. The main CSAM feature proposed using systems to detect abuse imagery stored within a user's Photos library. If a user had iCloud Photos on, this feature scanned a user's photos on-device and compared them against a hash database of known infringing material.

A second feature has already been implemented, intending to protect young users from seeing or sending sexually explicit photos within iMessage. Originally, this feature was going to notify parents when such an image was found, but as implemented, there is no external notification.

A third added updates to Siri and Search offering additional resources and guidance.

Of the three, the latter two made it into the release of iOS 15.2 on Monday. The more contentious CSAM element did not.

The Child Safety page on the Apple Website previously explained Apple was introducing the on-device CSAM scanning, in which it would compare image file hashes against a database of known CSAM image hashes. If a sufficient quantity of files were to be flagged as CSAM, Apple would have contacted the National Center for Missing and Exploited Children (NCMEC) on the matter.

The updated version of the page removes not only the section on CSAM detection, but also references in the page's introduction, the Siri and Search guidance, and a section offering links to PDFs explaining and assessing its CSAM process have been pulled.

On Wednesday afternoon, The Verge was told that Apple's plans were unchanged, and the pause for the re-think is still just that -- a temporary delay.

Shortly after the announcement of its proposed tools in August, Apple had to respond to claims from security and privacy experts, as well as other critics, about the CSAM scanning itself. Critics viewed the system as being a privacy violation, and the start of a slippery slope that could've led to governments using an expanded form of it to effectively perform on-device surveillance.

Feedback came from a wide range of critics, from privacy advocates and some governments to a journalist association in Germany, Edward Snowden, and Bill Maher. Despite attempting to put forward its case for the feature, and an admittance it failed to properly communicate the feature, Apple postponed the launch in September to rethink the system.

Update December 15, 1:39 PM ET: Updated with Apple clarifying that its plans are unchanged.

Read on AppleInsider

Comments

  • Reply 1 of 18
    Bravo, paranoids. Thanks to you, iCloud and iCloud Drive won’t be end-to-end encrypted, after all. Bingo! /s
  • Reply 2 of 18
    Privacy advocates help perverts. 
    indieshackwilliamlondonjony0
  • Reply 3 of 18
    badmonkbadmonk Posts: 1,295member
    I suspect CSAM screening will ultimately be performed and confined iCloud server side, like every other cloud based service has been doing for years (and not talked about it).

    I always thought iCloud screened for CSAM, after all MSFT and Google have been doing it for years.

    Freedom has its limits.
  • Reply 4 of 18
    F_Kent_DF_Kent_D Posts: 98unconfirmed, member
    I myself didn’t have any issues with any of it TBH. I don’t have child pornography on anything I own and neither does anyone I know so I have nothing to worry about. They never were going to physically look at every photo, these are hash scans for particular data in the file details itself, not the actual photos. I also allowed the notifications setting on my 11 year old Daughter’s phone to notify me of potential unacceptable messages being sent or received. Let the paranoid people kill what could actually help end the CP sickness that’s more of an issue today than ever. 
    indieshackkdupuis77williamlondonwatto_cobrajony0
  • Reply 5 of 18
    Great! And the pedos will keep getting away with it.
    edited December 2021 williamlondonwatto_cobrajony0
  • Reply 6 of 18
    badmonk said:
    I suspect CSAM screening will ultimately be performed and confined iCloud server side, like every other cloud based service has been doing for years (and not talked about it).

    I always thought iCloud screened for CSAM, after all MSFT and Google have been doing it for years.

    Freedom has its limits.
    I think they’ve always been scanning iCloud photos and reporting on the server side.  I think someone previously linked to a document that states so.  So the claims about protecting peds is unwarranted, because it will still be done on the server side.  What apple was trying to do was impose/move their house rule (server side) into our houses (iPhones) for whatever reason.

    I see no reason for the move.  As some people have previously stated, “maybe Apple is going to e2e encrypt iCloud photos”.  The rub here is that it would not be e2e encrypted either way.  Scanning and reporting necessitates access to the data. E2E encryption is only E2E encryption IFF there is no process to circumvent it (including at either end) to send the data to someone else outside of the authorized recipients intended by the sender.  This very fact alone means that iCloud photos will never be e2e encrypted as Apple needs to do CSAM scanning.

    So all things stated, I’m fine with the current state of server side scanning as it’s not on my device and the only way the scanning and reporting applies is IFF you use the service (some may argue that’s the way it would work on device, but that is subject to change, whereas if it’s on the server, they can’t make that change to scan more than what’s sent to iCloud)
    muthuk_vanalingamneoncatcat52baconstangwatto_cobragatorguy
  • Reply 7 of 18
    F_Kent_D said:
    I myself didn’t have any issues with any of it TBH. I don’t have child pornography on anything I own and neither does anyone I know so I have nothing to worry about. They never were going to physically look at every photo, these are hash scans for particular data in the file details itself, not the actual photos. I also allowed the notifications setting on my 11 year old Daughter’s phone to notify me of potential unacceptable messages being sent or received. Let the paranoid people kill what could actually help end the CP sickness that’s more of an issue today than ever. 
    It is an interesting balance between privacy concerns and the greater good of combatting the spread of CP online. I take my privacy very seriously and feel I very slightly come down on the side of preferring Apple not routinely scanning all my files and photos, but I can also appreciate the potential benefits of doing so to potentially aid in catching these predators as well. I dunno, it's a tough one but if you parallel this to "stop and frisk" and people saying "If you have nothing to hide, you won't have any reason to be worried" it does seem to contravene reasonable search and seizure without any probable cause to blanket troll through our data. Getting back to the Apple ecosystem, this is also akin to the AirTag privacy concerns vs. effective theft deterrent/tracking capabilities debate, though of course much more serious of a problem.

    But I'll also be real, I am 1000% sure Google, Microsoft, etc. are scanning my data already looking for illicit content and merely ways to sell me more stuff anyhow lol. I agree with an earlier poster that Apple will likely contain their hash scanning to iCloud data stored locally on their own servers and just not publicize this information while leaving CSAM off of people's local hardware (which, as more and more customers move towards 100% storage of their data in the cloud, will be nearly just as effective one would think - I just migrated everything on my MacBook Pro/iPad/iPhone into iCloud fully myself this past summer, with just the occasional Time Machine backup taken to the HDD in my fire safe).

    Either way you look at it, there is a growing percentage of people, of a particular ideological persuasion, that would have us all believe these predators (pedophiles) are just merely "minor attracted persons" and seem to want to normalize this deranged euphemism for them. Sickening.
    muthuk_vanalingamwilliamlondoncat52appleinsideruserbaconstangwatto_cobra
  • Reply 8 of 18
    Anilu_777 said:
    Privacy advocates help perverts. 
    Ummm, no.  
    neoncatwilliamlondoncat52baconstangwatto_cobraentropyskojack
  • Reply 9 of 18
    Great! And the pedos will keep getting away with it.
    You’re clueless 
    neoncatwilliamlondoncat52baconstangentropyskojack
  • Reply 10 of 18
    badmonk said:
    I suspect CSAM screening will ultimately be performed and confined iCloud server side, like every other cloud based service has been doing for years (and not talked about it).

    I always thought iCloud screened for CSAM, after all MSFT and Google have been doing it for years.

    Freedom has its limits.
    iCloud is been CSAM screened already…like any other cloud service. 

    People are clueless on this subject. 
    cat52watto_cobra
  • Reply 11 of 18
    zimmiezimmie Posts: 651member
    badmonk said:
    I suspect CSAM screening will ultimately be performed and confined iCloud server side, like every other cloud based service has been doing for years (and not talked about it).

    I always thought iCloud screened for CSAM, after all MSFT and Google have been doing it for years.

    Freedom has its limits.
    I think they’ve always been scanning iCloud photos and reporting on the server side.  I think someone previously linked to a document that states so.  So the claims about protecting peds is unwarranted, because it will still be done on the server side.  What apple was trying to do was impose/move their house rule (server side) into our houses (iPhones) for whatever reason.

    I see no reason for the move.  As some people have previously stated, “maybe Apple is going to e2e encrypt iCloud photos”.  The rub here is that it would not be e2e encrypted either way.  Scanning and reporting necessitates access to the data. E2E encryption is only E2E encryption IFF there is no process to circumvent it (including at either end) to send the data to someone else outside of the authorized recipients intended by the sender.  This very fact alone means that iCloud photos will never be e2e encrypted as Apple needs to do CSAM scanning.

    So all things stated, I’m fine with the current state of server side scanning as it’s not on my device and the only way the scanning and reporting applies is IFF you use the service (some may argue that’s the way it would work on device, but that is subject to change, whereas if it’s on the server, they can’t make that change to scan more than what’s sent to iCloud)
    The proposed on-device CSAM scanning intrinsically involves end-to-end encryption. The whole point is to allow your device to encrypt photos with a key which Apple doesn't hold. If it detects CSAM, it also uploads partial directions on how to find the key. Once enough partial directions are uploaded, they can be used to find the key to decrypt the images, but Apple doesn't have the key until then. They also can't create the partial directions on their own.

    To protect against one of those partial directions being used as evidence of possession, they also set the system up to emit fake partial directions which are indistinguishable from real ones until after you have uploaded enough real ones.

    The clear solution is to spin all of the iCloud-related functionality out of Photos and into a separate application. CSAM scanning then goes in that application. If you want to upload stuff to iCloud, you need that application, which has the CSAM scanning. If you don't want the CSAM scanning, don't load the application (or remove it if it comes preinstalled). Done. Addresses everybody's concerns.
    CelticPaddywilliamlondon[Deleted User]watto_cobrajony0
  • Reply 12 of 18
    They should move this mess into the cloud, client side spyware on a device that i am expected to pay for is not acceptable!
    baconstangkojack
  • Reply 13 of 18
    zimmie said:
    badmonk said:
    I suspect CSAM screening will ultimately be performed and confined iCloud server side, like every other cloud based service has been doing for years (and not talked about it).

    I always thought iCloud screened for CSAM, after all MSFT and Google have been doing it for years.

    Freedom has its limits.
    I think they’ve always been scanning iCloud photos and reporting on the server side.  I think someone previously linked to a document that states so.  So the claims about protecting peds is unwarranted, because it will still be done on the server side.  What apple was trying to do was impose/move their house rule (server side) into our houses (iPhones) for whatever reason.

    I see no reason for the move.  As some people have previously stated, “maybe Apple is going to e2e encrypt iCloud photos”.  The rub here is that it would not be e2e encrypted either way.  Scanning and reporting necessitates access to the data. E2E encryption is only E2E encryption IFF there is no process to circumvent it (including at either end) to send the data to someone else outside of the authorized recipients intended by the sender.  This very fact alone means that iCloud photos will never be e2e encrypted as Apple needs to do CSAM scanning.

    So all things stated, I’m fine with the current state of server side scanning as it’s not on my device and the only way the scanning and reporting applies is IFF you use the service (some may argue that’s the way it would work on device, but that is subject to change, whereas if it’s on the server, they can’t make that change to scan more than what’s sent to iCloud)
    The proposed on-device CSAM scanning intrinsically involves end-to-end encryption. The whole point is to allow your device to encrypt photos with a key which Apple doesn't hold. If it detects CSAM, it also uploads partial directions on how to find the key. Once enough partial directions are uploaded, they can be used to find the key to decrypt the images, but Apple doesn't have the key until then. They also can't create the partial directions on their own.

    To protect against one of those partial directions being used as evidence of possession, they also set the system up to emit fake partial directions which are indistinguishable from real ones until after you have uploaded enough real ones.

    The clear solution is to spin all of the iCloud-related functionality out of Photos and into a separate application. CSAM scanning then goes in that application. If you want to upload stuff to iCloud, you need that application, which has the CSAM scanning. If you don't want the CSAM scanning, don't load the application (or remove it if it comes preinstalled). Done. Addresses everybody's concerns.
    It’s not intrinsic, the encryption used as a veil of protection.  I get that it’s theoretically impossible to decrypt the offending files until a threshold is met, but in order to report those images, it necessarily circumvents the E2E process between the normal intended sender and receiver(s), to send them for review at Apple and authorities.  Yes, the process heavily relies on encryption methods, but allows for another originally unintended party to be involved.  True E2E encryption means only specified sender and receiver have access to the information (but you still have to trust those involved not to reshape it).  Top it off, it’s on device, where it’s easier to “expand” functionality to include other things, whereas when it’s in the service, it’s physically unable to access anything on device unless it’s sent (hopefully over ssl) or already stored in the service.
    muthuk_vanalingam
  • Reply 14 of 18
    Anilu_777 said:
    Privacy advocates help perverts. 
    says the guy using a pseuddonym
    williamlondonjony0
  • Reply 15 of 18
    zimmiezimmie Posts: 651member
    zimmie said:
    badmonk said:
    I suspect CSAM screening will ultimately be performed and confined iCloud server side, like every other cloud based service has been doing for years (and not talked about it).

    I always thought iCloud screened for CSAM, after all MSFT and Google have been doing it for years.

    Freedom has its limits.
    I think they’ve always been scanning iCloud photos and reporting on the server side.  I think someone previously linked to a document that states so.  So the claims about protecting peds is unwarranted, because it will still be done on the server side.  What apple was trying to do was impose/move their house rule (server side) into our houses (iPhones) for whatever reason.

    I see no reason for the move.  As some people have previously stated, “maybe Apple is going to e2e encrypt iCloud photos”.  The rub here is that it would not be e2e encrypted either way.  Scanning and reporting necessitates access to the data. E2E encryption is only E2E encryption IFF there is no process to circumvent it (including at either end) to send the data to someone else outside of the authorized recipients intended by the sender.  This very fact alone means that iCloud photos will never be e2e encrypted as Apple needs to do CSAM scanning.

    So all things stated, I’m fine with the current state of server side scanning as it’s not on my device and the only way the scanning and reporting applies is IFF you use the service (some may argue that’s the way it would work on device, but that is subject to change, whereas if it’s on the server, they can’t make that change to scan more than what’s sent to iCloud)
    The proposed on-device CSAM scanning intrinsically involves end-to-end encryption. The whole point is to allow your device to encrypt photos with a key which Apple doesn't hold. If it detects CSAM, it also uploads partial directions on how to find the key. Once enough partial directions are uploaded, they can be used to find the key to decrypt the images, but Apple doesn't have the key until then. They also can't create the partial directions on their own.

    To protect against one of those partial directions being used as evidence of possession, they also set the system up to emit fake partial directions which are indistinguishable from real ones until after you have uploaded enough real ones.

    The clear solution is to spin all of the iCloud-related functionality out of Photos and into a separate application. CSAM scanning then goes in that application. If you want to upload stuff to iCloud, you need that application, which has the CSAM scanning. If you don't want the CSAM scanning, don't load the application (or remove it if it comes preinstalled). Done. Addresses everybody's concerns.
    It’s not intrinsic, the encryption used as a veil of protection.  I get that it’s theoretically impossible to decrypt the offending files until a threshold is met, but in order to report those images, it necessarily circumvents the E2E process between the normal intended sender and receiver(s), to send them for review at Apple and authorities.  Yes, the process heavily relies on encryption methods, but allows for another originally unintended party to be involved.  True E2E encryption means only specified sender and receiver have access to the information (but you still have to trust those involved not to reshape it).  Top it off, it’s on device, where it’s easier to “expand” functionality to include other things, whereas when it’s in the service, it’s physically unable to access anything on device unless it’s sent (hopefully over ssl) or already stored in the service.
    It sounds like you misunderstand exactly what end-to-end encryption is. When it's used, either end can betray the other and leak the unencrypted information, or even the key. That doesn't make it not end-to-end. This is one end (your phone) potentially leaking the key. Unless your phone decides to do that, Apple has no way to get at the key or any of the encrypted data. Compare with non-end-to-end cryptosystems, such as Telegram's old encryption method, where Telegram-the-company is a party to the key negotiation, and gets key material they could use to decrypt the conversation without either end being aware.

    Right now, Apple employees can view photos you store in iCloud. Presumably they are prohibited from doing this by policy, but they have the technical capability. With the endpoint CSAM scanning as explained in the technical papers Apple presented, they would no longer have that capability. That's because the endpoint CSAM scanning intrinsically involves end-to-end encryption.

    We already trust that Apple won't rework their on-device content scanning. Or did you forget what Spotlight and the object recognition in Photos are? Not like you can disable either of those.
    jony0
  • Reply 16 of 18
    entropysentropys Posts: 4,166member
    Oh dear, police will have to get off their bums and hunt the rock spiders down, rather than handing responsibility over to a great big corporation.
  • Reply 17 of 18
    zimmie said:
    zimmie said:
    badmonk said:
    I suspect CSAM screening will ultimately be performed and confined iCloud server side, like every other cloud based service has been doing for years (and not talked about it).

    I always thought iCloud screened for CSAM, after all MSFT and Google have been doing it for years.

    Freedom has its limits.
    I think they’ve always been scanning iCloud photos and reporting on the server side.  I think someone previously linked to a document that states so.  So the claims about protecting peds is unwarranted, because it will still be done on the server side.  What apple was trying to do was impose/move their house rule (server side) into our houses (iPhones) for whatever reason.

    I see no reason for the move.  As some people have previously stated, “maybe Apple is going to e2e encrypt iCloud photos”.  The rub here is that it would not be e2e encrypted either way.  Scanning and reporting necessitates access to the data. E2E encryption is only E2E encryption IFF there is no process to circumvent it (including at either end) to send the data to someone else outside of the authorized recipients intended by the sender.  This very fact alone means that iCloud photos will never be e2e encrypted as Apple needs to do CSAM scanning.

    So all things stated, I’m fine with the current state of server side scanning as it’s not on my device and the only way the scanning and reporting applies is IFF you use the service (some may argue that’s the way it would work on device, but that is subject to change, whereas if it’s on the server, they can’t make that change to scan more than what’s sent to iCloud)
    The proposed on-device CSAM scanning intrinsically involves end-to-end encryption. The whole point is to allow your device to encrypt photos with a key which Apple doesn't hold. If it detects CSAM, it also uploads partial directions on how to find the key. Once enough partial directions are uploaded, they can be used to find the key to decrypt the images, but Apple doesn't have the key until then. They also can't create the partial directions on their own.

    To protect against one of those partial directions being used as evidence of possession, they also set the system up to emit fake partial directions which are indistinguishable from real ones until after you have uploaded enough real ones.

    The clear solution is to spin all of the iCloud-related functionality out of Photos and into a separate application. CSAM scanning then goes in that application. If you want to upload stuff to iCloud, you need that application, which has the CSAM scanning. If you don't want the CSAM scanning, don't load the application (or remove it if it comes preinstalled). Done. Addresses everybody's concerns.
    It’s not intrinsic, the encryption used as a veil of protection.  I get that it’s theoretically impossible to decrypt the offending files until a threshold is met, but in order to report those images, it necessarily circumvents the E2E process between the normal intended sender and receiver(s), to send them for review at Apple and authorities.  Yes, the process heavily relies on encryption methods, but allows for another originally unintended party to be involved.  True E2E encryption means only specified sender and receiver have access to the information (but you still have to trust those involved not to reshape it).  Top it off, it’s on device, where it’s easier to “expand” functionality to include other things, whereas when it’s in the service, it’s physically unable to access anything on device unless it’s sent (hopefully over ssl) or already stored in the service.
    It sounds like you misunderstand exactly what end-to-end encryption is. When it's used, either end can betray the other and leak the unencrypted information, or even the key. That doesn't make it not end-to-end. This is one end (your phone) potentially leaking the key. Unless your phone decides to do that, Apple has no way to get at the key or any of the encrypted data. Compare with non-end-to-end cryptosystems, such as Telegram's old encryption method, where Telegram-the-company is a party to the key negotiation, and gets key material they could use to decrypt the conversation without either end being aware.

    Right now, Apple employees can view photos you store in iCloud. Presumably they are prohibited from doing this by policy, but they have the technical capability. With the endpoint CSAM scanning as explained in the technical papers Apple presented, they would no longer have that capability. That's because the endpoint CSAM scanning intrinsically involves end-to-end encryption.

    We already trust that Apple won't rework their on-device content scanning. Or did you forget what Spotlight and the object recognition in Photos are? Not like you can disable either of those.
    I’m well aware what e2e encryption is, and is predicated on trust.  It always boils down to trust to quote one of my comp sci professors.  It’s what I alluded to in 

    (but you still have to trust those involved not to reshape it)
    Note: I made a typo: reshape should have been re-shared.  And yes, I’m well aware encryption keys need to be safely stored.  Now I ask you this, once data is encrypted, to be sent, how does one decrypt it?  With the corresponding decryption key (which is different from the encryption key of course but the 2 come as a pair). 

    Apple has used a threshold algorithm to enable review of the contents… remove that, and you will have true e2e encryption; otherwise it’s just “wish it were e2e”. It’s close, but no cigar.

    Once a threshold is met what happens then? Do we trust Apple to handle the data correctly? Do we trust the authorities will handle the data correctly?  Do we trust individuals at these places not to share it or leak it?  Do we trust that the thresholds won’t change? Do we trust that other types of images won’t be deemed “illicit” in the future? Do we trust a similar threshold algorithm won’t be applied to text in imessages, looking for messages of terrorism or “hate” speech?  Do we trust that it will only be done for photos uploaded to iCloud?

    I for one am fine with my images not being e2e encrypted in iCloud, as I consider it public space anyway and act accordingly.  I would expect apple is employing encryption for data at rest, which a department(s) has access to the keys.  So which would you prefer, “wish it were e2e” or a select few with access to the data at rest?  6 one way, half a dozen another, (except one in my opinion has far more bad implications for the future)… both ways still involve trust however.

    ¯\_(ツ)_/¯ 
    muthuk_vanalingam
  • Reply 18 of 18
    zimmie said:

    We already trust that Apple won't rework their on-device content scanning. Or did you forget what Spotlight and the object recognition in Photos are? Not like you can disable either of those.
    I have also said in the past, that I am not opposed to scanning, it is a very useful tool for many purposes.  It the scanning and reporting that data to another party that’s the issue.  


    Many things require access to the data in order to work, but as long as the data remains on device? ¯\_(ツ)_/¯  I don’t care how much scanning is done; if it makes my life easier, great.  At some point, the software has to decrypt the data, otherwise we would only see a jumbled mess on our screens.  This is also part of why Apple has added the neural processing cores to its chips: to enable fast, more complex on-device (and more inherently secure) AI to do neat/useful things without the need to send it off to a server (how Siri works for a great many things, though with iOS 15, some of that has changed).
    muthuk_vanalingam
Sign In or Register to comment.