Apple's plan to scan iPhone photos for child abuse material is dead

Posted:
in General Discussion edited December 2022
While Apple's controversial plan to hunt down child sexual abuse material with on-iPhone scanning has been abandoned, the company has other plans in mind to stop it at the source.

Apple's proposed CSAM detection feature
Apple's proposed CSAM detection feature


Apple announced two initiatives in late 2021 that aimed to protect children from abuse. One, which is already in effect today, would warn minors before sending or receiving photos with nude content. It works using algorithmic detection of nudity and only warns the kids -- the parents aren't notified.

The second, and much more controversial feature, would analyze a user's photos being uploaded to iCloud on the user's iPhone for known CSAM content. The analysis was performed locally, on device, using a hashing system.

After backlash from privacy experts, child safety groups, and governments, Apple paused the feature indefinitely for review. On Wednesday, Apple released a statement to AppleInsider and other venues explaining that it has abandoned the feature altogether.
"After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021."

"We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all."

The statement comes moments after Apple announced new features that would end-to-end encrypt even more iCloud data, including iMessage content and photos. These strengthened protections would have made the server-side flag system impossible, which was a primary part of Apple's CSAM detection feature.

Taking a different approach

Amazon, Google, Microsoft, and others perform server-side scanning as a requirement by law, but end-to-end encryption will prevent Apple from doing so.

Instead, Apple hopes to address the issue at its source -- creation and distribution. Rather than target those who hoard content in cloud servers, Apple hopes to educate users and prevent the content from being created and sent in the first place.

Apple provided extra details about this initiative to Wired. While there isn't a timeline for the features, it would start with expanding algorithmic nudity detection to video for the Communication Safety feature. Apple then plans on expanding these protections to its other communication tools, then provide developers with access as well.

"Potential child exploitation can be interrupted before it happens by providing opt-in tools for parents to help protect their children from unsafe communications," Apple also said in a statement. "Apple is dedicated to developing innovative privacy-preserving solutions to combat Child Sexual Abuse Material and protect children, while addressing the unique privacy needs of personal communications and data storage."

Other on-device protections exist in Siri, Safari, and Spotlight to detect when users search for CSAM. This redirects the search to resources that provide help to the individual.

Features that educate users while preserving privacy has been Apple's goal for decades. All the existing implementations for child safety seek to inform, and Apple never learns when the safety feature is triggered.

Read on AppleInsider

Comments

  • Reply 1 of 15
    maestro64maestro64 Posts: 5,043member
    As I have said before it does not make sense unless you go after the sources not the end pervert. Plus if the image is not in the database to compare the hash you are not going to catch the new sources. Glad apple saw the error of the altruistic ways, yes they thought them were well meaning and doing it for right reason, but at the cost to everyone.
    watto_cobrabaconstangJaiOh81
  • Reply 2 of 15
    entropysentropys Posts: 4,148member
    I am amazed that Apple backtracked on what it claimed was such a noble and virtue cause by eventually accepting it could also result in being used for another kind of evil itself.

    Unless that’s what “they” want us to think! Bwaahahhhaha!
    beowulfschmidt
  • Reply 3 of 15
    DAalsethDAalseth Posts: 2,783member
    A noble idea that was doomed to fail. Glad Apple was willing to admit their mistake and backtrack. 
    JaiOh81
  • Reply 4 of 15
    davidwdavidw Posts: 2,028member
    >."Amazon, Google, Microsoft, and others perform server-side scanning as a requirement by law, but end-to-end encryption will prevent Apple from doing so."<

    That is not true. In the US, there are no laws that requires company like Amazon, Google, Microsoft or others to perform CSAM search on the data stored in there servers. Our US Constitution would forbid government from passing such laws. Such laws would violate the 4th Amendment regarding unreasonable and unwarranted search, without probable cause. 

    The only laws that exist is that companies must report to law enforcement, instances of child porn that they find on their servers. But no law exist that they must actively search for this. 

    However, Amazon, Google Microsoft and others can voluntarily perform CSAM on the data stored in their servers at the request of foundations against child exploitation, even if no law exist that requires them to do it. 

    The gray area is that many foundations against child exploitation often pressure companies to perform CSAM or otherwise face public shame for not performing it. And many of these foundations are government funded and might be construed as actors working for the government and thus subject to the US Constitution.

    https://crsreports.congress.gov/product/pdf/LSB/LSB10713

    >Currently, nothing in federal law requires providers to monitor their services or content for CSAM in the first instance. Under the law, although providers must report CSAM to NCMEC, which must then make the reports available to law enforcement, providers are not obligated to “affirmatively search, screen, or scan for” these violations. <

    For citizens in the EU on the other hand, they have no Constitution protecting them from unreasonable and unwarranted search by the government and thus might be (or soon will be) subject to mandatory  CSAM searches, on the data they keep on a third party server.  Even if it's suppose to be encrypted end to end. 

    https://techcrunch.com/2022/05/11/eu-csam-detection-plan/?guccounter=1&guce_referrer=aHR0cHM6Ly93d3cuZ29vZ2xlLmNvbS91cmw_c2E9dCZyY3Q9aiZxPSZlc3JjPXMmc291cmNlPXdlYiZjZD0mdmVkPTJhaFVLRXdpVXc5cjBfZWo3QWhWb01EUUlIY2J2Q2FBUUZub0VDQW9RQVEmdXJsPWh0dHBzJTNBJTJGJTJGdGVjaGNydW5jaC5jb20lMkYyMDIyJTJGMDUlMkYxMSUyRmV1LWNzYW0tZGV0ZWN0aW9uLXBsYW4lMkYmdXNnPUFPdlZhdzFjYkw5RlF3Y0tKQ0ZLQnhFYTFickg&guce_referrer_sig=AQAAAAa00vwPWJ8hov6dKhTHkWFDKxkOOna5JP7CvrJUxLg47DVnrCVqYY0W3UQgoZnFKub-eQ4VwMUcUGzxE8g2rpl_SwL_Ooqx8PNCJZTGn5BCHk_tOBTD_MGHqc0mxVRWhoGHSyGYPqgrqabbyiczcl7ZIE6uWKf7Rjm3p2-fkWxi


    https://9to5mac.com/2022/05/12/eu-csam-scanning-law/

    In the US, its citizens privacy is better protected from government intrusion because of the US Constitution but laws protecting our privacy from third parties tends to be lax. But in the EU, its citizens privacy are better protected from third party intrusions by strict privacy laws, while laws protecting their privacy from government intrusions, tends to be lax. 


    muthuk_vanalingamFileMakerFeller
  • Reply 5 of 15
    And the fbi was quick to start crying “tHiNk AbOuT tHe ChIlDrEn”
  • Reply 6 of 15
     A complete non-issue that was entirely blown out of proportion on the internet (yeah...big surprise there). Scanning on the phone is no different than scanning in the cloud. The user chooses which apps make use of iCloud backup and all of those app files are going to be scanned regardless. You can't use iCloud without agreeing to Apple's user terms and those terms include scans for illegal content. 
    edited December 2022
  • Reply 7 of 15
    Given the long dev timelines, surely the CSAM team knew that the end-end team’s work would render their scheme impossible? 
  • Reply 8 of 15
    gatorguygatorguy Posts: 24,153member
    Given the long dev timelines, surely the CSAM team knew that the end-end team’s work would render their scheme impossible? 
    Would E2EE have prevented scanning done on the user's own device? E2EE comes into play when the image leaves your device, not while still on your device AFAIK.
  • Reply 9 of 15
    gatorguygatorguy Posts: 24,153member
      Scanning on the phone is no different than scanning in the cloud. 
    Oh, it's surely different, and no doubt you understand the distinction. It is NOT Apple's phone even if it's Apple's OS. You earned the money to buy it, just as I earned the money to buy mine. It is your property to use, smash, mistreat, give away, resell, and yes make use of it for things that someone else might find abhorrent. But Apple's servers where your iCloud data is optionally sent is unequivacally Apple's property.  They can have whatever rules they wish when they allow you to store your data there. 
    muthuk_vanalingam
  • Reply 10 of 15
    gatorguy said:
    Given the long dev timelines, surely the CSAM team knew that the end-end team’s work would render their scheme impossible? 
    Would E2EE have prevented scanning done on the user's own device? E2EE comes into play when the image leaves your device, not while still on your device AFAIK.
    Yes I see what you mean. However, from the article is: "These strengthened protections would have made the server-side flag system impossible, which was a primary part of Apple's CSAM detection feature." Which actually maybe doesn't make sense as the scanning is client side, as you say (and the client uploads the images it detects on a differently encrypted channel outside of the E2EE iCloud Photos).
    edited December 2022
  • Reply 11 of 15
    gatorguy said:
      Scanning on the phone is no different than scanning in the cloud. 
    Oh, it's surely different, and no doubt you understand the distinction. It is NOT Apple's phone even if it's Apple's OS. You earned the money to buy it, just as I earned the money to buy mine. It is your property to use, smash, mistreat, give away, resell, and yes make use of it for things that someone else might find abhorrent. But Apple's servers where your iCloud data is optionally sent is unequivacally Apple's property.  They can have whatever rules they wish when they allow you to store your data there. 
    In order to use iCloud you have to agree to Apple's terms...which always include scanning iCloud files for illegal content. Once you choose which apps to use with iCloud, all the files from the app are going to be scanned. The on-phone scanning didn't change anything. It scanned the same material that would be scanned in the cloud. If you don't use iCloud, nothing is going to be scanned. If you do use iCloud, it's going to scan the files from the apps that you personally chose to use with the service per the terms that you agreed to for the service.
    edited December 2022
  • Reply 12 of 15
    Yay I can finally upgrade from iOS 14!

    So glad the author of this article was wrong:
    https://appleinsider.com/articles/22/08/26/apples-csam-detection-system-may-not-be-perfect-but-it-is-inevitable


    appleinsideruser
  • Reply 13 of 15
    davidwdavidw Posts: 2,028member
    gatorguy said:
    Given the long dev timelines, surely the CSAM team knew that the end-end team’s work would render their scheme impossible? 
    Would E2EE have prevented scanning done on the user's own device? E2EE comes into play when the image leaves your device, not while still on your device AFAIK.

    End to end encryption means that when you hit "send', only you and the person you're sending the message to, can read the message. If the message can be scanned by a third party (while still on your device) after you hit "send" and before it's encrypted, then it's not true end to end encryption as we know it. And for sure, the third party can't be allowed to scan your message on your device, before you send it. Even if you don't plan to use end to end encryption.
  • Reply 14 of 15
    davidwdavidw Posts: 2,028member
    gatorguy said:
      Scanning on the phone is no different than scanning in the cloud. 
    Oh, it's surely different, and no doubt you understand the distinction. It is NOT Apple's phone even if it's Apple's OS. You earned the money to buy it, just as I earned the money to buy mine. It is your property to use, smash, mistreat, give away, resell, and yes make use of it for things that someone else might find abhorrent. But Apple's servers where your iCloud data is optionally sent is unequivacally Apple's property.  They can have whatever rules they wish when they allow you to store your data there. 
    In order to use iCloud you have to agree to Apple's terms...which always include scanning iCloud files for illegal content. Once you choose which apps to use with iCloud, all the files from the app are going to be scanned. The on-phone scanning didn't change anything. It scanned the same material that would be scanned in the cloud. If you don't use iCloud, nothing is going to be scanned. If you do use iCloud, it's going to scan the files from the apps that you personally chose to use with the service per the terms that you agreed to for the service.

    In order to fly on an airline, one must agree to let airport security x-ray any luggage you're planning to put on the plane and allow them be searched by dogs trained to sniff out drugs, explosives and other contraband. So does it make a difference if airport security bring their x-ray machine and search dogs into your home to do the security search on your luggage, right before you load them into the car that you're going to take to the airport? After all, the luggage is going to end up on an airline anyway and you must agree to allow the search. So it shouldn't make a difference if the search is done inside your home, with the x-ray machine plugged into your home outlets.

    If it makes no difference on whether the images are scanned on the customers Apple devices right before they're uploaded onto the iCloud servers and when they're already on the iCloud servers, then why did Apple want to do the scanning on the customers device? If what you say is true,that it doesn't make any difference. Well, even if the scanning on the device uses an imperceptible amount of each customer's device CPU power, it's not zero and the saving of CPU power might not be imperceptible on Apple end, when they do the scanning on 100's of millions of customers Apple devices. Apple might be saving a significant amount of money in scanning cost, by having it done on their customers devices. So there might be a difference, at least for Apple. 
    muthuk_vanalingam
  • Reply 15 of 15
    davidw said:

    If it makes no difference on whether the images are scanned on the customers Apple devices right before they're uploaded onto the iCloud servers and when they're already on the iCloud servers, then why did Apple want to do the scanning on the customers device?
    The 'saving some CPU' theory falls into the 'greedy/cheap Apple' mythology.

    I think Apple has wanted to do full iCloud encryption for a very long time, and the reason they haven't is because of the reported pushback from the federal government.  On-device scanning was a valiant attempt to thread the needle - checking content before it gets uploaded to the <soon-to-be> encrypted cloud.

    The attempt failed to get traction and Apple has thrown in the towel with regard to satisfying the government and is pushing ahead with its long term goal of encrypting everything they can (and the EFF's self-serving statement 'Oh, Apple finally listened to us' is gag-inducing).
Sign In or Register to comment.