Apple provides detailed reasoning behind abandoning iPhone CSAM detection

Posted:
in General Discussion

A child safety group pushed Apple on why the announced CSAM detection feature was abandoned, and the company has given its most detailed response yet as to why it backed off its plans.

Apple's scrapped CSAM detection tool
Apple's scrapped CSAM detection tool



Child Sexual Abuse Material is an ongoing severe concern Apple attempted to address with on-device and iCloud detection tools. These controversial tools were ultimately abandoned in December 2022, leaving more controversy in its wake.

A child safety group known as Heat Initiative told Apple it would organize a campaign against the choice to abandon CSAM detection, hoping to force it to offer such tools. Apple responded in detail, and Wired was sent the response and detailed its contents in a report.

The response focuses on consumer safety and privacy as Apple's reason to pivot to a feature set called Communication Safety. Trying to find a way to access information that is normally encrypted goes against Apple's wider privacy and security stance -- a position that continues to upset world powers.

"Child sexual abuse material is abhorrent and we are committed to breaking the chain of coercion and influence that makes children susceptible to it," wrote Erik Neuenschwander, Apple's director of user privacy and child safety.

"Scanning every user's privately stored iCloud data would create new threat vectors for data thieves to find and exploit," Neuenschwander continued. "It would also inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types."

"We decided to not proceed with the proposal for a hybrid client-server approach to CSAM detection for iCloud Photos from a few years ago," he finished. "We concluded it was not practically possible to implement without ultimately imperiling the security and privacy of our users."

Neuenschwander was responding to a request by Sarah Gardner, leader of the Heat Initiative. Gardner was asking why Apple backed down on the on-device CSAM identification program.

"We firmly believe that the solution you unveiled not only positioned Apple as a global leader in user privacy but also promised to eradicate millions of child sexual abuse images and videos from iCloud," Gardner wrote. "I am a part of a developing initiative involving concerned child safety experts and advocates who intend to engage with you and your company, Apple, on your continued delay in implementing critical technology."

"Child sexual abuse is a difficult issue that no one wants to talk about, which is why it gets silenced and left behind, added Gardner. "We are here to make sure that doesn't happen."

Rather than take an approach that would violate user trust and make Apple a middleman for processing reports, the company wants to help direct victims to resources and law enforcement. Developer APIs that apps like Discord can use will help educate users and direct them to report offenders.

Apple doesn't "scan" user photos stored on device or in iCloud in any way. The Communication Safety feature can be enabled for children's accounts but won't notify parents when nudity is detected in chats.

The feature is being expanded to adults in iOS 17, which will enable users to filter out unwanted nude photos from iMessage. Apple hopes to expand these features to more areas in the future.

Read on AppleInsider

sphericOracleOphiRCVIPER
«1

Comments

  • Reply 1 of 25
    Apple is not a law enforcement organization. That should answer the question. Period.
    tyler82danoxchasmAlex1Nmuthuk_vanalingammike1Calamandermacguigrandact73watto_cobra
  • Reply 2 of 25
    Great that Apple heard the outcry of the slippery slope to snooping — they don’t often eat humble pie!
    elijahgchasmAlex1NOferblastdoorCalamandersphericmacguigrandact73watto_cobra
  • Reply 3 of 25
    entropysentropys Posts: 4,168member
    “Won’t someone please think of the children!”

    seriously, the world is awash with people who in the name of good, do great harm. This is a gold medal example.

    and good on Apple for realising that many peoples’ concerns with how this system could create a situation that could be exploited by bad actors, and further, adopted those concerns as its own position.
    elijahgFileMakerFellerchasmAlex1NOfermike1Calamandermacguiwatto_cobraOracleOphi
  • Reply 4 of 25
    danoxdanox Posts: 2,874member
    Good luck with the dawn of AI, and being able to manipulate and create millions of pictures without any context, or intelligence behind them, just press the button and go, within 10 years what’s real and what’s not?
    FileMakerFellerAlex1Nwatto_cobra
  • Reply 5 of 25
    badmonkbadmonk Posts: 1,295member
    They should also talk to parents who texted pictures of their children’s rash to the pediatrician and got caught up in CSAM investigations.  As bad as CSAM is, larger privacy issues are important as well.  Apple is doing the right thing here.
    Alex1NOfermike1blastdoorCalamandermacguiwatto_cobraOracleOphi
  • Reply 6 of 25
    Kind of ridiculous that Apple got flak for trying to implement CSAM using a better method to improve privacy while Google, Microsoft and others have had full-blown CSAM scanning for years and hardly anyone talks about it.

    Google is even worse for using machine learning to try and identify images that aren't in the CSAM database, generating additional false positives.
    OferbaconstangwilliamlondonStrangeDayswatto_cobraOracleOphijony0
  • Reply 7 of 25
    danox said:
    Good luck with the dawn of AI, and being able to manipulate and create millions of pictures without any context, or intelligence behind them, just press the button and go, within 10 years what’s real and what’s not?
    10 years?  We already have deep fakes now and it's only going to get worse as time goes on and technology gets better and easier to use. :(
    watto_cobra
  • Reply 8 of 25
    Of course due to UK idiocy and Online Safety Bill and updates to RIPA Apple will have to pull iMessage out of the UK, Signal and WhatsApp will also have to pull out.

    The UK politicians behind all of this are arrogant and stupid and refuse to listen to experts in the field.
    avon b7appleinsideruserblastdoorwatto_cobra
  • Reply 9 of 25
    Maybe UK people will get different coloured iMessages to denote unencrypted? They could be transparent maybe, to illustrate that anyone can see them...
    watto_cobra
  • Reply 10 of 25
    blastdoorblastdoor Posts: 3,308member
    Maybe UK people will get different coloured iMessages to denote unencrypted? They could be transparent maybe, to illustrate that anyone can see them...
    No, they need to protect their users in other countries — they will need to pull out completely. 

    I’m glad apple is fighting the good fight. I hope they win. But if the EU follows the UK, it could become very hard. And if the US were to fall, that would be the end of it. 

    Bottom line is, apple can’t do this alone. Citizens in democracies need to advocate for their rights. 
    watto_cobra
  • Reply 11 of 25
    blastdoor said:
    Maybe UK people will get different coloured iMessages to denote unencrypted? They could be transparent maybe, to illustrate that anyone can see them...
    No, they need to protect their users in other countries — they will need to pull out completely. 

    I’m glad apple is fighting the good fight. I hope they win. But if the EU follows the UK, it could become very hard. And if the US were to fall, that would be the end of it. 

    Bottom line is, apple can’t do this alone. Citizens in democracies need to advocate for their rights. 
    Indeed fighting and winning would be ideal. Yet at the moment Apple can sell into most countries. Those users know Green messages are unencrypted.

    Pragmatically, letting the unfortunate still enjoy the convenience of iMessage is better than dumping them to Green.
    watto_cobra
  • Reply 12 of 25
    Apple did the right thing - you can't be privacy orientated and simultaneously scan everyone's pictures. 

    Pretty amazing they listened and turned around.

    I would love to say they did it because it was the right thing to do - but Apple's marketing makes a big deal of privacy and they see it as an important differentiating factor to Google and obviously if they start getting into everyone's stuff they couldn't say that anymore. 


    watto_cobra
  • Reply 13 of 25
    blastdoor said:
    Maybe UK people will get different coloured iMessages to denote unencrypted? They could be transparent maybe, to illustrate that anyone can see them...
    No, they need to protect their users in other countries — they will need to pull out completely. 

    I’m glad apple is fighting the good fight. I hope they win. But if the EU follows the UK, it could become very hard. And if the US were to fall, that would be the end of it. 

    Bottom line is, apple can’t do this alone. Citizens in democracies need to advocate for their rights. 
    If people don't fight for their rights, they will lose their rights. 

    It's that simple. 

    I like that Apple would pull out - in western countries this can actually work. 
    watto_cobra
  • Reply 14 of 25
    Kind of ridiculous that Apple got flak for trying to implement CSAM using a better method to improve privacy while Google, Microsoft and others have had full-blown CSAM scanning for years and hardly anyone talks about it.

    Google is even worse for using machine learning to try and identify images that aren't in the CSAM database, generating additional false positives.
    It is ridiculous that you don't understand how it works.  Google, Microsoft, and others scan the photos that USERS upload to their cloud services and photos are scanned on the cloud servers.  Apple wanted to scan every single photo stored on your device whether you like it or not.  Big difference, and big privacy violation.  Apple's attempt at scanning was also known to produce false positives, then a third party company would review those photos.
    muthuk_vanalingamgatorguywilliamlondon
  • Reply 15 of 25
    badmonk said:
    They should also talk to parents who texted pictures of their children’s rash to the pediatrician and got caught up in CSAM investigations.  As bad as CSAM is, larger privacy issues are important as well.  Apple is doing the right thing here.
    Not really an analogy 
  • Reply 16 of 25
    I am so happy that Apple decided to go this route!
    (And the many users protested against the initial plan!)
    watto_cobra
  • Reply 17 of 25
    "Slippery slope" is the absolute weakest form of argument due to the fact that "slippery slope" can be applied to anything and is entirely conjectural. Is the photo/video capability of smartphones a slippery slope towards making child pornography? Is the access to an internet browser a slippery slope to viewing child pornography? Is owning a computer a slippery slope to becoming a hacker who violates people's privacy? 

    I've posted this before but it's instructional when it comes to how bad "slippery slope" arguments are. During the original Edward Snowden hysteria about government surveillance powers, Yahoo! decided to sue the government over FISA subpoenas for their metadata. Yahoo!'s argument was that the FISA subpoenas would lead to abuse of power by the government. When the judge asked Yahoo! to provide examples of the government abusing FISA subpoena power they didn't have any. All they had was conjectural scenarios based on "slippery slope" arguments. Result? The case was thrown out of court. 
  • Reply 18 of 25
    When we read about systems like this, there's a tendency to imagine a perfect operation - even though our life experience gives us many contrary examples. For example, I'm not sure those always advocating for capital punishment would be so enthusiastic if they were more aware of incorrect incorrect verdicts (people proved innocent later by DNA testing or confusion by the real criminal) often because of fraudulent or incompetent representation.

    There have been stories of families being contacted about possible sexual crimes because they posted facebook photos of their very young children innocently playing naked in a home swimming pool. My personal experience in the work environment - a medical clinic - was once the IT Dept implemented a "sexual content" filter/reporting system, when I or my colleagues were doing work research we'd trigger an alert that required a visit to HR. When looking up information about a dermatologist, the website might feature cosmetic surgery with an example of breast adjustments. The image would trigger the filter/alert.

    Years ago, a local childcare facility had to close down because of accusations of sexual abuse. Turned out, one child was upset they didn't get their way about something and made up the story to "get back" at the care center. Once one report was made, parents of the other kids started questioning them - I'm guessing there was a bit of a FOMO frenzy. Soon there were many reports.  But then, once there was time to do an actual investigation, one by one the accusations were withdrawn. Eventually, the facility was cleared of all wrongdoing. By then it was too late. Newspapers print "leading news" in big bold print. They print "oops - sorry" in small print where space will allow.

    So if you are going to have a system, especially an "automatic" system for detecting bad action, you should have a system just as robust - or more so - for corrections when the "evil detector" gets it wrong. Because that can ruin lives.


    muthuk_vanalingam
  • Reply 19 of 25
    badmonk said:
    They should also talk to parents who texted pictures of their children’s rash to the pediatrician and got caught up in CSAM investigations.  As bad as CSAM is, larger privacy issues are important as well.  Apple is doing the right thing here.
    Not a factor here since their proposal was using hash matching of known-child-porn. 
    appleinsideruserwatto_cobrawilliamlondonjony0
  • Reply 20 of 25
    Rogue01 said:
    Kind of ridiculous that Apple got flak for trying to implement CSAM using a better method to improve privacy while Google, Microsoft and others have had full-blown CSAM scanning for years and hardly anyone talks about it.

    Google is even worse for using machine learning to try and identify images that aren't in the CSAM database, generating additional false positives.
    It is ridiculous that you don't understand how it works.  Google, Microsoft, and others scan the photos that USERS upload to their cloud services and photos are scanned on the cloud servers.  Apple wanted to scan every single photo stored on your device whether you like it or not.  Big difference, and big privacy violation.  Apple's attempt at scanning was also known to produce false positives, then a third party company would review those photos.
    Incorrect. Only pictures on accounts that are using iCloud thus as a pre-upload step. The only difference from current iCloud (or Google or Microsoft or Twitter or Dropbox, etc) is it happened prior to upload rather than after. 

    No local scanning for non-iCloud-stored photos. 

    Sounds like you don’t understand how it works (intended to work) either. 
    edited September 2023 watto_cobraihatescreennameswilliamlondonjony0
Sign In or Register to comment.