Apple sued over 2022 dropping of CSAM detection features

Posted:
in General Discussion

A victim of childhood sexual abuse is suing Apple over its 2022 dropping of a previously-announced plan to scan images stored in iCloud for child sexual abuse material.

Four phone screens display messages about communication safety, warning of sensitive content and offering guidance. Options to continue or not are available, with emphasis on safety and parental notification.
Apple has retained nudity detection in images, but dropped some CSAM protection features in 2022.



Apple originally introduced a plan in late 2021 to protect users from child sexual abuse material (CSAM) by scanning uploaded images on-device using a hashtag system. It would also warn users before sending or receiving photos with algorithically-detected nudity.

The nudity-detection feature, called Communication Safety, is still in place today. However, Apple dropped its plan for CSAM detection after backlash from privacy experts, child safety groups, and governments.

A 27-year-old woman, who was a victim of sexual abuse as a child by a relative, is suing Apple using a court-allowed pseudonym for stopping the CSAM-detecting feature. She says she previously received a law-enforcement notice that the images of her abuse were being stored on iCloud via a MacBook seized in Vermont when the feature was active.

In her lawsuit, she says Apple broke its promise to protect victims like her when it eliminated the CSAM-scanning feature from iCloud. By doing so, she says that Apple has allowed that material to be shared extensively.

Therefore, Apple is selling "defective products that harmed a class of customers" like herself.

More victims join lawsuit



The woman's lawsuit against Apple demands changes to Apple practices, and potential compensation to a group of up to 2,680 other eligible victims, according to one of the woman's lawyers. The lawsuit notes that CSAM-scanning features used by Google and Meta's Facebook catch far more illegal material than Apple's anti-nudity feature does.

Under current law, victims of child sexual abuse can be compensated at a minimum amount of $150,000. If all of the potential plaintiffs in the woman's lawsuit were to win compensation, damages could exceed $1.2 billion for Apple if it is found liable.

In a related case, attorneys acting on behalf of a nine-year-old CSAM victim sued Apple in a North Carolina court in August. In that case, the girl says strangers sent her CSAM videos through iCloud links, and "encouraged her to film and upload" similar videos, according to The New York Times, which reported on both cases.

Apple filed a motion to dismiss the North Carolina case, noting that Section 230 of the federal code protects it from liability for material uploaded to iCloud by its users. It also said that it was protected from product liability claims because iCloud isn't a standalone product.

Court rulings soften Section 230 protection

Recent court rulings, however, could work against Apple's claims to avoid liability. The US Court of Appeals for the Ninth Circuit has
determined that such defenses can only apply to active content moderation, rather than as a blanket protection from possible liability.

Apple spokesman Fred Sainz said in response to the new lawsuit that Apple believes "child sexual abuse material is abhorrent, and we are committed to fighting the ways predators put children at risk."

Sainz added that "we are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users."

He pointed to the expansion of the nudity-detecting features to its Messages app, along with the ability for users to report harmful material to Apple.

The woman behind the lawsuit and her lawyer, Margaret Mabie, do not agree that Apple has done enough. In preparation for the case, Mabie dug through law enforcement reports and other documents to find cases related to her clients' images and Apple's products.

Mabie eventually built a list of more than 80 examples of the images being shared. One of the people sharing the images was a Bay Area man who was caught with more than 2,000 illegal images and videos stored in iCloud, the Times noted.



Read on AppleInsider

«1

Comments

  • Reply 1 of 31
    Pure self promotion for that group suing Apple. Show me their case against Samsung or Microsoft for doing nothing. Crickets. 
    baconstangdanoxssfe11macxpressmike1ronnlotonesAlex1Nwilliamlondonwatto_cobra
  • Reply 2 of 31
    chasmchasm Posts: 3,654member
    I'm the last person to hop up to defend Microsoft, but in point of fact it does have CSAM/CSEM detection features:

    Microsoft Joins Thorn and All Tech Is Human to enact strong child ...

    I can't say about Samsung's efforts because I don't own any Samsung products, but those products using Android (from any manufacturer) probably rely on Google's anti-CSAM measures:


    This would have taken you under a minute to investigate yourself, as it did me. Just because APPLEInsider doesn't mention rival companies' CSAM policies doesn't mean they don't have any.
    edited December 2024 dewmemuthuk_vanalingamAlex1Ngatorguywatto_cobra
  • Reply 3 of 31
    jimh2jimh2 Posts: 679member
    So deciding not to implement something is now a crime even if your implementation would be faulty. We can never get out from underneath of this madness until class action suits are eliminated. Let's get crazy...Apple had to remove/disable the able to do blood oxygen capability after the ITC ruling and later chose not to settle (license) with Masimo. Knowing that Apple knew how to implement it, had implemented it and could continue to implement it, if someone had a heart issue that could have been identified by the blood oxygen app should they be able to sue because of Apple's decision to not implement in new watches?

    or

    Luxury cars typically have more safety features or more advanced ones than basic cars. A manufacturer adds pedestrian sensing with auto-braking to avoid hitting someone who goes in front of or behind the car and collision with them is likely. A person is hit by a car that does not have the pedestrian sensing auto-braking feature. Should the person who was hit be able to sue the auto manufacturer because they chose to exclude this safety feature for whatever reason (price/cost, model differentiation, design limitations, waiting for a major redesign, supply issues, etc). I see no difference.
      
    elijahgthtbaconstangdanoxmike1lotonesentropysAlex1Nwatto_cobra
  • Reply 4 of 31
    jdwjdw Posts: 1,465member
    I hate lawsuits.  Period.  And not just against Apple.  Americans are sue-happy nuts.

    It's usually the blood sucking lawyers who get people to ponder these brilliant ideas.  After all, their main job is to use the law to line their pockets. That's a big part of my strong stance against lawsuits in general.

    However, this particular lawsuit is worse because it potentially harms us all.  I was against the CSAM spyware from the get-go because it cannot be made fail-safe.  Some people would be wrongfully charged with crimes they didn't commit, yes, even with those promised "human reviews."  This is a big reason why many were against it, not just me.

    So basically, someone who was abused is abused yet again, albeit in a different way, by lawyers, who get them involved in these crazy lawsuits which have the potential to later harm a larger percentage of the population, if indeed Apple loses and is strong-armed to implement CSAM spyware.

    Leave it to Americans to ALWAYS cast blame when something very bad happens to you.  We need to eliminate the criminals without potentially making innocent people into criminals.  Stop the endless suing!
    thtbaconstangelijahgOctoMonkeyAlex1Nwatto_cobra
  • Reply 5 of 31
    netroxnetrox Posts: 1,519member
    If it's encrypted end to end, how is it possible to detect CSAM? 


    Alex1Nwatto_cobra
  • Reply 6 of 31
    chasmchasm Posts: 3,654member
    netrox said:
    If it's encrypted end to end, how is it possible to detect CSAM? 


    That question is way above my pay grade to know, but my GUESS is that, as Apple mentioned in the original announcement, even an encrypted photo can be algorithmically searched for a certain pattern, in this case human nudity.

    Now lots of perfectly innocent pictures contain some level of nudity, so — again, guessing here — that is combined with where the image was sent and/or uploaded by. If any of the recipients or senders is a child account, that plus the nudity might flag it — and then law enforcement could be contacted to obtain a court order to decrypt.

    As mentioned previously when Apple abandoned the idea, this sadly opens the possibility of a government-mandated “backdoor” to iCloud Photos, which Apple really REALLY does not want (and if we had a sane government, they wouldn’t want it either). So they just dropped the policy and have to take some heat with the public and press when something like this comes up.

    That said, and again just a guess on how this came about, but I don’t think this poor young woman’s lawsuit is going to get very far. End users, even those with criminal intentions, have a right to privacy and the presumption of innocence until proven guilty. That means law enforcement cannot be PRO-ACTIVE but instead must be REACTIVE to suspicious possibilities.
    muthuk_vanalingambaconstangdanoxronnentropysAlex1Nwatto_cobra
  • Reply 7 of 31
    danoxdanox Posts: 3,479member
    Before the Internet all programs basically worked on the edge, everything that you did stayed on your computer and went nowhere else, Photos and maybe some of the other programs that Apple create should revert back to that, you can do whatever you want on your computer on the edge that Apple is no longer going to accept or store outside pictures or user text files from (word processors) on their servers, we live in the days where if you want storage capacity on the edge on your computers, you have plenty of extremely large drives on the edge that you can use, forget hosting or backing up peoples pictures or personal word files on Apple servers.

    If this lawsuit succeeds people playing the victim to a third party will go on and on, the way the Internet currently is today people that host websites and ad advertisers like to be able to send any and everything back-and-forth naturally they’re not gonna want a lockdown on the edge type of solution, but something like this means Apple is gonna have to design something that makes individual people users take some type of responsibility (by jumping through hoops loading up new programs). Not a iCloud user storing personal Photos or Pages text files online working on the edge is fine.

    I look forward to the creation, of laser focused AI Agents that will eliminate junk mail, junk messages, and much of the other driftwood that’s on the Internet, the scam and advertising and telephone companies are not gonna support, the elimination of that crap. (such a Program won’t come from Google, Microsoft or Meta however they will build a pseudo me-too program after Apple that claims to do something).

    This will also affect all of the other photo and text file programs from third parties made beyond Apple, back to the edge may be coming back like it or not.
    edited December 2024 Alex1Nwatto_cobra
  • Reply 8 of 31
    elijahgelijahg Posts: 2,876member
    netrox said:
    If it's encrypted end to end, how is it possible to detect CSAM? 


    It's scanned on-device before it's sent, or upon reception. Which is also a problem - your device suddenly has unremovable spyware installed, this is not ok. What happens when Putin adds into the signature database a picture of one of his adversaries - or someone who has wronged him? All of a sudden this is flagged up under the guise of someone having CSAM, and that person is smeared and will probably disappear. No amount of "checks" will stop a dictator in his own country from doing something like that. Who controls the signature database? A guarantee that someone hasn't added a false signature in is essentially impossible. 
    OctoMonkeychasmentropysAlex1Nwatto_cobra
  • Reply 9 of 31
    jimh2 said:
    So deciding not to implement something is now a crime even if your implementation would be faulty. We can never get out from underneath of this madness until class action suits are eliminated. Let's get crazy...Apple had to remove/disable the able to do blood oxygen capability after the ITC ruling and later chose not to settle (license) with Masimo. Knowing that Apple knew how to implement it, had implemented it and could continue to implement it, if someone had a heart issue that could have been identified by the blood oxygen app should they be able to sue because of Apple's decision to not implement in new watches?

    I am enraged by the automatic inclusion in a class.  Under no circumstances should a person ever be involuntarily included in anything.  I don't like class action lawsuits, do not support class action lawsuits, but get lumped into them regardless.  Sure I can exclude myself, but I should not be required to spend even a moment of time extracting myself from something I did not ask to be a party to.  It's all about money for the lawyers!
    watto_cobra
  • Reply 10 of 31
    Apple's decision to abandon CSAM scanning was always dumb. Their iCloud user agreement has always included language that stated they can scan files for illegal content. And the user is in total control of whether or not they use iCloud for backup and which specific apps have files backed up.
    chasmAlex1Nwilliamlondon
  • Reply 11 of 31
    elijahg said:
    netrox said:
    If it's encrypted end to end, how is it possible to detect CSAM? 


    It's scanned on-device before it's sent, or upon reception. Which is also a problem - your device suddenly has unremovable spyware installed, this is not ok. What happens when Putin adds into the signature database a picture of one of his adversaries - or someone who has wronged him? All of a sudden this is flagged up under the guise of someone having CSAM, and that person is smeared and will probably disappear. No amount of "checks" will stop a dictator in his own country from doing something like that. Who controls the signature database? A guarantee that someone hasn't added a false signature in is essentially impossible. 
    Slippery slope arguments can be made about anything. And totalitarian countries don't need "evidence" if they want to throw someone in prison or out of a window. That's the whole point of totalitarianism: no checks or balances. 
    edited December 2024
  • Reply 12 of 31
    hmurchisonhmurchison Posts: 12,441member
    CSAM is a privacy nightmare.   I'm sorry that a small group of people have experienced abuse but I'm just not into a future where people want to store my data as long as I give them carte blanche to check for whatever they want.  If Apple wants to enable CSAM scanning then it's only right of them to ensure that every iCloud feature is available to 3rd party storage providers. 


    Alex1Nelijahgwatto_cobra
  • Reply 13 of 31
    CSAM is a privacy nightmare.   I'm sorry that a small group of people have experienced abuse but I'm just not into a future where people want to store my data as long as I give them carte blanche to check for whatever they want.  
    That's not the future. That's the past. Cloud storage companies always have language in their user agreements that allow scans for illegal content. Apple's terms have included that for many years. 
    Alex1Nwilliamlondon
  • Reply 14 of 31
    sbdudesbdude Posts: 294member
    That's not the future. That's the past. Cloud storage companies always have language in their user agreements that allow scans for illegal content. Apple's terms have included that for many years. 
    And that somehow makes it not a privacy nightmare? The right to privacy is enshrined in the California State Constitution. How do you propose Apple, a California Corporation, gets around that?
    elijahgsdw2001williamlondonwatto_cobra
  • Reply 15 of 31
    williamhwilliamh Posts: 1,048member
    chasm said:
    netrox said:
    If it's encrypted end to end, how is it possible to detect CSAM? 


    That question is way above my pay grade to know, but my GUESS is that, as Apple mentioned in the original announcement, even an encrypted photo can be algorithmically searched for a certain pattern, in this case human nudity.

    Now lots of perfectly innocent pictures contain some level of nudity, so — again, guessing here — that is combined with where the image was sent and/or uploaded by. If any of the recipients or senders is a child account, that plus the nudity might flag it — and then law enforcement could be contacted to obtain a court order to decrypt.
    The basic concept of CSAM scanning doesn't involve searching for patterns in your images, not nudity or anything else.  The way it worked was to compare hashes of your images to a database of hashes of known CSAM images.  The database came from the National Center for Missing and Exploited Children which maintains hashes of the images that were proven in criminal cases to be of minors.

    The concerns of having CSAM on our devices as part of the detection were unwarranted and based on a misunderstanding of how the system works.  A potential valid concern is the probability of hash collisions.  I recall Apple's response on that was that they weren't going to alert on single matches.
    thtchasmronnAlex1NAnObserverwatto_cobra
  • Reply 16 of 31
    jSnivelyjSnively Posts: 447administrator
    williamh said:
    chasm said:
    netrox said:
    If it's encrypted end to end, how is it possible to detect CSAM? 
    That question is way above my pay grade to know, but my GUESS is that, as Apple mentioned in the original announcement, even an encrypted photo can be algorithmically searched for a certain pattern, in this case human nudity.

    Now lots of perfectly innocent pictures contain some level of nudity, so — again, guessing here — that is combined with where the image was sent and/or uploaded by. If any of the recipients or senders is a child account, that plus the nudity might flag it — and then law enforcement could be contacted to obtain a court order to decrypt.
    The basic concept of CSAM scanning doesn't involve searching for patterns in your images, not nudity or anything else.  The way it worked was to compare hashes of your images to a database of hashes of known CSAM images.  The database came from the National Center for Missing and Exploited Children which maintains hashes of the images that were proven in criminal cases to be of minors.

    The concerns of having CSAM on our devices as part of the detection were unwarranted and based on a misunderstanding of how the system works.  A potential valid concern is the probability of hash collisions.  I recall Apple's response on that was that they weren't going to alert on single matches.

    In a truly E2E (iCloud photos are not unless you turn it on) system, hash scanning doesn't work as Apple never sees the original content on their servers. The only way to do it is to scan on device before anything goes anywhere. Apple's original annoucment around this effort was somewhat convoluted (but actually pretty smart) and didn't trigger on single matches (for fasle-positive reasons you mention) but it did have to do with scanning actual content not just hashes, which is why there's probably confusion.

    edited December 2024 ronnAlex1Nwatto_cobrachasm
  • Reply 17 of 31
    chasmchasm Posts: 3,654member
    CSAM is a privacy nightmare.   I'm sorry that a small group of people have experienced abuse but I'm just not into a future where people want to store my data as long as I give them carte blanche to check for whatever they want.  If Apple wants to enable CSAM scanning then it's only right of them to ensure that every iCloud feature is available to 3rd party storage providers. 


    With respect:

    A. You seem to suggest that the number of CSA victims is small, and I assure you that is not the case.

    B. Apple doesn’t want to enable CSAM scanning, that’s made clear in the article. But they DO want to stop criminals using iCloud to distribute illegal material, while respecting the privacy of users. That’s the big problem they haven’t solved yet.

    Essentially, Apple has landed on the position that it will cooperate with law enforcement in existing investigations into CSAM material, but will not be the SOURCE of such an investigation.
    ronnAlex1Nwatto_cobra
  • Reply 18 of 31
    chasmchasm Posts: 3,654member

    williamh said:
    The basic concept of CSAM scanning doesn't involve searching for patterns in your images, not nudity or anything else.  The way it worked was to compare hashes of your images to a database of hashes of known CSAM images.  The database came from the National Center for Missing and Exploited Children which maintains hashes of the images that were proven in criminal cases to be of minors.
    Thank you for the correction and enlightenment.
    watto_cobra
  • Reply 19 of 31
    entropysentropys Posts: 4,331member
    elijahg said:
    netrox said:
    If it's encrypted end to end, how is it possible to detect CSAM? 


    It's scanned on-device before it's sent, or upon reception. Which is also a problem - your device suddenly has unremovable spyware installed, this is not ok. What happens when Putin adds into the signature database a picture of one of his adversaries - or someone who has wronged him? All of a sudden this is flagged up under the guise of someone having CSAM, and that person is smeared and will probably disappear. No amount of "checks" will stop a dictator in his own country from doing something like that. Who controls the signature database? A guarantee that someone hasn't added a false signature in is essentially impossible. 
    Slippery slope arguments can be made about anything. And totalitarian countries don't need "evidence" if they want to throw someone in prison or out of a window. That's the whole point of totalitarianism: no checks or balances. 
    Unfortunately slippery slope arguments have been proven right. The activists just move to the next step, as their activism is how they define themselves. 

    And even the totalitarian needs to know who they “need” to defenestrate. Why have one in 300 people working for the Stasi when you can have everyone’s phones do the work for them?
    Alex1Nelijahgronnsdw2001watto_cobra
  • Reply 20 of 31
    eriamjheriamjh Posts: 1,782member
    SO... Apple is being sued under some kind of "you should have done something" concept?
    Alex1Ndanoxwatto_cobra
Sign In or Register to comment.