Apple accused of using privacy to excuse ignoring child abuse material on iCloud

Posted:
in iOS

A proposed class action suit claims that Apple is hiding behind claims of privacy in order to avoid stopping the storage of child sexual abuse material on iCloud, and alleged grooming over iMessage.

Smartphone displaying a sensitive content warning, laid on a white keyboard.
Apple cancelled its major CSAM proposals but introduced features such as automatic blocking of nudity sent to children



Following a UK organization's claims that Apple is vastly underreporting incidents of child sexual abuse material (CSAM), a new proposed class action suit says the company is "privacy-washing" its responsibilities.

A new filing with the US District Court for the Northern District of California, has been brought on behalf of an unnamed 9-year-old plaintiff. Listed only as Jane Doe in the complaint, the filing says that she was coerced into making and uploading CSAM on her iPad.

"When 9-year-old Jane Doe received an iPad for Christmas," says the filing, "she never imagined perpetrators would use it to coerce her to produce and upload child sexual abuse material ("CSAM") to iCloud."

The suit asks for a trial by jury. Damages exceeding $5 million are sought for every person encompassed by the class action.

It also asks in part that Apple be forced to:


  • Adopt measures to protect children against the storage and distribution of CSAM on iCloud

  • Adopt measures to create easily accessible reporting

  • Comply with quarterly third-party monitoring



"This lawsuit alleges that Apple exploits 'privacy' at its own whim," says the filing, "at times emblazoning 'privacy' on city billboards and slick commercials and at other times using privacy as a justification to look the other way while Doe and other children's privacy is utterly trampled on through the proliferation of CSAM on Apple's iCloud."

The child cited in the suit was initially contacted over SnapChat. The conversations were then moved over to iMessage before she was asked to record videos.

SnapChat is not named in the suit as a co-defendant.

Apple did attempt to introduce more stringent CSAM detection tools. But in 2022, it abandoned those following allegations that the same tools would lead to surveillance of any or all users.



Read on AppleInsider

Comments

  • Reply 1 of 12
    blastdoorblastdoor Posts: 3,520member
    I’d say people who want to undermine privacy protections are using children as a cover for their goals.
    xyzzy-xxxelijahgtimpetusmike1danoxluke hamblyBeOS UseriOS_Guy80MacProentropys
  • Reply 2 of 12
    Looks like FBI would like to see end-to-end encryption forbidden. This would be an absolut disaster for privacy!
    elijahgtimpetusdanoxluke hamblyiOS_Guy80MacProwatto_cobrawilliamlondon
  • Reply 3 of 12
    Where were the parents when she was uploading to the cloud?
    elijahghammeroftruthtimpetusmike1danoxluke hamblyxyzzy-xxxiOS_Guy80watto_cobrawilliamlondon
  • Reply 5 of 12
    Yes it’s all Apple’s fault for not being a more responsible parent. /s
    timpetusmike1luke hamblyxyzzy-xxxiOS_Guy80MacProentropyswatto_cobrawilliamlondondewme
  • Reply 6 of 12
    blastdoor said:
    I’d say people who want to undermine privacy protections are using children as a cover for their goals.
    This 100%. We can live in a world that has horrible people abusing children and privacy for other people who need it, or we can live in a world where we still have horrible people abusing children and privacy is a dream for all but the rich and powerful. It's not a trade-off, at least not in the direction the anti-privacy lobby would have you believe. The world will be far worse on net if we destroy the right to privacy.
    xyzzy-xxxblastdoorwatto_cobraJanNLwilliamlondon
  • Reply 7 of 12
    mike1mike1 Posts: 3,388member
    When 9-year-old Jane Doe received an iPad for Christmas," says the filing, "she never imagined perpetrators would use it to coerce her to produce and upload child sexual abuse material ("CSAM") to iCloud."

    Probably not, but her parents should have.

    How could all of the alleged actions have taken place without the parents knowing. Apparently it was more than once. She was 9, not 16.
    Parents should be the ones being targeted.

    Hate when people proclaim "My cause is the most important thing in the world ever and nothing else even matters."



    xyzzy-xxxwatto_cobrabeowulfschmidtwilliamlondondewme
  • Reply 8 of 12
    One day, the title will become : 

    “Apple accused of using CSAM detection to ignoring privacy protection on iCloud” 


    If you have to know a piece of information is lawful or evil, you must look into it, investigate it, examine it, check it or even compare it to some other information, and all of these activities are not suppose to be related to “Privacy”, not mention that if we know what/where is the targeted information, txt? pic? video? voice message? When does it happen?
    Scan everything relate to iCloud in real time, all users around the world? How about once a month?
    I am not saying that Apple should do nothing, but what and how to do it?
    Find something out of something but you are not allowed to touch it or not even have a look, or a peek….
    Even use Computer or AI to look for CSAM, it must go through a manned process to do the final judgement before it become a case, how about it is a picture of someone just standing next to a child mannequin? False alarm, then “someone” privacy was exposed. How about a wife recording the father changing the diaper very first time? False alarm, but the naked baby exposed, then? 
    watto_cobra
  • Reply 9 of 12
    entropysentropys Posts: 4,254member
    blastdoor said:
    I’d say people who want to undermine privacy protections are using children as a cover for their goals.

    watto_cobrawilliamlondonblastdoor
  • Reply 10 of 12
    I thought Apple was already already scanning stuff for CSAM after it's uploaded to iCloud?  Did they stop, or is it alleged that the scanning isn't good enough.

    As I recall, the kerfuffle last year(?) was because Apple was going to use on device resources to scan material before it was uploaded, using a database of CSAM images that could potentially be compromised by an authoritarian government to include images that have nothing to do with CSAM, but which that government doesn't like.
    muthuk_vanalingam
  • Reply 11 of 12
    I thought Apple was already already scanning stuff for CSAM after it's uploaded to iCloud?  Did they stop, or is it alleged that the scanning isn't good enough.

    As I recall, the kerfuffle last year(?) was because Apple was going to use on device resources to scan material before it was uploaded, using a database of CSAM images that could potentially be compromised by an authoritarian government to include images that have nothing to do with CSAM, but which that government doesn't like.
    The kurfufle was that Apple’s CSAM detection tool could detect other things if it wasn’t a CSAM tool. The same way that if an authoritarian government forced them to make your iPhone into a gun that would murder people they didn’t like, that would also be a security threat.

    Basically a bunch of politicians told Apple they’d make their live hell if they protected kids from the politicians donors. 
  • Reply 12 of 12
    davidwdavidw Posts: 2,096member
    mike1 said:
    When 9-year-old Jane Doe received an iPad for Christmas," says the filing, "she never imagined perpetrators would use it to coerce her to produce and upload child sexual abuse material ("CSAM") to iCloud."

    Probably not, but her parents should have.

    How could all of the alleged actions have taken place without the parents knowing. Apparently it was more than once. She was 9, not 16.
    Parents should be the ones being targeted.

    Hate when people proclaim "My cause is the most important thing in the world ever and nothing else even matters."




    That's like giving your young kid a Red Ryder BB gun for Xmas and not knowing that your kid can ..... shoot their eye out.

    But really, how difficult would it had been for the parents to find out about and enabling .... "Parental Control", on the iPad (with internet access) they gave to their young kid. IIRC, even first generation iPads, still have some form of "Parental Control" (though not nearly as robust as ones running iOS 12 and later), that might have prevented this.








Sign In or Register to comment.