Activists rally at Apple Park for reinstatement of child safety features

Posted:
in General Discussion edited June 12

Protesters at Apple Park are demanding that the company reinstate its recently abandoned child safety measures.

Group of protesters at Apple Park Visitor Center holding signs advocating against child abuse and for a future without abuse.
Image credit: Brooke Anderson (@MovementPhotog on X)



Nearly three dozen protesters gathered around Apple Park on Monday morning, carrying signs that read, "Why Won't you Delete Child Abuse?" and "Build a future without abuse."

The group, known as Heat Initiative, is demanding that the iPhone developer bring back its child sexual abuse material (CSAM) detection tool.

"We don't want to be here, but we feel like we have to," Heat Initiative CEO Sarah Gardner told Silicon Valley. "This is what it's going to take to get people's attention and get Apple to focus more on protecting children on their platform."

Today, we showed up at #WWDC, with dozens of survivors, advocates, parents, community + youth leaders and allies, calling on Apple to stop the spread of child sexual abuse on their platforms now.

We must build a future without abuse. pic.twitter.com/hky9eh3dFD

-- heatinitiative (@heatinitiative)



In 2021, Apple had planned to roll out a new feature capable of comparing iCloud photos to a known database of CSAM images. If a match were found, Apple would review the image and report any relevant findings to NCMEC. This agency works as a reporting center for child abuse material that works with law enforcement agencies around the U.S.

After significant backlash from multiple groups, the company paused the program in mid-December of the same year, and then ultimately abandoned the plan entirely the following December.

Diagram showing Apple's process for detecting CSAM in photos. On-device matching, followed by safety vouchers, uploading to Apple, threshold check, then review by Apple, and report to NCMEC.
Apple's abandoned CSAM detection tool



When asked why, Apple said the program would "create new threat vectors for data thieves to find and exploit," and worried that these vectors would compromise security, a topic that Apple prides itself in taking very seriously.

While Apple never officially rolled out its CSAM detection system, it did roll out a feature that alerts children when they receive or attempt to send content containing nudity in Messages, AirDrop, FaceTime video messages and other apps.

However, protesters don't feel like this system does enough to hold predators accountable for possessing CSAM and would rather it reinstate its formerly abandoned system.

"We're trying to engage in a dialog with Apple to implement these changes," protester Christine Almadjian said on Monday. "They don't feel like these are necessary actions."

This isn't the first time that Apple has butt heads with Heat Initiative, either. In 2023, the group launched a multi-million dollar campaign against Apple.



Read on AppleInsider

Comments

  • Reply 1 of 11
    In 2021, Apple had planned to roll out a new feature capable of comparing iCloud photos to a known database of CSAM images.

    As I recall, one of the major bones of contention was that the system was going to scan photos destined for iClound, but not yet actually there, and was using resources on the user's phone itself, and not using iCloud resources, to do the actual scanning.  There were also, if I remember correctly, concerns about mistakes, as happened with some Google attempt to do the same thing flagging a file with a single character in it as problematic and locking an account.

    byronlelijahgVictorMortimerwatto_cobra
  • Reply 2 of 11
    beowulfschmidt said: As I recall, one of the major bones of contention was that the system was going to scan photos destined for iClound, but not yet actually there, and was using resources on the user's phone itself, and not using iCloud resources, to do the actual scanning.  There were also, if I remember correctly, concerns about mistakes, as happened with some Google attempt to do the same thing flagging a file with a single character in it as problematic and locking an account.
    It was a controversy that didn't really make much sense. The files that would be scanned were the same regardless of whether the scan happened in the cloud or on the phone itself. The user would choose whether or not to use iCloud for file backup. If they did choose to do so they would have to agree to Apple's terms for iCloud (which always include file scanning) and then choose which applications would have files backed up. Only the files from the applications that the user chose to use with iCloud would be scanned. So there's no actual difference to the files being scanned on the phone or in the cloud. Nothing would change in terms of what files were being scanned. 
    edited June 11 40domiapplebynaturewatto_cobra
  • Reply 3 of 11
    DAalsethDAalseth Posts: 2,966member
    Maybe I’m a cynic, but this protest strikes me more like Astroturfing by some group with an agenda than a real protest. 
    badmonk40domibyronlad0niramVictorMortimerwatto_cobra
  • Reply 4 of 11
    chasmchasm Posts: 3,502member
    If these folks are seriously concerned about preventing CSAM distribution, why aren’t they permanently encamped at Meta, Xitter, and Google’s headquarters?

    I mean, if they are serious about trying to trample constitutional limits on the right to privacy and are pro-preemptive state searching without the presumption of innocence, they’d be much more likely to get further with those companies (who handle far more volume of cloud transportation of data than Apple anyway).
    badmonk40domielijahgad0niramVictorMortimerwatto_cobra
  • Reply 5 of 11
    gatorguygatorguy Posts: 24,584member
    chasm said:
    If these folks are seriously concerned about preventing CSAM distribution, why aren’t they permanently encamped at Meta, Xitter, and Google’s headquarters?

    I mean, if they are serious about trying to trample constitutional limits on the right to privacy and are pro-preemptive state searching without the presumption of innocence, they’d be much more likely to get further with those companies (who handle far more volume of cloud transportation of data than Apple anyway).
    https://protectingchildren.google/#fighting-abuse-on-our-own-platform-and-services

    Apple wants to do the right thing, and had everything in place to do so (perhaps going one step too far) but is struggling with how to do so now without stirring up the minions.
    edited June 11 40domijony0
  • Reply 6 of 11
    bharperbharper Posts: 1member
    beowulfschmidt said: As I recall, one of the major bones of contention was that the system was going to scan photos destined for iClound, but not yet actually there, and was using resources on the user's phone itself, and not using iCloud resources, to do the actual scanning.  There were also, if I remember correctly, concerns about mistakes, as happened with some Google attempt to do the same thing flagging a file with a single character in it as problematic and locking an account.
    It was a controversy that didn't really make much sense. The files that would be scanned were the same regardless of whether the scan happened in the cloud or on the phone itself. The user would choose whether or not to use iCloud for file backup. If they did choose to do so they would have to agree to Apple's terms for iCloud (which always include file scanning) and then choose which applications would have files backed up. Only the files from the applications that the user chose to use with iCloud would be scanned. So there's no actual difference to the files being scanned on the phone or in the cloud. Nothing would change in terms of what files were being scanned. 
    I agree, this was impossible to talk about because Apple did a terrible job announcing it. It wasn't clear that this was only for iCloud-bound pictures, ONLY pictures that Apple would store on their servers. By the time they clarified that, it was too late and "Apple will scan photos on your phone and report you" was the only thing people heard.

    There was some interesting technology behind it, and it would have likely been the first widespread use of fully homomorphic encryption (the ability to run operations on data without decrypting it first). They came up with a whole host of techniques like private set intersection that would have provided security guarantees in the cloud (Apple wouldn't have the ability to view your photos without meeting a threshold of positives). And even people who understood the technology compared it against some non-existent ideal system that doesn't exist, instead of viewing this as an improvement to the current "cloud provider can see 100% of the pictures you send them" system.

    I actually lost some respect for the EFF during this whole episode. I still think they generally do good advocacy, but in this instance they totally failed to evaluate the proposal honestly.
    40domiapplebynaturewatto_cobrajony0
  • Reply 7 of 11
    40domi40domi Posts: 138member
    Apple made a mistake by pulling this safety feature, they should have stuck to their guns!
    jony0
  • Reply 8 of 11
    elijahgelijahg Posts: 2,825member
    A big part of the resistance was this would enable less 'friendly' governments to add particular images they didn't approve of to the database, CSAM or not. It could potentially flag images and therefore people that had anti-government photos in their library. Imagine a pro-Ukranian Russian citizen screenshotting some article to show a friend, but Putin had screenshots of that same article pushed into Russia's national CSAM database. All of a sudden that citizen has fallen out of a 10th story window. It is a lot harder to quietly compel Apple to write spyware for iOS than it is to simply add a signature to a database.
    mjpbuyVictorMortimerwatto_cobra
  • Reply 9 of 11
    elijahg said:
    A big part of the resistance was this would enable less 'friendly' governments to add particular images they didn't approve of to the database, CSAM or not. It could potentially flag images and therefore people that had anti-government photos in their library. Imagine a pro-Ukranian Russian citizen screenshotting some article to show a friend, but Putin had screenshots of that same article pushed into Russia's national CSAM database. All of a sudden that citizen has fallen out of a 10th story window. It is a lot harder to quietly compel Apple to write spyware for iOS than it is to simply add a signature to a database.

    This part.  The scanning process would have been under Apple's control.  The database of image signatures was not.  The probability of compromise by some, at least one, tyrannical and/or fascist government approaches closely enough to 1 as not to matter.
    VictorMortimerelijahgwatto_cobra
  • Reply 10 of 11
    jvm156jvm156 Posts: 29member
    No one wants this. Get over it
    VictorMortimerelijahgwatto_cobra
  • Reply 11 of 11
    Absolutely unacceptable.

    The "but think of the children" crowd needs to get over it.  On-device scanning and reporting of ANYTHING is incredibly dangerous and should NEVER happen.  Today it's kiddy porn, tomorrow it's tracking of women's reproductive choices or criticism of the government.

    exceptionhandlerbeowulfschmidtelijahg
Sign In or Register to comment.