Child safety advocacy group launches campaign against Apple

Posted:
in iCloud

Heat Initiative, a child safety advocacy group, is launching a multi-million dollar campaign against Apple to pressure the company into reinstating iCloud CSAM detection.

Apple's new child protection feature
Apple's abandoned child protection feature



Heat Initiative said it would launch the campaign against Apple after pressing the tech giant on why it had abandoned plans for on-device and iCloud Child Sexual Abuse Material (CSAM) detection tools. The launch comes after Apple giving its most detailed response yet as to why it backed off its plans, citing that it would uphold user privacy.

In response to Apple, Heat Initiative has officially launched its campaign website. The advocacy group issues a statement on the front page that reads, "Child sexual abuse is stored on iCloud. Apple allows it."

"Apple's landmark announcement to detect child sexual abuse images and videos in 2021 was silently rolled back, impacting the lives of children worldwide," the statement continues. "With every day that passes, there are kids suffering because of this inaction, which is why we're calling on Apple to deliver on their commitment."

The website contains alleged case studies that detail multiple cases where iCloud had been used to store sexual abuse materials, including photos, videos, and explicit messages.

It calls on Apple to "detect, report, and remove sexual abuse images and videos from iCloud," as well as "create a robust reporting mechanism for users to report child sexual abuse images and videos to Apple."

The company provides a copy of a letter it sent directly to Tim Cook, saying that it was "shocked and discouraged by your decision to reverse course and not institute" CSAM detection measures.

It also includes a button that allows visitors to send a prewritten email demanding action from Apple to the entire Apple executive team.

Child Sexual Abuse Material is an ongoing severe concern Apple attempted to address with on-device and iCloud detection tools. These controversial tools were ultimately abandoned in December 2022, leaving more controversy in its wake.

The Heat Initiative is not alone in its quest. As spotted by 9to5mac, the Christian Brothers Investment Services, and Degroof Petercam are respectively filing and backing a shareholder resolution about the topic.



Read on AppleInsider
«1

Comments

  • Reply 1 of 28
    Apple's use of a "slippery slope" argument is entirely unconvincing and lazy. 
    jony0
  • Reply 2 of 28
    Honestly I think they're damned if they do and damned if they don't. People will get pissed about privacy on one side which isn't good for Apple and then advocacy groups get pissed on the other side. What are they gonna do....
    dewmewilliamlondonAlex1Nwatto_cobrawilliamhjony0
  • Reply 3 of 28
    chasmchasm Posts: 3,306member
    Again with a group of idiots who think Apple can make a “backdoor” that “only the good guys” have access to.

    You either have privacy, or you don’t. Full stop.

    Apple won’t respond to this any further, I suspect, but if they did — they should demand that the Heat Initiative outline a system that would reveal CSAM images *only* but protect a user’s right to privacy in *all* other ways.

    Good luck with that. If Apple’s own engineers tried and failed, it’s very unlikely anyone else could, and certainly not this fringe outfit.
    FileMakerFellerentropysrob53baconstangdewmebyronlwonkothesaneelijahgad0niramzeus423
  • Reply 4 of 28
    entropysentropys Posts: 4,168member
    We can compel companies to make them add a feature now? 

    Awesome. I want my next iPhone to make me very handsome, and extraordinarily…gifted.


    edited September 2023 wonkothesanewilliamlondonAlex1Nwatto_cobraFileMakerFeller
  • Reply 5 of 28
    jdwjdw Posts: 1,340member
    Pretty horrible for any group to use children to get what they want.  They know people will sympathize only because they say they are doing it to protect the innocents.  And while it is true their entire organization exists to protect kids, what they are trying to achieve with Apple will negatively impact most people, for reasons Apple so eloquently stated.  And this is why their use of kids in this fight is a below the belt punch. They couldn't care less if what Apple does opens the door to hell itself.  They only want what they want "to help kids."  Even negative consequences be darned!

    With that said, Googling "Heat Initiative" and "Sarah Gardner" yields pretty much nothing.  It's like they don't even exist.  Looks very shady to me.  Not sure why Apple even gave such an invisible and seemingly non-existent organization the time of day.  We need Apple Insider to sleuth out more details about this mysterious group that's demanding to remove YOUR privacy protections currently in place.
    baconstangbyronlronnAlex1Nwatto_cobramattinozFileMakerFeller
  • Reply 6 of 28
    JonGJonG Posts: 24unconfirmed, member
    Apple is not the government, nor are they law enforcement, I don't want them trying to act as either.

    Also, these groups can bleat all they want, but are they going to ban Math classes in college?  Anyone with a bit of math knowledge can make an encryption scheme (or simply grab one of a few hundred thousand apps out there) and encrypt the files themselves.  They can put their illegal files into an encrypted archive, or into an encrypted disk image.

    Would these groups like to try to ban encryption?  What they are asking for goes far beyond a "slippery slope", it is willful stupidity about the way the world, and computers, work. There is no silver bullet that will stop the abuse and exploitation of children.  It is a repugnant crime that has gone on throughout history.

    Apple is not facilitating these crimes, people are using their services while they commit a crime.

    Are we going to start ordering car companies to limit the speed a car can travel?  I mean cars can go far faster than the speed limit, and it is super easy for the car to know the speed limit of the road it is on, so why don't we simply prevent cars from going faster than that?  What about all the cameras in cars, should we also demand that those are networked and monitored by the car companies so that if someone injures someone with their car, they can be caught by the car company?

    Never allow private companies to do the job of police, they aren't trained for it, and then when they inevitably turn over a parent for taking a picture of their baby in the bath they will get gutted by a lawsuit for defamation.
    ad0niramSleepySheepAlex1Nwatto_cobraFileMakerFeller
  • Reply 7 of 28
    Well let’s just start by openly publishing all of the information any Heat Initiative employees, donors and their families have in iCloud. 
    williamlondonwatto_cobra
  • Reply 8 of 28
    One more thing: no scanning on my device. Privacy first. Period.
    williamlondonzeus423baconstangwatto_cobragrandact73
  • Reply 9 of 28
    avon b7avon b7 Posts: 7,703member
    macxpress said:
    Honestly I think they're damned if they do and damned if they don't. People will get pissed about privacy on one side which isn't good for Apple and then advocacy groups get pissed on the other side. What are they gonna do....
    Yes.

    I agree that the complexities surrounding the issue mean that there is no solution that will make all sides happy but I will take issue with the crackpot extreme sensationalist approach of things like this:

    "Child sexual abuse is stored on iCloud. Apple allows it"

    No matter how concerned you may be with a given situation, you shouldn't be banding claims around in that fashion. 
    dewmemuthuk_vanalingambaconstangronnAlex1NFileMakerFeller
  • Reply 10 of 28
    davidwdavidw Posts: 2,053member
    >"Child sexual abuse is stored on iCloud. Apple allows it."<

    This child safety advocacy group should either put up or shut up. If they have evidence that Apple is allowing CSAM to be stored on their servers, then why don't they report their findings to law enforcement? It is illegal to be in possession of any CSAM. Even unknowingly. Law enforcement can force Apple to shut down their servers if this were the case.

    In the US, there are no laws requiring any services to actively search for CSAM. If there were such laws, then a court issued search warrant for every account holder, would be required to perform any CSAM search, as the service would be performing such a search on behalf of the government. So even if they are not the government, the US Constitution would still apply to them.

    All services performing CSAM searches on their customers data, are doing so voluntarily, not because they are required to do so by any US laws. However, if the services were to find CSAM while voluntarily performing such a search or found CSAM while performing other functions of their service or it was reported to them that CSAM might be in an account on their servers, then by law, they are required to report any CSAM findings to law enforcement.  So long as Apple is at least reporting (to law enforcement) any CSAM they find on their servers (whether they were looking for it or not) , Apple is not breaking any laws and is not allowing CSAM on their servers.

    Right now (and its been around for quite awhile), technology exist where a car will not start unless the driver passes a built in breathe analyzer. Tens of thousands of people (many of them innocent children) are killed and injured every year (in the US alone) by people driving drunk. We don't see MADD (Mothers against Drunk Driving) running a million dollar ad campaign against auto makers, claiming auto makers allows drivers to drive drunk because they are not installing such a device on the cars they sell. Surely, auto makers can help prevent drunk driving by forcing all drivers to breathe into a breathe analyzer (and passing) before the car will start. MADD has enough common sense to know that just because auto makers are not installing such a device on all their cars, that they are some how ...... allowing drivers to drive drunk.

    And then we have to deal with the Apple haters that will claim Apple allows CSAM on their iCloud servers because of greed, by the profits from selling iCloud storage space.
    edited September 2023 lamboaudi4ronnAlex1Nwatto_cobraFileMakerFeller
  • Reply 11 of 28
    wood1208wood1208 Posts: 2,913member
    Not against child advocacy group. In fact support them whole heatedly. As many clearly said Privacy vs non-privacy. What about Android,MS Windows, MacOS and all the databases in or out of Cloud storing encrypted child abuse, porn material ?
    Best way to stop child abuse is reduce human foot print on earth so everyone has basic necessity so people don't exploit children for money. Make parent responsible because they produce children and not taking responsibility to raise them right. Any child found sexually abused and in porn, go after his/her parent. Implement strict laws including death penalty for those involve and distribution of child porn.

    Going after Apple to fix the child sexual abuse problem is not going to solve. In fact, it can create more of other problems due to the lack of privacy..
    watto_cobra
  • Reply 12 of 28
    dewmedewme Posts: 5,376member
    avon b7 said:
    macxpress said:
    Honestly I think they're damned if they do and damned if they don't. People will get pissed about privacy on one side which isn't good for Apple and then advocacy groups get pissed on the other side. What are they gonna do....
    Yes.

    I agree that the complexities surrounding the issue mean that there is no solution that will make all sides happy but I will take issue with the crackpot extreme sensationalist approach of things like this:

    "Child sexual abuse is stored on iCloud. Apple allows it"

    No matter how concerned you may be with a given situation, you shouldn't be banding claims around in that fashion. 
    Agree with both of you. No matter what Apple does, the amazing height of their money pile is always going to attract victims. 

    But that’s not a complete picture either. If you can blame Apple for what gets stored on their cloud servers, who’s next, hard drive and SSD manufacturers? 

    Not them, because they just provide the storage media. So you go after the device driver developers who actually move the bits to the media? 

    Still not enough, so you go after the apps that call the APIs that package up the files that get sent to the APIs provided by the device drivers that send the bits to the storage devices? 

    No matter how many layers of what should be opaque technology you start backing up into, it always ends right back where it started - a human with cognition and agency who is commanding it all and deciding what to filter and send down through a processing chain that reacts to the human’s will and intentions, whether good or bad. 

    It’s always a real person, not a corporate entity, not a string of technical claptrap, but some amoral pervert with a twisted sense of what’s right and what’s wrong, that’s exactly where the blame lies. 

    Unfortunately, to assign blame and enforce appropriate justice where it needs to be dealt with, we need to catch the bad actors doing the bad things they do. That’s where the problem gets messy and requires more thought and consideration. If we’re truly committed as a society to catching the bad guys doing bad things we need to be pushing solutions from the social level, not the technical level. 

    It’s one thing to have public agencies deciding that traps need to be put in place to catch the bad guys doing bad things. It’s quite another when a technology provider decides that they are going to take responsibility for trapping and catching the bad guys on their own. That’s where Apple’s good intentions fell off the rail. 

    They saw an opportunity to intercede and proposed doing something about it. This may have been a defensive move to avoid future blame, but in doing so they set themselves up for being an enforcer for something they should not bear sole responsibility for. Not only should Apple force those who should take responsibility to do so, by stepping in and providing what they saw as a solution would allow those who need to be in charge to defer responsibility to a corporate entity, one whose profits and fiscal health would be put at great risk. 

    Apple’s only reasonable course of action at this time is to do nothing and wait for those who bear responsibility to tell them what to do next. 
    muthuk_vanalingamAlex1Nwatto_cobraFileMakerFeller
  • Reply 13 of 28
    Here's the obvious problem for all the people claiming there is some sort of major privacy issue at stake with CSAM scanning in this thread: Apple already scans files uploaded to iCloud for illegal content. That was a part of the user agreement long before 2021. You have to agree to Apple's terms to use the service...so you've already agreed to have your files scanned. The idea that your files residing in iCloud are totally private isn't true at all, just like it isn't true for all the other mainstream cloud services.

    However, unlike the other major cloud services Apple has made the bizarre decision to say that CSAM scanning shouldn't be done...despite scanning for other illegal content.
    edited September 2023 jony0
  • Reply 14 of 28
    jdwjdw Posts: 1,340member
    Here's the obvious problem for all the people claiming there is some sort of major privacy issue at stake with CSAM scanning in this thread: Apple already scans files uploaded to iCloud for illegal content. That was a part of the user agreement long before 2021. You have to agree to Apple's terms to use the service...so you've already agreed to have your files scanned. The idea that your files residing in iCloud are totally private isn't true at all, just like it isn't true for all the other mainstream cloud services.

    However, unlike the other major cloud services Apple has made the bizarre decision to say that CSAM scanning shouldn't be done...despite scanning for other illegal content.
    Please provide proof that Apple not only scans our files uploaded to iCloud, but provide evidence showing what files they prohibit you from uploading due to the supposed scanning.  Simply reading a user agreement isn't evidence of what Apple is actually doing.
    watto_cobraFileMakerFeller
  • Reply 15 of 28
    The “human trafficking” umbrella has become a fig leaf used by right wing groups to cover their efforts to smear liberals. Who doesn’t want to protect kids, right? The child prostitution ring in the D.C. pizza parlor lunacy sprung from this initiative. If you protest their methods they claim you are protecting pedophiles or worse. Look very carefully at who is funding these groups and initiatives and what other beliefs they espouse. 
    edited September 2023 ronnAlex1Nwatto_cobra
  • Reply 16 of 28
    jdw said:
    Here's the obvious problem for all the people claiming there is some sort of major privacy issue at stake with CSAM scanning in this thread: Apple already scans files uploaded to iCloud for illegal content. That was a part of the user agreement long before 2021. You have to agree to Apple's terms to use the service...so you've already agreed to have your files scanned. The idea that your files residing in iCloud are totally private isn't true at all, just like it isn't true for all the other mainstream cloud services.

    However, unlike the other major cloud services Apple has made the bizarre decision to say that CSAM scanning shouldn't be done...despite scanning for other illegal content.
    Please provide proof that Apple not only scans our files uploaded to iCloud, but provide evidence showing what files they prohibit you from uploading due to the supposed scanning.  Simply reading a user agreement isn't evidence of what Apple is actually doing.

    Section V: Content and Your Conduct
    A. Content
    B. Your Conduct
    C. Removal of Content

    Obviously Apple has to be scanning files in order to remove content. Just click on the link I provided and go to Section V and you will find highly detailed information.
    edited September 2023 muthuk_vanalingamjony0
  • Reply 17 of 28
    davidwdavidw Posts: 2,053member
    Here's the obvious problem for all the people claiming there is some sort of major privacy issue at stake with CSAM scanning in this thread: Apple already scans files uploaded to iCloud for illegal content. That was a part of the user agreement long before 2021. You have to agree to Apple's terms to use the service...so you've already agreed to have your files scanned. The idea that your files residing in iCloud are totally private isn't true at all, just like it isn't true for all the other mainstream cloud services.

    However, unlike the other major cloud services Apple has made the bizarre decision to say that CSAM scanning shouldn't be done...despite scanning for other illegal content.

    The problem of any privacy violation isn't that Apple is (and most likely been doing for a while) already scanning for CSAM on their iCloud servers but that they were planning to scan for CSAM on the owner's device. It doesn't matter that the scans were going to take place right before the images are transferred from the owner device to the iCloud. The software for the scanning process is on the owner device, the scanning process uses the owner device resources and the scanning takes place before the images are on the iCloud server. That is not the same as Apple scanning for CSAM on their servers, like how the other cloud services are doing it. This child safety advocacy group is mainly bitching about Apple not implementing their plan to scan for CSAM on the owners devices.

    The SCOTUS has ruled many times that electronic devices like smartphones, have the same Constitutional 4th Amendment protection as a home. Thus law enforcement must get a search warrant (along with probable cause) to perform a search. Unless there's an immediate danger to the public in waiting to obtain the warrant. But just like your landlord can not enter your rental unit to see if you have any CSAM material lying around, Apple should not be able to do the same and enter your device and look for CSAM there. It's a whole different matter when your data is already on a third party server. The SCOTUS have ruled that there is no expectation of privacy when you allow a third party to have possession of your data.   

    Plus Apple was going to install software to scan iMessages for adult images and contents on minors account, so that parents can use the scanning software to better monitor their kids online messaging. This had nothing to do with images going to be stored on the iCloud and the software is capable of monitoring all iMessages, not just those of minors.  

    Lets put it this way. If say that Apple will allow you to download their CSAM scanning app for your iPhone/iPad. Would you download and install the app? Why would you give up some of your 4th amendment privacy rights, if you think Apple is going to be scanning the images you're going to upload to their iCloud anyway? What difference would it make in preventing child abuse, by having this app installed on your iPhone/iPad, so that Apple can scan your images before they are uploaded and stored on their iCloud servers?
    baconstangmuthuk_vanalingamronnAlex1Nwatto_cobra
  • Reply 18 of 28
    davidw said:
    Here's the obvious problem for all the people claiming there is some sort of major privacy issue at stake with CSAM scanning in this thread: Apple already scans files uploaded to iCloud for illegal content. That was a part of the user agreement long before 2021. You have to agree to Apple's terms to use the service...so you've already agreed to have your files scanned. The idea that your files residing in iCloud are totally private isn't true at all, just like it isn't true for all the other mainstream cloud services.

    However, unlike the other major cloud services Apple has made the bizarre decision to say that CSAM scanning shouldn't be done...despite scanning for other illegal content.

    The problem of any privacy violation isn't that Apple is (and most likely been doing for a while) already scanning for CSAM on their iCloud servers but that they were planning to scan for CSAM on the owner's device. It doesn't matter that the scans were going to take place right before the images are transferred from the owner device to the iCloud. The software for the scanning process is on the owner device, the scanning process uses the owner device resources and the scanning takes place before the images are on the iCloud server. That is not the same as Apple scanning for CSAM on their servers, like how the other cloud services are doing it. This child safety advocacy group is mainly bitching about Apple not implementing their plan to scan for CSAM on the owners devices.

    The SCOTUS has ruled many times that electronic devices like smartphones, have the same Constitutional 4th Amendment protection as a home. Thus law enforcement must get a search warrant (along with probable cause) to perform a search. Unless there's an immediate danger to the public in waiting to obtain the warrant. But just like your landlord can not enter your rental unit to see if you have any CSAM material lying around, Apple should not be able to do the same and enter your device and look for CSAM there. It's a whole different matter when your data is already on a third party server. The SCOTUS have ruled that there is no expectation of privacy when you allow a third party to have possession of your data.   

    Plus Apple was going to install software to scan iMessages for adult images and contents on minors account, so that parents can use the scanning software to better monitor their kids online messaging. This had nothing to do with images going to be stored on the iCloud and the software is capable of monitoring all iMessages, not just those of minors.  

    Lets put it this way. If say that Apple will allow you to download their CSAM scanning app for your iPhone/iPad. Would you download and install the app? Why would you give up some of your 4th amendment privacy rights, if you think Apple is going to be scanning the images you're going to upload to their iCloud anyway? What difference would it make in preventing child abuse, by having this app installed on your iPhone/iPad, so that Apple can scan your images before they are uploaded and stored on their iCloud servers?
    Law enforcement only need a warrant to force a search, any company or individual can voluntarily give it to law enforcement anytime they want to. This happens all the time in regulated businesses such as banking where the government regulators such as the FDIC, Federal Reserve, OCC (which are government but not law enforcement) have full access to all bank records without warrant and will automatically make a referral with and provide evidence to the FBI if they happen to come across anything during a regular bank examination.

    So in other words if Congress were to pass a law to create a federal agency to regulate CSAM standards, that agency will have full access to the records of any company that falls under their charter without warrant as part of their regulatory examinations and if they find anything they will automatically refer it to law enforcement.
    edited September 2023 Alex1Nwatto_cobra
  • Reply 19 of 28
    The “human trafficking” umbrella has become a fig leaf used by right wing groups to cover their efforts to smear liberals. Who doesn’t want to protect kids, right? The child prostitution ring in the D.C. pizza parlor lunacy sprung from this initiative. If you protest their methods they claim you are protecting pedophiles or worse. Look very carefully at who is funding these groups and initiatives and what other beliefs they espouse. 
    That's actually very on point. The current hysteria about "human trafficking" is really a cover for attacking prostitution. You might argue that the "worlds oldest profession" is somehow undesirable, but call this campaign against "human trafficking" for what really is.
    carnegieronnwatto_cobrawilliamlondon
  • Reply 20 of 28
    If the illegal images have been encrypted, Apple won't be able to determine that they're illegal.  That's why some folks want them to scan an individual's phone, before they're encrypted.  
    I'll go with the 4th and say show me the search warrant.  Otherwise, bug off.
    daveflashronnwatto_cobra
Sign In or Register to comment.