New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM

Posted:
in General Discussion edited August 2021
Apple has published a response to privacy criticisms of its new iCloud Photos feature of scanning for child abuse images, saying it "will refuse" government pressures to infringe privacy.

Apple's new child protection feature
Apple's new child protection feature


Apple's suite of tools meant to protect children has caused mixed reactions from security and privacy experts, with some erroneously choosing to claim that Apple is abandoning its privacy stance. Now Apple has published a rebuttal in the form of a Frequently Asked Questions document.

"At Apple, our goal is to create technology that empowers people and enriches their lives -- while helping them stay safe," says the full document. "We want to protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM)."

"Since we announced these features, many stakeholders including privacy organizations and child safety organizations have expressed their support of this new solution," it continues, "and some have reached out with questions."

"What are the differences between communication safety in Messages and CSAM detection in iCloud Photos?" it asks. "These two features are not the same and do not use the same technology."

Apple emphasizes that the new features in Messages are "designed to give parents... additional tools to help protect their children." Images sent or received via Messages are analyzed on-device "and so [the feature] does not change the privacy assurances of Messages."

CSAM detection in iCloud Photos does not send information to Apple about "any photos other than those that match known CSAM images."

Much of the document details what AppleInsider broke out on Friday. However, there are a few points explicitly spelled out that weren't before.

First, a concern from privacy and security experts has been that this scanning of images on device could easily be extended to the benefit of authoritarian governments that demand Apple expand what it searches for.

"Apple will refuse any such demands," says the FAQ document. "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."

"Let us be clear," it continues, "this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government's request to expand it."

Apple's new publication on the topic comes after an open letter was sent, asking the company to reconsider its new features.

Second, while AppleInsider said this before based on commentary from Apple, the company has clarified in no uncertain terms that the feature does not work when iCloud Photos is turned off.

Read on AppleInsider
«1345

Comments

  • Reply 1 of 98
    entropysentropys Posts: 4,152member
    A concern from privacy and security experts has been that this scanning of images on device could easily be extended to the benefit of authoritarian governments that demand Apple expand what it searches for. 

    "Apple will refuse any such demands," says the FAQ document. "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."


    Riiiight.

    Translation: Here at Apple, we might have created a back door, but we promise to only ever use it for good. Pinky swear!

    JaiOh81AlwaysWinternewisneverenoughmacplusplusgeorgie01mobirdelijahgOctoMonkeyexceptionhandlerbyronl
  • Reply 2 of 98
    JaiOh81JaiOh81 Posts: 60member
    entropys said:
    A concern from privacy and security experts has been that this scanning of images on device could easily be extended to the benefit of authoritarian governments that demand Apple expand what it searches for. 

    "Apple will refuse any such demands," says the FAQ document. "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."


    Riiiight.

    Translation: Here at Apple, we might have created a back door, but we promise to only ever use it for good. Pinky swear!

    Came to say the same thing. I’m really disappointed in Apple. It’s unfortunate that our privacy is being  eroded even more. This is a really bad look for Apple but they know their customers have little choice because the other guys are worse. 
    AlwaysWinternewisneverenoughentropysbdubyaRayz2016georgie01OctoMonkeyelijahgexceptionhandlerbyronl
  • Reply 3 of 98
    entropys said: Riiiight.

    Translation: Here at Apple, we might have created a back door, but we promise to only ever use it for good. Pinky swear!

    Read the user agreement terms for iCloud and other cloud services: they have always had parameters for what is acceptable use of the service and have always reserved the right to screen files as a result. There has never been any "you can do whatever you want in the cloud and we'll never look at any files in the cloud" promise from any of these companies. 

    The only people that think this is something new are people that never read the user agreements for cloud services. 
    edited August 2021 chaickabageljoeykillroycoolfactorscstrrfronnjahbladeuraharan2itivguywatto_cobra
  • Reply 4 of 98
    Their defense is that they would refuse authoritarian attempts but we’ve seen two instances so far where Apple couldn’t refuse. With iCloud in China and with FaceTime in Saudi Arabia. Setting the past aside what’s to say that the political or financial climate for Apple won’t change and make it harder to say no than it is now? There may come a time when they want to say no but can’t. 
    newisneverenoughbdubyaRayz2016OctoMonkeyelijahgexceptionhandlerbyronlchemengin1uraharadarkvader
  • Reply 5 of 98
    Their defense is that they would refuse authoritarian attempts but we’ve seen two instances so far where Apple couldn’t refuse. With iCloud in China and with FaceTime in Saudi Arabia. Setting the past aside what’s to say that the political or financial climate for Apple won’t change and make it harder to say no than it is now? There may come a time when they want to say no but can’t. 
    The problem with hypotheticals is that they can be applied to anything and anyone. That's why lawsuits based on hypotheticals get thrown out of court, like Yahoo! suing the government over subpoenas for data. Yahoo! imagined all kinds of scenarios where handing over the data could be abused, but the court said "where's the proof the government is actually doing any of that with the data" and the case was thrown out. 
    bdubyakillroybyronlscstrrfjahbladen2itivguywatto_cobra
  • Reply 6 of 98
    That Apple fails to see the conceptual problem with checking the contents of private accounts is mind boggling. And the idea, as other commenters mention, that users must rely on their current and future refusal to use this technology for other purposes is the most ridiculous thing ever. I don’t know if they are just too impressed with themselves, or simply stupid when it comes to anything beyond technological cleverness. 
    bdubyaentropysgeorgie01elijahgbyronlchemengin1baconstangmike54ireland
  • Reply 7 of 98
    entropysentropys Posts: 4,152member
    So don’t create the ability in the first place, and Apple can’t hand people over.
    longpathmacplusplusKolvorokelijahgbyronln2itivguymejsricbaconstangmike54ireland
  • Reply 8 of 98
     "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."

    How quickly Apple has forgotten they caved into a foreign government power mandating changes so they could keep selling in China? Sacrificing privacy and security of their user’s iCloud storage to be stored in China for profits over the customer privacy - shame Apple. I think they have proven how ‘steadfast’  they actually are… Maybe this will be the last iOS device.

    And scanning with on-device learning of your photo library for any reason is an invasion of privacy without permission. Would you let someone into your house to look through you photo albums without permission? It’s the same at the end of the day. A warrant is the current requirement to do that! How can a tech giant replace legislation and human rights? Yes, and it is just hash scanning now - but tech giants don’t have a good track record on privacy!


    edited August 2021 macplusplusgeorgie01OctoMonkeyelijahgmuthuk_vanalingambyronlchemengin1mejsricbaconstangmike54
  • Reply 9 of 98
    crowleycrowley Posts: 10,453member
    bdubya said:

    And scanning with on-device learning of your photo library for any reason is an invasion of privacy without permission

    But Apple's CSAM check doesn't do this!  It doesn't "learn" your photo library.  It check individual photos at the point of upload to iCloud, for which the terms and conditions clearly give Apple the right, therefore by accepting terms and conditions the user have given their permission.  If they don't want to give their permission then they can opt out of iCloud Photos.

    Moreover, the fact that iCloud in China is not totally under Apple's control is probably a significant reason for Apple to rely on on-device processing to do these checks, as it means that it is totally under Apple's control.
    killroysagan_studentlkruppbyronlscstrrfronnjahbladeuraharan2itivguywatto_cobra
  • Reply 10 of 98
    macplusplusmacplusplus Posts: 2,112member
    Apple can resist to government requests but if a government makes that scheme into law Apple cannot resist.

    Apple has made a precedent with that. Now many governments will take that precedent as template and will try to turn it into law. But in the past Apple had resisted to the request of developing a special iOS to break into criminals' phones as that would constitute a precedent.
    edited August 2021 georgie01OctoMonkeyelijahgronn
  • Reply 11 of 98
    crowleycrowley Posts: 10,453member
    Apple can resist to government requests but if a government makes that scheme into law Apple cannot resist.
    This was true last week too, nothing has changed with regards to Apple's obligation to follow the law in places where they do business.
    killroyscstrrfronnwatto_cobra
  • Reply 12 of 98
    davidwdavidw Posts: 2,036member
    bdubya said:
     "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."

    How quickly Apple has forgotten they caved into a foreign government power mandating changes so they could keep selling in China? Sacrificing privacy and security of their user’s iCloud storage to be stored in China for profits over the customer privacy - shame Apple. I think they have proven how ‘steadfast’  they actually are… Maybe this will be the last iOS device.


    There is no customer privacy from the government of China. China government is not a government of the people, by the people and for the people. China citizens are not protected from their own government, by the US Constitution. Apple could not had sacrificed any of their China customers privacy, that they don't have it the first place. If anything, if Apple were able to keep their China customers iCloud accounts away from the eyes of their government, their China customers would have gained some privacy. I don't think any of their China customers were counting on that.  
    scstrrfronnbaconstangwatto_cobra
  • Reply 13 of 98
    elijahgelijahg Posts: 2,753member
    Bollocks. So when the Chinese government tells Apple to add a heap of CPP provided hashes, they’re going to refuse? Of course they won’t. If any government said provided data were hashes of CSAM material, who’s Apple to say it’s not?
    muthuk_vanalingammejsricbaconstangireland
  • Reply 14 of 98
    georgie01georgie01 Posts: 436member
    entropys said:

    Riiiight.

    Translation: Here at Apple, we might have created a back door, but we promise to only ever use it for good. Pinky swear!

    The pettiness of Apple’s response is obvious.

    Apple has clearly shown governments they are willing to scan data directly on a user’s device. That’s not inconsequential.

    What if Apple decides next year to join the ‘misinformation’ bandwagon, for the ‘good of the people’?  What if the year after that they decide to aid in politics through monitoring communications about political issues and delivering that information to ‘independent’ political organisations, for the ‘good of the people’?

    Those are hypotheticals of course, but Apple should keep far and away from anything that smells like where it could lead. An authoritarian government (which our government is smelling more and more like), or a tech industry controlling people, is worse for a population than CSAM.
    macpluspluselijahgchemengin1baconstangmike54ireland
  • Reply 15 of 98
    Rayz2016Rayz2016 Posts: 6,957member
    JaiOh81 said:
    entropys said:
    A concern from privacy and security experts has been that this scanning of images on device could easily be extended to the benefit of authoritarian governments that demand Apple expand what it searches for. 

    "Apple will refuse any such demands," says the FAQ document. "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."


    Riiiight.

    Translation: Here at Apple, we might have created a back door, but we promise to only ever use it for good. Pinky swear!

    Came to say the same thing. I’m really disappointed in Apple. It’s unfortunate that our privacy is being  eroded even more. This is a really bad look for Apple but they know their customers have little choice because the other guys are worse. 

    Wellll, the other guy being worse usually depends on where you're standing. From a software developer's point of view, it's not as clear.

    Google (as one example of the other guy) also scans the for CSAM pictures, but they do it on the server. What they don't do is install spyware on your phone to scan it before it reaches the server.

    Now, in the future, this may change; Google may have no choice but to do the same thing, because the advantage that Apple's system has for law enforcement is that it is now pretty much impossible to encrypt pictures before they're scanned and uploaded. 

    But if Google does do the same thing, they have a couple of advantages over Apple. 

    Android is open-source, which means the backdoor code can be scrutinised and tested for correctness. The other thing, which will annoy a number of governments, is that folk will be able to build phones which disable it or leave it out altogether, so the only reliable way to find CSAM images (or whatever they're looking for) is to continue scanning server-side.

    The other potential weak point in Apple's scheme is the backdoor itself. One mistake that Cupertino makes time and again, is forgetting to check for data that can cause problems in the operating system. We've had at least two instances of text message strings that crash the phone on arrival. Pretty easy to avoid; just send the message to folk you don't like, not to your mother.

    Apple is now going to be using a database that they are taking from a third party and installing on every iPhone, iPad and Mac. 

    Running the scan server side, the worst that can happen is that the match brings down a server.

    Running the scan on-device, the worst that can happen is that it crashes every phone running it.


    elijahgmejsricbaconstangmike54muthuk_vanalingam
  • Reply 16 of 98
    "Apple will refuse any such demands," says the FAQ document. "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."
    Until the government in question passes a law that requires Apple to do so, because as they've said many times, they'll comply with any local laws, even to the detriment of their principles concerning privacy.

    "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."

    And having "steadfastly refused" those demands in the past, you've now done what they want voluntarily.  And as soon as a government passes a law requiring the addition of something else, you'll comply, just as you have all along.


    macpluspluselijahgbaconstangmike54
  • Reply 17 of 98
    gatorguygatorguy Posts: 24,176member
    "Apple will refuse any such demands," says the FAQ document. "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."
    Until the government in question passes a law that requires Apple to do so, because as they've said many times, they'll comply with any local laws, even to the detriment of their principles concerning privacy.

    "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."

    And having "steadfastly refused" those demands in the past, you've now done what they want voluntarily.  And as soon as a government passes a law requiring the addition of something else, you'll comply, just as you have all along.


    I would not expect Apple to necessarily reveal any expansion of it if some country, and in this case I'm thinking of China, would order them to. They've long soft-peddled the "iCloud operated by GCBD" handover. Heck, it's not even an Apple-run program there. Apple is simply contractually required to cooperate with the government-controlled cloud provider in whatever way needed for handlng the demands on services and access. It is no longer Apple's to run, and they aren't making the rules.
    ronnbaconstang
  • Reply 18 of 98
    Rayz2016Rayz2016 Posts: 6,957member
    crowley said:
    Apple can resist to government requests but if a government makes that scheme into law Apple cannot resist.
    This was true last week too, nothing has changed with regards to Apple's obligation to follow the law in places where they do business.
    Yeah, I think the trick is to wait for them to demand a backdoor; then at least you can tell your customers that the law says they have to put in a backdoor at the same time as the competition.

    Putting one in before you're told to strikes me as a little bit dumb.

    My guess is that they've been offered a deal: implement the backdoor and the anti-trust/monopoly stuff goes away.

    Then it would be daft of the government to force Apple to allow alternative app stores, because then they prevent folk from installing software that might bypass the checks.
    macplusplusmuthuk_vanalingamjahbladebaconstang
  • Reply 19 of 98
    Rayz2016Rayz2016 Posts: 6,957member
    elijahg said:
    Bollocks. So when the Chinese government tells Apple to add a heap of CPP provided hashes, they’re going to refuse? Of course they won’t. If any government said provided data were hashes of CSAM material, who’s Apple to say it’s not?
    That's the great thing about the CSAM material; it's just hashes. In some countries it could kiddie porn; in other countries it could be photos taken by the police at protest march. And in those countries, Apple won't be the only ones checking the pictures.


    elijahgscstrrfbaconstang
  • Reply 20 of 98
    Rayz2016Rayz2016 Posts: 6,957member
    gatorguy said:
    "Apple will refuse any such demands," says the FAQ document. "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."
    Until the government in question passes a law that requires Apple to do so, because as they've said many times, they'll comply with any local laws, even to the detriment of their principles concerning privacy.

    "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."

    And having "steadfastly refused" those demands in the past, you've now done what they want voluntarily.  And as soon as a government passes a law requiring the addition of something else, you'll comply, just as you have all along.


    I would not expect Apple to necessarily reveal any expansion of it if some country, and in this case I'm thinking of China, would order them to. They've long soft-peddled the "iCloud operated by GCBD" handover. Heck, it's not even an Apple-run program there. Apple is simply contractually required to cooperate with the government-controlled cloud provider in whatever way needed for handlng the demands on services and access. It is no longer Apple's to run, and they aren't making the rules.
    Surprisingly, you haven't actually hit on the worst problem.

    Under the current system, the Chinese can avoid the problem simply by not storing stuff in iCloud. Apple even warned them when they were switching over so they had plenty of time to make other arrangements.

    This is different.

    This piece of software (let's not be coy; it's spyware, plain and simple – it is rifling through your shit looking for other shit) is running on the phone. This means that it can be activated to report on any picture, document or video, regardless of what cloud service it is attached to.

    Now, people will now jump in and say, "Well, let's just wait until it happens shall we?"

    But some things you know are a bad idea without waiting and seeing. I sometimes think it might be fun to lick a lamppost in sub-zero temperatures, just to see what would happen. But then, on second thoughts, I usually just assume the worst without testing the hypothesis.

    You seem to have a finger deep inside Google; do they have something like this, or do they just do the server side scan. I haven't been able to find any reference to a similar setup at any other tech behemoth.
    edited August 2021 muthuk_vanalingamelijahgbaconstangmike54
Sign In or Register to comment.