Apple privacy head explains privacy protections of CSAM detection system

Posted:
in iCloud edited August 2021
Apple's privacy chief Erik Neuenschwander has detailed some of the projections built into the company's CSAM scanning system that prevent it from being used for other purposes - including clarifying that the system performs no hashing if iCloud Photos is off.

Credit: WikiMedia Commons
Credit: WikiMedia Commons


The company's CSAM detection system, which was announced with other new child safety tools, has caused controversy. In response, Apple has offered numerous details about how it can scan for CSAM without endangering user privacy.

In an interview with TechCrunch, Apple privacy head Erik Neuenschwander said the system was designed from the start to prevent government overreach and abuse.

For one, the system only applies in the U.S., where Fourth Amendment protections already guard against illegal search and seizure.

"Well first, that is launching only for US, iCloud accounts, and so the hypotheticals seem to bring up generic countries or other countries that aren't the US when they speak in that way," Neuenschwander said "And therefore it seems to be the case that people agree US law doesn't offer these kinds of capabilities to our government."

But even beyond that, the system has baked-in guardrails. For example, the hash list that the system uses to tag CSAM is built into the operating system. It can't be updated from Apple's side without an iOS update. Apple also must release any updates to the database on a global scale -- it can't target individual users with specific updates.

The system also only tags collections of known CSAM. A single image isn't going to trigger anything. More then that, images that aren't in the database provided by the National Center for Missing and Exploited Children won't get tagged either.

Apple also has a manual review process. If an iCloud account gets flagged for a collection of illegal CSAM material, an Apple team will review the flag to ensure that it's actually a correct match before any external entity is alerted.

"And so the hypothetical requires jumping over a lot of hoops, including having Apple change its internal process to refer material that is not illegal, like known CSAM and that we don't believe that there's a basis on which people will be able to make that request in the US," Neuenschwander said.

Additionally, Neuenschwander added, there is still some user choice here. The system only works if a user has iCloud Photos enabled. The Apple privacy chief said that, if a user doesn't like the system, "they can choose not to use iCloud Photos." If iCloud Photos is not enabled, "no part of the system is functional."

"If users are not using iCloud Photos, NeuralHash will not run and will not generate any vouchers. CSAM detection is a neural hash being compared against a database of the known CSAM hashes that are part of the operating system image," the Apple executive said. "None of that piece, nor any of the additional parts including the creation of the safety vouchers or the uploading of vouchers to iCloud Photos is functioning if you're not using iCloud Photos."

Although Apple's CSAM feature has caused a stir online, the company refutes that the system can be used for any purposes other than detecting CSAM. Apple clearly states that it will refuse any government attempt to modify or use the system for something other than CSAM.

Read on AppleInsider
«134

Comments

  • Reply 1 of 62
    bloggerblogbloggerblog Posts: 2,464member
    Neuenschwander said "... people agree US law doesn't offer these kinds of capabilities to our government."
    That is not completely true, people got discriminated against due to guilty by association. Imagine if a fan took a photo with a whistleblowers like Snowden. So yeah it can be abused by any country
    darkvaderentropys
  • Reply 2 of 62
    RogueCitizen1RogueCitizen1 Posts: 2unconfirmed, member
    Somehow I think this is a case of Apple being pressured by govt to do this.  

    It honestly ruins their supposed privacy protection standards.   No one wants AI to go through their stuff.  Usually probably cause standard would exist, now? entire new caselaw,  use the cloud and you assume your own risk for things bring looked through?   So today it is about this issue, what issue in the future will be the excuse to go through someone's files?   suppose it is perils of using cloud storage, which is not private to begin with.   

    Seems like this will cause many to wake up to the fact that cloud storage is not remotely private or secure.
    darkvaderbaconstangentropysbloggerblogbyronl
  • Reply 3 of 62
    Rayz2016Rayz2016 Posts: 6,957member
    “Okay, guys, listen up. Here’s the brief. We’re going to bake a database of illegal porn image hashes into the operating system.”

    Well, gotta give them credit for thinking different. No one else would’ve thought to do this … not on a full stomach anyway. 




    baconstangmuthuk_vanalingam
  • Reply 4 of 62
    darkvaderdarkvader Posts: 1,146member
    In other words "Any government can force us to do whatever they want with this technology, and if they do we won't even be allowed to tell you it's happening."

    Oh, that's not what he said?  Then he's lying.

    Once this is in place, it can be abused for whatever purpose Apple or any government of any country where Apple sells iPhones will be able to force Apple to search those iPhones for any kind of image they want to look for.  In the US, the mechanism is a "National Security Letter" or "FISA warrant".  And it's not at all comforting to know that they can't push a fresh database without a spyOS update, they can release that at any time.

    Apple can no longer be trusted.

    It remains to be seen if Apple will drop this stupidity.  If they drop it now before it ever hits the public, perhaps they can be trusted again someday.
    lordjohnwhorfinbaconstangbyronlmuthuk_vanalingamBeats
  • Reply 5 of 62

    It would be helpful (rather than just trashing the offered fix from Apple) to offered an alternative solution - unless we wish to state that there is not a problem to solve.


    Criticism is easy but solutions are difficult - buts lets try.

    baconstangrobaba
  • Reply 6 of 62
    alanhalanh Posts: 75member
    Protections or projections?
    appleinsideruser
  • Reply 7 of 62
    Rayz2016Rayz2016 Posts: 6,957member

    It would be helpful (rather than just trashing the offered fix from Apple) to offered an alternative solution - unless we wish to state that there is not a problem to solve.


    Criticism is easy but solutions are difficult - buts lets try.

    I’ll have a think. 
  • Reply 8 of 62
    So, Apple continues to defend itself and be utterly deaf to any criticism. They continue to act like they can do no wrong. If this is their reaction to privacy issues, then the entire stance of the company is over. I was thankful in the past for all their efforts. How easily they are throwing it all away. 
    entropysbyronlcochomuthuk_vanalingamBeatsdarkvader
  • Reply 9 of 62
    Rayz2016 said:

    It would be helpful (rather than just trashing the offered fix from Apple) to offered an alternative solution - unless we wish to state that there is not a problem to solve.


    Criticism is easy but solutions are difficult - buts lets try.

    I’ll have a think. 
    Great - think hard - I did suggest on another post maybe Apple could remove the iCloud photos completely - but that may be a bit of an inconvenience. However after some further thinking as long as photos could be transferred between devices (via the internet) then maybe that would work for me.
  • Reply 10 of 62
    lkrupplkrupp Posts: 10,557member
    Again, where will you all go if you can’t trust Apple anymore? I’ve asked this question many times but no responses. If you no longer trust Apple or think they are in bed with the government will you leave the platform? Where will you go? Or, as I strongly suspect, will you just bitch and whine about privacy and pray for the destruction of Apple, and then meekly accept the inevitable, and go about your lives?

    All this talk about Apple destroying its credibility, PR disasters, loss of customers, is just a wet dream of the paranoid, NSA behind every bush crowd.

    Early in my career I worked in a Ma Bell public office. We always had people calling and demanding we stop their neighbor from tapping their phone. We had one woman who regularly called in absolutely positive the Baptists across her street were listening to her phone calls during Sunday morning services. We had dispatched repairmen several times to find the ‘bug’ she was sure was there. We finally put her on the ‘do not respond’ list.
    edited August 2021 Gabybyronljony0
  • Reply 11 of 62
    lkrupp said:
    Again, where will you all go if you can’t trust Apple anymore? I’ve asked this question many times but no responses. If you no longer trust Apple or think they are in bed with the government will you leave the platform? Where will you go? Or, as I strongly suspect, will you just bitch and whine about privacy and pray for the destruction of Apple, and then meekly accept the inevitable, and go about your lives?

    All this talk about Apple destroying its credibility, PR disasters, loss of customers, is just a wet dream of the paranoid, NSA behind every bush crowd.
    And this is exactly the problem. There is no other place to go. This is why I and many other people are angry. We have no other options. 
    entropysbloggerblogbyronlRayz2016Beatsdarkvader
  • Reply 12 of 62
    elijahgelijahg Posts: 2,759member
    lkrupp said:
    Again, where will you all go if you can’t trust Apple anymore? I’ve asked this question many times but no responses. If you no longer trust Apple or think they are in bed with the government will you leave the platform? Where will you go? Or, as I strongly suspect, will you just bitch and whine about privacy and pray for the destruction of Apple, and then meekly accept the inevitable, and go about your lives?

    All this talk about Apple destroying its credibility, PR disasters, loss of customers, is just a wet dream of the paranoid, NSA behind every bush crowd.

    Early in my career I worked in a Ma Bell public office. We always had people calling and demanding we stop their neighbor from tapping their phone. We had one woman who regularly called in absolutely positive the Baptists across her street were listening to her phone calls during Sunday morning services. We had dispatched repairmen several times to find the ‘bug’ she was sure was there. We finally put her on the ‘do not respond’ list.
    So what you're saying is no matter what any company introduces (presumably that extends to government too) everyone should simply suck it up and not let anyone know how you feel. Got it.
    baconstangentropysbloggerblogbyronlRayz2016muthuk_vanalingamBeatschemengin1darkvader
  • Reply 13 of 62
    At least Apple is discussing the solution. Facebook (as admitted by a former head of privacy) detected 4.5 million CSAM related images and nobody knew about it or complained about it.
    jony0
  • Reply 14 of 62
    macplusplusmacplusplus Posts: 2,112member
    Old privacy chief:

    "Apple has confirmed that it’s automatically scanning images backed up to iCloud to ferret out child abuse images.

    As the Telegraph reports, Apple chief privacy officer Jane Horvath, speaking at the Consumer Electronics Show in Las Vegas this week, said that this is the way that it’s helping to fight child exploitation, as opposed to breaking encryption."

    https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/

    New privacy chief:

    "The voucher generation is actually exactly what enables us not to have to begin processing all users’ content on our servers which we’ve never done for iCloud Photos" The TechCruch interview quoted in the article.

    What do you think?

    elijahg
  • Reply 15 of 62
    baconstangbaconstang Posts: 1,105member
    lkrupp said:
    Again, where will you all go if you can’t trust Apple anymore? I’ve asked this question many times but no responses. If you no longer trust Apple or think they are in bed with the government will you leave the platform? Where will you go? Or, as I strongly suspect, will you just bitch and whine about privacy and pray for the destruction of Apple, and then meekly accept the inevitable, and go about your lives?

    All this talk about Apple destroying its credibility, PR disasters, loss of customers, is just a wet dream of the paranoid, NSA behind every bush crowd.
    And this is exactly the problem. There is no other place to go. This is why I and many other people are angry. We have no other options. 
    Polaroids and a common carrier?
  • Reply 16 of 62
    I am SO fùcking happy to see this many people opposed to this; not because theyre opposed to preventing harm to children, but because they understand this is the only way the government will force apple to begin the process of backdooring their platform.   If Apple JUST said all we do is scan iCloud for CSAM -- I think most people would say "sucks..but fine".  But the fact they're building in the capability to analyze messages prior to transmission is terribly frightening.   All it's going to take is a court order and a gag to get them to inject hashes of words to search for.  And big fucking deal if changes require an iOS update and big fucking deal if they cant (allegedly) target individual users.  "Even better", says the government.  "Give us every American user that used the following phrase..."


    "oh..we have no idea how the hash of an AR-15 got into the CSAM database...we'll certainly investigate any abuse of this system..."
    tylersdadelijahgbyronlandrewj5790muthuk_vanalingamchemengin1darkvader
  • Reply 17 of 62
    peteopeteo Posts: 402member
    Rayz2016 said:

    It would be helpful (rather than just trashing the offered fix from Apple) to offered an alternative solution - unless we wish to state that there is not a problem to solve.


    Criticism is easy but solutions are difficult - buts lets try.

    I’ll have a think. 
    we already know what the solution is. Only run this in the cloud on iCloud photos. Do not run it on the users device. Of course, I believe they can not do this since iCloud photos are encrypted when in the cloud?
  • Reply 18 of 62
    macplusplusmacplusplus Posts: 2,112member
    peteo said:
    Rayz2016 said:

    It would be helpful (rather than just trashing the offered fix from Apple) to offered an alternative solution - unless we wish to state that there is not a problem to solve.


    Criticism is easy but solutions are difficult - buts lets try.

    I’ll have a think. 
    we already know what the solution is. Only run this in the cloud on iCloud photos. Do not run it on the users device. Of course, I believe they can not do this since iCloud photos are encrypted when in the cloud?
    iCloud Photos are encryped on the cloud with Apple's keys, not devıce (user's) keys.
    baconstangbyronlmuthuk_vanalingamdarkvader
  • Reply 19 of 62
    macplusplusmacplusplus Posts: 2,112member
    Rayz2016 said:

    It would be helpful (rather than just trashing the offered fix from Apple) to offered an alternative solution - unless we wish to state that there is not a problem to solve.


    Criticism is easy but solutions are difficult - buts lets try.

    I’ll have a think. 
    They must enable End-to-End Encryption on all iCloud user data at the same time they release that client-side CSAM matching. 

    What about the final human review then? Well, the safety voucher includes a preview image, as I understand. The original image will still be sent encrypted, but its preview will remain in the voucher. When the threshold is reached, the vouchers will unlock and the team will perform its review on the previews.

    Even with full E2E encryption enabled, that feature should not be implemented in a silent way, even if it is stated in the user agreement. The actual announcement includes nothing about user consent, prior notification and alike. Apple's stance should be preventive, to protect users from knowingly or unknowingly committing a crime. A though alert must be presented when the user tries to activate iCloud Photos, such as: "We can only accept photos that pass a CSAM scan to iCloud. In order to download a CSAM database into your device and initiate the CSAM scan on your photos, click Continue. Continue | Let Me Think | More Info..."

    And the result of the scan should be clearly communicated to the user: "13864 photos scanned, 104 photos will be sent with voucher. Show | Discard | Stop Upload"

    If it is implemented as spyware, that may cause Apple big headaches at the courts, especially in different jurisdictions.

    The user must be in control of that scan and of its results.
    edited August 2021 muthuk_vanalingamdarkvadergetvoxoarobaba
  • Reply 20 of 62
    bvwjbvwj Posts: 11unconfirmed, member
    So if some political foe wants to destroy me, all they need to do is hack my phone, deposit a collection of CSAM, and the cops will do the rest.  How convenient.
    tylersdadelijahgbyronlmuthuk_vanalingamBeatsdarkvader
Sign In or Register to comment.