Apple details user privacy, security features built into its CSAM scanning system

Posted:
in iCloud edited August 2021
Apple has published a new document that provides more detail about the security and privacy of its new child safety features, including how it's designed to prevent misuse.

Credit: Apple
Credit: Apple


For one, Apple says in the document that the system will be auditable by third parties like security researchers or nonprofit groups. The company says that it will publish a Knowledge Base with the root hash of the encrypted CSAM hash database used for iCloud photo scanning.

The Cupertino tech giant says that "users will be able to inspect the root hash of the encrypted database present on their device, and compare it to the expected root hash in the Knowledge Base article." Additionally, it added that the accuracy of the database will be reviewable by security researchers.

"This approach enables third-party technical audits: an auditor can confirm that for any given root hash of the encrypted CSAM database in the Knowledge Base article or on a device, the database was generated only from an intersection of hashes from participating child safety organizations, with no additions, removals, or changes," Apple wrote.

Additionally, there are mechanisms to prevent abuse from specific child safety organizations or governments. Apple says that it's working with at least two child safety organizations to generate its CSAM hash database. It's also ensuring that the two organizations are not under the jurisdiction of the same government.

If multiple governments and child safety organizations somehow collaborate and include non-CSAM hashes in the database, Apple says there's a protection for that, too. The company's human review team will realize that an account was flagged for something other than CSAM, and will respond accordingly.

Separately on Friday, it was revealed that the threshold for a CSAM collection that would trigger an alert was 30 pieces of abuse material. Apple says the number is flexible, however, and it's only committed to stick to that at launch.

Bracing for an expected onslaught of questions from customers, the company instructed retail personnel to use a recently published FAQ as a resource when discussing the topic, according to an internal memo seen by Bloomberg. The letter also said Apple plans to hire an independent auditor to review the system.

Apple's child safety features has stirred controversy among privacy and security experts, as well as regular users. On Friday, Apple software chief Craig Federighi admitted that the messaging has been "jumbled" and "misunderstood."

Read on AppleInsider
«134

Comments

  • Reply 1 of 71
    lkrupplkrupp Posts: 10,557member
    In this brave new world where facts are irrelevant and perception and spin is all that matters the horse is already out of the barn and nothing Apple says or does will change the perception being spun in the news media and tech blogs.
    bloggerblogmike1killroymike54williamlondonopinionbyronl
  • Reply 2 of 71
    BeatsBeats Posts: 3,073member
    Shame on Apple. They are not the police!!
    bloggerblogmobirddk49xyzzy-xxxnewisneverenoughmejsricpulseimagesdutchlordmike54pscooter63
  • Reply 3 of 71
    bloggerblogbloggerblog Posts: 2,462member
    User data should only used to improve that user's experience, not to share with other orgs. No matter what the scenario is.
    xyzzy-xxxnewisneverenoughmejsricmike54muthuk_vanalingamcat52elijahgdarkvaderbyronl
  • Reply 4 of 71
    mobirdmobird Posts: 752member
    They don't get it!!! We DO NOT WANT THIS. I don't give a rat's ass on their "white paper".
    bloggerblogxyzzy-xxxnewisneverenoughchadbagmejsricpulseimagesdutchlordmike54muthuk_vanalingamOctoMonkey
  • Reply 5 of 71
    dk49dk49 Posts: 267member
    I am still wondering why Apple developed this feature? I don't see any other reason than a requirement/request from the government. 
    xyzzy-xxxcochomejsricJaiOh81muthuk_vanalingamOctoMonkeyelijahgsteven n.byronl
  • Reply 6 of 71
    Initially well intentioned, but with the technology in place, soon it’ll be hard for Apple and competitors to resist when governments ask to extend from child safety to minority’s safety, porn, crime, copyrighted material, fake news, libel, misinformation, then everything.
    edited August 2021 xyzzy-xxxnewisneverenoughcochomike54muthuk_vanalingamcat52entropyselijahgsteven n.darkvader
  • Reply 7 of 71
    It's just unacceptable to build a spyware into billions of devices – now that criminals are warned, who are they going to catch?

    I will NEVER accept that my data will be scanned on my OWN DEVICES and for sure I won't PAY FOR A DEVICE to spy me.

    iOS 15 and thereby iPhone 13 is now dead for me.
    newisneverenoughcochomrstepmejsricdutchlordmike54muthuk_vanalingamOctoMonkeycat52williamlondon
  • Reply 8 of 71
    F_Kent_DF_Kent_D Posts: 98unconfirmed, member
    I 100% agree with everyone’s concern and disapproval of this feature. However, I have nothing to hide as far as child pornography or anything of the sort. I have 3 daughters and would rather them not receive any pornographic texts or communication from anyone and this is to help keep that from happening. Todays kids are chatting and messaging no telling who on the online games and I’ve found that one of my daughters was suckered into doing things that shouldn’t have been done as a 10 year old. She’s been warned but I’m unable to warn the other party. I’m not 100% happy about all of this scanning but at the same time I have young girls that if there’s a way to protect them I will accept the protection against sex trafficking and other improper activities via messaging. 
    killroypichaelmike54pscooter63williamlondon
  • Reply 9 of 71
    Apple is becoming pathetic in their deafness to criticism. This not a feature. It’s an invasion of privacy and the the ´back door’ they said they wouldn’t build. I don’t buy their devices to become a part of the surveillance state. Get out the h*** out of my data, Apple. Serve the customer. iCloud is supposed to helpfully sync across devices and helpfully provide off device storage. Now, suddenly they are proud to announce that it’s become a tool for surveillance. 
    cochoxyzzy-xxxmrstepmejsricmike54cat52williamlondonentropyselijahgbaconstang
  • Reply 10 of 71
    Rayz2016Rayz2016 Posts: 6,957member
    dk49 said:
    I am still wondering why Apple developed this feature? I don't see any other reason than a requirement/request from the government. 
    Why indeed 🤔
    fastasleepmike54OctoMonkeyelijahgbaconstangbyronl
  • Reply 11 of 71
    sdw2001sdw2001 Posts: 18,015member
    F_Kent_D said:
    I 100% agree with everyone’s concern and disapproval of this feature. However, I have nothing to hide as far as child pornography or anything of the sort. I have 3 daughters and would rather them not receive any pornographic texts or communication from anyone and this is to help keep that from happening. Todays kids are chatting and messaging no telling who on the online games and I’ve found that one of my daughters was suckered into doing things that shouldn’t have been done as a 10 year old. She’s been warned but I’m unable to warn the other party. I’m not 100% happy about all of this scanning but at the same time I have young girls that if there’s a way to protect them I will accept the protection against sex trafficking and other improper activities via messaging. 
    I have three daughters, too.  This isn’t going to protect them.  No matter what Apple says, it’s scanning our data and potentially flagging it.   Totally unacceptable.  
    xyzzy-xxxmejsricmike54cat52williamlondonentropyselijahgbaconstangdarkvaderbyronl
  • Reply 12 of 71

    I don’t think this is an issue of Apple’s “messaging” or understanding by users. I still have three serious concerns that don’t seem to be addressed. 

    #1 Apple has acknowledged the privacy impact of this technology if misapplied by totalitarian governments. The response has been, “we’ll say no”. In the past the answer hasn’t been no with China and Saudi Arabia. This occurred when Apple was already powerful and wealthy. If a government compelled Apple, or if Apple one day is not in a dominant position they may not be able to say no even if they want to. 

    #2 We’ve recently observed zero-day exploits being used by multiple private companies to bypass the existing protections that exist in Apple’s platforms. Interfaces like this increase the attack surface that malicious actors can exploit. 

    #3 Up until this point the expectation from users has been that the data on your device was private and that on-device processing was used to prevent data from being uploaded to cloud services. The new system turns that expectation around and now on-device processing is being used as a means to upload to the cloud. This system, though narrowly tailored to illegal content at this time, changes the operating system’s role from the user’s perspective and places the device itself in a policing and policy enforcement role. This breaks the level of trust that computer users have had since the beginning of computing, that the device is “yours” in the same way that your car or home is “your” property. 

    Ultimately I think solving a human nature problem with technology isn’t a true solution. I think Apple is burning reputation with this move that was hard won. In my opinion law enforcement and judicial process should be used to rectify crime rather than technology providers like Apple. 

    macpluspluschadbagxyzzy-xxxmobirdcochofirelockCheeseFreezeRayz2016muthuk_vanalingamcat52
  • Reply 13 of 71
    chadbagchadbag Posts: 1,999member
    F_Kent_D said:
    I 100% agree with everyone’s concern and disapproval of this feature. However, I have nothing to hide as far as child pornography or anything of the sort. I have 3 daughters and would rather them not receive any pornographic texts or communication from anyone and this is to help keep that from happening. Todays kids are chatting and messaging no telling who on the online games and I’ve found that one of my daughters was suckered into doing things that shouldn’t have been done as a 10 year old. She’s been warned but I’m unable to warn the other party. I’m not 100% happy about all of this scanning but at the same time I have young girls that if there’s a way to protect them I will accept the protection against sex trafficking and other improper activities via messaging. 
    The problem is not the Messages feature blocking porn from kids.  The problem is a different component, which is a spyware piece that scans stuff and reports it (in a nutshell -- details are available everywhere).
    mejsricmike54cat52entropyselijahgdarkvaderbyronl
  • Reply 14 of 71
    chadbagchadbag Posts: 1,999member
    The problem is the hubris of Apple and Tim Cook and his EVP and other staff.   They are so convinced that all their "social" wokeness and initiatives are 100% correct and a mission from god (not God).  They are not listening.  They don't care.  They think they are right and just need to convince you of that.


    mobirdmike54OctoMonkeycat52williamlondonelijahgbyronl
  • Reply 15 of 71
    mobirdmobird Posts: 752member
    chadbag said:
    The problem is the hubris of Apple and Tim Cook and his EVP and other staff.   They are so convinced that all their "social" wokeness and initiatives are 100% correct and a mission from god (not God).  They are not listening.  They don't care.  They think they are right and just need to convince you of that.

    Exactly!!

    OctoMonkeycat52williamlondonelijahg
  • Reply 16 of 71
    mrstepmrstep Posts: 513member
    Apple is becoming pathetic in their deafness to criticism. This not a feature. It’s an invasion of privacy and the the ´back door’ they said they wouldn’t build. I don’t buy their devices to become a part of the surveillance state. Get out the h*** out of my data, Apple. Serve the customer. iCloud is supposed to helpfully sync across devices and helpfully provide off device storage. Now, suddenly they are proud to announce that it’s become a tool for surveillance. 
    They can repeat their message of “but it’s the good kind of surveillance” as often as they want, it’s still uninvited surveillance running on your own device.  They can also keep saying that it will never expand or be abused - the history of this industry says otherwise.  Total hubris, totally tone deaf, and totally doubling down on it.
    CheeseFreezemike54muthuk_vanalingamOctoMonkeycat52entropyselijahgbaconstangdarkvaderbyronl
  • Reply 17 of 71
    coolfactorcoolfactor Posts: 2,239member
    User data should only used to improve that user's experience, not to share with other orgs. No matter what the scenario is.

    No user data is being shared with other orgs — until a very concerning threshold has been met, and Apple has full rights to audit and monitor their own servers for abuse. They would be irresponsible to allow anybody to user their servers for any purpose without some protections in place.

    This is not an invasion of privacy, no matter how people want to spin it.

    fastasleepmwhitedewmerobababbhwilliamlondonDBSync
  • Reply 18 of 71
    fastasleepfastasleep Posts: 6,408member
    sdw2001 said:
    F_Kent_D said:
    I 100% agree with everyone’s concern and disapproval of this feature. However, I have nothing to hide as far as child pornography or anything of the sort. I have 3 daughters and would rather them not receive any pornographic texts or communication from anyone and this is to help keep that from happening. Todays kids are chatting and messaging no telling who on the online games and I’ve found that one of my daughters was suckered into doing things that shouldn’t have been done as a 10 year old. She’s been warned but I’m unable to warn the other party. I’m not 100% happy about all of this scanning but at the same time I have young girls that if there’s a way to protect them I will accept the protection against sex trafficking and other improper activities via messaging. 
    I have three daughters, too.  This isn’t going to protect them.  No matter what Apple says, it’s scanning our data and potentially flagging it.   Totally unacceptable.  
    Does it have to be your daughters who are protected to make it relevant to you?
    robabawilliamlondon
  • Reply 19 of 71
    fastasleepfastasleep Posts: 6,408member
    chadbag said:
    The problem is the hubris of Apple and Tim Cook and his EVP and other staff.   They are so convinced that all their "social" wokeness and initiatives are 100% correct and a mission from god (not God).  They are not listening.  They don't care.  They think they are right and just need to convince you of that.
    Yes, the hubris of Apple...and every other company that was already doing this. Google, Microsoft, Facebook, Dropbox, Snapchat, ... Oh look: "To date, over 1,400 companies are registered to make reports to NCMEC’s CyberTipline and, in addition to making reports, these companies also receive notices from NCMEC about suspected CSAM on their servers."

    https://www.missingkids.org/theissues/csam#bythenumbers
    mwhiteforegoneconclusiondewmerobabapscooter63williamlondonDBSync
  • Reply 20 of 71
    This is not enough. Apple has proven in several occasions their “privacy-first mantra” is just marketing, when you look at China, Russia and other countries.
    This system can be abused locally to search or collect data. I want Apple to at the very least state they will never ever do it, and if they do, are fine with the world-wide legal implications/liability.  

    Secondly, it’s MY device that I paid good money for and Apple allows me ZERO options to replace Photos with another app. It’s not that I can seamlessly switch and my camera and file browsing defaults to this new app - another topic but still relevant here.

    Ofcourse nobody wants child porn, but Apple is not the police nor did I choose to have my photos scanned.

    Screw this company and their hypocritical culture. I’m so much invested in their hardware and software but they are simply not the company anymore that I used to respect. 


    mobirdmike54cat52williamlondonaderutterelijahgdarkvaderbyronl
Sign In or Register to comment.