Apple privacy head explains privacy protections of CSAM detection system
Apple's privacy chief Erik Neuenschwander has detailed some of the projections built into the company's CSAM scanning system that prevent it from being used for other purposes - including clarifying that the system performs no hashing if iCloud Photos is off.
Credit: WikiMedia Commons
The company's CSAM detection system, which was announced with other new child safety tools, has caused controversy. In response, Apple has offered numerous details about how it can scan for CSAM without endangering user privacy.
In an interview with TechCrunch, Apple privacy head Erik Neuenschwander said the system was designed from the start to prevent government overreach and abuse.
For one, the system only applies in the U.S., where Fourth Amendment protections already guard against illegal search and seizure.
"Well first, that is launching only for US, iCloud accounts, and so the hypotheticals seem to bring up generic countries or other countries that aren't the US when they speak in that way," Neuenschwander said "And therefore it seems to be the case that people agree US law doesn't offer these kinds of capabilities to our government."
But even beyond that, the system has baked-in guardrails. For example, the hash list that the system uses to tag CSAM is built into the operating system. It can't be updated from Apple's side without an iOS update. Apple also must release any updates to the database on a global scale -- it can't target individual users with specific updates.
The system also only tags collections of known CSAM. A single image isn't going to trigger anything. More then that, images that aren't in the database provided by the National Center for Missing and Exploited Children won't get tagged either.
Apple also has a manual review process. If an iCloud account gets flagged for a collection of illegal CSAM material, an Apple team will review the flag to ensure that it's actually a correct match before any external entity is alerted.
"And so the hypothetical requires jumping over a lot of hoops, including having Apple change its internal process to refer material that is not illegal, like known CSAM and that we don't believe that there's a basis on which people will be able to make that request in the US," Neuenschwander said.
Additionally, Neuenschwander added, there is still some user choice here. The system only works if a user has iCloud Photos enabled. The Apple privacy chief said that, if a user doesn't like the system, "they can choose not to use iCloud Photos." If iCloud Photos is not enabled, "no part of the system is functional."
"If users are not using iCloud Photos, NeuralHash will not run and will not generate any vouchers. CSAM detection is a neural hash being compared against a database of the known CSAM hashes that are part of the operating system image," the Apple executive said. "None of that piece, nor any of the additional parts including the creation of the safety vouchers or the uploading of vouchers to iCloud Photos is functioning if you're not using iCloud Photos."
Although Apple's CSAM feature has caused a stir online, the company refutes that the system can be used for any purposes other than detecting CSAM. Apple clearly states that it will refuse any government attempt to modify or use the system for something other than CSAM.
Read on AppleInsider
Credit: WikiMedia Commons
The company's CSAM detection system, which was announced with other new child safety tools, has caused controversy. In response, Apple has offered numerous details about how it can scan for CSAM without endangering user privacy.
In an interview with TechCrunch, Apple privacy head Erik Neuenschwander said the system was designed from the start to prevent government overreach and abuse.
For one, the system only applies in the U.S., where Fourth Amendment protections already guard against illegal search and seizure.
"Well first, that is launching only for US, iCloud accounts, and so the hypotheticals seem to bring up generic countries or other countries that aren't the US when they speak in that way," Neuenschwander said "And therefore it seems to be the case that people agree US law doesn't offer these kinds of capabilities to our government."
But even beyond that, the system has baked-in guardrails. For example, the hash list that the system uses to tag CSAM is built into the operating system. It can't be updated from Apple's side without an iOS update. Apple also must release any updates to the database on a global scale -- it can't target individual users with specific updates.
The system also only tags collections of known CSAM. A single image isn't going to trigger anything. More then that, images that aren't in the database provided by the National Center for Missing and Exploited Children won't get tagged either.
Apple also has a manual review process. If an iCloud account gets flagged for a collection of illegal CSAM material, an Apple team will review the flag to ensure that it's actually a correct match before any external entity is alerted.
"And so the hypothetical requires jumping over a lot of hoops, including having Apple change its internal process to refer material that is not illegal, like known CSAM and that we don't believe that there's a basis on which people will be able to make that request in the US," Neuenschwander said.
Additionally, Neuenschwander added, there is still some user choice here. The system only works if a user has iCloud Photos enabled. The Apple privacy chief said that, if a user doesn't like the system, "they can choose not to use iCloud Photos." If iCloud Photos is not enabled, "no part of the system is functional."
"If users are not using iCloud Photos, NeuralHash will not run and will not generate any vouchers. CSAM detection is a neural hash being compared against a database of the known CSAM hashes that are part of the operating system image," the Apple executive said. "None of that piece, nor any of the additional parts including the creation of the safety vouchers or the uploading of vouchers to iCloud Photos is functioning if you're not using iCloud Photos."
Although Apple's CSAM feature has caused a stir online, the company refutes that the system can be used for any purposes other than detecting CSAM. Apple clearly states that it will refuse any government attempt to modify or use the system for something other than CSAM.
Read on AppleInsider
Comments
That is not completely true, people got discriminated against due to guilty by association. Imagine if a fan took a photo with a whistleblowers like Snowden. So yeah it can be abused by any country
It honestly ruins their supposed privacy protection standards. No one wants AI to go through their stuff. Usually probably cause standard would exist, now? entire new caselaw, use the cloud and you assume your own risk for things bring looked through? So today it is about this issue, what issue in the future will be the excuse to go through someone's files? suppose it is perils of using cloud storage, which is not private to begin with.
Seems like this will cause many to wake up to the fact that cloud storage is not remotely private or secure.
Well, gotta give them credit for thinking different. No one else would’ve thought to do this … not on a full stomach anyway.
It would be helpful (rather than just trashing the offered fix from Apple) to offered an alternative solution - unless we wish to state that there is not a problem to solve.
Criticism is easy but solutions are difficult - buts lets try.
All this talk about Apple destroying its credibility, PR disasters, loss of customers, is just a wet dream of the paranoid, NSA behind every bush crowd.
Early in my career I worked in a Ma Bell public office. We always had people calling and demanding we stop their neighbor from tapping their phone. We had one woman who regularly called in absolutely positive the Baptists across her street were listening to her phone calls during Sunday morning services. We had dispatched repairmen several times to find the ‘bug’ she was sure was there. We finally put her on the ‘do not respond’ list.
"Apple has confirmed that it’s automatically scanning images backed up to iCloud to ferret out child abuse images.
As the Telegraph reports, Apple chief privacy officer Jane Horvath, speaking at the Consumer Electronics Show in Las Vegas this week, said that this is the way that it’s helping to fight child exploitation, as opposed to breaking encryption."
https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/
New privacy chief:
"The voucher generation is actually exactly what enables us not to have to begin processing all users’ content on our servers which we’ve never done for iCloud Photos" The TechCruch interview quoted in the article.
What do you think?
"oh..we have no idea how the hash of an AR-15 got into the CSAM database...we'll certainly investigate any abuse of this system..."
What about the final human review then? Well, the safety voucher includes a preview image, as I understand. The original image will still be sent encrypted, but its preview will remain in the voucher. When the threshold is reached, the vouchers will unlock and the team will perform its review on the previews.
Even with full E2E encryption enabled, that feature should not be implemented in a silent way, even if it is stated in the user agreement. The actual announcement includes nothing about user consent, prior notification and alike. Apple's stance should be preventive, to protect users from knowingly or unknowingly committing a crime. A though alert must be presented when the user tries to activate iCloud Photos, such as: "We can only accept photos that pass a CSAM scan to iCloud. In order to download a CSAM database into your device and initiate the CSAM scan on your photos, click Continue. Continue | Let Me Think | More Info..."
And the result of the scan should be clearly communicated to the user: "13864 photos scanned, 104 photos will be sent with voucher. Show | Discard | Stop Upload"
If it is implemented as spyware, that may cause Apple big headaches at the courts, especially in different jurisdictions.
The user must be in control of that scan and of its results.