Think it’s a trade off that we are looking at here. Right now, Apples servers are unencrypted &/or encrypted with Apple holding the keys. This makes anything on the server fair game for authorities with any kind of court order. Remember, you don’t have to be informed when someone has a court order for your files on someone else’s server. Third party encryption would solve this, but also leaves Apple open to all kinds of lawsuits that would render its encryption system nil and void. So what to do?
Seen from this perspective, Apple CSAM gate makes the gov get a court order for your phone, which we now know they can crack (thanks Israel!) This provides just enough information to authorities to get a warrant, without opening your information to a fishing expedition without your knowledge. To me this suggests a quantum leap forward in security. Argue all you like about slippery slopes but when your on the beach and you know a tsunami is on the way, I would be heading for higher ground.
It would be helpful (rather than just trashing the offered fix from Apple) to offered an alternative solution - unless we wish to state that there is not a problem to solve.
Criticism is easy but solutions are difficult - buts lets try.
I’ll have a think.
They must enable End-to-End Encryption on all iCloud user data at the same time they release that client-side CSAM matching.
What about the final human review then? Well, the safety voucher includes a preview image, as I understand. The original image will still be sent encrypted, but its preview will remain in the voucher. When the threshold is reached, the vouchers will unlock and the team will perform its review on the previews.
Even with full E2E encryption enabled, that feature should not be implemented in a silent way, even if it is stated in the user agreement. The actual announcement includes nothing about user consent, prior notification and alike. Apple's stance should be preventive, to protect users from knowingly or unknowingly committing a crime. A though alert must be presented when the user tries to activate iCloud Photos, such as: "We can only accept photos that pass a CSAM scan to iCloud. In order to download a CSAM database into your device and initiate the CSAM scan on your photos, click Continue. Continue | Let Me Think | More Info..."
And the result of the scan should be clearly communicated to the user: "13864 photos scanned, 104 photos will be sent with voucher. Show | Discard | Stop Upload"
If it is implemented as spyware, that may cause Apple big headaches at the courts, especially in different jurisdictions.
The user must be in control of that scan and of its results.
This is well thought. I couldn't agree more. If it's anything, Apple should move all CSAM results into an album not to be uploaded to iCloud Photos.
I think that the photo scanning won't solve anything, what if someone packs CSAM material into a pdf file, and uploads it to iCloud as a document file? Will Apple consider that's ok?
You’re confusing anything with everything. Just because Apples approach won’t solve everything, doesn’t mean it won’t solve anything.
And Apple has never claimed that their plans will completely rid the word of child abuse.
Comments
Seen from this perspective, Apple CSAM gate makes the gov get a court order for your phone, which we now know they can crack (thanks Israel!) This provides just enough information to authorities to get a warrant, without opening your information to a fishing expedition without your knowledge. To me this suggests a quantum leap forward in security. Argue all you like about slippery slopes but when your on the beach and you know a tsunami is on the way, I would be heading for higher ground.
And Apple has never claimed that their plans will completely rid the word of child abuse.