macplusplus
About
- Username
- macplusplus
- Joined
- Visits
- 293
- Last Active
- Roles
- member
- Points
- 3,141
- Badges
- 1
- Posts
- 2,119
Reactions
-
Researchers who built rudimentary CSAM system say Apple's is a danger
AppleInsider said:
There are also protections against a bad actor sending CSAM to an innocent person. The Apple system only detects collections of CSAM in iCloud. Unless a user saves CSAM to iCloud themselves, or their Apple account is hacked by a sophisticated threat actor, then there's little chance of such a scam working out. -
Apple's Federighi says child protection message was 'jumbled,' 'misunderstood'
StrangeDays said:mfryd said:NYC362 said:Come on already. Google and Facebook have been doing this for years. Suddenly when Apple wants to do the same thing, everyone gets twisted.
It is just like the 1000 gas powered cars that will catch on fire get not a work of national attention, but one Tesla goes up in flames and there's a world wide news bulletin like it's the end of the world.
On the other hand, Apple prides itself on protecting the privacy of their customers. A key sales point in buying into the Apple ecosystem is that Apple does everything they possibly can in order to protect your data. They even fight court orders requiring them to add back doors to iPhone local encryption.
Under Apple's new policy, every image you upload to your iCloud library will be scanned, and compared against a list of blacklisted images. If you have too many blacklisted images, you will be reported to the authorities.
Initially, the blacklist will only contain child porn images. I can easily imagine a narcissistic leader ordering Apple to add to that list images which are critical to the government. Imagine a photo of a President that makes him look foolish, shows him in a compromising position, or reveals a public statement to be a lie. Such a President would have a strong incentive to add these photos to the list. Remember, Apple doesn't know what the blacklisted photos look like, Apple only has digital fingerprints of these images (it would be illegal for Apple to posses Child Porn, even if it was for a good cause).
There is nothing new here. You’re not forced to use any of these commercial cloud services to host your images online.
And reporting child porn libraries to the authorities is not some choice Apple makes — they’re required to by law. As are the other cloud storage services. You can’t store child porn, it’s against the law.
You or Apple can't burst into my phone either without a search warrant, it is as private as my house. -
Apple privacy head explains privacy protections of CSAM detection system
Rayz2016 said:mike_galloway said:It would be helpful (rather than just trashing the offered fix from Apple) to offered an alternative solution - unless we wish to state that there is not a problem to solve.
Criticism is easy but solutions are difficult - buts lets try.
What about the final human review then? Well, the safety voucher includes a preview image, as I understand. The original image will still be sent encrypted, but its preview will remain in the voucher. When the threshold is reached, the vouchers will unlock and the team will perform its review on the previews.
Even with full E2E encryption enabled, that feature should not be implemented in a silent way, even if it is stated in the user agreement. The actual announcement includes nothing about user consent, prior notification and alike. Apple's stance should be preventive, to protect users from knowingly or unknowingly committing a crime. A though alert must be presented when the user tries to activate iCloud Photos, such as: "We can only accept photos that pass a CSAM scan to iCloud. In order to download a CSAM database into your device and initiate the CSAM scan on your photos, click Continue. Continue | Let Me Think | More Info..."
And the result of the scan should be clearly communicated to the user: "13864 photos scanned, 104 photos will be sent with voucher. Show | Discard | Stop Upload"
If it is implemented as spyware, that may cause Apple big headaches at the courts, especially in different jurisdictions.
The user must be in control of that scan and of its results. -
Apple privacy head explains privacy protections of CSAM detection system
peteo said:Rayz2016 said:mike_galloway said:It would be helpful (rather than just trashing the offered fix from Apple) to offered an alternative solution - unless we wish to state that there is not a problem to solve.
Criticism is easy but solutions are difficult - buts lets try.
-
Apple privacy head explains privacy protections of CSAM detection system
Old privacy chief:
"Apple has confirmed that it’s automatically scanning images backed up to iCloud to ferret out child abuse images.As the Telegraph reports, Apple chief privacy officer Jane Horvath, speaking at the Consumer Electronics Show in Las Vegas this week, said that this is the way that it’s helping to fight child exploitation, as opposed to breaking encryption."
https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/
New privacy chief:
"The voucher generation is actually exactly what enables us not to have to begin processing all users’ content on our servers which we’ve never done for iCloud Photos" The TechCruch interview quoted in the article.
What do you think?