macplusplus

About

Username
macplusplus
Joined
Visits
263
Last Active
Roles
member
Points
3,111
Badges
1
Posts
2,112
  • Apple privacy head explains privacy protections of CSAM detection system

    Rayz2016 said:

    It would be helpful (rather than just trashing the offered fix from Apple) to offered an alternative solution - unless we wish to state that there is not a problem to solve.


    Criticism is easy but solutions are difficult - buts lets try.

    I’ll have a think. 
    They must enable End-to-End Encryption on all iCloud user data at the same time they release that client-side CSAM matching. 

    What about the final human review then? Well, the safety voucher includes a preview image, as I understand. The original image will still be sent encrypted, but its preview will remain in the voucher. When the threshold is reached, the vouchers will unlock and the team will perform its review on the previews.

    Even with full E2E encryption enabled, that feature should not be implemented in a silent way, even if it is stated in the user agreement. The actual announcement includes nothing about user consent, prior notification and alike. Apple's stance should be preventive, to protect users from knowingly or unknowingly committing a crime. A though alert must be presented when the user tries to activate iCloud Photos, such as: "We can only accept photos that pass a CSAM scan to iCloud. In order to download a CSAM database into your device and initiate the CSAM scan on your photos, click Continue. Continue | Let Me Think | More Info..."

    And the result of the scan should be clearly communicated to the user: "13864 photos scanned, 104 photos will be sent with voucher. Show | Discard | Stop Upload"

    If it is implemented as spyware, that may cause Apple big headaches at the courts, especially in different jurisdictions.

    The user must be in control of that scan and of its results.
    muthuk_vanalingamdarkvadergetvoxoarobaba
  • Apple privacy head explains privacy protections of CSAM detection system

    peteo said:
    Rayz2016 said:

    It would be helpful (rather than just trashing the offered fix from Apple) to offered an alternative solution - unless we wish to state that there is not a problem to solve.


    Criticism is easy but solutions are difficult - buts lets try.

    I’ll have a think. 
    we already know what the solution is. Only run this in the cloud on iCloud photos. Do not run it on the users device. Of course, I believe they can not do this since iCloud photos are encrypted when in the cloud?
    iCloud Photos are encryped on the cloud with Apple's keys, not devıce (user's) keys.
    baconstangbyronlmuthuk_vanalingamdarkvader
  • Apple privacy head explains privacy protections of CSAM detection system

    Old privacy chief:

    "Apple has confirmed that it’s automatically scanning images backed up to iCloud to ferret out child abuse images.

    As the Telegraph reports, Apple chief privacy officer Jane Horvath, speaking at the Consumer Electronics Show in Las Vegas this week, said that this is the way that it’s helping to fight child exploitation, as opposed to breaking encryption."

    https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/

    New privacy chief:

    "The voucher generation is actually exactly what enables us not to have to begin processing all users’ content on our servers which we’ve never done for iCloud Photos" The TechCruch interview quoted in the article.

    What do you think?

    elijahg
  • New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM

    All the boneheads claiming Apple created a "backdoor" -- nope. All the tech companies do this (Dropbox, Microsoft, Google), and Apple did 100% server-side CSAM scanning a year ago:

    https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/

    ...you dudes are simply panicking and clutching your pearls because you didn't know about it before.
    That article is an argument against Apple, not in favor of. Since they are already performing that scan on their servers, what is the point in injecting another mechanism into the device itself? I have no authority on iCloud servers, those are Apple's property, but I have authority on my device and I don't want it to be used to inspect me. This is not different than planting a camera into your house to monitor if you abuse your children or your wife.

    Previously, Apple had rejected government's request to develop a custom iOS in order to break into criminals' iPhones. That would work on case-by-case basis, but Apple had still rejected it because it would set a precedent. And now, Apple sets that precedent voluntarily, and not on case-by-case basis but for the whole ecosystem...
    elijahggatorguymuthuk_vanalingamchemengin1entropysRayz2016mejsricbaconstangmike54cocho
  • New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM

    lkrupp said:
    All this handwringing and foaming at the mouth, spittle flying everywhere. Fine, but what do you all pan to DO about it? Leave the iOS platform? Where will you go? Android, even though Google has been doing this for a couple of years now and Apple is just playing catch up? Will you extricate yourself from the online universe and go off the grid in a bunker in northern Utah or the Yukon territory, maybe Hudson’s Bay? Of course you can’t change anything in the ballot box because both parties are onboard with this. It’s coming and there’s NOTHING you can do about it except bitch on a tiny tech blog. 

    What will you do? Where will you go? Any answers?

    Maybe those bitching on this tiny tech blog can't do much, but [unfortunately] parents can do by prohibiting the iPhone to their kids: "iPhone? God forbid ! I heard that Apple's cloud is full of paedophiles and Apple is working hard to deal with that..."
    baconstang