macplusplus

About

Username
macplusplus
Joined
Visits
293
Last Active
Roles
member
Points
3,141
Badges
1
Posts
2,119
  • Researchers who built rudimentary CSAM system say Apple's is a danger



    There are also protections against a bad actor sending CSAM to an innocent person. The Apple system only detects collections of CSAM in iCloud. Unless a user saves CSAM to iCloud themselves, or their Apple account is hacked by a sophisticated threat actor, then there's little chance of such a scam working out.

    The threshold is 30 pictures, according to Federighi. 30 pictures can be sent in a blaze through WhatsApp. If WhatsApp's auto-save media option is active, until the user awakens and deletes them one by one, the 30 pictures may be already flagged as CSAM and uploaded to iCloud.
    xyzzy-xxxmuthuk_vanalingambyronlikir
  • Apple's Federighi says child protection message was 'jumbled,' 'misunderstood'

    mfryd said:
    NYC362 said:
    Come on already.  Google and Facebook have been doing this for years.   Suddenly when Apple wants to do the same thing, everyone gets twisted.

    It is just like the 1000 gas powered cars that will catch on fire get not a work of national attention, but one Tesla goes up in flames and there's a world wide news bulletin like it's the end of the world. 
    While Google and Facebook have been scanning user data for years, they are honest and open about it.  They make their money by selling information they glean about their customers.  Google doesn't charge for Gmail because they learn more about you when you use their services, and hence make more profit selling that higher grade information.

    On the other hand, Apple prides itself on protecting the privacy of their customers.  A key sales point in buying into the Apple ecosystem is that Apple does everything they possibly can in order to protect your data.  They even fight court orders requiring them to add back doors to iPhone local encryption.

    Under Apple's new policy, every image you upload to your iCloud library will be scanned, and compared against a list of blacklisted images.  If you have too many blacklisted images, you will be reported to the authorities.  

    Initially, the blacklist will only contain child porn images.  I can easily imagine a narcissistic leader ordering Apple to add to that list images which are critical to the government.   Imagine a photo of a President that makes him look foolish, shows him in a compromising position, or reveals a public statement to be a lie.  Such a President would have a strong incentive to add these photos to the list.  Remember, Apple doesn't know what the blacklisted photos look like, Apple only has digital fingerprints of these images (it would be illegal for Apple to posses Child Porn, even if it was for a good cause).
    You can imagine whatever you like, but the system uses the same NCMEC child porn database utilized by Microsoft, Google, Dropbox, Twitter, etc.. Your little fantasy about the POTUS adding images to it would affect every one of those major cloud storage providers, not just Apple. 

    There is nothing new here. You’re not forced to use any of these commercial cloud services to host your images online. 

    And reporting child porn libraries to the authorities is not some choice Apple makes — they’re required to by law. As are the other cloud storage services. You can’t store child porn, it’s against the law. 
    You can't burst into one's house haphazardly with that pretext, you need a search warrant. 

    You or Apple can't burst into my phone either without a search warrant, it is as private as my house.
    muthuk_vanalingamcat52
  • Apple privacy head explains privacy protections of CSAM detection system

    Rayz2016 said:

    It would be helpful (rather than just trashing the offered fix from Apple) to offered an alternative solution - unless we wish to state that there is not a problem to solve.


    Criticism is easy but solutions are difficult - buts lets try.

    I’ll have a think. 
    They must enable End-to-End Encryption on all iCloud user data at the same time they release that client-side CSAM matching. 

    What about the final human review then? Well, the safety voucher includes a preview image, as I understand. The original image will still be sent encrypted, but its preview will remain in the voucher. When the threshold is reached, the vouchers will unlock and the team will perform its review on the previews.

    Even with full E2E encryption enabled, that feature should not be implemented in a silent way, even if it is stated in the user agreement. The actual announcement includes nothing about user consent, prior notification and alike. Apple's stance should be preventive, to protect users from knowingly or unknowingly committing a crime. A though alert must be presented when the user tries to activate iCloud Photos, such as: "We can only accept photos that pass a CSAM scan to iCloud. In order to download a CSAM database into your device and initiate the CSAM scan on your photos, click Continue. Continue | Let Me Think | More Info..."

    And the result of the scan should be clearly communicated to the user: "13864 photos scanned, 104 photos will be sent with voucher. Show | Discard | Stop Upload"

    If it is implemented as spyware, that may cause Apple big headaches at the courts, especially in different jurisdictions.

    The user must be in control of that scan and of its results.
    muthuk_vanalingamdarkvadergetvoxoarobaba
  • Apple privacy head explains privacy protections of CSAM detection system

    peteo said:
    Rayz2016 said:

    It would be helpful (rather than just trashing the offered fix from Apple) to offered an alternative solution - unless we wish to state that there is not a problem to solve.


    Criticism is easy but solutions are difficult - buts lets try.

    I’ll have a think. 
    we already know what the solution is. Only run this in the cloud on iCloud photos. Do not run it on the users device. Of course, I believe they can not do this since iCloud photos are encrypted when in the cloud?
    iCloud Photos are encryped on the cloud with Apple's keys, not devıce (user's) keys.
    baconstangbyronlmuthuk_vanalingamdarkvader
  • Apple privacy head explains privacy protections of CSAM detection system

    Old privacy chief:

    "Apple has confirmed that it’s automatically scanning images backed up to iCloud to ferret out child abuse images.

    As the Telegraph reports, Apple chief privacy officer Jane Horvath, speaking at the Consumer Electronics Show in Las Vegas this week, said that this is the way that it’s helping to fight child exploitation, as opposed to breaking encryption."

    https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/

    New privacy chief:

    "The voucher generation is actually exactly what enables us not to have to begin processing all users’ content on our servers which we’ve never done for iCloud Photos" The TechCruch interview quoted in the article.

    What do you think?

    elijahg