What you need to know: Apple's iCloud Photos and Messages child safety initiatives

12345679»

Comments

  • Reply 161 of 162
    Pascalxx said:
    Apple’s documentation on the new child protection features mentions that Apple will only be able to decrypt and access photos identified by the algorithm as CSAM. What is the status quo? I thought that Apple has the keys to all iCloud backups, including iCloud Photos and could decrypt and access them. Does that mean that it will no longer have access to any data other than that marked as CSAM once the feature goes live? If on-device scanning is the cost for real end-to-end encryption in iCloud, that would make it a more worthwhile trade off to consider.
    Yes, you are right on the bolded part. Apple DOES have the keys to all iCloud backups. The other question on whether Apple will implement end-to-end encryption - We just don't know about it at this point about Apple's future plans. If Apple had plans to do it, they would have highlighted that as the key point while trying to sell this on-device CSAM scanning. Since Apple hasn't mentioned anything about it, the "reasonable guess" would be that it won't happen. Makes you wonder - Why Apple is adamant on doing it on-device when they have the capability to do it in the Cloud for which they do have access already.

    And, Google/Microsoft/Facebook/anyone else does NOT do this on-device scanning (user's property) for CSAM. They all do it on the Cloud/Servers (their own property).
    edited August 2021 Pascalxx
  • Reply 162 of 162
    fastasleepfastasleep Posts: 6,420member
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.

    radarthekat said:
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.
    So simply stated.  How can so many still be so confused?  Boggles the mind. 
    No, it is NOT as simple as that. There is a difference and it is an important one. Apple conducting search for CSAM in iCloud is "searching for illegal content within their property". Apple conducting search for CSAM in end user's phone is "Invasion of privacy of the individual". The scope of former CANNOT be expanded in future to other contents not uploaded to iCloud. The scope of latter CAN be expanded to other contents within the phone in future.
    You do know it’s applied only to images to be uploaded to iCloud.  Which is an important distinction on two aspects.  1. A user can avoid this by not uploading images to iCloud.  And 2, this allows Apple to ensure that it’s not hosting illegal child abuse images on its servers.  
    Agreed, it is applied only to images to be uploaded to iCloud - FOR NOW. It still has a possibility of "scope creep" (scanning other contents in the phone which are not marked for upload to iCloud) IN FUTURE (particularly in China/Russia/Saudi Arabia). Which is why, it is called as "backdoor" by the people who can think critically and understand how these features start off to begin with (a noble cause) and how they expand in scope (surveillance) in the long run. If Apple were to scan for CSAM content in iCloud, there is NO possibility of "scope creep" (scanning of other contents in the user's phones) in future. That is an important distinction on this topic.
    OMG what if! literally applies to every possible thing you can think of. What if Russia makes Apple upload Spotlight databases for all Russian users! Sounds scary, right? I can make up a ton of these. Doesn't mean it'll happen.
    So, are you of the type "I don't need privacy because I have nothing to hide. Like me, no needs privacy. Because if they have something to hide, they deserve to be punished anyway"?

    Apple can search within their property (iCloud) for all they want. But they should NOT search in the property of end user's phones WITHOUT warrant.
    No, I am not that “type” and I didn’t say that. 

    What you’re asking for is Apple to decrypt your encrypted data, which is often stored on third party servers, in order to hash and scan server-side. To me, that sounds like a far less secure procedure than what’s happening on device. 
Sign In or Register to comment.