What you need to know: Apple's iCloud Photos and Messages child safety initiatives

1234568

Comments

  • Reply 141 of 162
    StrangeDaysStrangeDays Posts: 12,877member
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.
    Yup. These paranoid sorts keep ignoring that cloud hosting companies could alter their server-side code and do the same there. In fact that's arguably easier, since it doesn't require updates to the client software pushed out.
    DBSync
  • Reply 142 of 162
    crowleycrowley Posts: 10,453member
    I wonder if there's any scope for making the hash list extractable, and then directly comparable against the hash list from the NCMEC for increased transparency so that any one who subscribes to the slippery slope arguments can be a little placated by the knowledge that any hashes not from the NCMEC will be immediately obvious and Apple will be held to account.
  • Reply 143 of 162
    fastasleepfastasleep Posts: 6,417member
    GusAgain said:
    What you need to know: Yes. Apple will check stuff in YOUR device. Opening precedents for totalitarian governments. Doesn't matter how much you guys or Apple keeps putting it, this is the absolute truth.
    Counterpoint: Only a Sith deals in absolutes.
  • Reply 144 of 162
    radarthekatradarthekat Posts: 3,842moderator
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.
    So simply stated.  How can so many still be so confused?  Boggles the mind. 
    fastasleepDBSync
  • Reply 145 of 162
    fastasleepfastasleep Posts: 6,417member
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.
    So simply stated.  How can so many still be so confused?  Boggles the mind. 
    People believe feelings are more important than facts.
    DBSyncradarthekat
  • Reply 146 of 162
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.

    radarthekat said:
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.
    So simply stated.  How can so many still be so confused?  Boggles the mind. 
    No, it is NOT as simple as that. There is a difference and it is an important one. Apple conducting search for CSAM in iCloud is "searching for illegal content within their property". Apple conducting search for CSAM in end user's phone is "Invasion of privacy of the individual". The scope of former CANNOT be expanded in future to other contents not uploaded to iCloud. The scope of latter CAN be expanded to other contents within the phone in future.
    xyzzy-xxxPascalxxIreneW
  • Reply 147 of 162
    radarthekatradarthekat Posts: 3,842moderator
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.

    radarthekat said:
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.
    So simply stated.  How can so many still be so confused?  Boggles the mind. 
    No, it is NOT as simple as that. There is a difference and it is an important one. Apple conducting search for CSAM in iCloud is "searching for illegal content within their property". Apple conducting search for CSAM in end user's phone is "Invasion of privacy of the individual". The scope of former CANNOT be expanded in future to other contents not uploaded to iCloud. The scope of latter CAN be expanded to other contents within the phone in future.
    You do know it’s applied only to images to be uploaded to iCloud.  Which is an important distinction on two aspects.  1. A user can avoid this by not uploading images to iCloud.  And 2, this allows Apple to ensure that it’s not hosting illegal child abuse images on its servers.  
    fastasleep
  • Reply 148 of 162
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.

    radarthekat said:
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.
    So simply stated.  How can so many still be so confused?  Boggles the mind. 
    No, it is NOT as simple as that. There is a difference and it is an important one. Apple conducting search for CSAM in iCloud is "searching for illegal content within their property". Apple conducting search for CSAM in end user's phone is "Invasion of privacy of the individual". The scope of former CANNOT be expanded in future to other contents not uploaded to iCloud. The scope of latter CAN be expanded to other contents within the phone in future.
    You do know it’s applied only to images to be uploaded to iCloud.  Which is an important distinction on two aspects.  1. A user can avoid this by not uploading images to iCloud.  And 2, this allows Apple to ensure that it’s not hosting illegal child abuse images on its servers.  
    Until it is changed or hacked this is only applied to images loaded to iCloud. If you want to scan images loaded to a cloud just do it in the cloud!
    But NEVER on my devices.
    edited August 2021 Pascalxxmuthuk_vanalingam
  • Reply 149 of 162
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.

    radarthekat said:
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.
    So simply stated.  How can so many still be so confused?  Boggles the mind. 
    No, it is NOT as simple as that. There is a difference and it is an important one. Apple conducting search for CSAM in iCloud is "searching for illegal content within their property". Apple conducting search for CSAM in end user's phone is "Invasion of privacy of the individual". The scope of former CANNOT be expanded in future to other contents not uploaded to iCloud. The scope of latter CAN be expanded to other contents within the phone in future.
    You do know it’s applied only to images to be uploaded to iCloud.  Which is an important distinction on two aspects.  1. A user can avoid this by not uploading images to iCloud.  And 2, this allows Apple to ensure that it’s not hosting illegal child abuse images on its servers.  
    Agreed, it is applied only to images to be uploaded to iCloud - FOR NOW. It still has a possibility of "scope creep" (scanning other contents in the phone which are not marked for upload to iCloud) IN FUTURE (particularly in China/Russia/Saudi Arabia). Which is why, it is called as "backdoor" by the people who can think critically and understand how these features start off to begin with (a noble cause) and how they expand in scope (surveillance) in the long run. If Apple were to scan for CSAM content in iCloud, there is NO possibility of "scope creep" (scanning of other contents in the user's phones) in future. That is an important distinction on this topic.
    edited August 2021 Pascalxx
  • Reply 150 of 162
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.

    radarthekat said:
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.
    So simply stated.  How can so many still be so confused?  Boggles the mind. 
    No, it is NOT as simple as that. There is a difference and it is an important one. Apple conducting search for CSAM in iCloud is "searching for illegal content within their property". Apple conducting search for CSAM in end user's phone is "Invasion of privacy of the individual". The scope of former CANNOT be expanded in future to other contents not uploaded to iCloud. The scope of latter CAN be expanded to other contents within the phone in future.
    You do know it’s applied only to images to be uploaded to iCloud.  Which is an important distinction on two aspects.  1. A user can avoid this by not uploading images to iCloud.  And 2, this allows Apple to ensure that it’s not hosting illegal child abuse images on its servers.  
    On the bolded point, I have seen this point being made by multiple members in this forum. Seriously? What happened to "Ecosystem stickiness" and "It just works"? They don't apply???
    Pascalxx
  • Reply 151 of 162
    crowleycrowley Posts: 10,453member
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.

    radarthekat said:
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.
    So simply stated.  How can so many still be so confused?  Boggles the mind. 
    No, it is NOT as simple as that. There is a difference and it is an important one. Apple conducting search for CSAM in iCloud is "searching for illegal content within their property". Apple conducting search for CSAM in end user's phone is "Invasion of privacy of the individual". The scope of former CANNOT be expanded in future to other contents not uploaded to iCloud. The scope of latter CAN be expanded to other contents within the phone in future.
    You do know it’s applied only to images to be uploaded to iCloud.  Which is an important distinction on two aspects.  1. A user can avoid this by not uploading images to iCloud.  And 2, this allows Apple to ensure that it’s not hosting illegal child abuse images on its servers.  
    On the bolded point, I have seen this point being made by multiple members in this forum. Seriously? What happened to "Ecosystem stickiness" and "It just works"? They don't apply???
    It does just work.  You just need to agree that you won't upload CSAM to iCloud Photos and agree to have you photos go through a basic scan to check that you're not doing that.  It's all user invisible, so it just works.
  • Reply 152 of 162
    crowleycrowley Posts: 10,453member
    xyzzy-xxx said:
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.

    radarthekat said:
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.
    So simply stated.  How can so many still be so confused?  Boggles the mind. 
    No, it is NOT as simple as that. There is a difference and it is an important one. Apple conducting search for CSAM in iCloud is "searching for illegal content within their property". Apple conducting search for CSAM in end user's phone is "Invasion of privacy of the individual". The scope of former CANNOT be expanded in future to other contents not uploaded to iCloud. The scope of latter CAN be expanded to other contents within the phone in future.
    You do know it’s applied only to images to be uploaded to iCloud.  Which is an important distinction on two aspects.  1. A user can avoid this by not uploading images to iCloud.  And 2, this allows Apple to ensure that it’s not hosting illegal child abuse images on its servers.  
    Until it is changed or hacked this is only applied to images loaded to iCloud. If you want to scan images loaded to a cloud just do it in the cloud!
    But NEVER on my devices.
    Why?  The TSA don't pat you down for a firearm when you're already on the aeroplane.  Why does the scan being on your phone make so much of a difference?
  • Reply 153 of 162
    fastasleepfastasleep Posts: 6,417member
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.

    radarthekat said:
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.
    So simply stated.  How can so many still be so confused?  Boggles the mind. 
    No, it is NOT as simple as that. There is a difference and it is an important one. Apple conducting search for CSAM in iCloud is "searching for illegal content within their property". Apple conducting search for CSAM in end user's phone is "Invasion of privacy of the individual". The scope of former CANNOT be expanded in future to other contents not uploaded to iCloud. The scope of latter CAN be expanded to other contents within the phone in future.
    You do know it’s applied only to images to be uploaded to iCloud.  Which is an important distinction on two aspects.  1. A user can avoid this by not uploading images to iCloud.  And 2, this allows Apple to ensure that it’s not hosting illegal child abuse images on its servers.  
    Agreed, it is applied only to images to be uploaded to iCloud - FOR NOW. It still has a possibility of "scope creep" (scanning other contents in the phone which are not marked for upload to iCloud) IN FUTURE (particularly in China/Russia/Saudi Arabia). Which is why, it is called as "backdoor" by the people who can think critically and understand how these features start off to begin with (a noble cause) and how they expand in scope (surveillance) in the long run. If Apple were to scan for CSAM content in iCloud, there is NO possibility of "scope creep" (scanning of other contents in the user's phones) in future. That is an important distinction on this topic.
    OMG what if! literally applies to every possible thing you can think of. What if Russia makes Apple upload Spotlight databases for all Russian users! Sounds scary, right? I can make up a ton of these. Doesn't mean it'll happen.
  • Reply 154 of 162
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.

    radarthekat said:
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.
    So simply stated.  How can so many still be so confused?  Boggles the mind. 
    No, it is NOT as simple as that. There is a difference and it is an important one. Apple conducting search for CSAM in iCloud is "searching for illegal content within their property". Apple conducting search for CSAM in end user's phone is "Invasion of privacy of the individual". The scope of former CANNOT be expanded in future to other contents not uploaded to iCloud. The scope of latter CAN be expanded to other contents within the phone in future.
    You do know it’s applied only to images to be uploaded to iCloud.  Which is an important distinction on two aspects.  1. A user can avoid this by not uploading images to iCloud.  And 2, this allows Apple to ensure that it’s not hosting illegal child abuse images on its servers.  
    Agreed, it is applied only to images to be uploaded to iCloud - FOR NOW. It still has a possibility of "scope creep" (scanning other contents in the phone which are not marked for upload to iCloud) IN FUTURE (particularly in China/Russia/Saudi Arabia). Which is why, it is called as "backdoor" by the people who can think critically and understand how these features start off to begin with (a noble cause) and how they expand in scope (surveillance) in the long run. If Apple were to scan for CSAM content in iCloud, there is NO possibility of "scope creep" (scanning of other contents in the user's phones) in future. That is an important distinction on this topic.
    OMG what if! literally applies to every possible thing you can think of. What if Russia makes Apple upload Spotlight databases for all Russian users! Sounds scary, right? I can make up a ton of these. Doesn't mean it'll happen.
    So, are you of the type "I don't need privacy because I have nothing to hide. Like me, no needs privacy. Because if they have something to hide, they deserve to be punished anyway"?

    Apple can search within their property (iCloud) for all they want. But they should NOT search in the property of end user's phones WITHOUT warrant.
    edited August 2021 Pascalxx
  • Reply 155 of 162
    crowleycrowley Posts: 10,453member
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.

    radarthekat said:
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.
    So simply stated.  How can so many still be so confused?  Boggles the mind. 
    No, it is NOT as simple as that. There is a difference and it is an important one. Apple conducting search for CSAM in iCloud is "searching for illegal content within their property". Apple conducting search for CSAM in end user's phone is "Invasion of privacy of the individual". The scope of former CANNOT be expanded in future to other contents not uploaded to iCloud. The scope of latter CAN be expanded to other contents within the phone in future.
    You do know it’s applied only to images to be uploaded to iCloud.  Which is an important distinction on two aspects.  1. A user can avoid this by not uploading images to iCloud.  And 2, this allows Apple to ensure that it’s not hosting illegal child abuse images on its servers.  
    Agreed, it is applied only to images to be uploaded to iCloud - FOR NOW. It still has a possibility of "scope creep" (scanning other contents in the phone which are not marked for upload to iCloud) IN FUTURE (particularly in China/Russia/Saudi Arabia). Which is why, it is called as "backdoor" by the people who can think critically and understand how these features start off to begin with (a noble cause) and how they expand in scope (surveillance) in the long run. If Apple were to scan for CSAM content in iCloud, there is NO possibility of "scope creep" (scanning of other contents in the user's phones) in future. That is an important distinction on this topic.
    OMG what if! literally applies to every possible thing you can think of. What if Russia makes Apple upload Spotlight databases for all Russian users! Sounds scary, right? I can make up a ton of these. Doesn't mean it'll happen.
    So, are you of the type "I don't need privacy because I have nothing to hide. Like me, no needs privacy. Because if they have something to hide, they deserve to be punished anyway"?

    Apple can search within their property (iCloud) for all they want. But they should NOT search in the property of end user's phones WITHOUT warrant.
    The photo is my property whether it's on my phone or in iCloud.  But if I want to place in it in iCloud Photos then Apple have the right to check if to make sure it's something they're willing to host.  And it makes no difference whether that check is before or after the transfer; it's Apple's service, I'm requesting to use it, and they're negotiating the contract.
    fastasleep
  • Reply 156 of 162
    crowley said:
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.

    radarthekat said:
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.
    So simply stated.  How can so many still be so confused?  Boggles the mind. 
    No, it is NOT as simple as that. There is a difference and it is an important one. Apple conducting search for CSAM in iCloud is "searching for illegal content within their property". Apple conducting search for CSAM in end user's phone is "Invasion of privacy of the individual". The scope of former CANNOT be expanded in future to other contents not uploaded to iCloud. The scope of latter CAN be expanded to other contents within the phone in future.
    You do know it’s applied only to images to be uploaded to iCloud.  Which is an important distinction on two aspects.  1. A user can avoid this by not uploading images to iCloud.  And 2, this allows Apple to ensure that it’s not hosting illegal child abuse images on its servers.  
    Agreed, it is applied only to images to be uploaded to iCloud - FOR NOW. It still has a possibility of "scope creep" (scanning other contents in the phone which are not marked for upload to iCloud) IN FUTURE (particularly in China/Russia/Saudi Arabia). Which is why, it is called as "backdoor" by the people who can think critically and understand how these features start off to begin with (a noble cause) and how they expand in scope (surveillance) in the long run. If Apple were to scan for CSAM content in iCloud, there is NO possibility of "scope creep" (scanning of other contents in the user's phones) in future. That is an important distinction on this topic.
    OMG what if! literally applies to every possible thing you can think of. What if Russia makes Apple upload Spotlight databases for all Russian users! Sounds scary, right? I can make up a ton of these. Doesn't mean it'll happen.
    So, are you of the type "I don't need privacy because I have nothing to hide. Like me, no needs privacy. Because if they have something to hide, they deserve to be punished anyway"?

    Apple can search within their property (iCloud) for all they want. But they should NOT search in the property of end user's phones WITHOUT warrant.
    The photo is my property whether it's on my phone or in iCloud.  But if I want to place in it in iCloud Photos then Apple have the right to check if to make sure it's something they're willing to host.  And it makes no difference whether that check is before or after the transfer; it's Apple's service, I'm requesting to use it, and they're negotiating the contract.
    No, it DOES make a difference which is what you guys are NOT getting it. If someone enters an airport with illegal drugs/weapons, police has every right to arrest that person because the search happens in the location of the authority. But the same person cannot be arrested by police for possession of illegal drugs/weapons at his own house without warrant. Police cannot go and check each and everyone's house for possession of illegal drugs/weapons without warrant. Same person, same content - But depending upon the location where the search is done, the requirement for "warrant issued by court" becomes relevant/irrelevant.

    Apple doing the search for CSAM in people's phones is similar to this - police searching for illegal items in everyone's house without warrant. When you say that "it makes no difference whether the check is before or after the transfer", it is incorrect. There IS a difference and it is an important one. One is a blatant violation of privacy, the other one is not. You guys would be better off thinking through this "distinction" before commenting on this. The good news is - Most of the people DO get it and upset over it and rightly so.
    edited August 2021 Pascalxx
  • Reply 157 of 162
    crowleycrowley Posts: 10,453member
    crowley said:
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.

    radarthekat said:
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.
    So simply stated.  How can so many still be so confused?  Boggles the mind. 
    No, it is NOT as simple as that. There is a difference and it is an important one. Apple conducting search for CSAM in iCloud is "searching for illegal content within their property". Apple conducting search for CSAM in end user's phone is "Invasion of privacy of the individual". The scope of former CANNOT be expanded in future to other contents not uploaded to iCloud. The scope of latter CAN be expanded to other contents within the phone in future.
    You do know it’s applied only to images to be uploaded to iCloud.  Which is an important distinction on two aspects.  1. A user can avoid this by not uploading images to iCloud.  And 2, this allows Apple to ensure that it’s not hosting illegal child abuse images on its servers.  
    Agreed, it is applied only to images to be uploaded to iCloud - FOR NOW. It still has a possibility of "scope creep" (scanning other contents in the phone which are not marked for upload to iCloud) IN FUTURE (particularly in China/Russia/Saudi Arabia). Which is why, it is called as "backdoor" by the people who can think critically and understand how these features start off to begin with (a noble cause) and how they expand in scope (surveillance) in the long run. If Apple were to scan for CSAM content in iCloud, there is NO possibility of "scope creep" (scanning of other contents in the user's phones) in future. That is an important distinction on this topic.
    OMG what if! literally applies to every possible thing you can think of. What if Russia makes Apple upload Spotlight databases for all Russian users! Sounds scary, right? I can make up a ton of these. Doesn't mean it'll happen.
    So, are you of the type "I don't need privacy because I have nothing to hide. Like me, no needs privacy. Because if they have something to hide, they deserve to be punished anyway"?

    Apple can search within their property (iCloud) for all they want. But they should NOT search in the property of end user's phones WITHOUT warrant.
    The photo is my property whether it's on my phone or in iCloud.  But if I want to place in it in iCloud Photos then Apple have the right to check if to make sure it's something they're willing to host.  And it makes no difference whether that check is before or after the transfer; it's Apple's service, I'm requesting to use it, and they're negotiating the contract.
    No, it DOES make a difference which is what you guys are NOT getting it. If someone enters an airport with illegal drugs/weapons, police has every right to arrest that person because the search happens in the location of the authority. But the same person cannot be arrested by police for possession of illegal drugs/weapons at his own house without warrant. Police cannot go and check each and everyone's house for possession of illegal drugs/weapons without warrant. Same person, same content - But depending upon the location where the search is done, the requirement for "warrant issued by court" becomes relevant/irrelevant.

    Apple doing the search for CSAM in people's phones is similar to this - police searching for illegal items in everyone's house without warrant. When you say that "it makes no difference whether the check is before or after the transfer", it is incorrect. There IS a difference and it is an important one. One is a blatant violation of privacy, the other one is not. You guys would be better off thinking through this "distinction" before commenting on this. The good news is - Most of the people DO get it and upset over it and rightly so.
    You're right I'm not getting it.  For one Apple are not the police, and everything is done by agreement with Apple.  If you don't want this to happen then turn off iCloud Photos and Apple will not hashcheck any of your photos.  Moreover, this only happens when a photo is in the process of being transferred to iCloud Photos, i.e. Apple are not scanning generic data on your phone, they are explicitly as part of the transfer process doing some routine checks, akin to validating datatypes, checksums or scanning for viruses or otherwise malicious code.  The "warrant" as you put it is invoked by your request to put something into iCloud Photos, and that'll be within Apple's terms and condition of using the iCloud Photos service.  

    And ultimately there is no practical difference.  You want the photo in iCloud, the photo gets checked, the photo is in iCloud.  That's the exact same effect whether the hashcheck was on device or on server.  The ruleset is the same, the technology would be pretty much the same, it's just the location that is different.  Zero user impact, absolutely no difference in impact to privacy.  This theoretical distinction is completely arbitrary, it makes no difference.
    fastasleep
  • Reply 158 of 162
    Mike WuertheleMike Wuerthele Posts: 6,861administrator
    crowley said:
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.

    radarthekat said:
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.
    So simply stated.  How can so many still be so confused?  Boggles the mind. 
    No, it is NOT as simple as that. There is a difference and it is an important one. Apple conducting search for CSAM in iCloud is "searching for illegal content within their property". Apple conducting search for CSAM in end user's phone is "Invasion of privacy of the individual". The scope of former CANNOT be expanded in future to other contents not uploaded to iCloud. The scope of latter CAN be expanded to other contents within the phone in future.
    You do know it’s applied only to images to be uploaded to iCloud.  Which is an important distinction on two aspects.  1. A user can avoid this by not uploading images to iCloud.  And 2, this allows Apple to ensure that it’s not hosting illegal child abuse images on its servers.  
    Agreed, it is applied only to images to be uploaded to iCloud - FOR NOW. It still has a possibility of "scope creep" (scanning other contents in the phone which are not marked for upload to iCloud) IN FUTURE (particularly in China/Russia/Saudi Arabia). Which is why, it is called as "backdoor" by the people who can think critically and understand how these features start off to begin with (a noble cause) and how they expand in scope (surveillance) in the long run. If Apple were to scan for CSAM content in iCloud, there is NO possibility of "scope creep" (scanning of other contents in the user's phones) in future. That is an important distinction on this topic.
    OMG what if! literally applies to every possible thing you can think of. What if Russia makes Apple upload Spotlight databases for all Russian users! Sounds scary, right? I can make up a ton of these. Doesn't mean it'll happen.
    So, are you of the type "I don't need privacy because I have nothing to hide. Like me, no needs privacy. Because if they have something to hide, they deserve to be punished anyway"?

    Apple can search within their property (iCloud) for all they want. But they should NOT search in the property of end user's phones WITHOUT warrant.
    The photo is my property whether it's on my phone or in iCloud.  But if I want to place in it in iCloud Photos then Apple have the right to check if to make sure it's something they're willing to host.  And it makes no difference whether that check is before or after the transfer; it's Apple's service, I'm requesting to use it, and they're negotiating the contract.
    No, it DOES make a difference which is what you guys are NOT getting it. If someone enters an airport with illegal drugs/weapons, police has every right to arrest that person because the search happens in the location of the authority. But the same person cannot be arrested by police for possession of illegal drugs/weapons at his own house without warrant. Police cannot go and check each and everyone's house for possession of illegal drugs/weapons without warrant. Same person, same content - But depending upon the location where the search is done, the requirement for "warrant issued by court" becomes relevant/irrelevant.

    Apple doing the search for CSAM in people's phones is similar to this - police searching for illegal items in everyone's house without warrant. When you say that "it makes no difference whether the check is before or after the transfer", it is incorrect. There IS a difference and it is an important one. One is a blatant violation of privacy, the other one is not. You guys would be better off thinking through this "distinction" before commenting on this. The good news is - Most of the people DO get it and upset over it and rightly so.
    What I bolded is a stunningly inaccurate statement. You're confusing the loudness of the complaints with the number of them. This is an issue about about 10% of Apple's market cares about, and of that 10%, it looks like about a third are strongly against and "upset over it."

    And, I understand what you're trying to say. As it pertains to the "without warrant," since Apple is requiring iCloud Photos to be on, the lawyers we've spoken to disagree with you on that point as it literally does not matter where the scan is being performed. As it pertains to the distinction - Apple gets no information from the photos that it is scanning (that aren't CSAM) at all, so I'm not sure how you see this is a privacy violation.

    You are obviously welcome to have different privacy lines than I do, because that is the main bone of contention. Well, that and how much any given user weighs the "might" and "could" in the arguments of what the future may bring. However, it's a little strange to say that your opinion (or mine) is a universal one, and it remains baffling to me how the folks who are concerned about this now, haven't been in the last 13 years as the technology has rolled out across the internet, with most of the implementers taking much less regard to users' privacy.
    edited August 2021 fastasleep
  • Reply 159 of 162

    The photo is my property whether it's on my phone or in iCloud.  But if I want to place in it in iCloud Photos then Apple have the right to check if to make sure it's something they're willing to host.  And it makes no difference whether that check is before or after the transfer; it's Apple's service, I'm requesting to use it, and they're negotiating the contract.
    No, it DOES make a difference which is what you guys are NOT getting it. If someone enters an airport with illegal drugs/weapons, police has every right to arrest that person because the search happens in the location of the authority. But the same person cannot be arrested by police for possession of illegal drugs/weapons at his own house without warrant. Police cannot go and check each and everyone's house for possession of illegal drugs/weapons without warrant. Same person, same content - But depending upon the location where the search is done, the requirement for "warrant issued by court" becomes relevant/irrelevant.

    Apple doing the search for CSAM in people's phones is similar to this - police searching for illegal items in everyone's house without warrant. When you say that "it makes no difference whether the check is before or after the transfer", it is incorrect. There IS a difference and it is an important one. One is a blatant violation of privacy, the other one is not. You guys would be better off thinking through this "distinction" before commenting on this. The good news is - Most of the people DO get it and upset over it and rightly so.
    Thank you for this summary which sums up the problem really well. Whether it makes a difference from a legal point of view, I cannot judge, but it makes a difference to me as a user who is using their product because it makes me feel differently about my iPhone. The phone will have an inbuilt mechanism that can scan its user’s content according to moral criteria (what is considered illegal) if they agree to use iCloud photos. I do not want such a mechanism in my phone, even if turned off. The location of the scanning matters to me. I would not disagree to the scanning if it was performed off device on Apple’s servers where it belongs - though I would prefer that ordinary users like me who do not possess CSAM are not routinely scanned for something that only a small criminal minority possesses.

    Doing it on device is different and is something I have not heard Google do (I don’t know Android or Pixel phones well enough to say, though).
    muthuk_vanalingam
  • Reply 160 of 162
    Apple’s documentation on the new child protection features mentions that Apple will only be able to decrypt and access photos identified by the algorithm as CSAM. What is the status quo? I thought that Apple has the keys to all iCloud backups, including iCloud Photos and could decrypt and access them. Does that mean that it will no longer have access to any data other than that marked as CSAM once the feature goes live? If on-device scanning is the cost for real end-to-end encryption in iCloud, that would make it a more worthwhile trade off to consider.
Sign In or Register to comment.