Apple backs down on CSAM features, postpones launch

124678

Comments

  • Reply 61 of 158
    lkrupplkrupp Posts: 10,557member
    lkrupp said:
    Dead_Pool said:
    Subway’s Jared tweets his appreciation!
    Think of the children who will suffer abuse because a few privacy wackos don’t want Apple to scan their photos. Fuck those kids, right?
    Nope. Apple can very well scan the photos in iCloud and report it to authorities. They have the keys to decrypt the files stored in iCloud, so there is NOTHING that is preventing Apple from doing it and NO ONE is against this. The opposition is only for doing the scan ON the device, NOT in iCloud.
    And as AppleInsider and Apple have stated, images are not scanned on the device. But you choose to believe it’s a lie because...?
    radarthekatjony0
  • Reply 62 of 158
    Illusive said:
    Does anyone here realize THIS means iCloud Photos stay virtually unencrypted, just as they have been since at least 2020? That CSAM thingy was supposed to scan the pics on-device so that they could be uploaded securely to iCloud if they don't violate the policy. 
    But this is the back door officials are looking for… what good is e2e encryption if there is any point in the process where data can be transmitted so that it notifies a 3rd party of what it may contain? If there’s any point where data can be transmitted, it renders the purpose of encryption useless.

    While apples solution won’t transmit anything until a certain threshold is met, the capability is there to transmit info about the data being encrypted, which necessarily circumvents the e2e process.  It becomes a “we promise and absolutely swear we won’t do anything else” which is as good as no e2e encryption.  This is a slippery slope, and new “features” could feasibly added touting the “success” of the on device scanning, and so it would begin…

    The data may technically be e2e encrypted, but the weak point becomes just outside the front door.  If something or someone is sitting there, watching what is going in or coming out, then what good is having opaque walls?  If I knew someone was watching my front door (which in this case we do), I’d just do my criminal stuff somewhere else.

    This also opens the door for bad actors to exploit this system… people have already started poking and prodding the disabled version of this in ios14.

    I personally consider iCloud photos to be public, and act accordingly, despite any “niceties” Apple may provide in terms of privacy of the service.  While I’d like to see icloud photos encrypted so that absolutely the only intended parties can see them (people I’ve shared with), I would not accept this on device scanning to achieve that.

    TL;DR. Stopped reading after 'slippery slope'. Sorry, dude :D You wanna sound serious, get technical first. My guess is you watch too much YouTube - and possibly read too much conspiracy fiction, too.

    Anyway, opt out of iCloud Photos if you're anxious about someone flipping through your cat pics. This generation is just beyond silly.
    edited September 2021 jony0
  • Reply 63 of 158
    bluefire1 said:
    People are against a lot of horrible things, but Apple, regardless of how laudable the company’s intentions, has no business intercepting anything that’s on my phone. Privacy doesn’t come with a back door. As for what people post on social media-that’s their choice.
    People need to stop using terms they don't understand.  This is NOT an example of a backdoor.   Also, nobody is intercepting photos on your phone.  Apple's CSAM scan is scanning your photos JUST BEFORE YOU UPLOAD THEM to Apple's iCloud service.  

    Illusive said:
    Does anyone here realize THIS means iCloud Photos stay virtually unencrypted, just as they have been since at least 2020? That CSAM thingy was supposed to scan the pics on-device so that they could be uploaded securely to iCloud if they don't violate the policy. 
    Just the opposite.  I'm guessing this is a necessary prerequisite for Apple to put in place before it could ever go with end-to-end encryption and still remain compliant with authorities by not holding CSAM material on their cloud. 

    Did Apple ever explain in easy to understand language how a human can review the photos if they are encrypted and private? Which is it? Reviewable by humans or encrypted? It can't be both.
    Encrypted but with keys held by Apple, so that the images can be decrypted on-demand.
    Correct.  Apple doesn't use end-to-end encryption with iCloud photos.  Yet.

    Child Sexual Abuse Material - CSAM would affect to many powerful people in all walks of life!  I will not list the whos who since I may be “Cancelled”. 
    iPhone sales will be down drastically from these people (we're talking about worldwide issues here) plus other paranoias. Apple has good intention but their large profits will be eaten.
    Complete nonsense.  The only people crying about this are people who don't understand the technology and of course people with a collection of CSAM material I suppose.  What's more, given that Microsoft, Google, Facebook, etc. and everyone else with cloud storage is already scanning such photos, where are the paranoid and delusional people going to go?
    [Deleted User]radarthekatroundaboutnown2itivguyjony0
  • Reply 64 of 158
    lkrupp said:
    And as AppleInsider and Apple have stated, images are not scanned on the device. But you choose to believe it’s a lie because...?
    No, images are scanned on device.  What's funny is that people are just getting worked up about this now.  Apple has been scanning images on our devices for a long time.  This isn't the CSAM hash type scanning, this is the machine learning scanning I'm talking about.  That's how we can search for generic things like "dog" or "beach" a get a bunch of relevant pictures from our library.  Where is all of the "slippery slope" type of discussions around that?  Seriously, the level of stupid being raised about this topic is mind numbing. 
    [Deleted User]mr. hradarthekatfastasleeproundaboutnown2itivguyjony0
  • Reply 65 of 158
    Illusive said:
    Does anyone here realize THIS means iCloud Photos stay virtually unencrypted, just as they have been since at least 2020? That CSAM thingy was supposed to scan the pics on-device so that they could be uploaded securely to iCloud if they don't violate the policy. 
    Just the opposite.  I'm guessing this is a necessary prerequisite for Apple to put in place before it could ever go with end-to-end encryption and still remain compliant with authorities by not holding CSAM material on their cloud. 


    My point exactly. Without CSAM, iCloud Photos will remain unencrypted - all courtesy of noobs talking nonsense on the Internet. I wish the latter were only accessible with a valid sanity certificate. 
    edited September 2021
  • Reply 66 of 158
    That is the mark of a quality corporation as well as a quality individual:   Realizing that they are not perfect and everything thing they do is not inherently the right thing.

    It's a humility that enables one to admit and correct mistakes -- or at least examine that they may have been mistakes.

    Was this the right thing or the wrong thing to do?  The mere fact that Apple sees that as a valid question speaks highly of them.
    Good job Apple!

    You give Apple too much credit.
    They had no options. They did not seek (and in fact declined) discussions around this topic with industry leaders, and suddenly barged in with their CSAM solution. 
    Now that they have received criticism from many credible sources, ignoring them and releasing their solution would have backfired legally and financially. 
    They are doing this for financial reasons, not because they want to make the world a better place.
    That said, despite finding their proposal to be incredibly stupid considering their mantra on privacy and the problems they would introduce down the line, I do think they did this with all the best intentions and truly find a solution to combat child pornography/abuse.
    mobird
  • Reply 67 of 158
    That is the mark of a quality corporation as well as a quality individual:   Realizing that they are not perfect and everything thing they do is not inherently the right thing.

    It's a humility that enables one to admit and correct mistakes -- or at least examine that they may have been mistakes.

    Was this the right thing or the wrong thing to do?  The mere fact that Apple sees that as a valid question speaks highly of them.
    Good job Apple!

    You give Apple too much credit.
    They had no options. They did not seek (and in fact declined) discussions around this topic with industry leaders, and suddenly barged in with their CSAM solution. 
    Now that they have received criticism from many credible sources, ignoring them and releasing their solution would have backfired legally and financially. 
    They are doing this for financial reasons, not because they want to make the world a better place.
    That said, despite finding their proposal to be incredibly stupid considering their mantra on privacy and the problems they would introduce down the line, I do think they did this with all the best intentions and truly find a solution to combat child pornography/abuse.
    Credible sources? I ask you, dude. Paranoid noobs and exiled whistleblowers don't really make for one. 

    >> They are doing this for financial reasons, not because they want to make the world a better place.

    Welcome to the real world, Neo. © This is capitalism, a system where money comes first. Or were you under the impression it was all about making you and me happy? :D 

    Also, in this real world, governments exist, and any private / public entity, such as Apple, must comply with their regulations in order to be able to operate freely in a certain country. You can't just tell those guys to screw off or publicly expose their demands. They might not like it at all, you know. 
    edited September 2021
  • Reply 68 of 158
    mr. hmr. h Posts: 4,870member
    JBSlough said:
    Just a question on this tech. So, the software scans for known images of child porn from a data base from a third party. Which would imply that these images are downloaded off the internet or texted to one’s phone. Well, what about the person whose actually using his/hers iPhone to shoot pics, say in their homes? Those wouldn’t be in the database. Exactly how does that work? Or are those images “safe”? It seems to me that this tech only solves half a problem, that is images of known child porn. Not new ones. Am I correct in understanding this? 
    Yes, that is correct. The proposed CSAM process would only be able to match to prior-known images. There is some cunningness to try to ensure that even if the image is slightly modified (e.g. cropped, mirrored, rotated slightly) from the known version a match will still be generated, but it absolutely would not trigger on any kind of "new" photo.
    baconstangfastasleepJBSloughjony0
  • Reply 69 of 158
    mr. hmr. h Posts: 4,870member
    muthuk_vanalingam said:
    I don't think your understanding is correct. It is NOT exact copy of of known CSAM that is being searched for/compared against in your phone. There is a level of "pattern matching" involved when CSAM check is done on the phone before it is uploaded to iCloud. So new photos taken will also get flagged IF the pattern matches with known CSAM database entries.
    This is not correct. Only known CSAM images, or minor modifications of those known CSAM images, would create a match. It would not be able to detect "new" child porn. Only the kind of AI-based scanning they were proposing for the Messages app on children's phones would theoretically be able to do that (and it's not what Apple was planning to do anyway)
    edited September 2021 baconstangfastasleepJBSloughjony0
  • Reply 70 of 158
    Great news! Apple listened. Their CSAM concept made a mockery of Apple’s privacy ethos. Even though it was well intentioned, it would have turned our iPhones into digital Stasi officers monitoring our every move. 

    Apple should turn their attention to screening cloud services where much of this offensive material is apparently stored and shared. But they should leave our iPhones alone. Our phones should be sacrosanct paragons of privacy. 
    macplusplusJaiOh81muthuk_vanalingamdarkvader
  • Reply 71 of 158
    mr. hmr. h Posts: 4,870member
    henrybay said:
    Great news! Apple listened. Their CSAM concept made a mockery of Apple’s privacy ethos. Even though it was well intentioned, it would have turned our iPhones into digital Stasi officers monitoring our every move. 

    Apple should turn their attention to screening cloud services where much of this offensive material is apparently stored and shared. But they should leave our iPhones alone. Our phones should be sacrosanct paragons of privacy. 
    The irony of this post is sky-high.

    Their CSAM concept was actually an extremely clever way of enabling all of your photos to be uploaded to iCloud fully encrypted (without giving Apple the keys), such that neither Apple nor anyone else (should they hack into iCloud, or be law-enforcement with a warrant) would have been able to inspect the photos in iCloud, with the exception of any individual photos that matched a CSAM hash, with the proviso that even then, there would have to be at least 30 photos that matched known CSAM material, before even that was possible.

    But now, since they have backed down, all of your photos will continue to be uploaded to iCloud unencrypted, where Apple, law enforcement, and any hackers will be able to inspect all of your photos.

    Which one of these two scenarios offers the most privacy?
    edited September 2021 radarthekatfastasleeproundaboutnown2itivguyjony0
  • Reply 72 of 158
    Illusive said:
    Illusive said:
    Does anyone here realize THIS means iCloud Photos stay virtually unencrypted, just as they have been since at least 2020? That CSAM thingy was supposed to scan the pics on-device so that they could be uploaded securely to iCloud if they don't violate the policy. 
    But this is the back door officials are looking for… what good is e2e encryption if there is any point in the process where data can be transmitted so that it notifies a 3rd party of what it may contain? If there’s any point where data can be transmitted, it renders the purpose of encryption useless.

    While apples solution won’t transmit anything until a certain threshold is met, the capability is there to transmit info about the data being encrypted, which necessarily circumvents the e2e process.  It becomes a “we promise and absolutely swear we won’t do anything else” which is as good as no e2e encryption.  This is a slippery slope, and new “features” could feasibly added touting the “success” of the on device scanning, and so it would begin…

    The data may technically be e2e encrypted, but the weak point becomes just outside the front door.  If something or someone is sitting there, watching what is going in or coming out, then what good is having opaque walls?  If I knew someone was watching my front door (which in this case we do), I’d just do my criminal stuff somewhere else.

    This also opens the door for bad actors to exploit this system… people have already started poking and prodding the disabled version of this in ios14.

    I personally consider iCloud photos to be public, and act accordingly, despite any “niceties” Apple may provide in terms of privacy of the service.  While I’d like to see icloud photos encrypted so that absolutely the only intended parties can see them (people I’ve shared with), I would not accept this on device scanning to achieve that.

    TL;DR. Stopped reading after 'slippery slope'. Sorry, dude :D You wanna sound serious, get technical first. My guess is you watch too much YouTube - and possibly read too much conspiracy fiction, too.

    Anyway, opt out of iCloud Photos if you're anxious about someone flipping through your cat pics. This generation is just beyond silly.
    First part, not helpful! ¯\_(ツ)_/¯ 

    Second part, somewhat helpful, but I’d add for those who don’t wish for the scanning on their device to stay on 14 or lower, and for those who absolutely want to be sure, stop using iCloud photos altogether (which the smart criminals will do anyways).

    Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety.” - Benjamin Franklin

    There is no justification for taking away individuals' freedom in the guise of public safety.” - Thomas Jefferson

    Trade liberty for safety or money and you'll end up with neither. Liberty, like a grain of salt, easily dissolves. The power of questioning - not simply believing - has no friends. Yet liberty depends on it.” - Thomas Jefferson
    mobirdbaconstangmuthuk_vanalingamdarkvader
  • Reply 73 of 158
    mr. hmr. h Posts: 4,870member
    techconc said:
    lkrupp said:
    And as AppleInsider and Apple have stated, images are not scanned on the device. But you choose to believe it’s a lie because...?
    No, images are scanned on device.  What's funny is that people are just getting worked up about this now.  Apple has been scanning images on our devices for a long time.  This isn't the CSAM hash type scanning, this is the machine learning scanning I'm talking about.  That's how we can search for generic things like "dog" or "beach" a get a bunch of relevant pictures from our library.  Where is all of the "slippery slope" type of discussions around that?  Seriously, the level of stupid being raised about this topic is mind numbing. 
    +
    n2itivguyjony0techconc
  • Reply 74 of 158
    gatorguygatorguy Posts: 24,106member
    mr. h said:
    henrybay said:
    Great news! Apple listened. Their CSAM concept made a mockery of Apple’s privacy ethos. Even though it was well intentioned, it would have turned our iPhones into digital Stasi officers monitoring our every move. 

    Apple should turn their attention to screening cloud services where much of this offensive material is apparently stored and shared. But they should leave our iPhones alone. Our phones should be sacrosanct paragons of privacy. 
    The irony of this post is sky-high.

    Their CSAM concept was actually an extremely clever way of enabling all of your photos to be uploaded to iCloud fully encrypted (without giving Apple the keys), such that neither Apple nor anyone else (should they hack into iCloud, or be law-enforcement with a warrant) would have been able to inspect the photos in iCloud, with the exception of any individual photos that matched a CSAM hash, with the proviso that even then, there would have to be at least 30 photos that matched known CSAM material, before even that was possible.

    But now, since they have backed down, all of your photos will continue to be uploaded to iCloud unencrypted, where Apple, law enforcement, and any hackers will be able to inspect all of your photos.

    Which one of these two scenarios offers the most privacy?
    Why are you and a couple of others so convinced this was all because Apple was prepared to E2E encrypt the whole shebang?  In truth there is no way they could have done so for half their entire user base as China would have barred them from the country if they did. You honestly think Apple was willing to cut revenues by a third or more? 

    I get that you really REALLY want to paint a glowing picture of "gosh Apple is doing this for us", but is there any even circumstantial evidence Apple was ready to make everything end-to-end encrypted in a way they could not access any of your data even if they were ordered to? Not as far as I know. It's more of a hope and prayer since otherwise it's not for the betterment of us users. 
    baconstangmacplusplusmuthuk_vanalingamdarkvader
  • Reply 75 of 158
    lkrupp said:
    Dead_Pool said:
    Subway’s Jared tweets his appreciation!
    Yep, old Jared was a pedo. Hope he rots in jail.
     
    Think of the children who will suffer abuse because a few privacy wackos don’t want Apple to scan their photos. Fuck those kids, right? And are any of you privacy freaks going to say with a straight face that you are certain your data is not being scanned by Apple, Google, Facebook, NSA, CIA, Homeland Security, er al, right NOW? When someone can switch your lightning cable with one that can suck your data from a mile away? Years from now declassified documents will reveal you were scanned all along without your permission or knowledge.
    I will. I am certain that my data isn't being scanned. Since it's on an air-gapped NAS...
  • Reply 76 of 158
    mr. hmr. h Posts: 4,870member
    <deleted as I may have jumped the gun and misunderstood what I was reading - see later post below
    edited September 2021
  • Reply 77 of 158
    radarthekatradarthekat Posts: 3,828moderator
    lkrupp said:
    Dead_Pool said:
    Subway’s Jared tweets his appreciation!
    Think of the children who will suffer abuse because a few privacy wackos don’t want Apple to scan their photos. Fuck those kids, right?
    Nope. Apple can very well scan the photos in iCloud and report it to authorities. They have the keys to decrypt the files stored in iCloud, so there is NOTHING that is preventing Apple from doing it and NO ONE is against this. The opposition is only for doing the scan ON the device, NOT in iCloud.
    chadbag said:
    I do believe the soundbite that got out early was, 'oh my god, Apple is scanning my phone for images.' This is not what is happening." — Craig Federighi 

    It is what is happening.  How else do they create the “magical” hashes? It is happening on the phone.  So, Craig, why do you say that is not what is happening when that is exactly what is happening?

    So you call creation of a checksum scanning a file?  Is that what you’re saying?  Apple is simply creating a hash from each photo to be uploaded.  It can then compare that hash to hashes created against the photos in a CSAM database.  This is pretty innocuous. 
    [Deleted User]jony0
  • Reply 78 of 158
    Illusive said:
    Illusive said:
    Does anyone here realize THIS means iCloud Photos stay virtually unencrypted, just as they have been since at least 2020? That CSAM thingy was supposed to scan the pics on-device so that they could be uploaded securely to iCloud if they don't violate the policy. 
    But this is the back door officials are looking for… what good is e2e encryption if there is any point in the process where data can be transmitted so that it notifies a 3rd party of what it may contain? If there’s any point where data can be transmitted, it renders the purpose of encryption useless.

    While apples solution won’t transmit anything until a certain threshold is met, the capability is there to transmit info about the data being encrypted, which necessarily circumvents the e2e process.  It becomes a “we promise and absolutely swear we won’t do anything else” which is as good as no e2e encryption.  This is a slippery slope, and new “features” could feasibly added touting the “success” of the on device scanning, and so it would begin…

    The data may technically be e2e encrypted, but the weak point becomes just outside the front door.  If something or someone is sitting there, watching what is going in or coming out, then what good is having opaque walls?  If I knew someone was watching my front door (which in this case we do), I’d just do my criminal stuff somewhere else.

    This also opens the door for bad actors to exploit this system… people have already started poking and prodding the disabled version of this in ios14.

    I personally consider iCloud photos to be public, and act accordingly, despite any “niceties” Apple may provide in terms of privacy of the service.  While I’d like to see icloud photos encrypted so that absolutely the only intended parties can see them (people I’ve shared with), I would not accept this on device scanning to achieve that.

    TL;DR. Stopped reading after 'slippery slope'. Sorry, dude :D You wanna sound serious, get technical first. My guess is you watch too much YouTube - and possibly read too much conspiracy fiction, too.

    Anyway, opt out of iCloud Photos if you're anxious about someone flipping through your cat pics. This generation is just beyond silly.
    First part, not helpful! ¯\_(ツ)_/¯ 

    Second part, somewhat helpful, but I’d add for those who don’t wish for the scanning on their device to stay on 14 or lower, and for those who absolutely want to be sure, stop using iCloud photos altogether (which the smart criminals will do anyways).

    “Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety.” - Benjamin Franklin

    “There is no justification for taking away individuals' freedom in the guise of public safety.” - Thomas Jefferson

    “Trade liberty for safety or money and you'll end up with neither. Liberty, like a grain of salt, easily dissolves. The power of questioning - not simply believing - has no friends. Yet liberty depends on it.” - Thomas Jefferson

    Don’t give me that Uncle Sam stuff, man :D It ain’t technical, so I literally don’t care. As for the update holdouts, they will update sooner or later anyway. We’ve been there already. 

    This whole discussion is a waste of time. You want your liberties, go for them. Just make sure your actions don’t make things worse for those of us who aren’t paranoid. 
    edited September 2021
  • Reply 79 of 158
    mr. hmr. h Posts: 4,870member
    gatorguy said:

    I get that you really REALLY want to paint a glowing picture of "gosh Apple is doing this for us", but is there any even circumstantial evidence Apple was ready to make everything end-to-end encrypted in a way they could not access any of your data even if they were ordered to? Not as far as I know. It's more of a hope and prayer since otherwise it's not for the betterment of us users. 
    All I can say about that is that the whole scheme would be totally pointless if they weren't going to encrypt the photos. Why go to all the effort of designing this enormously complicated system, calculating hashes on-device, doing the CSAM hash-matching in a "blind" way so even the device itself doesn't know if there's been a match, and then going to all the convoluted effort of generating doubly-encrypted "vouchers" and associated "image information", if the photo itself was uploaded to iCloud unencrypted?

    Certainly, this system would enable the photos to be uploaded to iCloud encrypted, but I concede that as far as I know, Apple hasn't said that they would do that. It's just that, as I said, the whole scheme seems totally pointless if the photos are uploaded to the server in the clear anyway.

    How about Apple just offers a toggle in iCloud photos settings? The two options would be:

    1. Photos are CSAM-scanned and encrypted before being uploaded to iCloud.
    2. Photos are not CSAM-scanned, but are uploaded to iCloud in the clear. The server then does the CSAM scan.

    Would this solution make everyone happier?
    [Deleted User]fastasleepjony0
  • Reply 80 of 158
    MplsP said:
    How many of the people screaming about CSAM have Facebook, WhatsApp, Instagram, and google apps on their devices and an Amazon or google smart speaker in their home?
    I don't.
    darkvader
Sign In or Register to comment.