Apple backs down on CSAM features, postpones launch

123468

Comments

  • Reply 101 of 158
    crowley said:
    A triumph of misinformation and hysteria.
    ...or, it stings to be wrong. 
    elijahgmuthuk_vanalingamchemengin1
  • Reply 102 of 158
    crowley said:
    A triumph of misinformation and hysteria.
    ...or, it stings to be wrong. 
    Still much better than being paranoid. 
  • Reply 103 of 158
    mike54 said:
    Its not about child protection, it all about getting US Gov surveillance software framework on Apple devices and Apple's best bet on getting this accepted is to use CSAM as the excuse.  Their multi-million dollar privacy campaign, in the  attempt to distinguish them from Google Android, is washed down the toilet with this single act.
    You’d have to thank the ‘US Gov’ even if that were true. It’s clear you’ve never run a business. Those guys absolutely aren’t someone you should - or can - mess with. 

    CSAM countermeasures haven’t change my view of Apple a single bit. Sucking up to simps may, though. Too many of them have access to the Internet nowadays. 

    P.S. Don’t use cloud storage if you’ve got that much to hide. It’s been scanned since day 1 - have you ever been (falsely) charged with anything because of it, though?
    edited September 2021 jony0
  • Reply 104 of 158
    lkrupp said:
    Dead_Pool said:
    Subway’s Jared tweets his appreciation!
    Think of the children who will suffer abuse because a few privacy wackos don’t want Apple to scan their photos. Fuck those kids, right?
    Nope. Apple can very well scan the photos in iCloud and report it to authorities. They have the keys to decrypt the files stored in iCloud, so there is NOTHING that is preventing Apple from doing it and NO ONE is against this. The opposition is only for doing the scan ON the device, NOT in iCloud.
    chadbag said:
    I do believe the soundbite that got out early was, 'oh my god, Apple is scanning my phone for images.' This is not what is happening." — Craig Federighi 

    It is what is happening.  How else do they create the “magical” hashes? It is happening on the phone.  So, Craig, why do you say that is not what is happening when that is exactly what is happening?

    So you call creation of a checksum scanning a file?  Is that what you’re saying?  Apple is simply creating a hash from each photo to be uploaded.  It can then compare that hash to hashes created against the photos in a CSAM database.  This is pretty innocuous. 
    Idi0ts won’t change. Why even reason with them?
    jony0
  • Reply 104 of 158
    mr. h said:
    henrybay said:
    Great news! Apple listened. Their CSAM concept made a mockery of Apple’s privacy ethos. Even though it was well intentioned, it would have turned our iPhones into digital Stasi officers monitoring our every move. 

    Apple should turn their attention to screening cloud services where much of this offensive material is apparently stored and shared. But they should leave our iPhones alone. Our phones should be sacrosanct paragons of privacy. 
    The irony of this post is sky-high.

    Their CSAM concept was actually an extremely clever way of enabling all of your photos to be uploaded to iCloud fully encrypted (without giving Apple the keys), such that neither Apple nor anyone else (should they hack into iCloud, or be law-enforcement with a warrant) would have been able to inspect the photos in iCloud, with the exception of any individual photos that matched a CSAM hash, with the proviso that even then, there would have to be at least 30 photos that matched known CSAM material, before even that was possible.

    But now, since they have backed down, all of your photos will continue to be uploaded to iCloud unencrypted, where Apple, law enforcement, and any hackers will be able to inspect all of your photos.

    Which one of these two scenarios offers the most privacy?

    Which one of these two scenarios offers the most privacy?

    That’s a no brainer. I choose scenario 2 because I don’t care who sees my photos on iCloud but I care deeply about who can access the content of my iPhone. 

    I consider my iPhone an extension of my private domain and is therefore sacrosanct. I don’t consider cloud services in the same way and assume they are open to external scrutiny. 

    It is also naive to believe that Apple’s CSAM approach won’t be exploited by the technical wizards who created spyware programs like Pegasus. They could turn this program against us and sell their service to repressive regimes. Let’s not kid ourselves — this is a back door into our phones. No amount of reassuring techno mumbo or nuanced obfuscation can disguise this fact. 

    muthuk_vanalingamdarkvader
  • Reply 106 of 158
    henrybay said: That’s a no brainer. I choose scenario 2 because I don’t care who sees my photos on iCloud but I care deeply about who can access the content of my iPhone. 
    When you agree to iCloud terms of service, you've granted Apple access to scanning files coming from apps that are using iCloud. You get to choose which apps do that, if any. Once you choose one to send files to the cloud, it doesn't matter if those files are scanned locally or on the server. It's the same files regardless. Apple isn't scanning anything that you haven't already agreed to allow them to scan. 
  • Reply 107 of 158
    henrybay said:
    mr. h said:
    henrybay said:
    Great news! Apple listened. Their CSAM concept made a mockery of Apple’s privacy ethos. Even though it was well intentioned, it would have turned our iPhones into digital Stasi officers monitoring our every move. 

    Apple should turn their attention to screening cloud services where much of this offensive material is apparently stored and shared. But they should leave our iPhones alone. Our phones should be sacrosanct paragons of privacy. 
    The irony of this post is sky-high.

    Their CSAM concept was actually an extremely clever way of enabling all of your photos to be uploaded to iCloud fully encrypted (without giving Apple the keys), such that neither Apple nor anyone else (should they hack into iCloud, or be law-enforcement with a warrant) would have been able to inspect the photos in iCloud, with the exception of any individual photos that matched a CSAM hash, with the proviso that even then, there would have to be at least 30 photos that matched known CSAM material, before even that was possible.

    But now, since they have backed down, all of your photos will continue to be uploaded to iCloud unencrypted, where Apple, law enforcement, and any hackers will be able to inspect all of your photos.

    Which one of these two scenarios offers the most privacy?

    Which one of these two scenarios offers the most privacy?

    That’s a no brainer. I choose scenario 2 because I don’t care who sees my photos on iCloud but I care deeply about who can access the content of my iPhone. 

    I consider my iPhone an extension of my private domain and is therefore sacrosanct. I don’t consider cloud services in the same way and assume they are open to external scrutiny. 

    It is also naive to believe that Apple’s CSAM approach won’t be exploited by the technical wizards who created spyware programs like Pegasus. They could turn this program against us and sell their service to repressive regimes. Let’s not kid ourselves — this is a back door into our phones. No amount of reassuring techno mumbo or nuanced obfuscation can disguise this fact. 

    Dude, are you even aware Photos is already scanning your stuff? Has been for years now. That’s exactly how they make those fancy shmancy stacks out of your cat pics. 
  • Reply 108 of 158
    crowleycrowley Posts: 10,453member
    crowley said:
    A triumph of misinformation and hysteria.
    ...or, it stings to be wrong. 
    ?  Who is stinging?
  • Reply 109 of 158
    hagarhagar Posts: 130member
    mike54 said:
    Its not about child protection, it all about getting US Gov surveillance software framework on Apple devices and Apple's best bet on getting this accepted is to use CSAM as the excuse.  Their multi-million dollar privacy campaign, in the  attempt to distinguish them from Google Android, is washed down the toilet with this single act.
    If that were the case, wouldn’t Google also be planning this?

    And would the US gov not need to pass a law to make this mandatory? Or keep it a secret so no one knows. 

    But only Apple combined with a public announced makes it completely unlikely the US gov is behind this. 
    [Deleted User]elijahgrobaba
  • Reply 110 of 158
    MplsP said:
    gatorguy said:
    MplsP said:
    How many of the people screaming about CSAM have Facebook, WhatsApp, Instagram, and google apps on their devices and an Amazon or google smart speaker in their home?
    Implying Apple is not any worse than "everyone else" is not a ringing endorsement. 
    xyzzy-xxx said:
    MplsP said:
    How many of the people screaming about CSAM have Facebook, WhatsApp, Instagram, and google apps on their devices and an Amazon or google smart speaker in their home?
    I won't use a smart speaker, but regarding Facebook & co. you are comparing apples to oranges – just don't give these apps access to your photos etc. and think about what you are uploading and you will be fine.
    my point was not comparing Apple to any of these other corporations. My point was that it's a bit hypocritical to be completely ok with all of these other 'services' snooping, scraping, monetizing and otherwise surveilling your personally life and then to start screaming about Apple trying to do something to protect the most vulnerable people in society in a way that preserves people's privacy.

    Everyone makes the obligatory statement that they're against exploiting children, but somehow they're not willing to put their money where their mouth is. But they are willing to give up their privacy for the ability to brag about their vacation, post conspiracy theories and snoop on their neighbors. I find it a very sad commentary on people's values.
    You are trying to make a point that’s flawed. For your point to work every single person must have an account with Facebook or be subscribed to those other services for the hypocritical claim. Not everyone have given access to their life on social media or the other services you mentioned. Only a person inside the social media bubble thinks that everyone has a facebook or a similar service. Some of us recognize and understand what’s privacy is/should be about. 


    [Deleted User]muthuk_vanalingambb-15
  • Reply 111 of 158
    mr. hmr. h Posts: 4,870member
    You have Spotlight indexing everything every day. You have Photos using machine learning to identify faces and pets and objects every day. What's to stop Apple from exfiltrating that data at any point? … Live Text is coming to iOS 15 and macOS Monterey and is going to make a whole lot of image-based text indexable, why aren't people freaking out about that?
    Why are none of the naysayers in this thread addressing this extremely valid point? The above features, which have been in iOS for years (apart from Live Text, obviously) actually fit the profile of what is being complained about, far more than the newly-proposed CSAM detection process.
    [Deleted User]jony0
  • Reply 112 of 158
    mr. h said:
    You have Spotlight indexing everything every day. You have Photos using machine learning to identify faces and pets and objects every day. What's to stop Apple from exfiltrating that data at any point? … Live Text is coming to iOS 15 and macOS Monterey and is going to make a whole lot of image-based text indexable, why aren't people freaking out about that?
    Why are none of the naysayers in this thread addressing this extremely valid point? The above features, which have been in iOS for years (apart from Live Text, obviously) actually fit the profile of what is being complained about, far more than the newly-proposed CSAM detection process.
    Cause they’re clueless trolls, that’s why. 
    edited September 2021 jony0
  • Reply 113 of 158
    robabarobaba Posts: 228member
    lkrupp said:
    Dead_Pool said:
    Subway’s Jared tweets his appreciation!
    Think of the children who will suffer abuse because a few privacy wackos don’t want Apple to scan their photos. Fuck those kids, right?
    Nope. Apple can very well scan the photos in iCloud and report it to authorities. They have the keys to decrypt the files stored in iCloud, so there is NOTHING that is preventing Apple from doing it and NO ONE is against this. The opposition is only for doing the scan ON the device, NOT in iCloud.
    This is the very issue that Apple is trying to avoid.  They don’t want to be able to scan your data on their services because they know if they do, ANYTHING that passes through their servers will eventually be requested by some authoritarian government, and they will have to provide that access, UNLESS that information is already encrypted and Apple doesn’t have the key.  This is the golden ring (golden Apple?) that they have been reaching for and everyone has been lauding them for.  The problem is this—how to they keep their system from becoming a haven for every bad actor in the known ‘verse?  Apples solution was to embed the process into the phones in a way that does not open the end user to further degradation of their privacy—only hashes, only looking for known matches, only in the process of uploading to their servers (not just passing through).  Like it or not that’s now binned.

    my point is, if Apple is to be able to provide users with the golden ring of security from government snooping, it’s going to need some solution for bad actors of completely scrap it’s services division.  At nearly half of its pre-tax earnings, there’s no way Apple can afford to abandon services.  Even then it will still be blamed for enabling bad actors.  No-one will want to be connected to the next 9/11 type incident when it inevitably happens.
    jony0
  • Reply 114 of 158
    robabarobaba Posts: 228member
    techconc said:
    gatorguy said:
    MplsP said:
    How many of the people screaming about CSAM have Facebook, WhatsApp, Instagram, and google apps on their devices and an Amazon or google smart speaker in their home?
    Implying Apple is not any worse than "everyone else" is not a ringing endorsement. 
    With comments like this, it's clear that people still don't understand the difference.  There is no cloud storage company that wants CSAM material on their service.  Period.  You can either scan for it locally, on device... in a private way, or you can wait until the files are uploaded and scanned in a less private way.  Your choice.  

    I wouldn't be surprised if this step is a prerequisite for Apple going full end-to-end encryption with photos next.  Once they have something like this in place, they can justify to authorities how they know they're not holding CSAM material without invading user privacy in the process. 

    I think it's rather naive to think that Apple isn't going to have to address this with regard to ensuring they don't have CSAM material on their cloud services.  You can either do it your way that preserves privacy the best you can or you have laws written to have it done their way which will most certainly be more invasive.  If you wait for the laws to come, you lose your choice on how to implement it. 
    Exactly—thank you for being the ninja we need!  You said this much more succinctly than I.
    jony0techconc
  • Reply 115 of 158
    robabarobaba Posts: 228member
    gatorguy said:
    gatorguy said:
    mr. h said:
    henrybay said:
    Great news! Apple listened. Their CSAM concept made a mockery of Apple’s privacy ethos. Even though it was well intentioned, it would have turned our iPhones into digital Stasi officers monitoring our every move. 

    Apple should turn their attention to screening cloud services where much of this offensive material is apparently stored and shared. But they should leave our iPhones alone. Our phones should be sacrosanct paragons of privacy. 
    The irony of this post is sky-high.

    Their CSAM concept was actually an extremely clever way of enabling all of your photos to be uploaded to iCloud fully encrypted (without giving Apple the keys), such that neither Apple nor anyone else (should they hack into iCloud, or be law-enforcement with a warrant) would have been able to inspect the photos in iCloud, with the exception of any individual photos that matched a CSAM hash, with the proviso that even then, there would have to be at least 30 photos that matched known CSAM material, before even that was possible.

    But now, since they have backed down, all of your photos will continue to be uploaded to iCloud unencrypted, where Apple, law enforcement, and any hackers will be able to inspect all of your photos.

    Which one of these two scenarios offers the most privacy?
    Why are you and a couple of others so convinced this was all because Apple was prepared to E2E encrypt the whole shebang?  In truth there is no way they could have done so for half their entire user base as China would have barred them from the country if they did. You honestly think Apple was willing to cut revenues by a third or more? 

    I get that you really REALLY want to paint a glowing picture of "gosh Apple is doing this for us", but is there any even circumstantial evidence Apple was ready to make everything end-to-end encrypted in a way they could not access any of your data even if they were ordered to? Not as far as I know. It's more of a hope and prayer since otherwise it's not for the betterment of us users. 
    So no objection to the Minority Report crowd who so readily projects into the future how Apple will start scanning for all manner of other things, but you object to those who project that Apple might, in the future, make a more secure iCloud.  Gotcha.  
    No objection to it at all if Apple has the courage to do the right thing and thumb their nose at China, take back their iCloud service there, and enact E2EE in order to have an actually secure Cloud service. Do you believe they do? 

    So no, you didn't "get me" at all. 
    They do this and they very quickly run out of governments to thumb their nose at.  Would you be happy is the only place you could legally use (or even own) your Apple device was Switzerland?  Good luck waiting on that kind of “bravery.”
    [Deleted User]jony0
  • Reply 116 of 158
    mr. h said:
    gatorguy said:

    I get that you really REALLY want to paint a glowing picture of "gosh Apple is doing this for us", but is there any even circumstantial evidence Apple was ready to make everything end-to-end encrypted in a way they could not access any of your data even if they were ordered to? Not as far as I know. It's more of a hope and prayer since otherwise it's not for the betterment of us users. 
    All I can say about that is that the whole scheme would be totally pointless if they weren't going to encrypt the photos. Why go to all the effort of designing this enormously complicated system, calculating hashes on-device, doing the CSAM hash-matching in a "blind" way so even the device itself doesn't know if there's been a match, and then going to all the convoluted effort of generating doubly-encrypted "vouchers" and associated "image information", if the photo itself was uploaded to iCloud unencrypted?

    Certainly, this system would enable the photos to be uploaded to iCloud encrypted, but I concede that as far as I know, Apple hasn't said that they would do that. It's just that, as I said, the whole scheme seems totally pointless if the photos are uploaded to the server in the clear anyway.

    How about Apple just offers a toggle in iCloud photos settings? The two options would be:

    1. Photos are CSAM-scanned and encrypted before being uploaded to iCloud.
    2. Photos are not CSAM-scanned, but are uploaded to iCloud in the clear. The server then does the CSAM scan.

    Would this solution make everyone happier?
    Yup, that would make at least 99% of the users happy. There are few odd ones out, but this would be a workable solution imho. 
    They’ve already been doing that for over a year. 

    https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/
    jony0
  • Reply 117 of 158
    Illusive said:

    Yet… here you are… ¯\_(ツ)_/¯

    due to the industry I am in, I have dealt with the technical side of this stuff, but I am not sure the  actual technical details are relevant… that is the how… Dont dev me wrong, their implementation is super cool, but…

    Another more recent quote I like to use, because I see bad tech designs/decisions all the time is:

    “Yeah, but your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should.” - Dr. Ian Malcolm
    I see. A quote from an old action movie is certainly of more relevance here :D 
    You’re missing the point, which is:

    There are times when we do things just because we can. We tend to live in the moment without considerating of the consequences of our actions. Sometimes, the consequences aren’t worth it.

    Which, while it comes from an old sci fi movie, is still applicable, and is more succinct.
    [Deleted User]
  • Reply 118 of 158
    gatorguygatorguy Posts: 24,213member
    robaba said:
    gatorguy said:
    gatorguy said:
    mr. h said:
    henrybay said:
    Great news! Apple listened. Their CSAM concept made a mockery of Apple’s privacy ethos. Even though it was well intentioned, it would have turned our iPhones into digital Stasi officers monitoring our every move. 

    Apple should turn their attention to screening cloud services where much of this offensive material is apparently stored and shared. But they should leave our iPhones alone. Our phones should be sacrosanct paragons of privacy. 
    The irony of this post is sky-high.

    Their CSAM concept was actually an extremely clever way of enabling all of your photos to be uploaded to iCloud fully encrypted (without giving Apple the keys), such that neither Apple nor anyone else (should they hack into iCloud, or be law-enforcement with a warrant) would have been able to inspect the photos in iCloud, with the exception of any individual photos that matched a CSAM hash, with the proviso that even then, there would have to be at least 30 photos that matched known CSAM material, before even that was possible.

    But now, since they have backed down, all of your photos will continue to be uploaded to iCloud unencrypted, where Apple, law enforcement, and any hackers will be able to inspect all of your photos.

    Which one of these two scenarios offers the most privacy?
    Why are you and a couple of others so convinced this was all because Apple was prepared to E2E encrypt the whole shebang?  In truth there is no way they could have done so for half their entire user base as China would have barred them from the country if they did. You honestly think Apple was willing to cut revenues by a third or more? 

    I get that you really REALLY want to paint a glowing picture of "gosh Apple is doing this for us", but is there any even circumstantial evidence Apple was ready to make everything end-to-end encrypted in a way they could not access any of your data even if they were ordered to? Not as far as I know. It's more of a hope and prayer since otherwise it's not for the betterment of us users. 
    So no objection to the Minority Report crowd who so readily projects into the future how Apple will start scanning for all manner of other things, but you object to those who project that Apple might, in the future, make a more secure iCloud.  Gotcha.  
    No objection to it at all if Apple has the courage to do the right thing and thumb their nose at China, take back their iCloud service there, and enact E2EE in order to have an actually secure Cloud service. Do you believe they do? 

    So no, you didn't "get me" at all. 
    They do this and they very quickly run out of governments to thumb their nose at.  Would you be happy is the only place you could legally use (or even own) your Apple device was Switzerland?  Good luck waiting on that kind of “bravery.”
    China, and perhaps Russia, are the only countries I can see E2EE being problematic, and I think Russia is actually fine with it at the moment. So that leaves China, which granted accounts for roughly half the entire iPhone user base if I've read it correctly.

    What you probably don't know is Google did just this a few years ago, end-to-end encrypting Android smartphone backups, and it's been the default setting since 2019. Even Google cannot access your backed-up Android smartphone personal data, they possess no key to do so, noting that Photos and Google Drive cannot be E2EE due to there being a web interface for them. So there is that exception. If presented with a legal demand to turn over personal data contained in your cloud backups all Google can do is offer gobbledegook. They cannot decrypt it no matter what the authority says or demands. No Key. But Apple can comply. They have a key. 

    Yet Google hasn't been banned from any western country for cutting themselves out from the process. If the authorities want your messages or contacts or other on-device personal information contained in your cloud backup data that's going to be up to you the user to give it to them. Google has made sure they themselves cannot. Brave? I don't know I'd call it that.

    Perhaps that's why the US has Google in their antitrust crosshairs on a few fronts now, but leaving Apple be for the most part. I'm sure there is some danger to their business from doing so. Yet they did it anyway. Google's advantage is of course they don't have to deal with China, and that would be a huge market for Apple to walk away from. A little tantrum from some government agency is one thing. Giving up a several $billion in business is altogether different.
    https://daringfireball.net/linked/2020/01/21/android-encrypted-backups
    edited September 2021 muthuk_vanalingamelijahgexceptionhandlerdarkvader
  • Reply 119 of 158
    bb-15bb-15 Posts: 283member
    mr. h said:
    You have Spotlight indexing everything every day. You have Photos using machine learning to identify faces and pets and objects every day. What's to stop Apple from exfiltrating that data at any point? … Live Text is coming to iOS 15 and macOS Monterey and is going to make a whole lot of image-based text indexable, why aren't people freaking out about that?
    Why are none of the naysayers in this thread addressing this extremely valid point? The above features, which have been in iOS for years (apart from Live Text, obviously) actually fit the profile of what is being complained about, far more than the newly-proposed CSAM detection process.
    It’s not a valid point. In the past Tim Cook has stated that Apple has the ability to mine all your data in iOS (Safari, iMessage, Maps, Photos) & either make money from it as Google does (I’m not counting News or the App Store) or turn that information over to law enforcement.
    - In the past Tim Cook has said Apple would not do this.
    That Apple would not mine all your data for money or mine all your data so it could be turned over to law enforcement.
    - Apple specifically took steps to protect user privacy such as encryption which Apple would not break.
    A famous example was the San Bernardino terrorist attack where Apple said they could / would not provide law enforcement with encrypted information. 

    * What is new here with CSAM?
    From Apple;
    ”new technology in iOS and iPadOS* will allow Apple to detect known CSAM images stored in iCloud Photos. This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC). NCMEC acts as a comprehensive reporting center for CSAM and works in collaboration with law enforcement agencies across the United States.”

    A user lends their phone to a friend, that friend uploads the wrong kind of image, that is report to NDMEC then law enforcement & then the phone owner has a the police barging into their house.

    * Apple has decided to become an extension of law enforcement. Imo it is not Apple’’s job to police the world.
    It is the job of parents to safeguard their children from the disturbing & harmful material on the internet.     
    Apple should not become Big Brother & demolish their past history of privacy. 
    anantksundaramelijahgdarkvader
  • Reply 120 of 158
    crowleycrowley Posts: 10,453member
    bb-15 said:

    A user lends their phone to a friend, that friend uploads the wrong kind of image, that is report to NDMEC then law enforcement & then the phone owner has a the police barging into their house. 
    You would need to do that something like 30 times for the police to come barging on your door.  Do you even lend your phone to a friend for anything other than a quick call?  I don't.  These gotchas that people are offering are always such a preposterous stretch.  
    edited September 2021 [Deleted User]jony0
Sign In or Register to comment.