Apple backs down on CSAM features, postpones launch

123578

Comments

  • Reply 81 of 157
    jdwjdw Posts: 1,472member
    Precisely what I asked for and hoped for.  A postponement was the right decision for Apple to make. 

    If it's really the best course of action, Apple now has time to make its case before its users and the tech media.  If ultimately it isn't a good idea, this extra time will allow Apple and everyone else to see that, leading to a cancellation of the idea altogether.
    macplusplushenrybaymobird
     3Likes 0Dislikes 0Informatives
  • Reply 82 of 157
    radarthekatradarthekat Posts: 3,938moderator
    gatorguy said:
    Why are you and a couple of others so convinced this was all because Apple was prepared to E2E encrypt the whole shebang?  In truth there is no way they could have done so for half their entire user base as China would have barred them from the country if they did. You honestly think Apple was willing to cut revenues by a third or more? 

    I get that you really REALLY want to paint a glowing picture of "gosh Apple is doing this for us", but is there any even circumstantial evidence Apple was ready to make everything end-to-end encrypted in a way they could not access any of your data even if they were ordered to? Not as far as I know. It's more of a hope and prayer since otherwise it's not for the betterment of us users. 
    So no objection to the Minority Report crowd who so readily projects into the future how Apple will start scanning for all manner of other things, but you object to those who project that Apple might, in the future, make a more secure iCloud.  Gotcha.  
    baconstangrobabajony0
     3Likes 0Dislikes 0Informatives
  • Reply 83 of 157
    elijahgelijahg Posts: 2,888member

    That's true.  Their history of humility (especially since Steve) has not been great.   But I give them credit for it this time.

    I think this clip of Steve says a lot:  While he's proud of what he created he realizes that he doesn't have the perfect answer for everybody. 



    Offtopic but it also shows how Jobs always wanted what's best for the customer. Cook wants what's best for the shareholder, which is a lot of what's wrong with modern Apple.
    baconstangmacplusplusxyzzy-xxxmobird
     4Likes 0Dislikes 0Informatives
  • Reply 84 of 157
    mr. hmr. h Posts: 4,870member
    omair said:
    I agree.  I have lost faith and I have been busy divesting out of the ecosystem for the last month.  Moving out of photos and imessage so far.  Will be giving up on icloud as my i replace the services backing into it.  And I am also looking into getting a personal email account and just give up on that from apple.  Perhaps it wasnt a good idea to put so much faith in one company for two decades.
    Please could you let us know what alternatives you are using, and how you went about verifying that they don't have any similar (or worse) features? For the avoidance of doubt, this is a genuine question.
    radarthekatroundaboutnowjony0
     3Likes 0Dislikes 0Informatives
  • Reply 85 of 157
    They were utter morons to do this in the first place. 

    Glad that good sense prevailed, seemingly after all other options were exhausted. 
    elijahgbaconstangmuthuk_vanalingammobirdchemengin1darkvader
     5Likes 0Dislikes 1Informative
  • Reply 86 of 157
    CSAM would only hurt Apple with an imminent iPhone 
    lauch only a couple weeks away 
    There might be more risk than reward in CSAM
    APPLE should stick to what it does best as opposed to being the C.I.A
    baconstangmuthuk_vanalingamxyzzy-xxxdarkvader
     2Likes 0Dislikes 2Informatives
  • Reply 87 of 157
    jungmarkjungmark Posts: 6,927member
    elijahg said:
    That's because people make a choice to use FB/IG/Twitter etc, they make a choice to lose privacy over the photo they post, and they are posting that photo to someone else's device. Plus, they agreed to scanning for "objectionable material" when they signed up. No service - not even Google or FB, scans the photos on your own device. Apple was going to install spyware on people's own devices without permission, and with no choice. No one agreed to Apple scanning devices for CSAM when they bought their phones. 

    Ceasing to use FB/IG/Twitter doesn't cost a penny. Ceasing to use an iPhone could cost a lot of cash, especially if you are deeply invested in the ecosystem. 
    I chose not to use iCloud photo. That’s where the hash scan feature resides. 
     0Likes 0Dislikes 0Informatives
  • Reply 88 of 157
    gatorguygatorguy Posts: 24,730member
    So no objection to the Minority Report crowd who so readily projects into the future how Apple will start scanning for all manner of other things, but you object to those who project that Apple might, in the future, make a more secure iCloud.  Gotcha.  
    No objection to it at all if Apple has the courage to do the right thing and thumb their nose at China, take back their iCloud service there, and enact E2EE in order to have an actually secure Cloud service. Do you believe they do? 

    So no, you didn't "get me" at all. 
    edited September 2021
    muthuk_vanalingam
     1Like 0Dislikes 0Informatives
  • Reply 89 of 157
    crowleycrowley Posts: 10,453member
    A triumph of misinformation and hysteria.
    fastasleep[Deleted User]robabajony0
     4Likes 0Dislikes 0Informatives
  • Reply 90 of 157
    Illusive said:
    Yet… here you are… ¯\_(ツ)_/¯

    due to the industry I am in, I have dealt with the technical side of this stuff, but I am not sure the  actual technical details are relevant… that is the how… Dont dev me wrong, their implementation is super cool, but…

    Another more recent quote I like to use, because I see bad tech designs/decisions all the time is:

    “Yeah, but your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should.” - Dr. Ian Malcolm
    baconstanghenrybay
     2Likes 0Dislikes 0Informatives
  • Reply 91 of 157
    mike54 said:
    This is not about CSAM or protecting the children at all. This is all about deploying the software framework, the mechanism on every Apple device for future use cases. Privacy is out of the window with this software. Don't only look at today, look at the morrow.

    Apple should not allow this software on Apple devices.
    Idiotic. The "framework" is already running every day on your device. You have Spotlight indexing everything every day. You have Photos using machine learning to identify faces and pets and objects every day. What's to stop Apple from exfiltrating that data at any point? This feature adds absolutely nothing more nefarious than those things that nobody seems to have a problem with and that you can't even opt out of if you wanted to. Live Text is coming to iOS 15 and macOS Monterey and is going to make a whole lot of image-based text indexable, why aren't people freaking out about that?
    roundaboutnow[Deleted User]jony0Detnator
     4Likes 0Dislikes 0Informatives
  • Reply 92 of 157
    omair said:
    I agree.  I have lost faith and I have been busy divesting out of the ecosystem for the last month.  Moving out of photos and imessage so far.  Will be giving up on icloud as my i replace the services backing into it.  And I am also looking into getting a personal email account and just give up on that from apple.  Perhaps it wasnt a good idea to put so much faith in one company for two decades.
    Why iMessage? Nothing even happens there unless you've got a Family setup and you turn on that feature for your under 13yo kid. Are you 12?
    roundaboutnowjony0
     2Likes 0Dislikes 0Informatives
  • Reply 93 of 157
    On device spying is wrong. 

    Spying on my files on my server space, whether free or paid is wrong. 

    It’s not a website with content for others to look at. It’s my personal stuff. 

    If Apple wanted to do this, iCloud needs to always be off by default and their needs to a be a warning when you go to turn it on snd a confirmation. 

    No one wants people snooping. 

    Not to mention any political retaliation for different groups views as the potential for abiding and infrastructure like this is definitely there. 
    xyzzy-xxxelijahgdarkvader
     2Likes 0Dislikes 1Informative
  • Reply 94 of 157
    omair said:
    I agree.  I have lost faith and I have been busy divesting out of the ecosystem for the last month.  Moving out of photos and imessage so far.  Will be giving up on icloud as my i replace the services backing into it.  And I am also looking into getting a personal email account and just give up on that from apple.  Perhaps it wasnt a good idea to put so much faith in one company for two decades.
    ...Also @omair: ;
    'moving out' of the Internet - and on to a Nokia 3310 (1st gen) + IBM PC/XT, I guess. Don't forget the tin foil!

    On a more serious note, this does say a lot about Apple's fan bases' technological literacy. 
    edited September 2021
     0Likes 0Dislikes 0Informatives
  • Reply 95 of 157
    On device spying is wrong. 

    Spying on my files on my server space, whether free or paid is wrong. 

    It’s not a website with content for others to look at. It’s my personal stuff. 

    If Apple wanted to do this, iCloud needs to always be off by default and their needs to a be a warning when you go to turn it on snd a confirmation. 

    No one wants people snooping. 

    Not to mention any political retaliation for different groups views as the potential for abiding and infrastructure like this is definitely there. 
    How come comments are still on for these posts? I genuinely do not understand.
    edited September 2021
     0Likes 0Dislikes 0Informatives
  • Reply 96 of 157

    Yet… here you are… ¯\_(ツ)_/¯

    due to the industry I am in, I have dealt with the technical side of this stuff, but I am not sure the  actual technical details are relevant… that is the how… Dont dev me wrong, their implementation is super cool, but…

    Another more recent quote I like to use, because I see bad tech designs/decisions all the time is:

    “Yeah, but your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should.” - Dr. Ian Malcolm
    I see. A quote from an old action movie is certainly of more relevance here :D 
    edited September 2021
     0Likes 0Dislikes 0Informatives
  • Reply 97 of 157
    crowley said:
    A triumph of misinformation and hysteria.
    Says a lot about society nowadays. Clueless folks with endless outreach, right at their fingertips. 
    jony0
     1Like 0Dislikes 0Informatives
  • Reply 98 of 157
    mr. h said:
    All I can say about that is that the whole scheme would be totally pointless if they weren't going to encrypt the photos. Why go to all the effort of designing this enormously complicated system, calculating hashes on-device, doing the CSAM hash-matching in a "blind" way so even the device itself doesn't know if there's been a match, and then going to all the convoluted effort of generating doubly-encrypted "vouchers" and associated "image information", if the photo itself was uploaded to iCloud unencrypted?

    Certainly, this system would enable the photos to be uploaded to iCloud encrypted, but I concede that as far as I know, Apple hasn't said that they would do that. It's just that, as I said, the whole scheme seems totally pointless if the photos are uploaded to the server in the clear anyway.

    How about Apple just offers a toggle in iCloud photos settings? The two options would be:

    1. Photos are CSAM-scanned and encrypted before being uploaded to iCloud.
    2. Photos are not CSAM-scanned, but are uploaded to iCloud in the clear. The server then does the CSAM scan.

    Would this solution make everyone happier?
    Yup, that would make at least 99% of the users happy. There are few odd ones out, but this would be a workable solution imho. 
    JaiOh81jony0
     2Likes 0Dislikes 0Informatives
  • Reply 99 of 157
    So you call creation of a checksum scanning a file?  Is that what you’re saying?  Apple is simply creating a hash from each photo to be uploaded.  It can then compare that hash to hashes created against the photos in a CSAM database.  This is pretty innocuous. 
    The devil is always in the DETAILS. A simple hash would NEVER match for even trivially "modified" images with CSAM database hashes. BUT, Apple is claiming that even "modified" versions of CSAM images would be flagged by the system. How??? Without proper analysis of the images through AI/ML algorithms on the device, this is not going to be feasible. Innocuous - NOT.
    edited September 2021
    darkvader
     0Likes 0Dislikes 1Informative
  • Reply 100 of 157
    crowley said:
    A triumph of misinformation and hysteria.
    ...or, it stings to be wrong. 
    elijahgmuthuk_vanalingamchemengin1
     3Likes 0Dislikes 0Informatives
Sign In or Register to comment.