Apple backs down on CSAM features, postpones launch

245678

Comments

  • Reply 21 of 158
    Hopefully this spyware never will be in iOS – I would be fine if Apple scans photos in the cloud, but not on my devices!
    Now I will update to iOS 15 (at least using the first version until it's clear how this evolves)...
    cfilipponibluefire1Beatselijahgmuthuk_vanalingambaconstangdarkvader
  • Reply 22 of 158
    dws-2dws-2 Posts: 276member
    I hope it's the same "additional time" they needed for the magic charging mat.
    elijahgmuthuk_vanalingamhucom2000darkvader
  • Reply 23 of 158
    jungmarkjungmark Posts: 6,926member
    Misinformation rules the day!
    lkruppjony0radarthekatroundaboutnowfastasleeprobaba
  • Reply 24 of 158
    bluefire1bluefire1 Posts: 1,302member
    Sanity has (at least temporarily) prevailed.
    edited September 2021 muthuk_vanalingamdarkvader
  • Reply 25 of 158
    MplsP said:
    How many of the people screaming about CSAM have Facebook, WhatsApp, Instagram, and google apps on their devices and an Amazon or google smart speaker in their home?
    That isn't relevant, a very oranges to lemons (not going to say apples: pun avoided) comparison. Facebook et al. are services where you deliver content to them where it is stored and processed on their systems.  They have every right to scan for anything they want on their systems in the same way I have the right to censor speech in my home.  If I don't like what someone says I can remove them.  Apples proposal was to use a system I own in a way that created an adversarial relationship with me.  It was actively checking on my activity to make sure it complied with a 3rd parties definition of okay.  I have to make the obligatory declaration that exploitation of all people, especially children is reprehensible and morally bankrupt.  Apple is perfectly justified in scanning iCloud for CSAM if they want, and I support it 100 percent.  Apple should not be free to use a device I own for purposes of letting law enforcement bypass warrants.
    darkvader
  • Reply 26 of 158
    MplsPMplsP Posts: 3,925member
    gatorguy said:
    MplsP said:
    How many of the people screaming about CSAM have Facebook, WhatsApp, Instagram, and google apps on their devices and an Amazon or google smart speaker in their home?
    Implying Apple is not any worse than "everyone else" is not a ringing endorsement. 
    xyzzy-xxx said:
    MplsP said:
    How many of the people screaming about CSAM have Facebook, WhatsApp, Instagram, and google apps on their devices and an Amazon or google smart speaker in their home?
    I won't use a smart speaker, but regarding Facebook & co. you are comparing apples to oranges – just don't give these apps access to your photos etc. and think about what you are uploading and you will be fine.
    my point was not comparing Apple to any of these other corporations. My point was that it's a bit hypocritical to be completely ok with all of these other 'services' snooping, scraping, monetizing and otherwise surveilling your personally life and then to start screaming about Apple trying to do something to protect the most vulnerable people in society in a way that preserves people's privacy.

    Everyone makes the obligatory statement that they're against exploiting children, but somehow they're not willing to put their money where their mouth is. But they are willing to give up their privacy for the ability to brag about their vacation, post conspiracy theories and snoop on their neighbors. I find it a very sad commentary on people's values.
    gatorguyforegoneconclusionauxioradarthekatn2itivguyrobabawatto_cobraDetnator
  • Reply 27 of 158
    Doesn't sound like they're backing down at all. They specifically say that the features will be released in the future. 
    baconstangdarkvader
  • Reply 28 of 158
    elijahgelijahg Posts: 2,759member
    Doesn't sound like they're backing down at all. They specifically say that the features will be released in the future. 
    Hopefully the reasoning behind that wording is that they won't get such a backlash from people who are pro CSAM-scanning.
  • Reply 29 of 158
    aguyinatx said: That isn't relevant, a very oranges to lemons (not going to say apples: pun avoided) comparison. Facebook et al. are services where you deliver content to them where it is stored and processed on their systems.  They have every right to scan for anything they want on their systems in the same way I have the right to censor speech in my home.
    And yet the files that will be scanned in either scenario are identical. There's no difference at all in terms of what gets scanned. The user has total control over choosing which apps will have files backed up in iCloud and only files from those apps will be scanned. 
    radarthekatn2itivguyrobaba
  • Reply 30 of 158
    Just a question on this tech. So, the software scans for known images of child porn from a data base from a third party. Which would imply that these images are downloaded off the internet or texted to one’s phone. Well, what about the person whose actually using his/hers iPhone to shoot pics, say in their homes? Those wouldn’t be in the database. Exactly how does that work? Or are those images “safe”? It seems to me that this tech only solves half a problem, that is images of known child porn. Not new ones. Am I correct in understanding this? 
    baconstangwatto_cobradarkvader
  • Reply 31 of 158
    Apple could make the CSAM features optional, but at the same time provide some tangible benefit for turning it on, so that most people will turn it on. There are many possible benefits to choose from, like 1GB of extra iCloud storage for example. That way the haters can just leave it off. That's a win for everyone.
    jony0
  • Reply 32 of 158
    elijahgelijahg Posts: 2,759member
    MplsP said:
    gatorguy said:
    MplsP said:
    How many of the people screaming about CSAM have Facebook, WhatsApp, Instagram, and google apps on their devices and an Amazon or google smart speaker in their home?
    Implying Apple is not any worse than "everyone else" is not a ringing endorsement. 
    xyzzy-xxx said:
    MplsP said:
    How many of the people screaming about CSAM have Facebook, WhatsApp, Instagram, and google apps on their devices and an Amazon or google smart speaker in their home?
    I won't use a smart speaker, but regarding Facebook & co. you are comparing apples to oranges – just don't give these apps access to your photos etc. and think about what you are uploading and you will be fine.
    my point was not comparing Apple to any of these other corporations. My point was that it's a bit hypocritical to be completely ok with all of these other 'services' snooping, scraping, monetizing and otherwise surveilling your personally life and then to start screaming about Apple trying to do something to protect the most vulnerable people in society in a way that preserves people's privacy.

    Everyone makes the obligatory statement that they're against exploiting children, but somehow they're not willing to put their money where their mouth is. But they are willing to give up their privacy for the ability to brag about their vacation, post conspiracy theories and snoop on their neighbors. I find it a very sad commentary on people's values.
    That's because people make a choice to use FB/IG/Twitter etc, they make a choice to lose privacy over the photo they post, and they are posting that photo to someone else's device. Plus, they agreed to scanning for "objectionable material" when they signed up. No service - not even Google or FB, scans the photos on your own device. Apple was going to install spyware on people's own devices without permission, and with no choice. No one agreed to Apple scanning devices for CSAM when they bought their phones. 

    Ceasing to use FB/IG/Twitter doesn't cost a penny. Ceasing to use an iPhone could cost a lot of cash, especially if you are deeply invested in the ecosystem. 
    muthuk_vanalingamargonautbaconstang
  • Reply 33 of 158
    lkrupplkrupp Posts: 10,557member
    Dead_Pool said:
    Subway’s Jared tweets his appreciation!
    Yep, old Jared was a pedo. Hope he rots in jail.
     
    Think of the children who will suffer abuse because a few privacy wackos don’t want Apple to scan their photos. Fuck those kids, right? And are any of you privacy freaks going to say with a straight face that you are certain your data is not being scanned by Apple, Google, Facebook, NSA, CIA, Homeland Security, er al, right NOW? When someone can switch your lightning cable with one that can suck your data from a mile away? Years from now declassified documents will reveal you were scanned all along without your permission or knowledge.
    edited September 2021 jony0
  • Reply 34 of 158
    elijahg said: No one agreed to Apple scanning devices for CSAM when they bought their phones. 
    Anyone using iCloud has to agree to the terms of service. Apple reserves the right to scan files being backed up in the cloud. And only files that are coming from apps that the user has chosen to be backed up in iCloud would be scanned in either scenario. 
    radarthekatn2itivguywatto_cobra
  • Reply 35 of 158
    JBSlough said:
    Just a question on this tech. So, the software scans for known images of child porn from a data base from a third party. Which would imply that these images are downloaded off the internet or texted to one’s phone. Well, what about the person whose actually using his/hers iPhone to shoot pics, say in their homes? Those wouldn’t be in the database. Exactly how does that work? Or are those images “safe”? It seems to me that this tech only solves half a problem, that is images of known child porn. Not new ones. Am I correct in understanding this? 
    I don't think your understanding is correct. It is NOT exact copy of of known CSAM that is being searched for/compared against in your phone. There is a level of "pattern matching" involved when CSAM check is done on the phone before it is uploaded to iCloud. So new photos taken will also get flagged IF the pattern matches with known CSAM database entries.
    edited September 2021
  • Reply 36 of 158
    lkrupp said:
    Dead_Pool said:
    Subway’s Jared tweets his appreciation!
    Think of the children who will suffer abuse because a few privacy wackos don’t want Apple to scan their photos. Fuck those kids, right?
    Nope. Apple can very well scan the photos in iCloud and report it to authorities. They have the keys to decrypt the files stored in iCloud, so there is NOTHING that is preventing Apple from doing it and NO ONE is against this. The opposition is only for doing the scan ON the device, NOT in iCloud.
    edited September 2021 JMStearnsX2elijahgbaconstanghucom2000
  • Reply 37 of 158
    muthuk_vanalingam said: Nope. Apple can very well scan the photos in iCloud and report it to authorities.
    It doesn't make a difference when or where Apple scans files. You can't use iCloud without agreeing to Apple's terms of service and part of that includes Apple reserving the right to scan files the user is backing up in the cloud. Once you select an app to have files backed up in iCloud, you can't object to Apple scanning those files. Your selection per the cloud has given Apple the right to scan them. 
    edited September 2021 jony0n2itivguy
  • Reply 38 of 158
    chadbagchadbag Posts: 2,000member
    I do believe the soundbite that got out early was, 'oh my god, Apple is scanning my phone for images.' This is not what is happening." — Craig Federighi 

    It is what is happening.  How else do they create the “magical” hashes? It is happening on the phone.  So, Craig, why do you say that is not what is happening when that is exactly what is happening?

    muthuk_vanalingamJMStearnsX2baconstang
  • Reply 39 of 158
    chadbag said:
    I do believe the soundbite that got out early was, 'oh my god, Apple is scanning my phone for images.' This is not what is happening." — Craig Federighi 

    It is what is happening.  How else do they create the “magical” hashes? It is happening on the phone.  So, Craig, why do you say that is not what is happening when that is exactly what is happening?

    Agreeing to the terms of service for iCloud is the only thing that actually matters. Once you do that, files that are coming from apps that you've chose to be backed up in iCloud are going to be scanned. There's no difference between that happening on the phone or on the iCloud servers. You agreed to the terms and the files being scanned are exactly the same in both scenarios. 
  • Reply 40 of 158
    muthuk_vanalingam said: Nope. Apple can very well scan the photos in iCloud and report it to authorities.
    It doesn't make a difference when or where Apple scans files. You can't use iCloud without agreeing to Apple's terms of service and part of that includes Apple reserving the right to scan files the user is backing up in the cloud. Once you select an app to have files backed up in iCloud, you can't object to Apple scanning those files. Your selection per the cloud has given Apple the right to scan them. 
    It does make a difference and many people understand that difference, which is why there is outrage. And there is NO technical limitation for Apple to perform this in iCloud. Then why does Apple insist on doing it on-device? That raises suspicion significantly on the motive of Apple.
    cfilipponimacplusplusJMStearnsX2baconstangApple_Barhucom2000darkvader
Sign In or Register to comment.