Apple's Federighi says child protection message was 'jumbled,' 'misunderstood'

245

Comments

  • Reply 21 of 90
    emoelleremoeller Posts: 574member
    As I stated before this is dystopian and I'll be canceling iCloud.

    But as an Apple user since 1978 and shareholder this is the biggest mistake Apple has made - IMHO

    Bigger than when Steve Jobs famously responded to Antennae Gate: 

    "All phones have sensitive areas. Just avoid holding it in this way,"

    Why, because this is yet another loss of trust at a time when "trust is everything".   It is a total repudiation of Apple's famous Las Vegas poster "What Happens on Your Phone Stays on Your Phone"!

    I subscribe to the Wall Street Journal (fire walled) and watched Craig's interview - it was a hot mess.   First he claims there is NO SCANNING of photos because Apple uses an Algorithm both on the phone and in iCloud.  Do they think we are all dumb - everything that is done on a computer/iPhone is an algorithm.  He could have used a metaphor such as lock and key - but no,   Even he did some eye rolls during the interview.  

    So what should Apple do now:  Reset and come clean

    1)  Hold off on rolling this out.
    2)  Explain why THEY are doing this (is it an internal policy change, demands by the government, or ???)
    3)  Justify the urgency (why now, when should/must this addressed from a legal and or public safety aspect)
    4)  Set up an independent task group - Apple, other companies, including across ALL political and social groups to incorporate this in the larger and far more important issue of privacy.  Make sure this forum is public and open.
    5)  Come clean with who and what NEMCE is, how it was created, who manages it.  Just a little digging and I'm totally confused as it is not a private, public, or really non-profit and there appears to be little oversight as to how the data is collected, stored and secured.  Apple has been silent on this, and as holder of private information (even if it is hashed) they need to communicate the complete chain of custody and authority for anything shared.

    No good deed goes unpunished.   I have no doubt that Apple was trying to do the right thing, but what they are doing now is wrong.
    BeatsanantksundaramOctoMonkeycat52elijahgsteven n.
  • Reply 22 of 90
    Rayz2016Rayz2016 Posts: 6,957member
    "This is only being applied as part of a process of storing something in the cloud," he continued. "This isn't some processing running over the images you store in Messages, or Telegram ..."
    Er, no one thought that.
    muthuk_vanalingammacplusplusBeatscat52elijahgsteven n.
  • Reply 23 of 90
    Rayz2016Rayz2016 Posts: 6,957member
    If this was such a great idea, why are thousands of Apple employees singing an open letter against it? 
    Because they're 'misunderstanding' apparently. Or it could because they understand it perfectly, and just think it's a really bad idea.
    edited August 2021 muthuk_vanalingammacplusplusbulk001OctoMonkeycat52elijahgcanukstorm
  • Reply 24 of 90
    macplusplusmacplusplus Posts: 2,112member
    mfryd said:
    NYC362 said:
    Come on already.  Google and Facebook have been doing this for years.   Suddenly when Apple wants to do the same thing, everyone gets twisted.

    It is just like the 1000 gas powered cars that will catch on fire get not a work of national attention, but one Tesla goes up in flames and there's a world wide news bulletin like it's the end of the world. 
    While Google and Facebook have been scanning user data for years, they are honest and open about it.  They make their money by selling information they glean about their customers.  Google doesn't charge for Gmail because they learn more about you when you use their services, and hence make more profit selling that higher grade information.

    On the other hand, Apple prides itself on protecting the privacy of their customers.  A key sales point in buying into the Apple ecosystem is that Apple does everything they possibly can in order to protect your data.  They even fight court orders requiring them to add back doors to iPhone local encryption.

    Under Apple's new policy, every image you upload to your iCloud library will be scanned, and compared against a list of blacklisted images.  If you have too many blacklisted images, you will be reported to the authorities.  

    Initially, the blacklist will only contain child porn images.  I can easily imagine a narcissistic leader ordering Apple to add to that list images which are critical to the government.   Imagine a photo of a President that makes him look foolish, shows him in a compromising position, or reveals a public statement to be a lie.  Such a President would have a strong incentive to add these photos to the list.  Remember, Apple doesn't know what the blacklisted photos look like, Apple only has digital fingerprints of these images (it would be illegal for Apple to posses Child Porn, even if it was for a good cause).
    You can imagine whatever you like, but the system uses the same NCMEC child porn database utilized by Microsoft, Google, Dropbox, Twitter, etc.. Your little fantasy about the POTUS adding images to it would affect every one of those major cloud storage providers, not just Apple. 

    There is nothing new here. You’re not forced to use any of these commercial cloud services to host your images online. 

    And reporting child porn libraries to the authorities is not some choice Apple makes — they’re required to by law. As are the other cloud storage services. You can’t store child porn, it’s against the law. 
    You can't burst into one's house haphazardly with that pretext, you need a search warrant. 

    You or Apple can't burst into my phone either without a search warrant, it is as private as my house.
    muthuk_vanalingamcat52
  • Reply 25 of 90
    bloggerblogbloggerblog Posts: 2,464member
    "This is only being applied as part of a process of storing something in the cloud," he continued. "This isn't some processing running over the images you store in Messages, or Telegram... or what you're browsing over the web. This literally is part of the pipeline for storing images in iCloud."

    No Craig we did not misunderstand, it's actually exactly what we understood. The user data in the cloud should not be scanned to identify individuals for other than improving user experience. Apple's ecosystem should always be a safe environment for the user, otherwise what's the difference of putting my photos in any other cloud service? Also political platforms can change, today you scan for this tomorrow you may scan for something else. This whole thing stinks of gov pressure or another gag order.

    anantksundarammuthuk_vanalingamcat52elijahg
  • Reply 26 of 90
    It is Very Offensive when these idiots, excuse me Senior Vice President was it, only talk about the way their repulsive concept has been ‘communicated’. Are they truly so blind as to believe they are just sooo special that it’s ok when they look at private data, that their Surveillance is nice and happy, and not like that other bad kind of Surveillance? It’s friendly supportive Surveillance, helpful in every way. Everyone just needs to understand how very really good and clever Apple’s brand of Surveillance is. 
    aderutteranantksundarammuthuk_vanalingamcat52elijahgmaikysupra
  • Reply 27 of 90
    roakeroake Posts: 811member
    I agree that this move by Apple is shocking!  No matter what their initial intentions, surveillance in my pocket, on my private device, is an abhorrent idea.  They tried to make it sound MORE private because it all too place on the device, when the opposite is true.  As many have pointed out, this technology could be used to scan for ANYTHING.  It definitely smacks of Orwellian potential, with Apple installing the first tool of the New Order Thought Police.
    cat52maikysupra
  • Reply 28 of 90
    BeatsBeats Posts: 3,073member
    Rayz2016 said:
    If this was such a great idea, why are thousands of Apple employees singing an open letter against it? 
    Because they're 'misunderstanding' apparently. Or it could because they understand it perfectly, and just think it's a really bad idea.

    No no no. Apple employees don’t understand what Apple is doing!!

    /s
    muthuk_vanalingamcat52elijahg
  • Reply 29 of 90
    rcfarcfa Posts: 1,124member
    The matter is very simple: at any time, the very same system can be extended to anything else, simply by changing/updating the databases; which can happen both as a matter of Apple’s choice or government pressure.

    A system that can identify child porn can just as well be used to identify the Dalai Lama, Winnie the Pooh, weaponry, gay porn, bongs, Muhammad caricatures, political and religious symbols, etc.

    It’s one thing to scan things that are shared (FB, Twitter) and things that are private (cloud storage of files and photos).

    It’s one thing to scan on a server, where government can gain access undetected anyway, and scanning on a phone “as part of the uploading process”, which is still scanning on the phone, with only minor code changes required to scan preemptively “to improve performance”, etc.

    Hands off our phones, and end-to-end encryption of all iCloud contents, including backups.

    It is NOT the duty of technology to make government snooping (regardless of under which pretext it happens) easy, it’s the duty of technology to protect an owner’s data, regardless of what it may be.
    mobirdanantksundarammuthuk_vanalingamcat52elijahgmaikysupra
  • Reply 30 of 90
    sflocalsflocal Posts: 6,093member
    If this was such a great idea, why are thousands of Apple employees singing an open letter against it? Craig keeps saying "we". Who exactly are "we"? Apple has had a problem with its executives and their ideas for a long time now. It's why the walled garden has become more like a walled prison for its customers. When Apple has to lobby against common sense ideas like the right to repair and the right to install whatever apps you want on the devices you own, you know there is a problem. With warrantless spying on personal data, Apple made it clear that they need new management. It is a good thing that there is a pandemic or Apple execs might find the crowd booing at them during their next live product launch.
    Because those "thousands of Apple employees" may just be as ignorant as the rest of the people that are jumping in without understand the full story.

    Apple did a spectacular blunder in how they publicized this.  No doubt about that.  What people refuse to see and understand is that CSAM-scanning is a law.  All cloud service providers - which includes Apple - are required to run CSAM scanning.  My physical iPhone is still as secure as always.  Nothing changes on that.  Zero back door.  

    My iCloud data is something else.  I know it's controlled by Apple and given the choice, I trust Apple to be more secure than any of the other players out there, including Dropbox, Google, and Microsoft.  If they are required by law to scan my cloud data (not my iPhone) for CSAM, there's nothing I can do to stop that.  I could choose to not use iCloud.  Perfectly acceptable and it's my choice.  I don't lose sleep over it.

    However, if my government decides to abuse the CSAM system, and insert hashes for other non-CSAM object, then guarantee there will be countless of security research shops that will raise bloody hell like when Snowden opened up the NSA abuse.

    I still trust Apple to remain at the top of user-level privacy.  People are having hissy-fits to a problem that I don't see as really existing yet.

    If you don't want data that is not in your physical possession scanned, then STOP using cloud services.  WTF, people like you just want to make a mountain out of a molehill.
    dewmeradarthekatAlex_Vargonautbadmonk
  • Reply 31 of 90
    Rayz2016Rayz2016 Posts: 6,957member
    sflocal said:
    If this was such a great idea, why are thousands of Apple employees singing an open letter against it? Craig keeps saying "we". Who exactly are "we"? Apple has had a problem with its executives and their ideas for a long time now. It's why the walled garden has become more like a walled prison for its customers. When Apple has to lobby against common sense ideas like the right to repair and the right to install whatever apps you want on the devices you own, you know there is a problem. With warrantless spying on personal data, Apple made it clear that they need new management. It is a good thing that there is a pandemic or Apple execs might find the crowd booing at them during their next live product launch.
    Because those "thousands of Apple employees" may just be as ignorant as the rest of the people that are jumping in without understand the full story.

    Apple did a spectacular blunder in how they publicized this.  No doubt about that.  What people refuse to see and understand is that CSAM-scanning is a law.  All cloud service providers - which includes Apple - are required to run CSAM scanning.  My physical iPhone is still as secure as always.  Nothing changes on that.  Zero back door.  


    Yup, CSAM scanning is the law, but doing through spyware loaded on your phone is not. That’s why Google does it on the server where it’s easier to maintain and can’t have any detrimental effect in the performance of the phone. 

    muthuk_vanalingamcat52elijahgcanukstormchemengin1
  • Reply 32 of 90

    I don’t think this is an issue of Apple’s “messaging” or understanding by users. I still have three serious concerns that don’t seem to be addressed. 


    #1 Apple has acknowledged the privacy impact of this technology if misapplied by totalitarian governments. The response has been, “we’ll say no”. In the past the answer hasn’t been no with China and Saudi Arabia. This occurred when Apple was already powerful and wealthy. If a government compelled Apple, or if Apple one day is not in a dominant position they may not be able to say no even if they want to. 


    #2 We’ve recently observed zero-day exploits being used by multiple private companies to bypass the existing protections that exist in Apple’s platforms. Interfaces like this increase the attack surface that malicious actors can exploit. 


    #3 Up until this point the expectation from users has been that the data on your device was private and that on-device processing was used to prevent data from being uploaded to cloud services. The new system turns that expectation around and now on-device processing is being used as a means to upload to the cloud. This system, though narrowly tailored to illegal content at this time, changes the operating system’s role from the user’s perspective and places the device itself in a policing and policy enforcement role. This breaks the level of trust that computer users have had since the beginning of computing, that the device is “yours” in the same way that your car or home is “your” property. 


    Ultimately I think solving a human nature problem with technology isn’t the true solution. I think Apple is burning reputation with this move that was hard won. In my opinion law enforcement and judicial process should be used to rectify crime rather than technology providers like Apple. 

    edited August 2021 muthuk_vanalingamelijahg
  • Reply 33 of 90
    canukstormcanukstorm Posts: 2,700member
    mfryd said:
    NYC362 said:
    Come on already.  Google and Facebook have been doing this for years.   Suddenly when Apple wants to do the same thing, everyone gets twisted.

    It is just like the 1000 gas powered cars that will catch on fire get not a work of national attention, but one Tesla goes up in flames and there's a world wide news bulletin like it's the end of the world. 
    While Google and Facebook have been scanning user data for years, they are honest and open about it.  They make their money by selling information they glean about their customers.  Google doesn't charge for Gmail because they learn more about you when you use their services, and hence make more profit selling that higher grade information.

    On the other hand, Apple prides itself on protecting the privacy of their customers.  A key sales point in buying into the Apple ecosystem is that Apple does everything they possibly can in order to protect your data.  They even fight court orders requiring them to add back doors to iPhone local encryption.

    Under Apple's new policy, every image you upload to your iCloud library will be scanned, and compared against a list of blacklisted images.  If you have too many blacklisted images, you will be reported to the authorities.  

    Initially, the blacklist will only contain child porn images.  I can easily imagine a narcissistic leader ordering Apple to add to that list images which are critical to the government.   Imagine a photo of a President that makes him look foolish, shows him in a compromising position, or reveals a public statement to be a lie.  Such a President would have a strong incentive to add these photos to the list.  Remember, Apple doesn't know what the blacklisted photos look like, Apple only has digital fingerprints of these images (it would be illegal for Apple to posses Child Porn, even if it was for a good cause).
    You can imagine whatever you like, but the system uses the same NCMEC child porn database utilized by Microsoft, Google, Dropbox, Twitter, etc.. Your little fantasy about the POTUS adding images to it would affect every one of those major cloud storage providers, not just Apple. 

    There is nothing new here. You’re not forced to use any of these commercial cloud services to host your images online. 

    And reporting child porn libraries to the authorities is not some choice Apple makes — they’re required to by law. As are the other cloud storage services. You can’t store child porn, it’s against the law. 
    You can't burst into one's house haphazardly with that pretext, you need a search warrant. 

    You or Apple can't burst into my phone either without a search warrant, it is as private as my house.
    100% This.  People have this mentality of "if you haven't done anything wrong, you have nothing to worry about" as a pretext for invading without proper due diligence.  It's ludicrous.  
    OctoMonkeymuthuk_vanalingamcat52elijahgchemengin1
  • Reply 34 of 90
    mattinozmattinoz Posts: 2,316member
    elijahg said:

    Criticisms of the features centered on the perception that Apple was analyzing photos on users' iPhones. "That's a common but really profound misunderstanding," said Federighi.

    "This is only being applied as part of a process of storing something in the cloud," he continued. "This isn't some processing running over the images you store in Messages, or Telegram... or what you're browsing over the web. This literally is part of the pipeline for storing images in iCloud."
    So they’re actually not analysing photos on-device as they’re uploaded to iCloud? Meaning the content of the white paper Apple published the other day is actually wrong? Or does he seem to think people will be convinced all’s fine by claiming that photos are not scanned, only “analysed” as part of the upload process? That code could easily be expanded out into a daemon that scans all data on the phone at any time. That’s the issue. Nothing to do with messaging (though that wasn’t great either).
    There is code that scans all your photos in photos regardless if they are on iCloud or not. It powers photo search. No one seemed worried that was there but just as easily Apple could expand that system and become bad actors. No one has suggested they will and it’s now been there for years. 

    It seems They don’t even use this system for the system people are getting upset about. 
    fastasleepn2itivguyargonaut
  • Reply 35 of 90
    M68000M68000 Posts: 725member
    Just imagine if Apple turned this on without letting the public know about it - but then found out later… Imagine what the comments would be like then about how sneaky Apple is… But all in the name of attempting to make the world a better place - which a scary number of people are against with their comments.
    edited August 2021
  • Reply 36 of 90
    mejsricmejsric Posts: 152member
    NYC362 said:
    Come on already.  Google and Facebook have been doing this for years.   Suddenly when Apple wants to do the same thing, everyone gets twisted.

    It is just like the 1000 gas powered cars that will catch on fire get not a work of national attention, but one Tesla goes up in flames and there's a world wide news bulletin like it's the end of the world. 
    i don't heard any news about Google and Facebook scanning photos or files on my computer. 
    muthuk_vanalingamcat52
  • Reply 37 of 90
    coolfactorcoolfactor Posts: 2,241member
    xyzzy-xxx said:

    - Mass surveillance of a billion iPhone users for what – now that every criminal has been warned?


    One would think that prevention is smart. Apple is not trying to "secretly find" the bad people, but discourage bad people from using their devices and networks for nefarious purposes going forward.

    There's no retroactive scanning of existing photos since the entire process starts with the iPhone or iPad and does a "handshake" with the server. No voucher, no scanning. All existing phtoos are excluded. This only applies to _new_ photos added to devices and iCloud.
    edited August 2021 n2itivguyargonaut
  • Reply 38 of 90
    coolfactorcoolfactor Posts: 2,241member
    Rayz2016 said:

    Apple is determined to drive this through, no matter what; and you have to wonder why. I mean they already scan images on their servers, so why are they so determined to get spyware running on your phone?
    ...
    The message Apple is trying to get across is that your privacy is not compromised, because we're just dealing with a representation of your data, not the data itself. 

    You are posting from a position of knowing better than Apple, yet you are spreading FUD. That's peculiar.

    How do you know that they scan imagese on the their servers? Prove it. Images are stored in secure, encrypted storage.

    The "spy" in spyware suggests that it's hidden, secret software. Apple has come out with a full explanation about this feature before it rolls out. Nothing is hidden, nothing is secret. And they set the threshold really high. Many switches have to be thrown before any follow-through action is taken on a user's images. Very few accounts will ever reach that threshold, based on what they've described so far.
    n2itivguyargonaut
  • Reply 39 of 90
    hexclockhexclock Posts: 1,252member
    elijahg said:

    Criticisms of the features centered on the perception that Apple was analyzing photos on users' iPhones. "That's a common but really profound misunderstanding," said Federighi.

    "This is only being applied as part of a process of storing something in the cloud," he continued. "This isn't some processing running over the images you store in Messages, or Telegram... or what you're browsing over the web. This literally is part of the pipeline for storing images in iCloud."
    So they’re actually not analysing photos on-device as they’re uploaded to iCloud? Meaning the content of the white paper Apple published the other day is actually wrong? Or does he seem to think people will be convinced all’s fine by claiming that photos are not scanned, only “analysed” as part of the upload process? That code could easily be expanded out into a daemon that scans all data on the phone at any time. That’s the issue. Nothing to do with messaging (though that wasn’t great either).
    He’s very clearly referring the misunderstanding that Apple is analyzing all photos all the time, like in Messages or Telegram. It isn’t. It’s analyzing the photo prior to uploading it to their commercial iCloud servers, so they don’t have to host child porn. 

    These are numeric hash comparisons to the hash signatures of known child porn. The hash is not the photo, it’s a numeric representation of the photo — in the same way that your iPhone doesn’t store your FaceID or TouchID images, but instead stores a numeric hash which is used for authentication. 

    Man how do you not get this yet.
    Heh…one could argue that the photo itself is just a numeric representation of reality. 
    cat52
  • Reply 40 of 90
    fastasleepfastasleep Posts: 6,417member
    mattinoz said:
    elijahg said:

    Criticisms of the features centered on the perception that Apple was analyzing photos on users' iPhones. "That's a common but really profound misunderstanding," said Federighi.

    "This is only being applied as part of a process of storing something in the cloud," he continued. "This isn't some processing running over the images you store in Messages, or Telegram... or what you're browsing over the web. This literally is part of the pipeline for storing images in iCloud."
    So they’re actually not analysing photos on-device as they’re uploaded to iCloud? Meaning the content of the white paper Apple published the other day is actually wrong? Or does he seem to think people will be convinced all’s fine by claiming that photos are not scanned, only “analysed” as part of the upload process? That code could easily be expanded out into a daemon that scans all data on the phone at any time. That’s the issue. Nothing to do with messaging (though that wasn’t great either).
    There is code that scans all your photos in photos regardless if they are on iCloud or not. It powers photo search. No one seemed worried that was there but just as easily Apple could expand that system and become bad actors. No one has suggested they will and it’s now been there for years. 

    It seems They don’t even use this system for the system people are getting upset about. 
    Or Spotlight for that matter, literally searches all of your data. WHAT IF THEY... etc
    mattinozn2itivguyargonaut
Sign In or Register to comment.