Apple's Federighi says child protection message was 'jumbled,' 'misunderstood'

Posted:
in iCloud
Craig Federighi has said that Apple was wrong to release three child protection features at the same time, and wishes the confusion had been avoided.

Apple's Craig Federighi talking to Joanna Stern of the Wall Street Journal
Apple's Craig Federighi talking to Joanna Stern of the Wall Street Journal


Apple senior vice president of software engineering, Craig Federighi, has spoken out about the controversy over the company's new child protection features. In detailing how the two iCloud Photos and Messages features are different, he confirms what AppleInsider previously reported.

But, he also talked about how the negative reaction has been seen within Apple.

"We wish that this had come out a little more clearly for everyone because we feel very positive and strongly about what we're doing, and we can see that it's been widely misunderstood," he told the Wall Street Journal in an interview.

"I grant you, in hindsight, introducing these two features at the same time was a recipe for this kind of confusion," he continued. "It's really clear a lot of messages got jumbled up pretty badly. I do believe the soundbite that got out early was, 'oh my god, Apple is scanning my phone for images.' This is not what is happening."

Federighi then stepped through the processes involved in both the iCloud Photos scanning of uploaded CSAM (Child Sexual Abuse Material) images, and what happens when a child is sent such an image over Messages.

Criticisms of the features centered on the perception that Apple was analyzing photos on users' iPhones. "That's a common but really profound misunderstanding," said Federighi.

"This is only being applied as part of a process of storing something in the cloud," he continued. "This isn't some processing running over the images you store in Messages, or Telegram... or what you're browsing over the web. This literally is part of the pipeline for storing images in iCloud."

Asked whether there had been pressure to release this feature now, Federighi said there hadn't.

"No, it really came down to... we figured it out," he said. "We've wanted to do something but we've been unwilling to deploy a solution that would involve scanning all customer data."

Read on AppleInsider
«1345

Comments

  • Reply 1 of 90
    Then why not show it in the last keynote simply and clearly, along with a fancy video... or did you intentionally not want to draw attention to it and hope that letting the news get out at the end of a workweek would fit your agenda?
    darkvaderelijahgTheObannonFileemig647Beatskkqd1337cat52chemengin1byronl
  • Reply 2 of 90
    darkvaderdarkvader Posts: 1,146member
    The message is not the problem.

    The spyware is the problem.
    elijahgcaladanianchris-netcochoemoellerTheObannonFileanantksundarammuthuk_vanalingamemig647aderutter
  • Reply 3 of 90
    NYC362NYC362 Posts: 100member
    Come on already.  Google and Facebook have been doing this for years.   Suddenly when Apple wants to do the same thing, everyone gets twisted.

    It is just like the 1000 gas powered cars that will catch on fire get not a work of national attention, but one Tesla goes up in flames and there's a world wide news bulletin like it's the end of the world. 
    pichaeldewmeStrangeDaysnapoleon_phoneapartn2itivguybulk001radarthekatapplguyargonautjony0
  • Reply 4 of 90
    iadlibiadlib Posts: 116member
    For a company that prides itself on messaging. Y’all really screwed up
    anantksundaramDAalsethBeatsrcfacat52
  • Reply 5 of 90
    This just gets more stupid:

    - Mass surveillance of a billion iPhone users for what – now that every criminal has been warned?

    - On the device so that every security scientist knows what happens – no, they don't know if there is more in iCloud

    - Since it is on the device it looks like a first step, the second step could be a neural network detecting images

    To reiterate myself, after buying a new iPhone every year since 2007, I will not update to iOS 15 and will not buy an iPhone 13 Pro until this is sorted out. Same applies to macOS Monterey.
    chris-netcochoDontmentionthewaranantksundaramaderutterBeatsrcfamejsricbyronlcat52
  • Reply 6 of 90
    elijahgelijahg Posts: 2,843member

    Criticisms of the features centered on the perception that Apple was analyzing photos on users' iPhones. "That's a common but really profound misunderstanding," said Federighi.

    "This is only being applied as part of a process of storing something in the cloud," he continued. "This isn't some processing running over the images you store in Messages, or Telegram... or what you're browsing over the web. This literally is part of the pipeline for storing images in iCloud."
    So they’re actually not analysing photos on-device as they’re uploaded to iCloud? Meaning the content of the white paper Apple published the other day is actually wrong? Or does he seem to think people will be convinced all’s fine by claiming that photos are not scanned, only “analysed” as part of the upload process? That code could easily be expanded out into a daemon that scans all data on the phone at any time. That’s the issue. Nothing to do with messaging (though that wasn’t great either).
    caladanianchris-netcochoanantksundaramdavctt_zhmuthuk_vanalingamBeatsmejsriccat52
  • Reply 7 of 90
    mfrydmfryd Posts: 223member
    NYC362 said:
    Come on already.  Google and Facebook have been doing this for years.   Suddenly when Apple wants to do the same thing, everyone gets twisted.

    It is just like the 1000 gas powered cars that will catch on fire get not a work of national attention, but one Tesla goes up in flames and there's a world wide news bulletin like it's the end of the world. 
    While Google and Facebook have been scanning user data for years, they are honest and open about it.  They make their money by selling information they glean about their customers.  Google doesn't charge for Gmail because they learn more about you when you use their services, and hence make more profit selling that higher grade information.

    On the other hand, Apple prides itself on protecting the privacy of their customers.  A key sales point in buying into the Apple ecosystem is that Apple does everything they possibly can in order to protect your data.  They even fight court orders requiring them to add back doors to iPhone local encryption.

    Under Apple's new policy, every image you upload to your iCloud library will be scanned, and compared against a list of blacklisted images.  If you have too many blacklisted images, you will be reported to the authorities.  

    Initially, the blacklist will only contain child porn images.  I can easily imagine a narcissistic leader ordering Apple to add to that list images which are critical to the government.   Imagine a photo of a President that makes him look foolish, shows him in a compromising position, or reveals a public statement to be a lie.  Such a President would have a strong incentive to add these photos to the list.  Remember, Apple doesn't know what the blacklisted photos look like, Apple only has digital fingerprints of these images (it would be illegal for Apple to posses Child Porn, even if it was for a good cause).

    Apple fights to keep out back doors because they know that once those tools exist, at some point they will be used inappropriately by the government.  While I applaud Apple's goal of fighting child abuse, I know that once this tool exists, at some point it will be used inappropriately by the government.

    One of the key beliefs in the US Constitution is a healthy distrust of government.  We have three branches of government, each to keep watch on the other two.  We have a free press and free speech so that the public can stay informed as to what government is doing.  We have free and fair elections so the people can act as the ultimate watchdog by voting out representatives that are not doing what they should be doing.

    I strongly urge Apple to protect the privacy of our images by killing this feature.   At best, it will get the criminals use other services for sharing their personal photos.  At worst, it is a tool that can be used by the government to prevent free speech.
    cochoanantksundaramRayz2016bbhjony0bloggerblogomairmejsricOctoMonkeycat52
  • Reply 8 of 90
    davgregdavgreg Posts: 1,046member
    Either Apple is on the side of owner/customer/user privacy or they are not.

    Let the legal system in place handle this stuff.

    Apple has opened a can of worms.
    caladanianchris-netcochomobirdanantksundaramctt_zhbbhBeatsomairOctoMonkey
  • Reply 9 of 90
    StrangeDaysStrangeDays Posts: 13,086member
    darkvader said:
    The message is not the problem.

    The spyware is the problem.
    It’s not spyware. It’s the exact same child porn hash checking used by Microsoft, Google, Dropbox, Twitter, and Tumblr. Nobody wants to host child pornography on their own servers, and they won’t let you upload them. If you use any of those services, you’re already using CSAM hash checking. 

    You are free to not pay to use iCloud Photos and your problem is solved. 
    ihatescreennameskurai_kageforegoneconclusionmarklarkjony0dewmebulk001command_fAlex_Vargonaut
  • Reply 10 of 90
    StrangeDaysStrangeDays Posts: 13,086member
    elijahg said:

    Criticisms of the features centered on the perception that Apple was analyzing photos on users' iPhones. "That's a common but really profound misunderstanding," said Federighi.

    "This is only being applied as part of a process of storing something in the cloud," he continued. "This isn't some processing running over the images you store in Messages, or Telegram... or what you're browsing over the web. This literally is part of the pipeline for storing images in iCloud."
    So they’re actually not analysing photos on-device as they’re uploaded to iCloud? Meaning the content of the white paper Apple published the other day is actually wrong? Or does he seem to think people will be convinced all’s fine by claiming that photos are not scanned, only “analysed” as part of the upload process? That code could easily be expanded out into a daemon that scans all data on the phone at any time. That’s the issue. Nothing to do with messaging (though that wasn’t great either).
    He’s very clearly referring the misunderstanding that Apple is analyzing all photos all the time, like in Messages or Telegram. It isn’t. It’s analyzing the photo prior to uploading it to their commercial iCloud servers, so they don’t have to host child porn. 

    These are numeric hash comparisons to the hash signatures of known child porn. The hash is not the photo, it’s a numeric representation of the photo — in the same way that your iPhone doesn’t store your FaceID or TouchID images, but instead stores a numeric hash which is used for authentication. 

    Man how do you not get this yet.
    napoleon_phoneapartihatescreennameskurai_kageforegoneconclusionDBSyncmarklarkjony0bulk001command_fradarthekat
  • Reply 11 of 90
    StrangeDaysStrangeDays Posts: 13,086member
    mfryd said:
    NYC362 said:
    Come on already.  Google and Facebook have been doing this for years.   Suddenly when Apple wants to do the same thing, everyone gets twisted.

    It is just like the 1000 gas powered cars that will catch on fire get not a work of national attention, but one Tesla goes up in flames and there's a world wide news bulletin like it's the end of the world. 
    While Google and Facebook have been scanning user data for years, they are honest and open about it.  They make their money by selling information they glean about their customers.  Google doesn't charge for Gmail because they learn more about you when you use their services, and hence make more profit selling that higher grade information.

    On the other hand, Apple prides itself on protecting the privacy of their customers.  A key sales point in buying into the Apple ecosystem is that Apple does everything they possibly can in order to protect your data.  They even fight court orders requiring them to add back doors to iPhone local encryption.

    Under Apple's new policy, every image you upload to your iCloud library will be scanned, and compared against a list of blacklisted images.  If you have too many blacklisted images, you will be reported to the authorities.  

    Initially, the blacklist will only contain child porn images.  I can easily imagine a narcissistic leader ordering Apple to add to that list images which are critical to the government.   Imagine a photo of a President that makes him look foolish, shows him in a compromising position, or reveals a public statement to be a lie.  Such a President would have a strong incentive to add these photos to the list.  Remember, Apple doesn't know what the blacklisted photos look like, Apple only has digital fingerprints of these images (it would be illegal for Apple to posses Child Porn, even if it was for a good cause).
    You can imagine whatever you like, but the system uses the same NCMEC child porn database utilized by Microsoft, Google, Dropbox, Twitter, etc.. Your little fantasy about the POTUS adding images to it would affect every one of those major cloud storage providers, not just Apple. 

    There is nothing new here. You’re not forced to use any of these commercial cloud services to host your images online. 

    And reporting child porn libraries to the authorities is not some choice Apple makes — they’re required to by law. As are the other cloud storage services. You can’t store child porn, it’s against the law. 
    ihatescreennameskurai_kageforegoneconclusionDBSyncmarklarkjony0tjwolfbulk001meterestnzcommand_f
  • Reply 12 of 90
    Frederghi and Cook are beginning to look a bit foolish at this point. That would be fine with me, if it wasn't starting to make Apple itself look a bit foolish.

    Fix the problem instead of this mea culpa farce, people.
    Beatselijahgcat52chemengin1
  • Reply 13 of 90
    darkvader said:
    The message is not the problem.

    The spyware is the problem.
    It’s not spyware. It’s the exact same child porn hash checking used by Microsoft, Google, Dropbox, Twitter, and Tumblr. Nobody wants to host child pornography on their own servers, and they won’t let you upload them. If you use any of those services, you’re already using CSAM hash checking. 

    You are free to not pay to use iCloud Photos and your problem is solved. 
    Watch the interview! Frederighi's word salad is distressing. He simply avoids tough questions, especially the one on backdoors. Stern does an outstanding job, but should have pushed him harder.

    It's not about CSAM.
    muthuk_vanalingamDAalsethBeatselijahgmejsricOctoMonkeycat52chemengin1
  • Reply 14 of 90
    ctt_zhctt_zh Posts: 83member
    darkvader said:
    The message is not the problem.

    The spyware is the problem.
    It’s not spyware. It’s the exact same child porn hash checking used by Microsoft, Google, Dropbox, Twitter, and Tumblr. Nobody wants to host child pornography on their own servers, and they won’t let you upload them. If you use any of those services, you’re already using CSAM hash checking. 

    You are free to not pay to use iCloud Photos and your problem is solved. 
    Interesting, so are Microsoft, Google, Dropbox, Twitter and Tumblr also performing the hash checking on user devices? I thought they only performed the checks on their cloud servers (especially Microsoft and Google)... 
    muthuk_vanalingamaderutterelijahgmejsricOctoMonkeycat52
  • Reply 15 of 90
    If this was such a great idea, why are thousands of Apple employees singing an open letter against it? Craig keeps saying "we". Who exactly are "we"? Apple has had a problem with its executives and their ideas for a long time now. It's why the walled garden has become more like a walled prison for its customers. When Apple has to lobby against common sense ideas like the right to repair and the right to install whatever apps you want on the devices you own, you know there is a problem. With warrantless spying on personal data, Apple made it clear that they need new management. It is a good thing that there is a pandemic or Apple execs might find the crowd booing at them during their next live product launch.
    edited August 2021 elijahgcat52chemengin1
  • Reply 16 of 90
    I read the interview.  He said the system/process is protected by high levels of audibility.  No technical measures in place to prevent abuse.  That means when hashes of whatever the gov wants get inserted into these databases, apple cant do anything to stop it.  

    No one was confused about anything here, Craig.  Everyone knows exactly where this leads.   Audit away, assholes.  
    muthuk_vanalingamBeatselijahganantksundaramOctoMonkeycat52entropys
  • Reply 17 of 90
    ikirikir Posts: 130member
    darkvader said:
    The message is not the problem.

    The spyware is the problem.
    If you read the FAQ or watch rene video you will understand there is no spyware, the rest is paranoia.

    DBSyncmarklarkbulk001command_fargonaut
  • Reply 18 of 90
    Rayz2016Rayz2016 Posts: 6,957member
    darkvader said:
    The message is not the problem.

    The spyware is the problem.
    It’s not spyware. It’s the exact same child porn hash checking used by Microsoft, Google, Dropbox, Twitter, and Tumblr. Nobody wants to host child pornography on their own servers, and they won’t let you upload them. If you use any of those services, you’re already using CSAM hash checking. 

    You are free to not pay to use iCloud Photos and your problem is solved. 
    It's not exactly the same though. is it.

    Apple's system uses their own NeuralHash for matching, not PhotoDNA.

    But the biggest difference is that Apple is running the scan on your phone.
    muthuk_vanalingamaderuttercat52
  • Reply 19 of 90
    Rayz2016Rayz2016 Posts: 6,957member
    But things must be getting a bit sticky if they're rolling out Hair Force One.

    It's odd, because with this much of a backlash, Google and Microsoft would've thrown in the towel and sat round the campfire for a rethink,

    Apple keeps on insisting that the problem is the dissenters: we just don't understand. We understand just fine, Craig; we just disagree with you.

    Apple is determined to drive this through, no matter what; and you have to wonder why. I mean they already scan images on their servers, so why are they so determined to get spyware running on your phone?

    I think the reason is that, after a couple of false starts, Cupertino is ready to go all in on its next big product: advertising. But how do you do this while keeping up the 'privacy' mantra? How do you get into user tracking when you've spent the past three or four years crucifying all the other companies who make money doing it?

    Well, to start with, you release a client-side tracker, give it a noble purpose, and then try to convince people that their privacy is protected because it is not moving around a real image; just a hashed representation of the image.

    If you can get people to accept that, then it's a lot easier to get them to accept step 2; a client-side tracker that logs what you're doing on the phone, which developers and marketers can hook into and extract information. But here's the clever part: the info they extract is a machine-learned representation of you that gets a unique number so it can be tracked across applications. But it doesn't contain any real details; not your name, address, health records, nothing; because as long as they know that 884398443894398 exercises three times a week, goes to a lot of cookery classes and has a subscription to PornHub, that's all they really care about. Why do they need to know your real name? They can serve relevant ads to that person without knowing who they are. Only Apple knows that, and they will not allow that information out.  The APIs to access this pseudo-you might even incur a subscription charge.

    But to make this work, they would need the user base to be accept loggers running on their phones. And that's where we are now: Step 1. That's why the client-side tool cannot be dropped. Without it, the whole plan is screwed.

    Of course, this would work for apps, but once you get out onto the web then there's no API, so for that to work, Apple would need some kind of private relay that could substitute your details with your avatar when you make web requests.


    The message Apple is trying to get across is that your privacy is not compromised, because we're just dealing with a representation of your data, not the data itself. 


    edited August 2021 DAalsethmuthuk_vanalingamgatorguyBeatselijahganantksundarambulk001OctoMonkeycat52argonaut
  • Reply 20 of 90
    Rayz2016Rayz2016 Posts: 6,957member
    ctt_zh said:
    darkvader said:
    The message is not the problem.

    The spyware is the problem.
    It’s not spyware. It’s the exact same child porn hash checking used by Microsoft, Google, Dropbox, Twitter, and Tumblr. Nobody wants to host child pornography on their own servers, and they won’t let you upload them. If you use any of those services, you’re already using CSAM hash checking. 

    You are free to not pay to use iCloud Photos and your problem is solved. 
    Interesting, so are Microsoft, Google, Dropbox, Twitter and Tumblr also performing the hash checking on user devices? I thought they only performed the checks on their cloud servers (especially Microsoft and Google)... 

    No they don't.  They perform the checks on their servers. They could certainly build something like that into Android, but it's open-source, so someone else would just build a version without it, or write an app to disable it.
    edited August 2021 muthuk_vanalingamctt_zhelijahgbulk001OctoMonkeycat52argonaut
Sign In or Register to comment.