Apple employees express concern over new child safety tools

Posted:
in General Discussion edited August 2021
Apple employees are voicing concern over the company's new child safety features set to debut with iOS 15 this fall, with some saying the decision to roll out such tools could tarnish Apple's reputation as a bastion of user privacy.

CSAM


Pushback against Apple's newly announced child safety measures now includes critics from its own ranks who are speaking out on the subject in internal Slack channels, reports Reuters.

Announced last week, Apple's suite of child protection tools includes on-device processes designed to detect and report child sexual abuse material uploaded to iCloud Photos. Another tool protects children from sensitive images sent through Messages, while Siri and Search will be updated with resources to deal with potentially unsafe situations.

Since the unveiling of Apple's CSAM measures, employees have posted more than 800 messages to a Slack channel on the topic that has remained active for days, the report said. Those concerned about the upcoming rollout cite common worries pertaining to potential government exploitation, a theoretical possibility that Apple deemed highly unlikely in a new support document and statements to the media this week.

The pushback within Apple, at least as it pertains to the Slack threads, appears to be coming from employees who are not part of the company's lead security and privacy teams, the report said. Those working in the security field did not appear to be "major complainants" in the posts, according to Reuters sources, and some defended Apple's position by saying the new systems are a reasonable response to CSAM.

In a thread dedicated to the upcoming photo "scanning" feature (the tool matches image hashes against a hashed database of known CSAM), some workers have objected to the criticism, while others say Slack is not the forum for such discussions, the report said. Some employees expressed hope that the on-device tools will herald full end-to-end iCloud encryption.

Apple is facing down a cacophony of condemnation from critics and privacy advocates who say the child safety protocols raise a number of red flags. While some of the pushback can be written off to misinformation stemming from a basic misunderstanding of Apple's CSAM technology, others raise legitimate concerns of mission creep and violations of user privacy that were not initially addressed by the company.

The Cupertino tech giant has attempted to douse the fire by addressing commonly cited concerns in a FAQ published this week. Company executives are also making the media rounds to explain what Apple views as a privacy-minded solution to a particularly odious problem. Despite its best efforts, however, controversy remains.

Apple's CSAM detecting tool launch with iOS 15 this fall.

Read on AppleInsider
«134

Comments

  • Reply 1 of 66
    OferOfer Posts: 241unconfirmed, member
    Sadly, even with this new tool, Apple is still the best game in town as far as user privacy is concerned. I’m a big fan of the company and have been using their products since the very first computer I’ve owned (anyone remember the Apple IIgs?). However, if they continue in this trajectory, I may have to start researching other options.
    ArchStantonkillroycat52
  • Reply 2 of 66
    The folks working in privacy and information security were probably told to dummy up. 
    elijahgbaconstangdarkvadermike54byronlcat52zeus423
  • Reply 3 of 66
    So Apple scans our photo library for a hit on known child porn. Somebody at some point along the chain had to watch it and establish the library.
    elijahg
  • Reply 4 of 66
    Ofer said:
    Sadly, even with this new tool, Apple is still the best game in town as far as user privacy is concerned. I’m a big fan of the company and have been using their products since the very first computer I’ve owned (anyone remember the Apple IIgs?). However, if they continue in this trajectory, I may have to start researching other options.
    Yes, I remember the IIgs, still have one...  as well as my first Macintosh (a platinum Plus I purchased new back in '87).

    Sadly, due to this CSAM scanning, I will almost certainly be moving away from Apple.  I have already disabled the auto-upgrade on our dozen, or so, iOS based devices.  When iOS 15 comes out, that will be the end of updates for us.  Since we don't use iCloud and rarely text, this "feature" would have little impact on us.  However, the very notion that a company (any company) believes it has the right to place what amounts to spyware on devices I own is completely unacceptable!  If this means eventually going back to a "dumb" phone, then so be it.  Even if Apple were to state it will not include this "feature", I would probably not trust them enough to believe them...  perhaps they would just do it anyway without telling anybody.

    In fairness, I grew up consuming more than my fair share of dystopian novels and movies which might be influencing my position a bit.  :-)
    OferwilliamlondonelijahgmobirdbaconstangentropysRayz2016darkvaderDontmentionthewarmike54
  • Reply 5 of 66
    mknelsonmknelson Posts: 1,125member
    MisterKit said:
    So Apple scans our photo library for a hit on known child porn. Somebody at some point along the chain had to watch it and establish the library.
    That's not quite the process. It isn't viewing your library in the way some articles imply.

    https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf

    The hash list is provided by National Center for Missing and Exploited Children (NCMEC). The images or likely just the hashes would have been provided by prosecutors/police agencies/child welfare.

    The article above describes the hash algorithm. The algorithm creates a hash value. The hash value computed for the photos in the iCloud library are compared to hash values in the list.

    An account is only flagged if there is a certain number of such hash matches.
    williamlondonmattinozdewmekillroyn2itivguyjony0
  • Reply 6 of 66
    "Apple Employees"? Multiple tens of thousands voiced concern or 20 did? I don't like these misleading headlines. It allows a few people to control the public message. 

    Apple, as another poster said, is still by far the best game in town. 

    CSAM is probably not the hill to fight the privacy battle. Be sure that government will come to Apple and push for more. That's a better hill to fight the battle. 
    dewmemike1killroyn2itivguybyronljony0
  • Reply 7 of 66
    elijahgelijahg Posts: 2,759member
    "Apple Employees"? Multiple tens of thousands voiced concern or 20 did? I don't like these misleading headlines. It allows a few people to control the public message. 

    Apple, as another poster said, is still by far the best game in town. 

    CSAM is probably not the hill to fight the privacy battle. Be sure that government will come to Apple and push for more. That's a better hill to fight the battle. 
    800 posts so unless there are a couple of people spending a lot of time posting on Slack, it's a fair number of people.

    Problem is at the top of that hill is not only CSAM, but behind a chain-link fence is all legal data that every iOS user owns. Whilst the government may not have the means to climb the hill, they do hold a pair of wire cutters.
    baconstangchadbagmacplusplusRayz2016darkvadercat52
  • Reply 8 of 66
    elijahgelijahg Posts: 2,759member
    There really is significant pushback on this, and rightly so. Apple bleating on that "we believe privacy is a fundamental human right" means absolutely nothing now that they've announced this. 

    I don't have much respect for Cook at the best of times, but how he can stand up and tell barefaced lies that "privacy is in our DNA" is nigh on criminal IMO. They've blown people's trust overnight. 
    edited August 2021 mobirdmacplusplusOfermuthuk_vanalingamRayz2016OctoMonkeydarkvaderDontmentionthewaraderuttermike54
  • Reply 9 of 66
    I think the Apple tool is a great idea.

    The iPhone has top notch photo and video capabilities and is very portable.
    As such, it can easily be a portable tool to abuse children.

    Apple wants to stop that.  Go Apple go go go!
    h4y3ssireofsethdewmekillroyn2itivguyjony0
  • Reply 10 of 66
    mknelson said:
    MisterKit said:
    So Apple scans our photo library for a hit on known child porn. Somebody at some point along the chain had to watch it and establish the library.
    That's not quite the process. It isn't viewing your library in the way some articles imply.

    https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf

    The hash list is provided by National Center for Missing and Exploited Children (NCMEC). The images or likely just the hashes would have been provided by prosecutors/police agencies/child welfare.

    The article above describes the hash algorithm. The algorithm creates a hash value. The hash value computed for the photos in the iCloud library are compared to hash values in the list.

    An account is only flagged if there is a certain number of such hash matches.
    You misread the original post. The commenter wasn’t saying that someone was viewing your library. The commenter was saying that someone had to view the child porn video in order to add it to the list.
    killroy
  • Reply 11 of 66
    So I understand the hashes come from a database run by a government? So then that government can easily slip any other hash into the same database depending on what they are looking for.

    This is building in that back door that was requested some years ago.
    baconstangOferRayz2016elijahgGG1darkvaderDontmentionthewaraderuttercat52zeus423
  • Reply 12 of 66
    I can't wait to read about some Grandmother who gets arrested when they find a picture of naked infant grandchild in the bath in her iCloud Photo Library. I guarantee you that or something equally insane will result from this.
    baconstangdarkvadercat52zeus423
  • Reply 13 of 66
    baconstangbaconstang Posts: 1,105member
    tommikele said:
    I can't wait to read about some Grandmother who gets arrested when they find a picture of naked infant grandchild in the bath in her iCloud Photo Library. I guarantee you that or something equally insane will result from this.
    Oh, come on.  Misidentified hashes may happen, but they say the odds are very slim. 
    Hashes of images of dissidents or foreign terrorists (not domestic of course) might accidentally work their way into the database.  But probably only in countries with authoritarian governments.
    If you happen to be misidentified as a 'bad guy', I'm sure they'll  send you a registered letter to come down to the 'office' and help them clarify the situation.  They'd never start surveilling you or breakdown your door over something like that...Right?
    Oferelijahgkillroycat52zeus423
  • Reply 14 of 66
    chadbagchadbag Posts: 2,000member
    tommikele said:
    I can't wait to read about some Grandmother who gets arrested when they find a picture of naked infant grandchild in the bath in her iCloud Photo Library. I guarantee you that or something equally insane will result from this.
    It doesn’t work like that.  Unless that infant grandchild pic was put into the child abuse pic database and had a hash made and put into th DB. 

    I am against this because of the way it is being done.  Apple putting plumbing in that can later be abused when circumstances change, despite their best intentions today. 

    However, relying on misinformation does not help the discussion against this in any way. 
    baconstangdewmekillroy
  • Reply 15 of 66
    chadbagchadbag Posts: 2,000member
    Leifur said:
    So I understand the hashes come from a database run by a government? So then that government can easily slip any other hash into the same database depending on what they are looking for.

    This is building in that back door that was requested some years ago.
    I agree that this is a kind of backdoor and am against it for that reason. However, my understanding is that the hash database is NOT from a government or government agency. It is from a special non-profit set up to fight child abuse. 
    Oferkillroy
  • Reply 16 of 66
    baconstangbaconstang Posts: 1,105member
    chadbag said:
    Leifur said:
    So I understand the hashes come from a database run by a government? So then that government can easily slip any other hash into the same database depending on what they are looking for.

    This is building in that back door that was requested some years ago.
    I agree that this is a kind of backdoor and am against it for that reason. However, my understanding is that the hash database is NOT from a government or government agency. It is from a special non-profit set up to fight child abuse. 
    NCMEC is funded with private and government funds.

    From Wikipedia:
    The National Center for Missing & Exploited Children (NCMEC) is a private, nonprofit organization established in 1984 by the United States Congress. In September 2013, the United States House of RepresentativesUnited States Senate, and the President of the United States reauthorized the allocation of $40 million in funding for the National Center for Missing & Exploited Children as part of Missing Children's Assistance Reauthorization Act of 2013
    edited August 2021 Ofermuthuk_vanalingamRayz2016killroyn2itivguycat52zeus423
  • Reply 17 of 66
    Possibly part of the “screeching minority” who simply “don’t understand how this works”?
    elijahgsireofsethdewmemike1killroyn2itivguycat52
  • Reply 18 of 66
    baconstangbaconstang Posts: 1,105member
    Possibly part of the “screeching minority” who simply “don’t understand how this works”?
    More like the screeching minority who've seen this before...and do know how it works.
    edited August 2021 entropyselijahgOfermuthuk_vanalingamRayz2016OctoMonkeydarkvadermike54cat52zeus423
  • Reply 19 of 66
    entropysentropys Posts: 4,166member
    mknelson said:
    MisterKit said:
    So Apple scans our photo library for a hit on known child porn. Somebody at some point along the chain had to watch it and establish the library.
    That's not quite the process. It isn't viewing your library in the way some articles imply.

    https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf

    The hash list is provided by National Center for Missing and Exploited Children (NCMEC). The images or likely just the hashes would have been provided by prosecutors/police agencies/child welfare.

    The article above describes the hash algorithm. The algorithm creates a hash value. The hash value computed for the photos in the iCloud library are compared to hash values in the list.

    An account is only flagged if there is a certain number of such hash matches.
    People understand the process alright. The CSAM is the mcguffin used to introduce the ability to access a hash list, built right into the OS and survey people’s’ data. Surely only purveyors of kid p0rn would be against it!
    What an emotive button to press!
    But say a future government didn’t like people that were fans of Winnie the Pooh. Pics of Pooh bear could be put on a hash list and the owners reported to the state. Or Ché Gevara? Reagan; thatcher; a religious dissident.

    once this pathway is enabled in the OS, how can Apple say no? It won’t be able to.
    edited August 2021 OferelijahgGG1darkvadermike54cat52zeus423
  • Reply 20 of 66
    chadbagchadbag Posts: 2,000member
    chadbag said:
    Leifur said:
    So I understand the hashes come from a database run by a government? So then that government can easily slip any other hash into the same database depending on what they are looking for.

    This is building in that back door that was requested some years ago.
    I agree that this is a kind of backdoor and am against it for that reason. However, my understanding is that the hash database is NOT from a government or government agency. It is from a special non-profit set up to fight child abuse. 
    NCMEC is funded with private and government funds.

    From Wikipedia:
    The National Center for Missing & Exploited Children (NCMEC) is a private, nonprofit organization established in 1984 by the United States Congress. In September 2013, the United States House of RepresentativesUnited States Senate, and the President of the United States reauthorized the allocation of $40 million in funding for the National Center for Missing & Exploited Children as part of Missing Children's Assistance Reauthorization Act of 2013
    Yes. That is not a database "from the government".  A lot of private organizations get funds from the government.  Now, the government may have help set it up and given it certain privileges.  But it is still a private org and nit a governmental entity. 
    Oferbaconstangsireofsethkillroy
Sign In or Register to comment.