Open letter asks Apple not to implement Child Safety measures

Posted:
in General Discussion edited August 2021
An open letter making the rounds online asks Apple to halt plans to roll out new Child Safety tools designed to combat child sexual abuse material, with signatories including industry experts and high-profile names like Edward Snowden.

iOS 14


The document, which reads more like an indictment than an open letter, offers a rundown of Apple's Thursday announcement that details upcoming features designed to detect CSAM.

A multi-pronged effort, Apple's system uses on-device processing to detect and report CSAM images uploaded to iCloud Photos, as well as protect children from sensitive images sent through < href="https://appleinsider.com/inside/imessage">Messages.

"While child exploitation is a serious problem, and while efforts to combat it are almost unquestionably well-intentioned, Apple's proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products," the letter reads.

When it is implemented, Apple's system will hash and match user photos against a hashed database of known CSAM. The process is accomplished on-device before upload and only applies to images sent to iCloud. A second tool uses on-device machine learning to protect children under the age of 17 from viewing sexually explicit images in Messages. Parents can choose to be notified when children under 13 years old send or receive such content.

According to the letter, Apple's techniques pose an issue because they bypass end-to-end encryption.

"Because both checks are performed on the user's device, they have the potential to bypass any end-to-end encryption that would otherwise safeguard the user's privacy," the letter argues.

For its part, Apple has gone on record as saying the new safety protocols do not create a backdoor to its hardware and software privacy features.

The letter goes on to include commentary and criticism from a range of experts including Matthew Green, a cryptography professor at Johns Hopkins University who was among the first to voice concern over the implications of Apple's measures. Green and Snowden are counted among the signatories, which currently lists 19 organizations and 640 individuals who added their mark via GitHub.

Along with a halt to implementation, the letter requests that Apple issue a statement "reaffirming their commitment to end-to-end encryption and to user privacy."

Read on AppleInsider
«1345

Comments

  • Reply 1 of 91
    DAalsethDAalseth Posts: 2,783member
    No matter how well intentioned, this effort will be used to damage Apple's reputation, severely. It should be abandoned immediately. 
    Remember how Apple was excoriated by some last year for having a "monopoly" on covid reporting Apps. and that was a free thing they did with Google and kept no data. Apple just stuck a big red Kick Me sign on their back. 
    baconstangdarkvaderdj2k3000caladaniannadrielentropystyler82killroychemengin1
  • Reply 2 of 91
    Not a big fan of some of the wording in the letter but I understand the meaning behind the larger message. IMHO Apple made a mistake at minimum in how they rolled this out, at minimum. DAalseth comment was correct, this will be attempted to be used to bludgeon Apple. How well it succeeds is doubtful, IMHO.  But the surreal irony is some of those doing the bludgeoning will be among the worst purveyors of data privacy collecting and exploitation.
    No problem with the signers of the letter expressing this very important point. But Here's my problem with the signers of this letter: where the hell have they been on  the vast majority of smartphone users on the planet using a platform that was tracking the hell out of them? They've just been given a big platform to condemn user privacy issue  based on Apple's MEC surveilling, so where the hell is page 2 to protect hundreds of millions of people getting their privacy data tracked constantly? Speak now or show yourself to be looking for a few headlines. 

    EFF has been there since day 1. Calling out Facebook and Google but also calling out Apple when it was needed. Where were the rest of these letter signers? Unfortunately a few of them probably, I suspect, getting "third party research" grants. See how that works?
    longpathlolliverdj2k3000dope_ahminehcrefugeekillroycolinngchemengin1watto_cobra
  • Reply 3 of 91
    iadlibiadlib Posts: 95member
    This is admirable and I like the intention behind it. But to plays devils advocate. I bring up that this technique. Maybe. Possibly. Could be adapted for other kinds of data. Say emails. Text messages. Let’s say it becomes an API or baked in feature. What if that feature gets hijacked? By a hacker, or a government? To search for anti-state speech in China, or even industrial espionage. This is a Pandora’s box that couldn’t ever be shut. 
    longpathdj2k3000baconstangcaladanianhcrefugeechemengin1
  • Reply 4 of 91
    Rayz2016Rayz2016 Posts: 6,957member
    DAalseth said:
    No matter how well intentioned, this effort will be used to damage Apple's reputation, severely. It should be abandoned immediately. 
    Remember how Apple was excoriated by some last year for having a "monopoly" on covid reporting Apps. and that was a free thing they did with Google and kept no data. Apple just stuck a big red Kick Me sign on their back. 
    Apple will insist there is no back door into the system, but what they don’t realise is that this the back door. This is the back door that the governments have been asking for. All they need to do is add hashes from other databases (searching for pictures of dissidents, words from subversive poetry), tweak the threshold (you have to have four hits instead of eight) and you have an authoritarian government’s wet dream. It is the ultimate surveillance tool. 

    More of a back passage than a back door, centrally controlled by Apple and law enforcement, allowing every phone to spy on its user. 

    It’s odd but I’m typing this message on my iPad, and I have this notion that I no longer trust it, nor my iPhone, nor my Macs. I’m wary of them of them now. Even if Apple did reverse course (which they won’t), I don’t think that trust is coming back. 

    edited August 2021 macpluspluslongpathdarkvaderbaconstangdigitolcaladanianhcrefugeeargonautnadrielgeorgie01
  • Reply 5 of 91
    Rayz2016Rayz2016 Posts: 6,957member
    Yup, this basically. 


    GeorgeBMacgatorguybaconstangviclauyychcrefugeeargonautgeorgie01entropys
  • Reply 6 of 91
    macplusplusmacplusplus Posts: 2,112member
    Apple should shut down iCloud instead of developing a mass surveillance method for the government.
    mike54darkvaderboboliciouswilliamlondonRayz2016
  • Reply 7 of 91
    crowleycrowley Posts: 10,453member
    iadlib said:
    This is admirable and I like the intention behind it. But to plays devils advocate. I bring up that this technique. Maybe. Possibly. Could be adapted for other kinds of data. Say emails. Text messages. Let’s say it becomes an API or baked in feature. What if that feature gets hijacked? By a hacker, or a government? To search for anti-state speech in China, or even industrial espionage. This is a Pandora’s box that couldn’t ever be shut. 
    FUD.

    It's a private method for comparing an image hash against a specific database of image hashes.  

    No indication that it'll become a generic API, or a feature for anyone else to use.
    No indication that there will be any way for it to be hijacked, short of rooting the entire phone, in which case all bets are off anyway.
    No indication that it could be used to search for anything other than images, or more specifically the hashes of those images.
    No indication that it couldn't be shut down any time Apple wanted to shut it down.
    pichaelgenovellejibmark fearingDBSyncomasoukillroycoolfactorn2itivguy
  • Reply 8 of 91
    Rayz2016Rayz2016 Posts: 6,957member
    crowley said:
    iadlib said:
    This is admirable and I like the intention behind it. But to plays devils advocate. I bring up that this technique. Maybe. Possibly. Could be adapted for other kinds of data. Say emails. Text messages. Let’s say it becomes an API or baked in feature. What if that feature gets hijacked? By a hacker, or a government? To search for anti-state speech in China, or even industrial espionage. This is a Pandora’s box that couldn’t ever be shut. 
    FUD.


    It's not FUD if he's playing devil's advocate.

    C'mon, you of all people how this game's  played. :wink: 
    elijahgwilliamlondonbaconstang
  • Reply 9 of 91
    crowley said:
    iadlib said:
    FUD.

    It's a private method for comparing an image hash against a specific database of image hashes.  

    No indication that it'll become a generic API, or a feature for anyone else to use.
    No indication that there will be any way for it to be hijacked, short of rooting the entire phone, in which case all bets are off anyway.
    No indication that it could be used to search for anything other than images, or more specifically the hashes of those images.
    No indication that it couldn't be shut down any time Apple wanted to shut it down.
    ”No indication” seems kind of naive.  Just because there’s no indication now, doesn’t mean this doesn’t change in the future.
    elijahgOctoMonkeybaconstangcaladanianhcrefugeeargonautgeorgie01entropysbeowulfschmidtchemengin1
  • Reply 10 of 91
    crowleycrowley Posts: 10,453member
    Rayz2016 said:
    crowley said:
    iadlib said:
    This is admirable and I like the intention behind it. But to plays devils advocate. I bring up that this technique. Maybe. Possibly. Could be adapted for other kinds of data. Say emails. Text messages. Let’s say it becomes an API or baked in feature. What if that feature gets hijacked? By a hacker, or a government? To search for anti-state speech in China, or even industrial espionage. This is a Pandora’s box that couldn’t ever be shut. 
    FUD.


    It's not FUD if he's playing devil's advocate.

    C'mon, you of all people how this game's  played. :wink: 
    I don't get it at all.  You can claim to be playing any role you want, it's still spreading FUD.
    bulk001DBSynckillroy
  • Reply 11 of 91
    Edward "I Fled to Russia Because I Was So Worried About Living in an Authoritarian Country" Snowden
    citylightsapplewilliamlondonDBSynckillroymikethemartiann2itivguy
  • Reply 12 of 91
    Rayz2016 said:
    Yup, this basically. 



    Not sure what the EFF expects when it comes to laws passed by authoritarian governments. The only options would be to comply or leave the market. If every private company leaves the market, then state run companies will fill the void. So on the one hand you can say "we're not participating", but on the other you would have to admit that you haven't changed anything per protections for the people that live in the country. 
    DBSyncn2itivguy
  • Reply 13 of 91
    darkvaderdarkvader Posts: 1,146member
    crowley said:
    iadlib said:
    This is admirable and I like the intention behind it. But to plays devils advocate. I bring up that this technique. Maybe. Possibly. Could be adapted for other kinds of data. Say emails. Text messages. Let’s say it becomes an API or baked in feature. What if that feature gets hijacked? By a hacker, or a government? To search for anti-state speech in China, or even industrial espionage. This is a Pandora’s box that couldn’t ever be shut. 
    FUD.

    It's a private method for comparing an image hash against a specific database of image hashes.  

    No indication that it'll become a generic API, or a feature for anyone else to use.
    No indication that there will be any way for it to be hijacked, short of rooting the entire phone, in which case all bets are off anyway.
    No indication that it could be used to search for anything other than images, or more specifically the hashes of those images.
    No indication that it couldn't be shut down any time Apple wanted to shut it down.

    It's not FUD.  There is no indication that it can't be used for scanning any type of data, and in fact it's obvious that it absolutely can be.

    Saying things like "oh, but they're just going after evil kiddy porn, think of the children" is a wilful misunderstanding of the technology.  This is incredibly dangerous, and should absolutely be scrapped before it's ever implemented. 
    elijahgGeorgeBMacbaconstanghcrefugeeargonautgeorgie01chemengin1
  • Reply 14 of 91
    boboliciousbobolicious Posts: 1,139member
    “In a world without digital privacy, even if you have done nothing wrong other than think differently, you begin to censor yourself,” Cook said in a 2019 commencement speech at Stanford University. “Not entirely at first. Just a little, bit by bit. To risk less, to hope less, to imagine less, to dare less, to create less, to try less, to talk less, to think less. The chilling effect of digital surveillance is profound, and it touches everything.” 


    “Instead of focusing on making it easy for people to report content that’s shared with them, Apple has built software that can scan all the private photos on your phone -- even photos you haven’t shared with anyone,”


    I remember being shocked when Photos replaced iPhoto in that it had auto tagging with no opt out or off switch...
    baconstangargonautcolinngFileMakerFeller
  • Reply 15 of 91
    GeorgeBMacGeorgeBMac Posts: 11,421member
    Apple should shut down iCloud instead of developing a mass surveillance method for the government.

    You have the option of doing that yourself.  Apple doesn't have to eliminate it for everybody -- they simply gave everybody the option to do so if they wished.
    edited August 2021
  • Reply 16 of 91
    GeorgeBMacGeorgeBMac Posts: 11,421member
    Rayz2016 said:
    Yup, this basically. 



    Not sure what the EFF expects when it comes to laws passed by authoritarian governments. The only options would be to comply or leave the market. If every private company leaves the market, then state run companies will fill the void. So on the one hand you can say "we're not participating", but on the other you would have to admit that you haven't changed anything per protections for the people that live in the country. 

    What about laws passed by democratic governments?   Are they absolved of responsibility?  Are they pure and holy and without  fault?

    Have you heard of the Patriot Act?
    Or the data Apple supplied to the government on Democratic lawmakers?
    dewmeDAalseth
  • Reply 17 of 91
    lkrupplkrupp Posts: 10,557member
    So I wonder what the authors of this letter will do if Apple goes ahead? Who will they turn to for privacy? Do they not know that Google is doing this? So if they stop buying Apple then who will they buy from, or will they get off the internet?

    Sputtering about privacy, government surveillance, etc. when you are impotent to do anything because the only option is to leave the internet is not productive. Several have threatened or stated that they will no longer buy Apple products because of this. Who are they kidding, where will they go? This will blow over, will be implemented, and life will go on. By the way, both political parties are on board with this, so no matter who you vote for it’s a done deal.

    So how about those of you clamoring for Apple to stop this tell us what you plan to do when it is implemented. What will be your response? Writing your Congressperson? Or will you be hypocrites and go about your business?
    robin hubermark fearingDBSyncargonautdewmekillroyDAalsethn2itivguyurahara
  • Reply 18 of 91
    Apple should shut down iCloud instead of developing a mass surveillance method for the government.

    You have the option of doing that yourself.  Apple doesn't have to eliminate it for everybody -- they simply gave everybody the option to do so if they wished.

    The ”good guys” turning off iCloud does not solve the actual problem of the “bad guys” storing their photos in Apple’s iCloud.

    Assuming we believe CSAM is a something that needs tackling, the only other workable option I can see would be to shut down iCloud photos.

    This would have the considerable disadvantage of having to store all our photos on device but this was how it used to be (in the good old days) then the "bad guys" photos would be stored safely on their own devices.

  • Reply 19 of 91
    crowleycrowley Posts: 10,453member

    Assuming we believe CSAM is a something that needs tackling,

    You’re suggesting that it might not be?
  • Reply 20 of 91
    bulk001bulk001 Posts: 764member
    darkvader said:
    crowley said:
    iadlib said:
    This is admirable and I like the intention behind it. But to plays devils advocate. I bring up that this technique. Maybe. Possibly. Could be adapted for other kinds of data. Say emails. Text messages. Let’s say it becomes an API or baked in feature. What if that feature gets hijacked? By a hacker, or a government? To search for anti-state speech in China, or even industrial espionage. This is a Pandora’s box that couldn’t ever be shut. 
    FUD.

    It's a private method for comparing an image hash against a specific database of image hashes.  

    No indication that it'll become a generic API, or a feature for anyone else to use.
    No indication that there will be any way for it to be hijacked, short of rooting the entire phone, in which case all bets are off anyway.
    No indication that it could be used to search for anything other than images, or more specifically the hashes of those images.
    No indication that it couldn't be shut down any time Apple wanted to shut it down.

    It's not FUD.  There is no indication that it can't be used for scanning any type of data, and in fact it's obvious that it absolutely can be.

    Saying things like "oh, but they're just going after evil kiddy porn, think of the children" is a wilful misunderstanding of the technology.  This is incredibly dangerous, and should absolutely be scrapped before it's ever implemented. 
    There would need to be a database to scan it against. What data are you afraid of them scanning? What do you have that is so private but at the same time available in a database that is important enough to have been created to catch that information. If you are worried about them just reading your text messages, that is readily available to authoritarian governments already - they will just force you to unlock your phone or stored data and jail you or worse if you don’t. 
    DBSynckillroylongfang
Sign In or Register to comment.