Open letter asks Apple not to implement Child Safety measures
An open letter making the rounds online asks Apple to halt plans to roll out new Child Safety tools designed to combat child sexual abuse material, with signatories including industry experts and high-profile names like Edward Snowden.
The document, which reads more like an indictment than an open letter, offers a rundown of Apple's Thursday announcement that details upcoming features designed to detect CSAM.
A multi-pronged effort, Apple's system uses on-device processing to detect and report CSAM images uploaded to iCloud Photos, as well as protect children from sensitive images sent through < href="https://appleinsider.com/inside/imessage">Messages.
"While child exploitation is a serious problem, and while efforts to combat it are almost unquestionably well-intentioned, Apple's proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products," the letter reads.
When it is implemented, Apple's system will hash and match user photos against a hashed database of known CSAM. The process is accomplished on-device before upload and only applies to images sent to iCloud. A second tool uses on-device machine learning to protect children under the age of 17 from viewing sexually explicit images in Messages. Parents can choose to be notified when children under 13 years old send or receive such content.
According to the letter, Apple's techniques pose an issue because they bypass end-to-end encryption.
"Because both checks are performed on the user's device, they have the potential to bypass any end-to-end encryption that would otherwise safeguard the user's privacy," the letter argues.
For its part, Apple has gone on record as saying the new safety protocols do not create a backdoor to its hardware and software privacy features.
The letter goes on to include commentary and criticism from a range of experts including Matthew Green, a cryptography professor at Johns Hopkins University who was among the first to voice concern over the implications of Apple's measures. Green and Snowden are counted among the signatories, which currently lists 19 organizations and 640 individuals who added their mark via GitHub.
Along with a halt to implementation, the letter requests that Apple issue a statement "reaffirming their commitment to end-to-end encryption and to user privacy."
Read on AppleInsider
The document, which reads more like an indictment than an open letter, offers a rundown of Apple's Thursday announcement that details upcoming features designed to detect CSAM.
A multi-pronged effort, Apple's system uses on-device processing to detect and report CSAM images uploaded to iCloud Photos, as well as protect children from sensitive images sent through < href="https://appleinsider.com/inside/imessage">Messages.
"While child exploitation is a serious problem, and while efforts to combat it are almost unquestionably well-intentioned, Apple's proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products," the letter reads.
When it is implemented, Apple's system will hash and match user photos against a hashed database of known CSAM. The process is accomplished on-device before upload and only applies to images sent to iCloud. A second tool uses on-device machine learning to protect children under the age of 17 from viewing sexually explicit images in Messages. Parents can choose to be notified when children under 13 years old send or receive such content.
According to the letter, Apple's techniques pose an issue because they bypass end-to-end encryption.
"Because both checks are performed on the user's device, they have the potential to bypass any end-to-end encryption that would otherwise safeguard the user's privacy," the letter argues.
For its part, Apple has gone on record as saying the new safety protocols do not create a backdoor to its hardware and software privacy features.
The letter goes on to include commentary and criticism from a range of experts including Matthew Green, a cryptography professor at Johns Hopkins University who was among the first to voice concern over the implications of Apple's measures. Green and Snowden are counted among the signatories, which currently lists 19 organizations and 640 individuals who added their mark via GitHub.
Along with a halt to implementation, the letter requests that Apple issue a statement "reaffirming their commitment to end-to-end encryption and to user privacy."
Read on AppleInsider
Comments
Remember how Apple was excoriated by some last year for having a "monopoly" on covid reporting Apps. and that was a free thing they did with Google and kept no data. Apple just stuck a big red Kick Me sign on their back.
No problem with the signers of the letter expressing this very important point. But Here's my problem with the signers of this letter: where the hell have they been on the vast majority of smartphone users on the planet using a platform that was tracking the hell out of them? They've just been given a big platform to condemn user privacy issue based on Apple's MEC surveilling, so where the hell is page 2 to protect hundreds of millions of people getting their privacy data tracked constantly? Speak now or show yourself to be looking for a few headlines.
EFF has been there since day 1. Calling out Facebook and Google but also calling out Apple when it was needed. Where were the rest of these letter signers? Unfortunately a few of them probably, I suspect, getting "third party research" grants. See how that works?
It's a private method for comparing an image hash against a specific database of image hashes.
No indication that it'll become a generic API, or a feature for anyone else to use.
No indication that there will be any way for it to be hijacked, short of rooting the entire phone, in which case all bets are off anyway.
No indication that it could be used to search for anything other than images, or more specifically the hashes of those images.
No indication that it couldn't be shut down any time Apple wanted to shut it down.
It's not FUD if he's playing devil's advocate.
C'mon, you of all people how this game's played.
Not sure what the EFF expects when it comes to laws passed by authoritarian governments. The only options would be to comply or leave the market. If every private company leaves the market, then state run companies will fill the void. So on the one hand you can say "we're not participating", but on the other you would have to admit that you haven't changed anything per protections for the people that live in the country.
You have the option of doing that yourself. Apple doesn't have to eliminate it for everybody -- they simply gave everybody the option to do so if they wished.
Sputtering about privacy, government surveillance, etc. when you are impotent to do anything because the only option is to leave the internet is not productive. Several have threatened or stated that they will no longer buy Apple products because of this. Who are they kidding, where will they go? This will blow over, will be implemented, and life will go on. By the way, both political parties are on board with this, so no matter who you vote for it’s a done deal.
So how about those of you clamoring for Apple to stop this tell us what you plan to do when it is implemented. What will be your response? Writing your Congressperson? Or will you be hypocrites and go about your business?
The ”good guys” turning off iCloud does not solve the actual problem of the “bad guys” storing their photos in Apple’s iCloud.
Assuming we believe CSAM is a something that needs tackling, the only other workable option I can see would be to shut down iCloud photos.
This would have the considerable disadvantage of having to store all our photos on device but this was how it used to be (in the good old days) then the "bad guys" photos would be stored safely on their own devices.