German journalism association stokes fear over Apple CSAM initiative
A German journalist's union has demanded that the European Commission step in over Apple's CSAM tools, believing that the system will be used to harvest contact information and perform other intrusions.
Apple's CSAM tools, intended to help fight the spread of illegal images of children, have courted controversy throughout August, as critics proclaim them to be an affront to privacy. The latest group to speak out about the supposed threat is, oddly, journalists in Germany, Austria, and Switzerland.
Journalist union DJV, representing writers in the country, believes that Apple "intends to monitor cell phones locally in the future." In a press release, the union calls the tools a "violation of the freedom of the press," and urges for the EU Commission and Austrian and German federal interior ministers to take action.
According to public editors association AGRA spokesman Hubert Krech, Apple has introduced "a tool with which a company wants to access other user data on their own devices, such as contracts and confidential documents," which is thought to be a violation of GDPR rules.
Frank Uberall, chairman of the DJV, adds it could be the first step of many. "Will images or videos of opponents of the regime or user data be checked at some point using an algorithm?" Uberall asks.
ORF editors council spokesman Dieter Bornemann offers a bleaker outlook, suggesting a government could check for images that could be evidence the user is involved in the LGBT community. It is also feared that totalitarian states could take advantage of the system's supposed capabilities.
The group also dismisses the claim that it will only apply in the United States, as most European media outlets have correspondents in the country. Furthermore, it is believed "What begins in the USA will certainly follow in Europe as well," the DJV states.
This is in part due to the nature of Apple's CSAM system in the first place. One part involves a scanning of hashes of images that are stored on iCloud Photos, checked against a database of existing CSAM images, rather than examining the image itself.
The second part is an on-device machine learning system for child accounts that have access to iMessage, one that doesn't compare against CSAM databases. In that element, the system doesn't report to Apple, only to the parental Family Sharing manager account.
Following the initial outcry from the public and critics, as well as a warped view of the system's capabilities into being potentially used by governments for surveillance purposes, Apple has attempted to set the record straight about the tools, with evidently limited success.
Apple privacy chief Erik Neuenschwander explained the CSAM detection system has numerous elements to prevent a single government from abusing it. Apple has also published support documents explaining the system in more detail, what it does, and how it is kept safe from interference.
Apple SVP of software engineering Crag Federighi said on Friday that the company was wrong to release the three child protection features at the same time, which led to a "jumbled" and "widely misunderstood" assessment of the system.
"I grant you, in hindsight, introducing these two features at the same time was a recipe for this kind of confusion," said Federighi. "It's really clear a lot of messages got jumbled up pretty badly. I do believe the soundbite that got out early was, 'oh my god, Apple is scanning my phone for images.' This is not what is happening."
Read on AppleInsider
Apple's CSAM tools, intended to help fight the spread of illegal images of children, have courted controversy throughout August, as critics proclaim them to be an affront to privacy. The latest group to speak out about the supposed threat is, oddly, journalists in Germany, Austria, and Switzerland.
Journalist union DJV, representing writers in the country, believes that Apple "intends to monitor cell phones locally in the future." In a press release, the union calls the tools a "violation of the freedom of the press," and urges for the EU Commission and Austrian and German federal interior ministers to take action.
According to public editors association AGRA spokesman Hubert Krech, Apple has introduced "a tool with which a company wants to access other user data on their own devices, such as contracts and confidential documents," which is thought to be a violation of GDPR rules.
Frank Uberall, chairman of the DJV, adds it could be the first step of many. "Will images or videos of opponents of the regime or user data be checked at some point using an algorithm?" Uberall asks.
ORF editors council spokesman Dieter Bornemann offers a bleaker outlook, suggesting a government could check for images that could be evidence the user is involved in the LGBT community. It is also feared that totalitarian states could take advantage of the system's supposed capabilities.
The group also dismisses the claim that it will only apply in the United States, as most European media outlets have correspondents in the country. Furthermore, it is believed "What begins in the USA will certainly follow in Europe as well," the DJV states.
Misplaced concern
While the worry of having smartphones snooped by governments and security agencies can be well-founded in some cases, as with the Pegasus spying scandal, it seems DJV is overreaching with its claims of Apple's CSAM tools.This is in part due to the nature of Apple's CSAM system in the first place. One part involves a scanning of hashes of images that are stored on iCloud Photos, checked against a database of existing CSAM images, rather than examining the image itself.
The second part is an on-device machine learning system for child accounts that have access to iMessage, one that doesn't compare against CSAM databases. In that element, the system doesn't report to Apple, only to the parental Family Sharing manager account.
Following the initial outcry from the public and critics, as well as a warped view of the system's capabilities into being potentially used by governments for surveillance purposes, Apple has attempted to set the record straight about the tools, with evidently limited success.
Apple privacy chief Erik Neuenschwander explained the CSAM detection system has numerous elements to prevent a single government from abusing it. Apple has also published support documents explaining the system in more detail, what it does, and how it is kept safe from interference.
Apple SVP of software engineering Crag Federighi said on Friday that the company was wrong to release the three child protection features at the same time, which led to a "jumbled" and "widely misunderstood" assessment of the system.
"I grant you, in hindsight, introducing these two features at the same time was a recipe for this kind of confusion," said Federighi. "It's really clear a lot of messages got jumbled up pretty badly. I do believe the soundbite that got out early was, 'oh my god, Apple is scanning my phone for images.' This is not what is happening."
Read on AppleInsider
Comments
Even if not enabled, Apple could do so without being noticed by the user.
So it's a prohibited back door.
and now Apple can’t back down despite the considerable blow back. Because “Won’t someone think of the children!!” McGuffin to implement it has Apple trapped like a fly in amber.
but it can’t claim it protects your privacy anymore.
What this article does is ignore hypothetical situations that are being used by folks. What this article also does is discuss how the system doesn't do anything like what the German journalists (who use Gmail accounts, by the way) claim it does. What it does not do is validate your opinion on the matter, and that's fine -- but not a shortcoming in any way.
Apple controls the hardware, the OS, and the services. There is NOTHING stopping them from adding a service to scan and report on anything they want. They already actually scan and perform image recognition on all your photos in their Photos app to add tags; dog, mountain, cat, car, etc.
All this freaken out over this issue is asinine and short-sighted. It would be a very huge issue if Apple tried to hide what it was doing.
It seems people would rather remain ignorant and be angry, than to actually take the time to educate themselves.
You do not need this CSAM policy to do that!!! Apple already has image recognition they can and do use on your photos!!! CSAM is only used for identifying child-pornography. THAT’S IT!!!
It looks at a file’s hash and compares it with a database of other file hashes. The image is not scanned or “looked” at.
People acting like it can be used for something else is MISSING THE BIGGER PICTURE - IT IS NOT NEEDED. If Apple wanted to turn you in for other things, they already have that capability!!!! Your device already scan and tag your photos describing the image.
And to think that a government might try to co-opt it for other reasons is utterly ridiculous, when again, this system is not needed to scan through photos.
If it changes, we'll report on it.