German journalism association stokes fear over Apple CSAM initiative

Posted:
in General Discussion edited August 2021
A German journalist's union has demanded that the European Commission step in over Apple's CSAM tools, believing that the system will be used to harvest contact information and perform other intrusions.




Apple's CSAM tools, intended to help fight the spread of illegal images of children, have courted controversy throughout August, as critics proclaim them to be an affront to privacy. The latest group to speak out about the supposed threat is, oddly, journalists in Germany, Austria, and Switzerland.

Journalist union DJV, representing writers in the country, believes that Apple "intends to monitor cell phones locally in the future." In a press release, the union calls the tools a "violation of the freedom of the press," and urges for the EU Commission and Austrian and German federal interior ministers to take action.

According to public editors association AGRA spokesman Hubert Krech, Apple has introduced "a tool with which a company wants to access other user data on their own devices, such as contracts and confidential documents," which is thought to be a violation of GDPR rules.

Frank Uberall, chairman of the DJV, adds it could be the first step of many. "Will images or videos of opponents of the regime or user data be checked at some point using an algorithm?" Uberall asks.

ORF editors council spokesman Dieter Bornemann offers a bleaker outlook, suggesting a government could check for images that could be evidence the user is involved in the LGBT community. It is also feared that totalitarian states could take advantage of the system's supposed capabilities.

The group also dismisses the claim that it will only apply in the United States, as most European media outlets have correspondents in the country. Furthermore, it is believed "What begins in the USA will certainly follow in Europe as well," the DJV states.

Misplaced concern

While the worry of having smartphones snooped by governments and security agencies can be well-founded in some cases, as with the Pegasus spying scandal, it seems DJV is overreaching with its claims of Apple's CSAM tools.

This is in part due to the nature of Apple's CSAM system in the first place. One part involves a scanning of hashes of images that are stored on iCloud Photos, checked against a database of existing CSAM images, rather than examining the image itself.

The second part is an on-device machine learning system for child accounts that have access to iMessage, one that doesn't compare against CSAM databases. In that element, the system doesn't report to Apple, only to the parental Family Sharing manager account.

Following the initial outcry from the public and critics, as well as a warped view of the system's capabilities into being potentially used by governments for surveillance purposes, Apple has attempted to set the record straight about the tools, with evidently limited success.

Apple privacy chief Erik Neuenschwander explained the CSAM detection system has numerous elements to prevent a single government from abusing it. Apple has also published support documents explaining the system in more detail, what it does, and how it is kept safe from interference.

Apple SVP of software engineering Crag Federighi said on Friday that the company was wrong to release the three child protection features at the same time, which led to a "jumbled" and "widely misunderstood" assessment of the system.

"I grant you, in hindsight, introducing these two features at the same time was a recipe for this kind of confusion," said Federighi. "It's really clear a lot of messages got jumbled up pretty badly. I do believe the soundbite that got out early was, 'oh my god, Apple is scanning my phone for images.' This is not what is happening."

Read on AppleInsider
«1

Comments

  • Reply 1 of 33
    chadbagchadbag Posts: 2,000member
    While this journalist org is way jumping the shark in how this particular Spyware Apple wants to put on the iPhone can be abused, they are correct that in the future Apple could easily change or enhance it to do other things.  Only policies, which are easily changed, stop them.  Changes to how it works, technically, are just a release away. All the assurances they give are based on policies.   That is no assurance.   Apple has lost the trust of people through this misguided CSAM feature (I am not addressing the Messages feature), and rightly so.  And this is the result.  It is not a out the “messaging” or “optics”.  It is the feature itself and opening the Pandora’s Box of on-device spyware. 
    rcfabyronlentropysmobirdapplguybaconstangRayz2016caladanian
  • Reply 2 of 33
    BeatsBeats Posts: 3,073member
    People are seeing through the bull****. 
    entropysRayz2016muthuk_vanalingam
  • Reply 3 of 33
    They are just right, because scanning private data on a user's device is not allowed in the EU.
    Even if not enabled, Apple could do so without being noticed by the user.
    So it's a prohibited back door.
    byronlcaladanian
  • Reply 4 of 33
    They're lobbying the EU because they know they would never succeed in court. It's all hypotheticals. 
  • Reply 5 of 33
    Rayz2016Rayz2016 Posts: 6,957member
    "I grant you, in hindsight, introducing these two features at the same time was a recipe for this kind of confusion," said Federighi.

    No one is confused, Craig.
    baconstangmuthuk_vanalingamxyzzy-xxxcaladanian
  • Reply 6 of 33
    Rayz2016 said:
    "I grant you, in hindsight, introducing these two features at the same time was a recipe for this kind of confusion," said Federighi.

    No one is confused, Craig.
    It's obvious people are confused because they keep bringing up "search/seizure" when talking about files uploaded to iCloud servers. 
    mknelson
  • Reply 7 of 33
    jungmarkjungmark Posts: 6,926member
    You would think journalists would get facts rather than use speculation. These same journalists probably use Google apps and do not see the irony there. 
    ArchStantonikirbadmonkStrangeDaysmanfred zornJanNL
  • Reply 8 of 33
    jungmark said:
    You would think journalists would get facts rather than use speculation. These same journalists probably use Google apps and do not see the irony there. 
    Well said, you beat me to the punch. "On Phone!".....ohhhh spooky. It's surreal that people make these arguments while using Android/Google and Facebook. What's quite real is that anyone who doesn't touch on these obvious data mining done on the aforementioned platforms while clutching their pearls about Apple's CSAM policy are looking for media time. They want the media spotlight while clearly not wanting the importance of data privacy to be the important point. 
    badmonkmanfred zorn
  • Reply 9 of 33
    Mike WuertheleMike Wuerthele Posts: 6,861administrator
    If you can't see your post, it's time to take a look at the commenting guidelines conveniently linked at the bottom of every forum page and figure out why.
    edited August 2021
  • Reply 10 of 33
    entropysentropys Posts: 4,166member
    It doesn’t help when people are confused about what this does. But it also doesn’t help when people ignore what it does because “Think of the children!!” Like this article.

    Bottom line, is it has created a mechanism to identify people that have a list of specified imagery on their phone.  Kiddie fiddlers to start, who could argue against that? Next up imagery that a less liberal government doesn’t like.  And I mean liberal in the traditional sense, not the type the big brother acolytes in America who have usurped that term have claimed.

    and now Apple can’t back down despite the considerable blow back.  Because “Won’t someone think of the children!!” McGuffin to implement it has Apple trapped like a fly in amber.

    but it can’t claim it protects your privacy anymore.

  • Reply 11 of 33
    Mike WuertheleMike Wuerthele Posts: 6,861administrator
    entropys said:
    It doesn’t help when people are confused about what this does. But it also doesn’t help when people ignore what it does because “Think of the children!!” Like this article.

    Bottom line, is it has created a mechanism to identify people that have a list of specified imagery on their phone.  Kiddie fiddlers to start, who could argue against that? Next up imagery that a less liberal government doesn’t like.  And I mean liberal in the traditional sense, not the type the big brother acolytes in America who have usurped that term have claimed.

    and now Apple can’t back down despite the considerable blow back.  Because “Won’t someone think of the children!!” McGuffin to implement it has Apple trapped like a fly in amber.

    but it can’t claim it protects your privacy anymore.

    Considering that unless you have this material on your iPhone, the company still doesn't have any information at all about your photos, it absolutely can.

    What this article does is ignore hypothetical situations that are being used by folks. What this article also does is discuss how the system doesn't do anything like what the German journalists (who use Gmail accounts, by the way) claim it does. What it does not do is validate your opinion on the matter, and that's fine -- but not a shortcoming in any way.
    edited August 2021 ikirfastasleep
  • Reply 12 of 33
    mjtomlinmjtomlin Posts: 2,673member
    chadbag said:
    While this journalist org is way jumping the shark in how this particular Spyware Apple wants to put on the iPhone can be abused, they are correct that in the future Apple could easily change or enhance it to do other things.  Only policies, which are easily changed, stop them.  Changes to how it works, technically, are just a release away. All the assurances they give are based on policies.   That is no assurance.   Apple has lost the trust of people through this misguided CSAM feature (I am not addressing the Messages feature), and rightly so.  And this is the result.  It is not a out the “messaging” or “optics”.  It is the feature itself and opening the Pandora’s Box of on-device spyware. 
    Umm… This is an extremely ignorant point of view… I hate to tell you this, Apple is able, at anytime, to add any feature to its OS to spy on people. It DOES NOT need this CSAM feature to enable this. The system, once you read how it works, is peer-reviewed and steps are in place that it is literally impossible to reuse it for anything other than looking for child-pornography.

    Apple controls the hardware, the OS, and the services. There is NOTHING stopping them from adding a service to scan and report on anything they want. They already actually scan and perform image recognition on all your photos in their Photos app to add tags; dog, mountain, cat, car, etc.

    All this freaken out over this issue is asinine and short-sighted. It would be a very huge issue if Apple tried to hide what it was doing.

    It seems people would rather remain ignorant and be angry, than to actually take the time to educate themselves. 
    edited August 2021 StrangeDays
  • Reply 13 of 33
    mjtomlinmjtomlin Posts: 2,673member
    entropys said:
    Bottom line, is it has created a mechanism to identify people that have a list of specified imagery on their phone.  Kiddie fiddlers to start, who could argue against that? Next up imagery that a less liberal government doesn’t like.  And I mean liberal in the traditional sense, not the type the big brother acolytes in America who have usurped that term have claimed.


    You do not need this CSAM policy to do that!!! Apple already has image recognition they can and do use on your photos!!! CSAM is only used for identifying child-pornography. THAT’S IT!!!

    It looks at a file’s hash and compares it with a database of other file hashes. The image is not scanned or “looked” at. 

    People acting like it can be used for something else is MISSING THE BIGGER PICTURE - IT IS NOT NEEDED. If Apple wanted to turn you in for other things, they already have that capability!!!! Your device already scan and tag your photos describing the image.

    And to think that a government might try to co-opt it for other reasons is utterly ridiculous, when again, this system is not needed to scan through photos.

  • Reply 14 of 33
    I think it’s pretty disingenuous to dismiss these concerns as misplaced when multiple respected privacy groups outside this European group are still raising points. I would have appreciated a more neutral and journalistic tone. 
  • Reply 15 of 33
    Mike WuertheleMike Wuerthele Posts: 6,861administrator
    I think it’s pretty disingenuous to dismiss these concerns as misplaced when multiple respected privacy groups outside this European group are still raising points. I would have appreciated a more neutral and journalistic tone. 
    Well, it's pretty disingenuous to ignore the ludicrousness of "BUT WHAT IF IT CHANGES AND GETS WORSE LATER" as it pertains to the system today, or like the journalists above, suggest that their contacts will be harvested somehow violating freedom of the press, considering that's not what's going on in any way, shape, or form.

    If it changes, we'll report on it.
    StrangeDaysfastasleep
  • Reply 16 of 33
    entropysentropys Posts: 4,166member
    I don’t see how it is disingenuous (or ludicrous for that matter) to point out that an algorithm that creates a hash for a list of a CSAM identified imagery can very easily be used for a hash to other unapproved imagery. The point being who is doing the “unapproving”.  Truth is we only have Apple’s policy promise it will never be extended. 
    Now, I am pretty disposed to believe Apple, which usually is better than everyone else with respect to privacy.  But this act dilutes that promise because they got caught up focussed on child protection, seemingly forgetting future consequences (actually that is the best look at it, a harsh interpretation could be they wanted to create this algorithm and used child protection as the rhetorical enabler). 

    Create the tool and it will be used.  And you might never know to report on it.
    edited August 2021
  • Reply 17 of 33
    yojimbo007yojimbo007 Posts: 1,165member
    BACKDOOR DISGUISED AS VERTUE!
    Apple killing its own Mantra and breaking it ecosys at the same time…in one stupid, dangerous and hypocritical move!

    No iOS 15 or iPhone 13 or new ipad…. not for me!
    cocho
  • Reply 18 of 33
    I wonder if investigative journalists might come into the possession of illegal pornographic content as part of their investigations of, say, abuse cases and would be flagged by the system, thereby hindering their work. However, it does make me wonder why they would use a consumer platform such as iCloud if they work on such sensitive issues.
  • Reply 19 of 33
    roakeroake Posts: 811member
    entropys said:
    It doesn’t help when people are confused about what this does. But it also doesn’t help when people ignore what it does because “Think of the children!!” Like this article.

    Bottom line, is it has created a mechanism to identify people that have a list of specified imagery on their phone.  Kiddie fiddlers to start, who could argue against that? Next up imagery that a less liberal government doesn’t like.  And I mean liberal in the traditional sense, not the type the big brother acolytes in America who have usurped that term have claimed.

    and now Apple can’t back down despite the considerable blow back.  Because “Won’t someone think of the children!!” McGuffin to implement it has Apple trapped like a fly in amber.

    but it can’t claim it protects your privacy anymore.

    Considering that unless you have this material on your iPhone, the company still doesn't have any information at all about your photos, it absolutely can.

    What this article does is ignore hypothetical situations that are being used by folks. What this article also does is discuss how the system doesn't do anything like what the German journalists (who use Gmail accounts, by the way) claim it does. What it does not do is validate your opinion on the matter, and that's fine -- but not a shortcoming in any way.
    Apple has already bent their policies to China’s will, and will continue to do so.  What IS hypothetical is to say “I’ll bet this or that happens.”  What is NOT hypothetical is to note that China and other countries have already bent Apple to their own will on Apple Maps, privacy issues (servers have to be physically in China, where government can physically access them), etc.  Is it more hypothetical to say that China WILL demand abuse of these new capabilities?  Or more hypothetical to say that they won’t?  This will open a door that can’t easily be closed.  No amount of gaslighting is going to make intelligent people believe otherwise.

    Apple is not going to abandon sales in China in order to hold true to some policy-based principles.  They have already proven that.
    edited August 2021 muthuk_vanalingamRayz2016
  • Reply 20 of 33
    Rayz2016Rayz2016 Posts: 6,957member
    Beats said:
    People are seeing through the bull****. 
    Can you imagine what folk would be saying if Google were doing this?
    edited August 2021 xyzzy-xxxgatorguy
Sign In or Register to comment.