German government wants Tim Cook to reconsider CSAM plans

Posted:
in iOS edited August 2021
The German parliament is wading into the Apple CSAM debate, with a Digital Agenda committee chief writing to Apple CEO Tim Cook to express concerns about mass surveillance.




Apple's CSAM scanning system continues to draw criticism over its existence, largely at a misinterpretation as to what the system does and is capable of doing. The criticism has now reached the German Bundestag, the national parliament, which is now stepping into the affair.

A letter sent by Digital Agenda committee chairman Manuel Hoferlin claims Apple is treading a "dangerous path" and is undermining "secure and confidential communication," according to Heise Online. The letter to Cook urges Apple to not implement the system, both to protect society's data, and to avoid "foreseeable problems" for the company itself.

The CSAM tools are considered the "biggest breach of the dam for the confidentiality of communication that we have seen since the invention of the Internet," a translated extract from Hoferlin's letter reads. Without confidential communication, the internet would become "the greatest surveillance instrument in history."

The proposed tools are actually for two separate tasks. The main tool is a scanner that checks hashes of images uploaded to iCloud Photos for matches against a database of known CSAM images, not the content of the files themselves.

The second is in Messages, which will warn young users of harmful material they may see in messages, and will inform family account administrators of the incident. While the second system uses on-device machine learning to inspect the images, Apple is not fed back any data about the scans.

Despite the narrow impact of the tools, and reassurances from Apple, the parliament member still insists a narrow backdoor is still a backdoor. Requests to open the backdoor to scan other types of content are inevitable, they add, and could have Apple risking access to major international markets if it rejects them.

The Bundestag letter arrives one day after a German journalist union called for the European Commission to look into the matter, due to the perceived "violation of the freedom of the press."

Read on AppleInsider
«13

Comments

  • Reply 1 of 42
    bluefire1bluefire1 Posts: 1,302member
    No matter how laudable Apple‘s motives may be, this is a slippery slope that should definitely be reconsidered. Privacy, which Apple consistently prides itself in and boasts about, shouldn’t have any back door exceptions. 
    edited August 2021 xyzzy-xxxne1newisneverenoughmuthuk_vanalingamOferentropyselijahgbaconstangchemengin1
  • Reply 2 of 42
    bluefire1 said:
    No matter how laudable Apple‘s motives may be, this is an idea which should never have come to pass.
    Totally agree, if it's really about the cloud they would need to scan in the cloud and not putting spyware onto a billion if devices.

    Since scanning data on user's devices is prohibited in many countries, Apple is also in legal trouble (even when it officially is only in the USA), since a company plays the gatekeeper of this spyware and could change things at any time (even for specific users)!
    newisneverenoughmuthuk_vanalingamOferentropysbaconstang
  • Reply 3 of 42
    ne1ne1 Posts: 69member
    xyzzy-xxx said:
    bluefire1 said:
    No matter how laudable Apple‘s motives may be, this is an idea which should never have come to pass.
    Totally agree, if it's really about the cloud they would need to scan in the cloud and not putting spyware onto a billion if devices.

    Since scanning data on user's devices is prohibited in many countries, Apple is also in legal trouble (even when it officially is only in the USA), since a company plays the gatekeeper of this spyware and could change things at any time (even for specific users)!
    Agree with both these points. 

    Apple needs to pull this “feature” from iOS15 in North America and everywhere else. It goes against everything Apple stands for. 
    dantheman827xyzzy-xxxnewisneverenoughmuthuk_vanalingamOferentropysbaconstang
  • Reply 4 of 42
    genovellegenovelle Posts: 1,480member
    Since the images are not scanned but specific data from hash tag markers to identify know child pornography images from a data base. It is no different from a file that has a known virus being detected and handled. There is a reason this guy is coming out as many others. They have these files themselves and are fearful of being caught  
    williamlondonikir
  • Reply 5 of 42
    genovelle said:
    Since the images are not scanned but specific data from hash tag markers to identify know child pornography images from a data base. It is no different from a file that has a known virus being detected and handled. There is a reason this guy is coming out as many others. They have these files themselves and are fearful of being caught  
    Sure... everyone that's against this has CSAM... that makes perfect sense...

    Or I don't know, maybe people actually value their privacy and Apple just messed up big time?

    It's not an objection to the scanning, it's an objection to the scanning being done on your device without you having the option to disable it.
    edited August 2021 xyzzy-xxxmuthuk_vanalingamOferWgkruegerbaconstangPascalxx
  • Reply 6 of 42

    It’s quite possible that any detection system that may actually work will receive a lot of criticism. Not because a loads of people are guilty of what is being looked for, but because most of us probably have files/photos which we would rather not be looked at as they maybe marginally or actually illegal. 

    The black economy is no doubt way bigger that child pornography and we fear the ability for this to be discovered by whatever government we are hiding from. 

    Politicians may be extra nervous.

    Ofer
  • Reply 7 of 42
    EsquireCatsEsquireCats Posts: 1,268member
    Reading a lot of hysterical people who have completely missed how this is implemented.

    It's no more a slippery slope than Facebook, Instagram, Google Photos, Youtube, et. al. each scanning content and hashes upon upload. To address the topic of people worried this would devolve into a tool to suppress political dissent: it's trivial to repurpose a political message to evade a hash by reinventing the content - indeed this happens naturally even without such restrictions, while on the other hand it's difficult to repurpose a series of photos or videos to evade filters (see also copyright filters.)

    Now onto the hysterics: there is nothing preventing nefarious governments from requesting nefarious deeds from all of the above named services, so the suggestion that governments will uniquely lobby Apple to do this is a shaky argument (especially when considering the above services all have more users than iCloud Photo Library.)

    I also don't buy the privacy argument - your Photo library already hashes your photos, that's why you can search your library descriptively (e.g. type "Cat" to get photos iOS "thinks" are cat photos in your library.) This is merely adding a new source of hashes and doing something about matches that should be getting attention. Furthermore it only occurs when using a specific online photo service provided by Apple.
    edited August 2021 ronnsireofsethfastasleep
  • Reply 8 of 42
    zimmiezimmie Posts: 651member
    genovelle said:
    Since the images are not scanned but specific data from hash tag markers to identify know child pornography images from a data base. It is no different from a file that has a known virus being detected and handled. There is a reason this guy is coming out as many others. They have these files themselves and are fearful of being caught  
    Sure... everyone that's against this has CSAM... that makes perfect sense...

    Or I don't know, maybe people actually value their privacy and Apple just messed up big time?

    It's not an objection to the scanning, it's an objection to the scanning being done on your device without you having the option to disable it.
    You have the option to disable it! Don't sync your photos to iCloud, and your photos won't ever be hashed by this tool!

    Client-side CSAM detection is incontrovertibly better for privacy than server-side CSAM detection, which Apple currently does. To do the scanning on the server side, the server (and by extension, Apple) has to be able to see the photos you send to them. With client-side scanning, your photos can be encrypted before Apple ever gets them, so Apple can't ever see them. There have been several known incidents where employees of photo sync/sharing sites have saved customers' private images and shared them with other employees without the consent of the people in the photos. NSA employees are known to have done the same with photos caught in their surveillance dragnets. Client-side CSAM scanning and sending only encrypted images to Apple is specifically meant to address that type of issue.

    Whether the scanning should happen at all is definitely worth debating. The legal teams of every major photo sharing site clearly believe US law currently requires this scanning to happen. Dropbox, Facebook, Flickr, Google, Instagram, OneDrive, and more all do exactly the same scanning server-side which Apple does today.
    edited August 2021 ronnsireofsethfastasleep
  • Reply 9 of 42
    lkrupplkrupp Posts: 10,557member
    genovelle said:
    Since the images are not scanned but specific data from hash tag markers to identify know child pornography images from a data base. It is no different from a file that has a known virus being detected and handled. There is a reason this guy is coming out as many others. They have these files themselves and are fearful of being caught  
    Sure... everyone that's against this has CSAM... that makes perfect sense...

    Or I don't know, maybe people actually value their privacy and Apple just messed up big time?

    It's not an objection to the scanning, it's an objection to the scanning being done on your device without you having the option to disable it.
    Well, if the user has the option to disable it then it’s worthless as a tool to detect CSAM isn’t it.
  • Reply 10 of 42
    zeus423zeus423 Posts: 240member
    I wonder how many people at Apple never saw the uproar that this is creating. Everyone was just sitting around a big apple-shaped table nodding their heads, "Yes, this is a great idea. I don't see how anyone could possibly be against it."
    muthuk_vanalingamOferentropysmacplusplusbaconstangchemengin1
  • Reply 11 of 42
    This is a problem in concept. It doesn’t matter whether it’s hashes or anything else. The concept is Apple is checking peoples private data. They can be endlessly clever and it’s still a problem. It’s still amazing that they didn’t see that at the beginning. Maybe it’s arrogance. Maybe it’s government pressure. Maybe it’s stupidity. Whatever the case, it’s wrong. 
    muthuk_vanalingamOferentropysbaconstang
  • Reply 12 of 42
    entropysentropys Posts: 4,166member
    genovelle said:
    Since the images are not scanned but specific data from hash tag markers to identify know child pornography images from a data base. It is no different from a file that has a known virus being detected and handled. There is a reason this guy is coming out as many others. They have these files themselves and are fearful of being caught  

    elijahgbaconstang
  • Reply 13 of 42
    entropysentropys Posts: 4,166member
    lkrupp said:
    genovelle said:
    Since the images are not scanned but specific data from hash tag markers to identify know child pornography images from a data base. It is no different from a file that has a known virus being detected and handled. There is a reason this guy is coming out as many others. They have these files themselves and are fearful of being caught  
    Sure... everyone that's against this has CSAM... that makes perfect sense...

    Or I don't know, maybe people actually value their privacy and Apple just messed up big time?

    It's not an objection to the scanning, it's an objection to the scanning being done on your device without you having the option to disable it.
    Well, if the user has the option to disable it then it’s worthless as a tool to detect CSAM isn’t it.
    True. Better not to do this overreach in the first place.  It’s surveillance, and just requires a tweak to a different hash list to scan for Winnie the Pooh imagery. Or an Aung San Suu. 
    Oferelijahgbaconstang
  • Reply 14 of 42
    OferOfer Posts: 241unconfirmed, member
    As others have already said, technically this amounts to illegal search. Law enforcement agencies are required to have reasonable suspicion and to obtain a warrant in order to search for illegal materials. Yet what Apple is doing is searching through every iCloud owner’s account for illegal material. Regardless of whether or not other companies already do this or whether or not it has potential for abuse by authorities, the act of searching people who are supposed to be presumed innocent is not right.
    macpluspluselijahgbaconstang
  • Reply 15 of 42
    mjtomlinmjtomlin Posts: 2,673member
    Google, Facebook, et al, actually scan and identify all images once they’re in the cloud so they actually know what your images contain… whether it be child pornography or not - they report those that do match CSAM images. I believe Facebook has said they’ve matched and reported more than 20,000,000 CSAM images on their servers. No one seems to have a problem with that. 

    Here’s how Apple handle’s the same problem…

    1. Your Apple device (not Apple) hashes a photo before it is uploaded to iCloud Photos and only when it is about to be uploaded to iCloud Photos.
    2. That hash is compared against a local database of CSAM hashes and flagged if there’s a match
    3. Apple then looks through iCloud photo libraries on their server for flagged photos.
    4. If a threshold is hit (X number of flagged photos), then Apple manually checks the hashes with a database they keep on the server side. If these are valid CSAM hashes then the account is flagged and reported.

    No where in any those steps does Apple ever scan your photos or perform any image recognition. They do not know what any of your images are or contain at any time. They do not look at the images, they only compare the hashes.

    There is no way this system can be abused by any 3rd party, as the actual images are unknown to Apple. If the CSAM database on the local device was somehow hacked and filled with hashes from some other type of images, then on the server side they would not match with the secondary check unless that hacker was able to completely replace Apple’s CSAM hash database.

    If I had to guess why Apple has developed this system, it is because they eventually plan on e2e encryption for all user files and data. As this would allow photos to be encrypted on device before being sent to the cloud and still enable them to check for CSAM material - which will probably become law soon - making online hosting/storage services liable for storing such material. This is not a back door… its a loophole. This allows Apple to be complaint with laws without giving law enforcement (or anyone else) access to all your data.

    edited August 2021 sireofsethfastasleepDoctorQbbh
  • Reply 16 of 42
    jidojido Posts: 125member
    Client-side CSAM detection is incontrovertibly better for privacy than server-side CSAM detection.
    THAT!

    I am happy of the change, it means Apple doesn't need to see the contents of my pictures to thwart child abuse.

    The Messages update may be more nefarious, don't want to notify authorities about what images I receive.
    sireofseth
  • Reply 17 of 42
    mjtomlinmjtomlin Posts: 2,673member
    xyzzy-xxx said:
    bluefire1 said:
    No matter how laudable Apple‘s motives may be, this is an idea which should never have come to pass.
    Totally agree, if it's really about the cloud they would need to scan in the cloud and not putting spyware onto a billion if devices.

    Since scanning data on user's devices is prohibited in many countries, Apple is also in legal trouble (even when it officially is only in the USA), since a company plays the gatekeeper of this spyware and could change things at any time (even for specific users)!

    No data is scanned on user devices. This is similar to how data is encrypted. The encryption scheme doesn’t care what the data is, it just encrypts it. Creating a hash of data is the exact same thing. And this only happens when a photo is being uploaded to Apple’s iCloud Photos. It’s more secure and more privacy focused to do this before it’s uploaded as it enables the possibility of the photo itself to be encrypted on device before it gets to the server. This means Apple (or anyone else) would not be able to look at it.

    Furthermore…
    How do you think search works on devices? All data is scanned and indexed.
    How do you think virus/malware protection works? Everything is scanned for a fingerprint.
    How do you think data detectors work? Data is scanned and patterned matched.
    How do you think hand writing recognition works? Images are scanned for text.
    How do you think facial recognition works? Voice recognition? etc…

    If it was illegal to scan data on user devices, most of these devices would be completely worthless paper weights.
    sireofsethDoctorQikir
  • Reply 18 of 42
    Mike WuertheleMike Wuerthele Posts: 6,861administrator
    mjtomlin said:
    Google, Facebook, et al, actually scan and identify all images once they’re in the cloud so they actually know what your images contain… whether it be child pornography or not - they report those that do match CSAM images. I believe Facebook has said they’ve matched and reported more than 20,000 CSAM images on their servers. No one seems to have a problem with that. 


    20 million.
    sireofsethfastasleepikir
  • Reply 19 of 42
    crowleycrowley Posts: 10,453member
    Ofer said:
    As others have already said, technically this amounts to illegal search. Law enforcement agencies are required to have reasonable suspicion and to obtain a warrant in order to search for illegal materials. Yet what Apple is doing is searching through every iCloud owner’s account for illegal material. Regardless of whether or not other companies already do this or whether or not it has potential for abuse by authorities, the act of searching people who are supposed to be presumed innocent is not right.
    They aren't searching through your account, they're checking the hash of your images on upload.  And it's certainly not illegal for a private company to place conditions on you storing data on their servers and reserving the right to check it.
    ikir
  • Reply 20 of 42
    crowleycrowley Posts: 10,453member

    most of us probably have files/photos which we would rather not be looked at as they maybe marginally or actually illegal. 

    Most of us? :/ 
    fastasleep
Sign In or Register to comment.