Apple & EU slammed for dangerous child abuse imagery scanning plans

Posted:
in General Discussion
Security researchers say that Apple's CSAM prevention plans, and the EU's similar proposals, represent "dangerous technology," that expands the "surveillance powers of the state."

Apple's new child protection feature
Apple's new child protection feature


Apple has not yet announced when it intends to introduce its child protection features, after postponing them because of concerns from security experts. Now a team of researchers have published a report saying that similar plans from both Apple and the European Union, represent a national security issue.

According to the New York Times, the report comes from more than a dozen cybersecurity researchers. The group began its study before Apple's initial announcement, and say they are publishing despite Apple's delay, in order to warn the EU about this "dangerous technology."

Apple's plan was for a suite of tools to do with protecting children from the spread of child sexual abuse material (CSAM). One part would block suspected harmful images being seen in Messages, and another automatically scan images stored in iCloud.

The latter is similar to how Google has already been scanning Gmail for such images since 2008. However, privacy groups believed Apple's plan could lead to governments demanding that the firm scan images for political purposes.

"We wish that this had come out a little more clearly for everyone because we feel very positive and strongly about what we're doing, and we can see that it's been widely misunderstood," Apple's Craig Federighi said. "I grant you, in hindsight, introducing these two features at the same time was a recipe for this kind of confusion."

"It's really clear a lot of messages got jumbled up pretty badly," he continued. "I do believe the soundbite that got out early was, 'oh my god, Apple is scanning my phone for images.' This is not what is happening."

There was considerable confusion, which AppleInsider addressed, as well as much objection from privacy organizations, and governments.

The authors of the new report believe that the EU plans a similar system to Apple's in how it would scan for images of child sexual abuse. The EU's plan goes further in that it also looks for organized crime and terrorist activity.

"It should be a national-security priority to resist attempts to spy on and influence law-abiding citizens," the researchers wrote in the report seen by the New York Times.

"It's allowing scanning of a personal private device without any probable cause for anything illegitimate being done," said the group's Susan Landau, professor of cybersecurity and policy at Tufts University. "It's extraordinarily dangerous. It's dangerous for business, national security, for public safety and for privacy."

Apple has not commented on the report.

Read on AppleInsider
Sage1111
«1

Comments

  • Reply 1 of 21
    elijahgelijahg Posts: 2,753member
    "Confusion" was not the issue. Apple of course would say it is because to do otherwise would be an admission that the feature was toxic and entirely contradictory to their public privacy stance. Privacy organisations and governments weren't "confused", they could foresee the potential privacy consequences. Apple knows full well the pushback was due to their public "what's happens on your phone stays on your phone" stance, the polar opposite to scanning phones for CSAM - and the potential for further encroachment on privacy.
    muthuk_vanalingamxyzzy-xxxOferentropyscat52byronldarkvaderSage1111
  • Reply 2 of 21
    gatorguygatorguy Posts: 24,176member
    Expanding on the article just a bit:

    Documents released by the European Union and a meeting with E.U. officials last year led them to believe that the bloc’s governing body wanted a similar program that would scan not only for images of child sexual abuse but also for signs of organized crime and indications of terrorist ties.

    A proposal to allow the photo scanning in the European Union could come as soon as this year, the researchers believe.

    ... now the EU knows Apple possesses this capability – it might simply pass a law requiring the iPhone maker to expand the scope of its scanning. Why reinvent the wheel when a few strokes of a pen can get the job done in 27 countries?

    “It should be a national-security priority to resist attempts to spy on and influence law-abiding citizens,” the researchers wrote ...

    “Expansion of the surveillance powers of the state really is passing a red line,” said Ross Anderson, a professor of security engineering at the University of Cambridge and a member of the group […]

    “It’s allowing scanning of a personal private device without any probable cause for anything illegitimate being done,” added another member of the group, Susan Landau, a professor of cybersecurity and policy at Tufts University. “It’s extraordinarily dangerous. It’s dangerous for business, national security, for public safety and for privacy.”

    For those who want more details on what the EU may propose read this report commissioned by the Council of the European Union and the recommended Council Resolution for Encryption. https://data.consilium.europa.eu/doc/document/ST-13084-2020-REV-1/en/pdf CSAM as proposed by Apple is a horrible idea IMO.

    Anyone saying "oh it's no different than Google does in the cloud" or "anyone that doesn't want it just don't use the Photos app" needs to read this.
    edited October 2021 muthuk_vanalingamelijahgOfercat52darkvader
  • Reply 3 of 21
    Hopefully this technology will never be installed und users devices!
    tylersdadentropyselijahgdarkvader
  • Reply 4 of 21
    neilmneilm Posts: 985member
    Prediction: the CSAM horse is dead as a Monty Python parrot — no need to keep beating it. Yes, we know Apple says it’s only resting, and if we believe that, well…
  • Reply 5 of 21
    How Apple didn’t see these privacy and security threats to their users and to society is mind boggling. 
    curiousrun8entropyscat52elijahgplanetary pauldarkvader
  • Reply 6 of 21
    The EU's plan goes further in that it also looks for organized crime and terrorist activity.
    I hate to be the one to say "I told you so", but I bloody well told you so.

    When the capability exists, overreaching governments will exploit it.
    Ofertylersdadentropyswilliamlondoncat52elijahgplanetary pauldarkvader
  • Reply 7 of 21
    zimmiezimmie Posts: 651member
    elijahg said:
    "Confusion" was not the issue. Apple of course would say it is because to do otherwise would be an admission that the feature was toxic and entirely contradictory to their public privacy stance. Privacy organisations and governments weren't "confused", they could foresee the potential privacy consequences. Apple knows full well the pushback was due to their public "what's happens on your phone stays on your phone" stance, the polar opposite to scanning phones for CSAM - and the potential for further encroachment on privacy.
    Confusion absolutely is the issue. Today, Apple employees have the technical capability to view images you upload to iCloud. With the CSAM scanning, they would no longer have the technical capability to do so. That would be a substantial improvement to the current situation.

    The EU's plan goes further in that it also looks for organized crime and terrorist activity.
    I hate to be the one to say "I told you so", but I bloody well told you so.

    When the capability exists, overreaching governments will exploit it.
    Spotlight literally exists to index all content on the phone. That capability has existed for over a decade. Why haven't we seen any governmental attempt to force reporting if that index shows any document or message containing "bomb" and "president" or "minister", for example?



    I still think they should carve iCloud photo sync off into a separate tool, and make it clear the CSAM scanning is only part of that tool. If you want no CSAM scanning on your device, just delete the ability to sync photos to iCloud. Done. Solves everybody's complaints.
    edred
  • Reply 8 of 21
    StrangeDaysStrangeDays Posts: 12,844member
    neilm said:
    Prediction: the CSAM horse is dead as a Monty Python parrot — no need to keep beating it. Yes, we know Apple says it’s only resting, and if we believe that, well…
    Not so. CSAM scanning for child porn is still happening on iCloud as it has for almost two years. And on Microsoft, and Google, and Dropbox, and Tumblr, etc etc....

    iCloud: 

    https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/ ;

    MS and Google:

    https://www.microsoft.com/en-us/photodna

    https://protectingchildren.google/intl/en/

    ...the only difference was Apple doing the iCloud-photo scan on device prior to syncing the upload to the iCloud server. This actually increased privacy because it wouldn't record any hits unless the threshold for child porn was hit. As is w/ server-side scans, each hit can be logged.

    There is no functional difference -- if governments try to mandate other types of scans nothing about it being server-side makes that harder. Arguably, it only makes it easier.

    edited October 2021 lam92103edredjony0
  • Reply 9 of 21
    This was never about CSAM at all. If Apple really wanted to solve the problem of sexual images of underage children, then they could have just made the signature adding process open to the public.

    But ofcourse they didn't. It was never the goal. They just wanted to hand more power over to government agencies
    williamlondoncat52darkvadermuthuk_vanalingam
  • Reply 10 of 21
    gatorguygatorguy Posts: 24,176member
    neilm said:
    Prediction: the CSAM horse is dead as a Monty Python parrot — no need to keep beating it. Yes, we know Apple says it’s only resting, and if we believe that, well…
    Not so. CSAM scanning for child porn is still happening on iCloud as it has for almost two years. And on Microsoft, and Google, and Dropbox, and Tumblr, etc etc....

    iCloud: 

    https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/ ;

    MS and Google:

    https://www.microsoft.com/en-us/photodna

    https://protectingchildren.google/intl/en/

    ...the only difference was Apple doing the iCloud-photo scan on device prior to syncing the upload to the iCloud server. This actually increased privacy because it wouldn't record any hits unless the threshold for child porn was hit. As is w/ server-side scans, each hit can be logged.

    There is no functional difference -- if governments try to mandate other types of scans nothing about it being server-side makes that harder. Arguably, it only makes it easier.

    This EFF article from several weeks ago is making more sense, in light of this revealing of EU encryption recommendations, than many AI members gave it credit for when first published.  IMO it deserves another reading. 

    https://www.eff.org/deeplinks/2021/08/apples-plan-think-different-about-encryption-opens-backdoor-your-private-life
    edited October 2021 cat52elijahgdarkvadermuthuk_vanalingam
  • Reply 11 of 21
    How Apple didn’t see these privacy and security threats to their users and to society is mind boggling. 
    Plenty of people on Apple Insider didn't see this as a privacy or security threat. 

    Absolutely mind boggling. 
    entropyscat52elijahgplanetary pauldarkvadermuthuk_vanalingam
  • Reply 12 of 21
    zimmie said:
    elijahg said:
    "Confusion" was not the issue. Apple of course would say it is because to do otherwise would be an admission that the feature was toxic and entirely contradictory to their public privacy stance. Privacy organisations and governments weren't "confused", they could foresee the potential privacy consequences. Apple knows full well the pushback was due to their public "what's happens on your phone stays on your phone" stance, the polar opposite to scanning phones for CSAM - and the potential for further encroachment on privacy.
    Confusion absolutely is the issue. Today, Apple employees have the technical capability to view images you upload to iCloud. With the CSAM scanning, they would no longer have the technical capability to do so. That would be a substantial improvement to the current situation.

    The EU's plan goes further in that it also looks for organized crime and terrorist activity.
    I hate to be the one to say "I told you so", but I bloody well told you so.

    When the capability exists, overreaching governments will exploit it.
    Spotlight literally exists to index all content on the phone. That capability has existed for over a decade. Why haven't we seen any governmental attempt to force reporting if that index shows any document or message containing "bomb" and "president" or "minister", for example?



    I still think they should carve iCloud photo sync off into a separate tool, and make it clear the CSAM scanning is only part of that tool. If you want no CSAM scanning on your device, just delete the ability to sync photos to iCloud. Done. Solves everybody's complaints.
    It does not solve the complaints of those who still want to use iCloud.
    cat52elijahgdarkvadermuthuk_vanalingam
  • Reply 13 of 21
    Has anyone seen or heard of anything in iOS 15 code that references CSAM or this technology?  Where during a minor update (15.0.? Or 15.?) it could be activated because it was already in the system. 
    cat52darkvader
  • Reply 14 of 21
    crowleycrowley Posts: 10,453member
    Has this added anything to the conversation?  Seems like the same points again and again, and they’re no more convincing now.
    jony0
  • Reply 15 of 21
    mcdavemcdave Posts: 1,927member
    Nonsense, if Governments can insist Apple scans on-device libraries, they’re already insisting Google & Facebook scan cloud libraries.

    Also, I want implicit content rating ASAP. Technology exposes our kids to all manner of information & I’d like potentially objectionable material flagged preferably without homosexuality straw man excuses, we deal with these things as a family.
    jony0
  • Reply 16 of 21
    mcdavemcdave Posts: 1,927member
    gatorguy said:
    Expanding on the article just a bit:

    Documents released by the European Union and a meeting with E.U. officials last year led them to believe that the bloc’s governing body wanted a similar program that would scan not only for images of child sexual abuse but also for signs of organized crime and indications of terrorist ties.

    A proposal to allow the photo scanning in the European Union could come as soon as this year, the researchers believe.

    ... now the EU knows Apple possesses this capability – it might simply pass a law requiring the iPhone maker to expand the scope of its scanning. Why reinvent the wheel when a few strokes of a pen can get the job done in 27 countries?

    “It should be a national-security priority to resist attempts to spy on and influence law-abiding citizens,” the researchers wrote ...

    “Expansion of the surveillance powers of the state really is passing a red line,” said Ross Anderson, a professor of security engineering at the University of Cambridge and a member of the group […]

    “It’s allowing scanning of a personal private device without any probable cause for anything illegitimate being done,” added another member of the group, Susan Landau, a professor of cybersecurity and policy at Tufts University. “It’s extraordinarily dangerous. It’s dangerous for business, national security, for public safety and for privacy.”

    For those who want more details on what the EU may propose read this report commissioned by the Council of the European Union and the recommended Council Resolution for Encryption. https://data.consilium.europa.eu/doc/document/ST-13084-2020-REV-1/en/pdf CSAM as proposed by Apple is a horrible idea IMO.

    Anyone saying "oh it's no different than Google does in the cloud" or "anyone that doesn't want it just don't use the Photos app" needs to read this.
    There is literally no difference. If legislation can be passed the platform/implementation is irrelevant.

    The one key difference is that server-side scanning could release the precursory hints (any hash hits at all) and associated data whereas the iPhone will release nothing until multiple hits have been confirmed. 
    jony0
  • Reply 17 of 21
    tylersdad said:
    How Apple didn’t see these privacy and security threats to their users and to society is mind boggling. 
    Plenty of people on Apple Insider didn't see this as a privacy or security threat. 

    Absolutely mind boggling. 
    I believe Apple wanted to do it on device because they intended to have images on their servers fully encrypted - which would make scanning there impossible.

    That way, even if someone could break into Sally's iCloud storage, he couldn't view images Sally sent in confidence to her boyfriend.
    williamlondonjony0
  • Reply 18 of 21
    gatorguygatorguy Posts: 24,176member
    tylersdad said:
    How Apple didn’t see these privacy and security threats to their users and to society is mind boggling. 
    Plenty of people on Apple Insider didn't see this as a privacy or security threat. 

    Absolutely mind boggling. 
    I believe Apple wanted to do it on device because they intended to have images on their servers fully encrypted - which would make scanning there impossible.

    That way, even if someone could break into Sally's iCloud storage, he couldn't view images Sally sent in confidence to her boyfriend.
    You're not the only one who wants to believe that. Has Apple ever hinted they planned to do so? Nope. 
    elijahgmuthuk_vanalingam
  • Reply 19 of 21
    davidwdavidw Posts: 2,036member
    tylersdad said:
    How Apple didn’t see these privacy and security threats to their users and to society is mind boggling. 
    Plenty of people on Apple Insider didn't see this as a privacy or security threat. 

    Absolutely mind boggling. 
    I believe Apple wanted to do it on device because they intended to have images on their servers fully encrypted - which would make scanning there impossible.

    That way, even if someone could break into Sally's iCloud storage, he couldn't view images Sally sent in confidence to her boyfriend.
    Are you talking about this Apple, that was planning to encrypt iPhones backups in the iCloud? 

    https://www.reuters.com/article/us-apple-fbi-icloud-exclusive/exclusive-apple-dropped-plan-for-encrypting-backups-after-fbi-complained-sources-idUSKBN1ZK1CT

    Don't hold your breath. 
    elijahgwilliamlondonmuthuk_vanalingam
  • Reply 20 of 21
    Ironic because this is exactly the type of software the public or citizenry need to mandate on Government Officials’ mobile phones and electronic devices, and by extension the top 1% or elite that by no coincidence run in the same circles. This is where the head of snake resides, and is the base of operations for big money and the global industrial networks of child sex rings, child sex slavery and mega kiddy porn. The psychopath elite have been essentially immune to law and justice as it applies to the peasants. These monsters want to scan your devices for their own protection. Think about it, if you some how end up with an incriminating image of one of the elite in the act of crime. They're giving themselves the permission and power to essentially remotely scan your device, identify and disappear potentially incriminating images or legal evidence. And think about transnational organ harvesting operations, and the type of people and big money involved there. This CSS so called safety feature is total bs as its being sold to general public. CSS is actually about protecting the elites’ interests, entitlement, and power to continue doing as they please. .
Sign In or Register to comment.