Researchers who built rudimentary CSAM system say Apple's is a danger

Posted:
in General Discussion
A pair of Princeton researchers claim that Apple's CSAM detection system is dangerous because they explored and warned against similar technology, but the two systems are far from identical.

Credit: Apple
Credit: Apple


Jonathan Mayer and Anunay Kulshrestha recently penned an op-ed for the Washington Post detailing some of their concerns about Apple's CSAM detection system. The security researchers wrote that they had built a similar system two years prior, only to warn against the technology after realizing its security and privacy pitfalls.

For example, Mayer and Kulshrestha claim that the service could be easily repurposed for surveillance by authoritarian governments. They also suggest that the system could suffer from false positives, and could be vulnerable to bad actors subjecting innocent users to false flags.

However, the system that the two security researchers developed doesn't contain the same privacy or security safeguards that Apple baked into its own CSAM detecting technology.

Additionally, concerns that a government could easily swap in a database full of dissident speech don't pan out. Apple's database contains hashes sourced from at least two child safety organizations in completely different jurisdictions. This provides protection against a single government corrupting a child safety organization.

Additionally, such a scheme would need to rely on Apple's human auditors to also be collaborators. Apple says flagged accounts are checked for false positives by human auditors. If Apple's team found accounts flagged for non-CSAM material, Apple says they'd suspect something was amiss and would stop sourcing the database from those organizations.

False positives are also incredibly rare on Apple's systems. The company says there's a one-in-a-trillion chance of a falsely flagged account. Again, even if an account is falsely flagged, the presence of CSAM must be confirmed before any report is generated to child safety organizations.

There are also protections against a bad actor sending CSAM to an innocent person. The Apple system only detects collections of CSAM in iCloud. Unless a user saves CSAM to iCloud themselves, or their Apple account is hacked by a sophisticated threat actor, then there's little chance of such a scam working out.

Additionally, users will be able to verify that the database of known CSAM hashes stored locally on their devices matches the one maintained by Apple. The database also can't be targeted -- it applies to all users in a specific country.

Apple's system also relies on client-side scanning and local on-device intelligence. There's a threshold for CSAM too -- a user needs to pass that threshold to even get flagged.

In other words, the two security researchers built a system that only vaguely resembles Apple's. Although there might be valid arguments to be made about "mission creep" and privacy, Apple's system was built from the ground up to address at least some of those concerns.

Read on AppleInsider
«13

Comments

  • Reply 1 of 46
    DAalsethDAalseth Posts: 2,783member
    LISTEN TO THE EXPERTS. 
    They have been down this road.
    They know what they are talking about.
    They abandoned this line of development because they saw what a massively bad idea it was. 
    It can and will be used by governments to crack down on dissent. It’s not an if but a when. It will produce false positives, it’s not an if but a when. Apple’s privacy safeguards are a fig-leaf that will be ripped off by the first government that wants to. Worst of all it will destroy the reputation Apple has crafted over the last twenty years of being on the individual users side when it comes to privacy and security. Once they lose that, in the minds of a huge number of consumers they will then be no better than Google. 
    rcfamike54entropysxyzzy-xxxPascalxxpulseimagesBeatsstevenozmuthuk_vanalingambyronl
  • Reply 2 of 46
    DAalseth said: It can and will be used by governments to crack down on dissent. It’s not an if but a when. It will produce false positives, it’s not an if but a when. Apple’s privacy safeguards are a fig-leaf that will be ripped off by the first government that wants to.
    Governments that want to crack down on dissent will still do so regardless of whether Apple has CSAM hash scanning or not. They have the money to hire programmers just like Apple does. 
    fastasleepthtStrangeDayskillroyauxio
  • Reply 3 of 46
    rcfarcfa Posts: 1,124member
    The silly exculpatory listing of differences in the systems is useless.

    Did Apple leave the Russian market when Russia demanded the installation of Russian government approved apps? Did Apple leave the Russian and Chinese markets, when Russia and China demanded that iCloud servers be located in their countries where government has physical access? Did Apple leave the Chinese market, when VPN apps were requested to be removed from the Chinese AppStore? Did Apple comply when Russia demanded that Telegram be removed from the Russian AppStore? Did Apple leave the UAE when VoIP apps were outlawed there?

    NO, NO, NO, NO, NO, and NO!

    And NO will be the answer if these countries require additional databases, direct notification (instead of Apple reviewing the cases), etc.

    Once this is baked into the OS, Apple has no leg to stand on, once “lawful” requests from governments are coming.
    mike54entropysxyzzy-xxxmuthuk_vanalingambyronlrhonin
  • Reply 4 of 46
    DAalseth said: It can and will be used by governments to crack down on dissent. It’s not an if but a when. It will produce false positives, it’s not an if but a when. Apple’s privacy safeguards are a fig-leaf that will be ripped off by the first government that wants to.
    Governments that want to crack down on dissent will still do so regardless of whether Apple has CSAM hash scanning or not. They have the money to hire programmers just like Apple does. 
    But they don't have the power to put any of that programming work on EVERY IPHONE IN THEIR COUNTRY.

    You can't tell me that the hash for this image won't be added to the database that is used for phones in China?

    DAalsethmike54entropyselijahgBeatsmuthuk_vanalingambyronlchemengin1
  • Reply 5 of 46
    DAalsethDAalseth Posts: 2,783member
    DAalseth said: It can and will be used by governments to crack down on dissent. It’s not an if but a when. It will produce false positives, it’s not an if but a when. Apple’s privacy safeguards are a fig-leaf that will be ripped off by the first government that wants to.
    Governments that want to crack down on dissent will still do so regardless of whether Apple has CSAM hash scanning or not. They have the money to hire programmers just like Apple does. 
    There’s no reason to give them the keys to the castle. 
    mike54entropyselijahgbyronlrhonin
  • Reply 6 of 46
    macplusplusmacplusplus Posts: 2,112member


    There are also protections against a bad actor sending CSAM to an innocent person. The Apple system only detects collections of CSAM in iCloud. Unless a user saves CSAM to iCloud themselves, or their Apple account is hacked by a sophisticated threat actor, then there's little chance of such a scam working out.

    The threshold is 30 pictures, according to Federighi. 30 pictures can be sent in a blaze through WhatsApp. If WhatsApp's auto-save media option is active, until the user awakens and deletes them one by one, the 30 pictures may be already flagged as CSAM and uploaded to iCloud.
    xyzzy-xxxmuthuk_vanalingambyronlikir
  • Reply 7 of 46
    crowleycrowley Posts: 10,453member


    There are also protections against a bad actor sending CSAM to an innocent person. The Apple system only detects collections of CSAM in iCloud. Unless a user saves CSAM to iCloud themselves, or their Apple account is hacked by a sophisticated threat actor, then there's little chance of such a scam working out.
    The threshold is 30 pictures, according to Federighi. 30 pictures can be sent in a blaze through WhatsApp. If WhatsApp's auto-save media option is active, until the user awakens and deletes them one by one, the 30 pictures may be already flagged as CSAM and uploaded to iCloud.
    You think someone hates you so much that they will find and distribute 30 counts of child pornography that they somehow know is on the NCMEC list to you in the hope that you have WhatsApp automatically save images to Photos and you have iCloud Photos switched on?

    All for the massive humiliation for the police to maybe turn up, at which point you explain that you received messages unsolicited from the phone number of the person who hates you?  And if you're smart you'll have reported in advance anyway.

    Be careful you don't strain yourself with those contortions.
    StrangeDaysfastasleepkillroyrobabaroundaboutnowHirsuteJimjdb8167byronlikirjony0
  • Reply 8 of 46
    Boring though it maybe

    It is indeed gratifying that when there is much dissent over what at least on the surface appears to be a genuine attempt to solve a problem (caused by Apples highly secure walled system) - they may have “marked the right cards”

    The downside of having a very secure system is that it attracts all us undesirables. I guess we are all on notice.

    I guess I have been found out and and must remove all my dodgy photos (or at least stop putting them on iCloud)




  • Reply 9 of 46
    rcfa said:
    The silly exculpatory listing of differences in the systems is useless.

    Did Apple leave the Russian market when Russia demanded the installation of Russian government approved apps? Did Apple leave the Russian and Chinese markets, when Russia and China demanded that iCloud servers be located in their countries where government has physical access? Did Apple leave the Chinese market, when VPN apps were requested to be removed from the Chinese AppStore? Did Apple comply when Russia demanded that Telegram be removed from the Russian AppStore? Did Apple leave the UAE when VoIP apps were outlawed there?

    NO, NO, NO, NO, NO, and NO!

    And NO will be the answer if these countries require additional databases, direct notification (instead of Apple reviewing the cases), etc.

    Once this is baked into the OS, Apple has no leg to stand on, once “lawful” requests from governments are coming.

    Apple had to do something, period. CSAM is being used as a bludgeon. Payments systems, banking etc are stopping accepting of payments and like if companies don't "do" something. So take your fight to VISA and Bank of America etc. I strongly prefer privacy be a very important right but that is not the society we live in(see below). Side item, just stop with the Russian and Chinese nonsense, how can someone actually think massive, mega US "intelligence" and law enforcement won't do similar?

    So leave Apple(if you are on iOS), go to Android. Governments now won't need to surveil your phone, they just open the door at Google (or Facebook with the backdoor key they have.
    If data privacy is important to you then take the fight up a couple dozen notches. Your fight targets are listed in the sentence above (and with hundreds of other companies not named Apple). 
    thtfastasleep
  • Reply 10 of 46
    I believe Apple has a hidden agenda for doing this and it has nothing to do with protecting children or child trafficking. For all the bad press and blowback from the public, and the fact that this seems to contradict their own security ideals, there is more to this than meets the eye! 
    mike54byronl
  • Reply 11 of 46
    crowley said:


    There are also protections against a bad actor sending CSAM to an innocent person. The Apple system only detects collections of CSAM in iCloud. Unless a user saves CSAM to iCloud themselves, or their Apple account is hacked by a sophisticated threat actor, then there's little chance of such a scam working out.
    The threshold is 30 pictures, according to Federighi. 30 pictures can be sent in a blaze through WhatsApp. If WhatsApp's auto-save media option is active, until the user awakens and deletes them one by one, the 30 pictures may be already flagged as CSAM and uploaded to iCloud.
    You think someone hates you so much that they will find and distribute 30 counts of child pornography that they somehow know is on the NCMEC list to you in the hope that you have WhatsApp automatically save images to Photos and you have iCloud Photos switched on?

    All for the massive humiliation for the police to maybe turn up, at which point you explain that you received messages unsolicited from the phone number of the person who hates you?  And if you're smart you'll have reported in advance anyway.

    Be careful you don't strain yourself with those contortions.
    Well, Sherlock, be careful not to report your very kind self while trying to report unsollicited CSAM messages. Reporting doesn't save yor ass, because you already possess them since you guarded them as evidence. Otherwise what would you report? You save those photos as "evidence" and you expect the cops buy that ! Your only bet is to un-possess them i.e. delete them ASAP.
    Beatsmuthuk_vanalingam
  • Reply 12 of 46
    StrangeDaysStrangeDays Posts: 12,834member
    crowley said:

    There are also protections against a bad actor sending CSAM to an innocent person. The Apple system only detects collections of CSAM in iCloud. Unless a user saves CSAM to iCloud themselves, or their Apple account is hacked by a sophisticated threat actor, then there's little chance of such a scam working out.
    The threshold is 30 pictures, according to Federighi. 30 pictures can be sent in a blaze through WhatsApp. If WhatsApp's auto-save media option is active, until the user awakens and deletes them one by one, the 30 pictures may be already flagged as CSAM and uploaded to iCloud.
    You think someone hates you so much that they will find and distribute 30 counts of child pornography that they somehow know is on the NCMEC list to you in the hope that you have WhatsApp automatically save images to Photos and you have iCloud Photos switched on?

    All for the massive humiliation for the police to maybe turn up, at which point you explain that you received messages unsolicited from the phone number of the person who hates you?  And if you're smart you'll have reported in advance anyway.

    Be careful you don't strain yourself with those contortions.
    Well, Sherlock, be careful not to report your very kind self while trying to report unsollicited CSAM messages. Reporting doesn't save yor ass, because you already possess them since you guarded them as evidence. Otherwise what would you report? You save those photos as "evidence" and you expect the cops buy that ! Your only bet is to un-possess them i.e. delete them ASAP.
    Your doom scenarios are as stupid as the “Hey, you!” FaceID thefts and the TouchID finger-chopping-off muggings, neither of which ever happened. You guys need to write fiction. 

    Apple, Google, Microsoft, and Dropbox already do server-side CSAM scanning, so you’d already get flagged today if your phone was somehow duped into sending child pornography to your iCloud Photos. 

    You’re inventing self-victim fantasy. 

    https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/ ;

    https://www.microsoft.com/en-us/photodna

    https://protectingchildren.google/intl/en/

    edited August 2021 killroyjdb8167ikirjony0
  • Reply 13 of 46
    crowleycrowley Posts: 10,453member
    crowley said:


    There are also protections against a bad actor sending CSAM to an innocent person. The Apple system only detects collections of CSAM in iCloud. Unless a user saves CSAM to iCloud themselves, or their Apple account is hacked by a sophisticated threat actor, then there's little chance of such a scam working out.
    The threshold is 30 pictures, according to Federighi. 30 pictures can be sent in a blaze through WhatsApp. If WhatsApp's auto-save media option is active, until the user awakens and deletes them one by one, the 30 pictures may be already flagged as CSAM and uploaded to iCloud.
    You think someone hates you so much that they will find and distribute 30 counts of child pornography that they somehow know is on the NCMEC list to you in the hope that you have WhatsApp automatically save images to Photos and you have iCloud Photos switched on?

    All for the massive humiliation for the police to maybe turn up, at which point you explain that you received messages unsolicited from the phone number of the person who hates you?  And if you're smart you'll have reported in advance anyway.

    Be careful you don't strain yourself with those contortions.
    Well, Sherlock, be careful not to report your very kind self while trying to report unsollicited CSAM messages. Reporting doesn't save yor ass, because you already possess them since you guarded them as evidence. Otherwise what would you report? You save those photos as "evidence" and you expect the cops buy that ! Your only bet is to un-possess them i.e. delete them ASAP.
    So your argument is that if you tell the police that you've receiving unsolicited offensive material in WhatsApp they'll demand that you show them what you receive, and if you show them then they'll arrest you?

    Do you ever leave the house with that kind of paranoia?
    fastasleepkillroyjony0
  • Reply 14 of 46
    crowley said:

    There are also protections against a bad actor sending CSAM to an innocent person. The Apple system only detects collections of CSAM in iCloud. Unless a user saves CSAM to iCloud themselves, or their Apple account is hacked by a sophisticated threat actor, then there's little chance of such a scam working out.
    The threshold is 30 pictures, according to Federighi. 30 pictures can be sent in a blaze through WhatsApp. If WhatsApp's auto-save media option is active, until the user awakens and deletes them one by one, the 30 pictures may be already flagged as CSAM and uploaded to iCloud.
    You think someone hates you so much that they will find and distribute 30 counts of child pornography that they somehow know is on the NCMEC list to you in the hope that you have WhatsApp automatically save images to Photos and you have iCloud Photos switched on?

    All for the massive humiliation for the police to maybe turn up, at which point you explain that you received messages unsolicited from the phone number of the person who hates you?  And if you're smart you'll have reported in advance anyway.

    Be careful you don't strain yourself with those contortions.
    Well, Sherlock, be careful not to report your very kind self while trying to report unsollicited CSAM messages. Reporting doesn't save yor ass, because you already possess them since you guarded them as evidence. Otherwise what would you report? You save those photos as "evidence" and you expect the cops buy that ! Your only bet is to un-possess them i.e. delete them ASAP.
    Your doom scenarios are as stupid as the “Hey, you!” FaceID thefts and the TouchID finger-chopping-off muggings, neither of which ever happened. You guys need to write fiction. 

    Apple, Google, Microsoft, and Dropbox already do server-side CSAM scanning, so you’d already get flagged today if your phone was somehow duped into sending child pornography to your iCloud Photos. 

    You’re inventing self-victim fantasy. 

    https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/ ;

    https://www.microsoft.com/en-us/photodna

    https://protectingchildren.google/intl/en/

    Apple has never admitted in their recent communications that they have CSAM-scanned photos on iCloud servers. Instead they consistently denied it. Your only reference on server-side scanning is that Sophos article, which, in the light of latest releases from Apple, should be considered at least outdated. 

    "The voucher generation is actually exactly what enables us not to have to begin processing all users’ content on our servers which we’ve never done for iCloud Photos."

    "This is an area we’ve been looking at for some time, including current state of the art techniques which mostly involves scanning through entire contents of users’ libraries on cloud services that — as you point out — isn’t something that we’ve ever done; to look through users’ iCloud Photos."

    https://techcrunch.com/2021/08/10/interview-apples-head-of-privacy-details-child-abuse-detection-and-messages-safety-features/

    Besides, many people don't have a coherent view of the operating system and they use their phones by trial and error. They don't know what resides where, many don't even know what cloud is. Forgetting that WhatsApp auto-saves photos in the photo library is not a doom scenario, it is a very common usage scenario. Go clean-up your Downloads folder before fighting against "conspiracy theories".
    muthuk_vanalingamchemengin1
  • Reply 15 of 46
    fastasleepfastasleep Posts: 6,408member
    I believe Apple has a hidden agenda for doing this and it has nothing to do with protecting children or child trafficking. For all the bad press and blowback from the public, and the fact that this seems to contradict their own security ideals, there is more to this than meets the eye! 
    Wow, sounds very scary and very made up!
    robabaroundaboutnowjdb8167ikirjony0
  • Reply 16 of 46
    entropysentropys Posts: 4,152member


    There are also protections against a bad actor sending CSAM to an innocent person. The Apple system only detects collections of CSAM in iCloud. Unless a user saves CSAM to iCloud themselves, or their Apple account is hacked by a sophisticated threat actor, then there's little chance of such a scam working out.

    The threshold is 30 pictures, according to Federighi. 30 pictures can be sent in a blaze through WhatsApp. If WhatsApp's auto-save media option is active, until the user awakens and deletes them one by one, the 30 pictures may be already flagged as CSAM and uploaded to iCloud.
    Supporters of political dissidents could have any number of offical photos of their heroes, memes or iconic images that a government doesn’t like. 30 would be nothing. And the number would be as easy to change as the hashlist.
    edited August 2021 xyzzy-xxxmuthuk_vanalingam
  • Reply 17 of 46
    killroykillroy Posts: 271member
    DAalseth said:
    LISTEN TO THE EXPERTS. 
    They have been down this road.
    They know what they are talking about.
    They abandoned this line of development because they saw what a massively bad idea it was. 
    It can and will be used by governments to crack down on dissent. It’s not an if but a when. It will produce false positives, it’s not an if but a when. Apple’s privacy safeguards are a fig-leaf that will be ripped off by the first government that wants to. Worst of all it will destroy the reputation Apple has crafted over the last twenty years of being on the individual users side when it comes to privacy and security. Once they lose that, in the minds of a huge number of consumers they will then be no better than Google. 

    With NSO they don't need to use CSAM.
  • Reply 18 of 46
    crowleycrowley Posts: 10,453member
    I believe Apple has a hidden agenda for doing this and it has nothing to do with protecting children or child trafficking. For all the bad press and blowback from the public, and the fact that this seems to contradict their own security ideals, there is more to this than meets the eye! 
    Sounds like a well-thought through theory.  What's their hidden agenda?  Opening a backdoor to force more U2 albums on everyone?
    roundaboutnowfastasleepikir
  • Reply 19 of 46
    omasouomasou Posts: 564member
    Hummm sounds like a vail attempt by the researchers to gain notoriety for their previous work by bad mouthing others.
  • Reply 20 of 46
    robabarobaba Posts: 228member
    rcfa said:
    The silly exculpatory listing of differences in the systems is useless.

    1Did Apple leave the Russian market when Russia demanded the installation of Russian government approved apps? 2Did Apple leave the Russian and Chinese markets, when Russia and China demanded that iCloud servers be located in their countries where government has physical access? 3Did Apple leave the Chinese market, when VPN apps were requested to be removed from the Chinese AppStore? 4Did Apple comply when Russia demanded that Telegram be removed from the Russian AppStore? 5Did Apple leave the UAE when VoIP apps were outlawed there?

    NO, NO, NO, NO, NO, and NO!

    And NO will be the answer if these countries require additional databases, direct notification (instead of Apple reviewing the cases), etc.

    Once this is baked into the OS, Apple has no leg to stand on, once “lawful” requests from governments are coming.
    1-Apple did not end up preloading the software that Russia demanded, only allowed for users to selectively load programs upon start up if they chose to.
    2-Apple is quickly moving to end-to-end encryption with an independent, third party go between which would completely eliminate the threat of Chinese (or Russian, or UAE) access to encrypted files on servers.
    3-New security system will be a built in VPN on steroids (end to end encryption, intermediate, independent 3rd part server shielding ID from Webhosts and sniffers, while preventing ISPs from knowing sites visited)
    4-don’t know
    5-see 3

    THIS IS WHY THEY ARE TAKING THE STEP TO SINGLE OUT CSAM NOW—SO THEY CAN STAMP IT OUT, WITHOUT PROVIDING A GATEWAY TO BAD ACTORS, STATE OR PRIVATE ENTERPRISE, WHILE ALLOWING AN UNPRECEDENTED LEVEL OF SECURITY / PRIVACY.
    killroyjony0
Sign In or Register to comment.