Fed expansion of Apple's CSAM system barred by 4th Amendment, Corellium exec says

Posted:
in General Discussion
Government abuses of Apple's CSAM iCloud detection system in the U.S., like expansion to terrorist subjects and similar matters, is prevented by the Fourth Amendment, according to security firm Corellium's chief operating officer.

Credit: Apple
Credit: Apple


In a Twitter thread Monday, Corellium COO and security specialist Matt Tait detailed why the government couldn't just modify the database maintained by the National Center of Missing and Exploited Children (NCMEC) to find non-CSAM images in Apple's cloud storage. For one, Tait pointed out that NCMEC is not part of the government. Instead, it's a private nonprofit entity with special legal privileges to receive CSAM tips.

Because of that, authorities like the Justice Department can't just directly order NCMEC to do something outside of its scope. It might be able to force them in the courts, but NCMEC is not within its chain of command. Even if the Justice Department "asks nicely," NCMEC has several reasons to say no.

However, Tait uses a specific scenario of the DOJ getting NCMEC to add a hash for a classified document to its database.

In this scenario, perhaps someone has this photo on their phone and uploaded it to iCloud so it got scanned, and triggered a hit. First, in Apple's protocol, that single hit is not enough to identify that the person has the document, until the preconfigured threshold is reached.

-- Pwn All The Things (@pwnallthethings)


Tait also points out that a single image of non-CSAM wouldn't be enough to ping the system. Even if those barriers are somehow surmounted, it's likely that Apple would drop the NCMEC database if it knew the organization wasn't operating honestly. Technology companies have a legal obligation to report CSAM, but not to scan for it.

So thanks to DOJ asking and NCMEC saying "yes", Apple has dropped CSAM scanning entirely, and neither NCMEC nor DOJ actually got a hit. Moreover, NCMEC is now ruined: nobody in tech will use their database.

So the long story short is DOJ can't ask NCMEC politely and get to a yes

-- Pwn All The Things (@pwnallthethings)


Whether or not the government could compel NCMEC to add a hash for non-CSAM images is also a tricky question. According to Tait, the Fourth Amendment probably prohibits it.

NCMEC is not actually an investigative body, and there are blocks between it and governmental agencies. When it receives a tip, it passes the information along to law enforcement. To actually prosecute a known CSAM offender, law enforcement must gather their own evidence, typically by warrant.

Although courts have wrestled with the question, the original CSAM scanning by a tech company is likely compliant with the Fourth Amendment because companies are doing so voluntarily. If it's a non-voluntary search, then it's a "deputized search" and in violation of the Fourth Amendment if a warrant is not provided.

For the conspiracy to work, it'd need Apple, NCMEC and DOJ working together to pull it off *voluntarily* and it to never leak. If that's your threat model, OK, but that's a huge conspiracy with enormous risk to all participants, two of which existentially.

-- Pwn All The Things (@pwnallthethings)


Apple's CSAM detection mechanism has caused a stir since its announcement, drawing criticism from security and privacy experts. The Cupertino tech giant maintains that it won't allow the system to be used to scan for anything other than CSAM, however.

Read on AppleInsider
«1

Comments

  • Reply 1 of 30
    Yeah… No. It’s happened many times in the past that companies were compelled to do something and were placed under gag orders so it would be a serious crime for them to leak that order. Not saying I have a strong opinion on this new feature one way or the other, but I’m not buying this argument.
    bluefire1elijahgomar moralesOferdarkvaderwilliamlondonBeatsmuthuk_vanalingambonobobrcfa
  • Reply 2 of 30
    Shocking how many people do not want this feature and seem keen to protect the rights of pedophiles  :#
    williamlondonapplesauce007applguywatto_cobrajony0
  • Reply 3 of 30
    bluefire1bluefire1 Posts: 1,302member
    For a company like Apple which is so fervent about privacy rights, it’s both disconcerting and alarming that they’d carve out any exception regardless of how righteous or noble the cause.
    edited August 2021 omar moralesOferdarkvaderentropysBeatsbonobobrcfa
  • Reply 4 of 30
    Didn’t the NSA violate the 4th amendment when it was caught spying on American citizens who were not suspects of committing terrorism? 
    omar moralesOferdarkvaderwilliamlondonbaconstangsidricthevikingBeatsbyronlcochoapplguy
  • Reply 5 of 30
    jungmarkjungmark Posts: 6,926member
    Just because the govt could force an update to the NCMEC database doesn’t mean Apple updates their data to push out in an iOS update. 
  • Reply 6 of 30
    wwinter86 said:
    Shocking how many people do not want this feature and seem keen to protect the rights of pedophiles  :#
    Replace pedophiles with terrorists and you have practically the same argument Apple was against when the FBI wanted Apple to install a back door on their products. 

    The problem I see is the net that is cast to catch relatively few pedophiles or other criminals. They will hide how they commit their crimes and will not use iCloud or even their photos app to store the incriminating evidence. They use apps like WhatsApp and other 3rd party apps to commit their crimes.  Meanwhile there is the risk to mislabel people who have innocent pictures as suspects and that’s not Apple’s job. Just like it’s not Apple’s job to police if someone steals your device and wants service. As long as FMI is off they don’t care and they don’t want to get involved. Now with this, they are. 

    Law enforcement should be the ones responsible for this, not manufactures of electronic devices. Apple is not the police. 


    omar moralesOferaderutterdarkvaderbaconstangfahlmanBeatsbyronlmuthuk_vanalingambonobob
  • Reply 7 of 30
    darkvaderdarkvader Posts: 1,146member
    wwinter86 said:
    Shocking how many people do not want this feature and seem keen to protect the rights of pedophiles  :#

    It's not shocking at all that "but think of the children" types don't understand just exactly how dangerous for everyone something like this is.  Sad, but not shocking.
    williamlondonbaconstangentropysBeatsbyronlcochomuthuk_vanalingamp-dogbonobobrcfa
  • Reply 8 of 30
    mcdavemcdave Posts: 1,927member
    Legislation “prevents” nothing, it simply allows legal recourse once its all too late - if you have the resources.
    Beatsbeowulfschmidtp-dogbonobobrcfadarkvader
  • Reply 9 of 30
    davidwdavidw Posts: 2,053member
    wwinter86 said:
    Shocking how many people do not want this feature and seem keen to protect the rights of pedophiles  :#
    No one here is keen on protecting the 'rights" of pedophiles. They are keen on protecting the rights of people from not having their data and devices searched, for a crime that they are not even suspected of committing. And by an entity that is not law enforcement. Which seems to make it OK with some people thinking, because the Constitution only limits unreasonable search by the government and its agencies.

    Applying your illogical thinking, everyone that is not willing to go along with having their data and devices searched, for a crime that they have not committed, never will and not been suspected of, must be keen on supporting the "rights" of pedophiles. Your type of thinking is much more dangerous to a free society than that of a pedophile, because there seems to be lot more people like you, than there are pedophiles.

    "Suspicionless surveillance does not become okay simply because it's only victimizing 95% of the world instead of 100%." 

    "Under observation, we act less free, which means we effectively are less free."

    "It may be that by watching everywhere we go, by watching everything we do, by analyzing every word we say, by waiting and passing judgment over every association we make and every person we love, that we could uncover a terrorist plot, or we could discover more criminals. But is that the kind of society we want to live in?"

     Edward Snowden  
    entropysBeatsbyronlcochomuthuk_vanalingambeowulfschmidtp-dogbonobobrcfasbdude
  • Reply 10 of 30
    hexclockhexclock Posts: 1,254member
    Yup, and governors aren’t supposed to be able to change election laws without the state legislature, but did anyway. 
    Beatscochorcfa
  • Reply 11 of 30
    wwinter86 said:
    Shocking how many people do not want this feature and seem keen to protect the rights of pedophiles  :#
    “Those who would give up essential liberty, to purchase a little temporary safety, deserve neither liberty nor safety.”

    Benjamin Franklin (1706-1790)

    Beatsmuthuk_vanalingamp-dogbonobobrcfa
  • Reply 12 of 30
    entropysentropys Posts: 4,167member
    The modern kiddies don’t care about liberty. They don’t understand what their grandparents’ siblings died for. They even believe government is benevolent. Let’s face it most of the time it is.

    Until it isn’t. And then it’s too late.

    muthuk_vanalingamrcfa
  • Reply 13 of 30
    blastdoorblastdoor Posts: 3,293member
    I'd say China is the far greater worry. 

    If the Chinese government ordered Apple to implement this feature and keep it quiet then Apple would (1) implement this feature and (2) keep it quiet, because Apple needs China vastly more than China needs Apple. 

    Not to be too conspiratorial or anything, but suppose China did order Apple to implement this feature (not just for CSAM, but for everything of interest to CCP), and to keep it quiet. How could Apple alert users without running afoul of the Chinese government? 

    Maybe by doing exactly what they're doing. Announce the feature is being implemented only in the US and only for CSAM, and then let people connect the dots. 
    gatorguybyronlmuthuk_vanalingambonobobrcfadarkvader
  • Reply 14 of 30
    entropysentropys Posts: 4,167member
    In a Twitter thread Monday, Corellium COO and security specialist Matt Tait detailed why the government couldn't just modify the database maintained by the National Center of Missing and Exploited Children (NCMEC) to find non-CSAM images in Apple's cloud storage. For one, Tait pointed out that NCMEC is not part of the government. Instead, it's a private nonprofit entity with special legal privileges to receive CSAM tips.

    I don’t recall anyone arguing that though. There is no need for that. This is more that a government actor can force Apple to use the OS feature that uses the hash to be used for other databases, other purposes to identify people it doesn’t like. Not CSAM. Whoever the evil bastard that came up with “won’t someone think of the children” as the McGuffin for this outrageous bit of overreach deserves a medal. A Big Brother medal.
    edited August 2021 muthuk_vanalingamrcfadarkvader
  • Reply 15 of 30
    BeatsBeats Posts: 3,073member
    Cops aren’t allowed to kill civilians while under arrest. So it will be never happen. 
    muthuk_vanalingamp-dogrcfa
  • Reply 16 of 30
    chadbagchadbag Posts: 2,000member
    This guy obviously doesn't understand the issue at hand.  It is not that the non governmental anti child abuse org would be compelled to add non abuse anti government stuff to the database.  It is that Apple is installing the plumbing so that in the future a totally different database could be used.  Not connected with the anti child abuse one.  And the multi-hit thing is a red herring.  It is a policy.  One that can be changed tomorrow.  
    p-dogrcfadarkvader
  • Reply 17 of 30
    It is relatively easy to conclusively identify pedophile content than to identify terrorist content.
    If Apple could detect terrorist content conclusively, I am sure they would be off of iCloud as well.

    Wanting to keep identifiably illegal content off of a storage service is the right thing to do.
    I hope other storage services follow suit.
  • Reply 18 of 30
    chadbagchadbag Posts: 2,000member
    It is relatively easy to conclusively identify pedophile content than to identify terrorist content.
    If Apple could detect terrorist content conclusively, I am sure they would be off of iCloud as well.

    Wanting to keep identifiably illegal content off of a storage service is the right thing to do.
    I hope other storage services follow suit.
    I don't think you understand the issue.  On its face I doubt very few people would disagree with you.  The problem is the method they are using to do it.  They are hijacking a user's phone to scan for them through a "back door" they are adding.  One that can easily be abused at a later date. 
    muthuk_vanalingamrcfadarkvader
  • Reply 19 of 30
    jdwjdw Posts: 1,338member
    wwinter86 said:
    Shocking how many people do not want this feature and seem keen to protect the rights of pedophiles  :#
    Even more shocking are people such as yourself who have the sheer audacity to point such a harsh finger of accusation at those resistant to Apple's plan, as if people opposed to the plan are somehow enablers of pedophiles, which is absolutely outrageous and totally and utterly unacceptable.
    muthuk_vanalingamp-dogrcfadarkvader
  • Reply 20 of 30
    blastdoor said:
    I'd say China is the far greater worry. 

    If the Chinese government ordered Apple to implement this feature and keep it quiet then Apple would (1) implement this feature and (2) keep it quiet, because Apple needs China vastly more than China needs Apple. 

    Not to be too conspiratorial or anything, but suppose China did order Apple to implement this feature (not just for CSAM, but for everything of interest to CCP), and to keep it quiet. How could Apple alert users without running afoul of the Chinese government? 

    Maybe by doing exactly what they're doing. Announce the feature is being implemented only in the US and only for CSAM, and then let people connect the dots. 
    As a foreigner who lives in China, I believe this will be the case. Every foreigner in China will be in danger.
    rcfadarkvader
Sign In or Register to comment.