Fed expansion of Apple's CSAM system barred by 4th Amendment, Corellium exec says
In a Twitter thread Monday, Corellium COO and security specialist Matt Tait detailed why the government couldn't just modify the database maintained by the National Center of Missing and Exploited Children (NCMEC) to find non-CSAM images in Apple's cloud storage. For one, Tait pointed out that NCMEC is not part of the government. Instead, it's a private nonprofit entity with special legal privileges to receive CSAM tips.
Because of that, authorities like the Justice Department can't just directly order NCMEC to do something outside of its scope. It might be able to force them in the courts, but NCMEC is not within its chain of command. Even if the Justice Department "asks nicely," NCMEC has several reasons to say no.
However, Tait uses a specific scenario of the DOJ getting NCMEC to add a hash for a classified document to its database.
In this scenario, perhaps someone has this photo on their phone and uploaded it to iCloud so it got scanned, and triggered a hit. First, in Apple's protocol, that single hit is not enough to identify that the person has the document, until the preconfigured threshold is reached.
Tait also points out that a single image of non-CSAM wouldn't be enough to ping the system. Even if those barriers are somehow surmounted, it's likely that Apple would drop the NCMEC database if it knew the organization wasn't operating honestly. Technology companies have a legal obligation to report CSAM, but not to scan for it.
So thanks to DOJ asking and NCMEC saying "yes", Apple has dropped CSAM scanning entirely, and neither NCMEC nor DOJ actually got a hit. Moreover, NCMEC is now ruined: nobody in tech will use their database.
So the long story short is DOJ can't ask NCMEC politely and get to a yes
Whether or not the government could compel NCMEC to add a hash for non-CSAM images is also a tricky question. According to Tait, the Fourth Amendment probably prohibits it.
NCMEC is not actually an investigative body, and there are blocks between it and governmental agencies. When it receives a tip, it passes the information along to law enforcement. To actually prosecute a known CSAM offender, law enforcement must gather their own evidence, typically by warrant.
Although courts have wrestled with the question, the original CSAM scanning by a tech company is likely compliant with the Fourth Amendment because companies are doing so voluntarily. If it's a non-voluntary search, then it's a "deputized search" and in violation of the Fourth Amendment if a warrant is not provided.
For the conspiracy to work, it'd need Apple, NCMEC and DOJ working together to pull it off *voluntarily* and it to never leak. If that's your threat model, OK, but that's a huge conspiracy with enormous risk to all participants, two of which existentially.
Apple's CSAM detection mechanism has caused a stir since its announcement, drawing criticism from security and privacy experts. The Cupertino tech giant maintains that it won't allow the system to be used to scan for anything other than CSAM, however.
Read on AppleInsider