Apple's CSAM detection system may not be perfect, but it is inevitable

Posted:
in General Discussion edited August 2022
Over a year ago, Apple announced plans to scan for child sexual abuse material (CSAM) with the iOS 15.2 release. The technology is inevitable despite imperfections and silence about it.

Logos for Messages, Siri, and Photos
Logos for Messages, Siri, and Photos


Apple announced this scanning technology on August 5, 2021 to appear in iCloud Photos, iMessage, and Siri. These tools were designed to improve the safety of children on its platforms.

At the time, these tools would release within an update for watchOS, iOS, macOS, and iPadOS by the end of 2021. Apple has postponed since then, removing mention of CSAM detection in iCloud Photos and posting an update to its Child Safety page.

And then, the complaints started. And, they started, seemingly ignorant that Microsoft had been scanning uploaded files for about 10 years, and Google for eight.

Apple had also already been doing so for a few years, with a server-side partial implementation even before the iOS 15.2 announcement. Its privacy policy from at least May 9, 2019, says that the company pre-screens or scans uploaded content for potentially illegal content, including child sexual abuse material. However, this appears to have been limited to iCloud Mail.

Likely in response to the massive blowback from customers and researchers, in September 2021, Apple said that it would take additional time to collect input and make improvements before releasing its child safety features for iCloud Photos. It kept some initiatives going, and followed through with Messages and Siri.

Child safety on Apple platforms

In Messages, iOS warns children between 13 years and 17 years old included in an iCloud Family account of potentially sexually explicit content detected in a received text. For example, if the system detects a nude image, it automatically blurs it, and a popup appears with a safety message and an option to unblur the image.

For children under 13 years, iOS sends parents a notification if the child chooses to view the image. Teens between 13-17 can unblur the image without the device notifying parents.

Child Communication Safety in the Messages app
Child communication safety in the Messages app


Siri, along with search bars in Safari and Spotlight, steps in next. It intervenes when an Apple user of any age performs search queries related to CSAM. A popup warns that the search is illegal and provides resources to "learn more and get help." Siri can also direct people to file a report of suspected child abuse material.

Finally, iCloud Photos would also detect and report suspected CSAM. Apple's plan was to include a database of image hashes of abuse material for on-device intelligence. This National Center for Missing & Exploited Children (NCMEC) database aims to ensure that Apple platforms only report child abuse material already found during law enforcement investigations.

Apple says that the event of a false positive match is rare, saying that the odds are one-in-a-trillion on any given account. There is also a human review team that makes the final call on whether to notify law enforcement or not, so the slope doesn't immediately end with a police report.

The slippery, yet bumpy, slope

The detection tools in iCloud Photos were the most controversial. As one example, an open letter penned by Edward Snowden and other high-profile people raises concerns that certain groups could use the technology for surveillance. Democratic and authoritarian governments could pressure Apple to add hash databases for things other than CSAM, such as images of political dissidents.

Child safety feature in Siri
Child safety feature in Siri


Indeed, the Electronic Frontier Foundation noted that it had already seen this in action, saying: "One of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed to create a database of "terrorist" content that companies can contribute to and access for the purpose of banning such content."

The slippery slope does have bumps on it, however. In August 2021, Apple's privacy chief Erik Neuenschwander responded to concerns in an interview, saying that Apple put protections in place to prevent its technology from being used for content other than CSAM.

For example, the system only applies to Apple customers in the U.S., a country that has a Fourth Amendment barring illegal search and seizure. Next, since the technology is built directly into its operating systems, they have to apply to all users everywhere. It's not possible for Apple to limit updates to specific countries or individual users.

A certain threshold of content must also be met before the gears start turning. A single image of known CSAM isn't going to trigger anything, instead, Apple's requirement is around 30 images.

Apple published a document of frequently asked questions in August 2021 about the child safety features. If a government tried to force Apple to add non-CSAM images to the hash list, the company says it will refuse such demands. The system is designed to be auditable, and it's not possible for non-CSAM images to be "injected" into the system.




Apple says it will also publish a Knowledge Base with the root hash of the encrypted database. "users will be able to inspect the root hash of the encrypted database present on their device, and compare it to the expected root hash in the Knowledge Base article," the company wrote.

Security researchers can also assess the accuracy of the database in their own reviews. If the hash of the database from an Apple device doesn't match the hash included in the Knowledge Base, people will know that something is wrong.

"And so the hypothetical requires jumping over a lot of hoops, including having Apple change its internal process to refer material that is not illegal, like known CSAM and that we don't believe that there's a basis on which people will be able to make that request in the US," Neuenschwander said.

Apple is right to delay the feature and find ways to improve the accuracy of its system, if needed. Some companies that scan for this type of content make errors.

And one of those problems was highlighted recently, by a fairly monumental Google screw-up.

The problem with pre-crime

A major example of flawed software happened on August 21 with Google. The New York Times published a story highlighting the dangers of such surveillance systems.

A father in San Francisco took a picture of his toddler's genitals at his doctor's request due to a medical problem. He sent the image through the health care provider's telemedicine system, but his Android phone also automatically uploaded it to Google Photos, a setting the company enables by default.

Flagged as CSAM, even though the image wasn't known as CSAM at that point, Google reported the images to law enforcement and locked every one of the father's accounts associated with its products. Fortunately, police understood the nature of the images and didn't file charges, although Google didn't return his account access.

Google Photos on the App Store
Google Photos on the App Store


Google's detection system doesn't work exactly like Apple's technology. The company's support page mentions hash matching, such as "YouTube's CSAI Match, to detect known CSAM."

But as shown in the medical case, Google's algorithms can detect any child's genitals, in addition to hashes from the NCMEC database. The page mentions machine learning "to discover never-before-seen CSAM" that obviously can't distinguish between crime and innocence.

It's a big problem and one of the reasons why privacy advocates are so concerned with Apple's technology.

Moving forward

And yet, Apple's implementation of CSAM detection in iCloud Photos is only a matter of time, simply because its system strikes a middle ground. Governments can't tell Apple to include terrorist content into the CSAM database.

The delay is only due to public outcry; Apple's mistake was in its initial messaging when announcing the feature, not with errors within the detection system.

In a report from February 2022, security company PenLink said Apple is already "phenomenal" for law enforcement. It earns $20 million annually by helping the US government track criminal suspects and sells its services to local law enforcement. Leaked presentation slides detailed iCloud warrants, for example.

Apple makes no secret of how it helps law enforcement when presented with a subpoena. Examples of information Apple can share include data from iCloud backups, mail stored on its servers, and sometimes text messages.

Governments worldwide are constantly developing ways to increase online surveillance, like the Online Safety Bill the UK introduced in May 2021. A proposed amendment to the bill would force tech companies such as Apple to detect CSAM even in end-to-end encrypted messaging services. Apple would have to move this scanning to on-device algorithms to screen iMessages before it encrypts and uploads them.

Thus far, Apple has managed to fight efforts from the US to build backdoors into its devices, although critics refer to iCloud Photo scanning as a backdoor. The company's famous fight with the FBI has kept Apple customers safe from special versions of iOS that would make it easier to crack into devices.

Whether an iOS 16 update brings iCloud Photo scanning or not is unclear, but it will happen someday soon. After that, Apple customers will have to decide if they want to continue using iCloud -- or move to an end-to-end alternative solution. Or, they can turn off iCloud Photos, as Apple assured everyone that the detection process only happens with its syncing service.

Read on AppleInsider
«13

Comments

  • Reply 1 of 50
    I don’t want any CSAM scanning on my devices so I expect an
    opt-out button or I will stop upgrading.
    darkvader9secondkox2byronlxyzzy-xxxnewisneverenoughbaconstanggrandact73
  • Reply 2 of 50
    Interested read but such naivety…

    “Governments can't tell Apple to include terrorist content into the CSAM database.”

    Right now. But that is being pushed step by step. Is the author unaware of the pressure the current government has put on private companies to censor content? Does the author not understand that the obsession with ‘misinformation’ is little more than a means to inject that government control?

    Is the author unaware that ‘misinformation’ has been going on forever and that it’s not a new phenomenon? I watched a TV show on a reputable TV network decades ago on the moon landing being fake. Back then exploring a wide range of ideas, no matter how ridiculous, was considered valuable for mature humans.

    The only thing that has changed is people in society are increasingly immoral, and therefore people are increasingly unable to express self-responsibility and think critically. 

    The more we take the power away from the people the more likely it will end up in the governments’ hands. The government is not an altruistic benign organisation.
    lam92103racerhomie3darkvaderad0nirammuthuk_vanalingam9secondkox2AlwaysWinterbyronlxyzzy-xxxzeus423
  • Reply 3 of 50
    “Fortunately, police understood the nature of the images and didn't file charges, although Google didn't return his account access.“

    That sums this bullshit up: the customer gets criminalised not by the police but by firms of “big data” without any possibility to complain. 

    No thanks. 
    lam92103thtad0niramforgot username9secondkox2byronlrecoveryboyxyzzy-xxxzeus423newisneverenough
  • Reply 4 of 50
    1) If you really want to evade CSAM, I am sure there are a hundred ways. Including using customized Android phones with custom built apps

    2) They have not addressed any way to prevent abuse. How do we ensure that rogue nation states do-not weaponize this feature?

    3) While in this one case the police helped clear the allegations made by Google, we all know that this could have easily gone the other way. Then the poor parents would be left to fight a long and arduous legal battle, while the child suffered. 

    4) Finally what about Google’s or any other providers responsibility to fix their mistakes.


    darkvaderad0niramforgot usernameJaiOh81byronlrecoveryboyzeus423newisneverenoughelijahgbaconstang
  • Reply 5 of 50
    georgie01 said:
    Interested read but such naivety…

    “Governments can't tell Apple to include terrorist content into the CSAM database.”

    Right now. But that is being pushed step by step. Is the author unaware of the pressure the current government has put on private companies to censor content? Does the author not understand that the obsession with ‘misinformation’ is little more than a means to inject that government control?

    Is the author unaware that ‘misinformation’ has been going on forever and that it’s not a new phenomenon? I watched a TV show on a reputable TV network decades ago on the moon landing being fake. Back then exploring a wide range of ideas, no matter how ridiculous, was considered valuable for mature humans.

    The only thing that has changed is people in society are increasingly immoral, and therefore people are increasingly unable to express self-responsibility and think critically. 

    The more we take the power away from the people the more likely it will end up in the governments’ hands. The government is not an altruistic benign organisation.
    You are misinforming people regarding the reputable network claiming that the Moon landing was fake. There may have been a show where they talked about the Moon landing being fake but the network would not have presented it as a fact. I have seen shows like that a couple of times and it was always presented as “some people believe.” And they go through the so called evidence  and then explain how their theories don’t make sense. One example would be that the flag is supposedly waving even though there is no wind on the Moon and the answer is that the flag is not waving. It just looks that way to some because of the way the flag was set up. And they explain all of this in the show. You would have to be pretty gullible to watch the show and actually come away believing that the Moon landing was fake.
    thtStrangeDaysbaconstangJaiOh81jony0
  • Reply 6 of 50
    darkvaderdarkvader Posts: 1,146member
    It is NOT inevitable.  It is criminal.

    If Apple is still pushing forward with this idiotic idea, then more public pressure MUST be applied.  Apple needs to understand that on-device searches CANNOT be tolerated in any society that even remotely calls itself "free".


    ad0niram9secondkox2muthuk_vanalingamAlwaysWinterbyronlxyzzy-xxxnewisneverenoughelijahgbaconstangJaiOh81
  • Reply 7 of 50
    auxioauxio Posts: 2,727member
    georgie01 said:

    The more we take the power away from the people the more likely it will end up in the governments’ hands. The government is not an altruistic benign organisation.
    And corporations are?  With no government, it would be the groups with the most money who hold the power.  At least in a democratic society, one gets to vote for government and there's a certain level of transparency in their operations.  Not so for corporations.
    forgot usernamebaconstang
  • Reply 8 of 50
    auxio said:
    georgie01 said:

    The more we take the power away from the people the more likely it will end up in the governments’ hands. The government is not an altruistic benign organisation.
    And corporations are?  With no government, it would be the groups with the most money who hold the power.  At least in a democratic society, one gets to vote for government and there's a certain level of transparency in their operations.  Not so for corporations.
    No, @georgie01 didn't say corporations are altruistic. However in a free capitalist society people do get to vote for or against corporations with their wallets. Whether that’s through wall street or simply consuming the product. I would argue money talks a lot louder then voting. That’s why almost all politicians are owned by the lobbyists. 
    muthuk_vanalingambyronlzeus423elijahg
  • Reply 9 of 50
    M68000M68000 Posts: 725member
    darkvader said:
    It is NOT inevitable.  It is criminal.

    If Apple is still pushing forward with this idiotic idea, then more public pressure MUST be applied.  Apple needs to understand that on-device searches CANNOT be tolerated in any society that even remotely calls itself "free".


    It could be said that freedom ends where crime begins.  This csam stuff seems to be significant effort to combat the issues.  If Apple tells you in their software and phone licenses agreements that they will scan devices in effort to combat crime and help society and you agree to it, it is what it is.  If you don’t like it,  don’t use their phone.  Yeah,  those license agreements that very few read…   Why not support this instead of trying to keep some percentage of perpetrators safe?   They (Apple) do not want their products associated with crime and have a bad look…They are doing something about it.  If you ran a business would you want that as well?  

    On a side note,  I’ve thought for years that Microsoft may have a top secret login or method to get into any windows pc regardless of encryption or not.   If the login or method was compromised they would have a windows update pushed out to change things.  Such a login could be used in extreme cases of national security for example.  It would not surprise me at all if there is a secret back door into every windows pc that Microsoft can do.

    edited August 2022 jony0
  • Reply 10 of 50
    9secondkox29secondkox2 Posts: 2,702member
    darkvader said:
    It is NOT inevitable.  It is criminal.

    If Apple is still pushing forward with this idiotic idea, then more public pressure MUST be applied.  Apple needs to understand that on-device searches CANNOT be tolerated in any society that even remotely calls itself "free".


    The justification that Microsoft snd google, two of the sleaziest companies around, have been violating your privacy for years doesn’t equate to “so now Apple has to do this too.” 
    JaiOh81AlwaysWinterzeus423newisneverenoughelijahgbaconstanggrandact73
  • Reply 11 of 50
    boboliciousbobolicious Posts: 1,145member
    ...Apple heralded Photos as an upgrade to iPhoto, yet the most glaring question for me was the inclusion of an auto image tagging with no off switch...?  That in concert with all enticements leading to iCloud...?  Think about it.. ?  Even the Apple system restore defaults to Apple servers (what are we downloading today?) and a drive I did a reinstall on recently was reformatted from HFS+ to APFS without warning ... Is there a systemic invasion ongoing with every (since 2011) annual macOS from Apple...?

    Is CSAM inevitable ? Would a study in human nature inform such a suggestion...?
    edited August 2022 9secondkox2byronl
  • Reply 12 of 50
    B-Mc-CB-Mc-C Posts: 41member
    No source quoted in the article. The author is presenting his opinion as if it were factual. This technology from Apple will never see the light of day, specifically after the widely publicized epic fail incident he mentioned, occurring last week with Google and the innocent parent.
    muthuk_vanalingamAlwaysWinterbyronlzeus423newisneverenoughelijahgJaiOh81
  • Reply 13 of 50
    A year has gone by and there is seems people don’t understand the issue here. Including the author repeating over and over “iCloud Photos” the issue with CSAM is not on the Cloud side because scanning content on Clouds: Google, Apple, Amazon, Microsoft is already being done which is fine because they (customers) willingly upload content to clouds and even if they are aware or not the terms have explicitly said the content is being scan for x, y, z. The issue with Apple approach is bringing that scanning tool on the device itself without even explaining if you have the option to opt out. How will they ensure that governments will not force them to modify the scanning beyond CSAM.
    9secondkox2byronlzeus423elijahgbaconstangJaiOh81
  • Reply 14 of 50
    9secondkox29secondkox2 Posts: 2,702member
    lam92103 said:
    1) If you really want to evade CSAM, I am sure there are a hundred ways. Including using customized Android phones with custom built apps

    2) They have not addressed any way to prevent abuse. How do we ensure that rogue nation states do-not weaponize this feature?

    3) While in this one case the police helped clear the allegations made by Google, we all know that this could have easily gone the other way. Then the poor parents would be left to fight a long and arduous legal battle, while the child suffered. 

    4) Finally what about Google’s or any other providers responsibility to fix their mistakes.


    Or… as was the case with the big iCloud hack years ago, extracting nude photos of famous people…

    who’s to say bad actors would not add horrible images to a high profile persons account? Especially in the crazy political world we live in now. A lot more difficult to add to someone’s actual device. 

    Just a thought. But one worth consideration. Sounds crazy until you consider what passes for political power plays these days. With the “Doesn’t matter if innocent people go to jail so long as we win” type of mentalities going around. 
    edited August 2022 muthuk_vanalingamzeus423newisneverenough
  • Reply 15 of 50
    9secondkox29secondkox2 Posts: 2,702member
    Apple_Bar said:
    A year has gone by and there is seems people don’t understand the issue here. Including the author repeating over and over “iCloud Photos” the issue with CSAM is not on the Cloud side because scanning content on Clouds: Google, Apple, Amazon, Microsoft is already being done which is fine because they (customers) willingly upload content to clouds and even if they are aware or not the terms have explicitly said the content is being scan for x, y, z. The issue with Apple approach is bringing that scanning tool on the device itself without even explaining if you have the option to opt out. How will they ensure that governments will not force them to modify the scanning beyond CSAM.
    Great point. We’ve already seen parents who are concerned about inappropriate school curriculum being treated like domestic terrorists by the FBI recently. Simply because they are standing up for their morals and not allowing the government to spoon feed their kids contrary values. 

    “Misinformation” (which often isn’t misinformation at all as we’ve seen over the course of COVID discourse) used to be a “difference of opinion” or difference of interpretation.” Now we have many wanting to make it a crime and big tech already acting on it as such. 

    Personally, I’m all for protecting innocent children from sickening filth masquerading as human beings. I volunteer in helping an organization that rescues kids and young adults from trafficking. 

    But an blank check to scan your stuff with a “just trust us to not see anything but the bad stuff” promise  is ridiculous. And no. It won’t stop at that. Things are too political nowadays. 

    Apple needs to be careful. They’ve been on top for a long time now. But that’s it guaranteed. An upstart company can come “out of nowhere”and gain the fans they lose by being anti political or no political and non invasive. 

    Right now, apple feels it can do whatever it wants as it has genuinely great products and services and an excellent ecosystem to keep customers locked in. This has been built on trust though. And that can be undone so quickly. 

    Hopefully apple does the right thing and bucks the trend. Even better if they can create better systems for protecting kids that don’t involve spying on everyone. 


    muthuk_vanalingamzeus423newisneverenoughelijahgbaconstang
  • Reply 16 of 50
    I question why any of this is Apple’s responsibility at all. In my opinion Apple’s job isn’t to solve deep rooted issues like this. Their responsibility is to make computers. We already have a democratic system to make laws, law enforcement systems to detect violations of those laws, a justice system to render decisions regarding violations of laws, and a penal system to contain and punish those that have been convicted of breaking the law. Creating these systems took thousands of years of civilization and thought to refine. Why are we now as a society putting the entire burden for this on technology companies? It reminds me of how during the heat of the pandemic we wanted tech to solve COVID-19 and the best they could do was a tracing system that didn’t really make a massive dent in the disease. Tech can’t solve all problems, no matter how much we all wish it could.
    muthuk_vanalingamnewisneverenoughelijahgbaconstangJaiOh81
  • Reply 17 of 50
    crowleycrowley Posts: 10,453member
    I question why any of this is Apple’s responsibility at all. In my opinion Apple’s job isn’t to solve deep rooted issues like this. Their responsibility is to make computers. We already have a democratic system to make laws, law enforcement systems to detect violations of those laws, a justice system to render decisions regarding violations of laws, and a penal system to contain and punish those that have been convicted of breaking the law. Creating these systems took thousands of years of civilization and thought to refine. Why are we now as a society putting the entire burden for this on technology companies? It reminds me of how during the heat of the pandemic we wanted tech to solve COVID-19 and the best they could do was a tracing system that didn’t really make a massive dent in the disease. Tech can’t solve all problems, no matter how much we all wish it could.
    So your opinion is that tech should be allowed to create problems, but not allowed to offer any solutions?  Because thousands years of laws based on paper will somehow be able to deal with a technology environment that is changing constantly?

    If Apple want to take responsibility then that's a positive thing.  They aren't being forced into this.  If only other companies would be so responsible.
    jony0
  • Reply 18 of 50
    davidwdavidw Posts: 2,049member
    >And then, the complaints started. And, they started, seemingly ignorant that Microsoft had been scanning uploaded files for about 10 years, and Google for eight.<

    That is an ignorantly false assumption.  Consumers complaining about Apple upcoming policy of scanning for  CSAM was about Apple doing it on the consumers devices, instead of on their iCloud servers. These consumers were not ignorant of the fact that other companies were already scanning for CSAM and  knew that Apple was doing same on their own servers. Hell, most of Apple customers data probably aren't even stored on Apple's own servers. they are stored on cloud servers provided by Amazon and Google. I'm sure they have policies where Apple has to scam for CSAM, if they want to use their cloud servers for storing Apple customers data. 

    >And yet, Apple's implementation of CSAM detection in iCloud Photos is only a matter of time, simply because its system strikes a middle ground. Governments can't tell Apple to include terrorist content into the CSAM database.<

    What's the point?  The US government can't tell Apple to actively scan for CSAM either. It's suppose to be "voluntary" on Apple part. Under Federal laws, Apple is only required to report CSAM, if they if they find it or are informed of it. Otherwise, Apple would become an agency of the government and must adhere to the US Constitution, when performing the search (even on their own servers.). And just like any other US government law enforcement agency, Apple must get a search warrant to justify a search for criminal activity, on the customers own device. The SCOTUS has already ruled that a smartphone can be an extension of one's "home" and require a search warrant to search it, without the owners permission. Just like how an automobile can be considered an extension of ones "home". Unless there's some sort of emergency, in order to get a search warrant, the government must show "probable cause".

    https://crsreports.congress.gov/product/pdf/LSB/LSB10713
    edited August 2022 muthuk_vanalingamgatorguyelijahgbaconstang
  • Reply 19 of 50
    Mike WuertheleMike Wuerthele Posts: 6,861administrator
    B-Mc-C said:
    No source quoted in the article. The author is presenting his opinion as if it were factual. This technology from Apple will never see the light of day, specifically after the widely publicized epic fail incident he mentioned, occurring last week with Google and the innocent parent.
    I don't necessarily agree with many of Andrew's points, but there are many, many sources quoted in the article.

    Also:

    crowleybadmonkStrangeDaysbaconstangjony0
  • Reply 20 of 50
    DAalsethDAalseth Posts: 2,783member
    So the bad guys will just move their files to somewhere that the system won’t touch. 
    The rest of us are stuck with scanning of our private data without our concent.
    Really we’re left with a choice of scanning by Apple or by Google who has a record of mining any and all data for profit. 
    Sucks.
    baconstangJaiOh81
Sign In or Register to comment.