Apple reportedly plans to make iOS detect child abuse photos

Posted:
in iPhone
A security expert claims that Apple is about to announce photo identification tools that would identify child abuse images in iOS photo libraries.

Apple's iPhone
Apple's iPhone


Apple has previously removed individual apps from the App Store over child pornography concerns, but now it's said to be about to introduce such detection system wide. Using photo hashing, iPhones could identify Child Sexual Abuse Material (CSAM) on device.

Apple has not confirmed this and so far the sole source is Matthew Green, a cryptographer and associate professor at Johns Hopkins Information Security Institute.

I've had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea.

— Matthew Green (@matthew_d_green)


According to Green, the plan is initially to be client-side -- that is, have all of the detection done on a user's iPhone. He argues, however, that it's possible that it's the start of a process that leads to surveillance of data traffic sent and received from the phone.

"Eventually it could be a key ingredient in adding surveillance to encrypted messaging systems," continues Green. "The ability to add scanning systems like this to E2E [end to end encryption] messaging systems has been a major 'ask' by law enforcement the world over."

"This sort of tool can be a boon for finding child pornography in people's phones," he said. "But imagine what it could do in the hands of an authoritarian government?"

Green who, with his cryptography students, has previously reported on how law enforcement may be able to break into iPhones. He and Johns Hopkins University have also previously worked with Apple to fix a security bug in Messages.

Read on AppleInsider
«1

Comments

  • Reply 1 of 26
    crowleycrowley Posts: 10,453member
    "This sort of tool can be a boon for finding child pornography in people's phones," he said. "But imagine what it could do in the hands of an authoritarian government?"
    But it's not in the hands of an authoritarian government?  It's in the hands of Apple.  If an authoritarian government wanted to do something like this I have no doubt they'd be capable of doing it, I don't see how Apple going after child abusers is going to affect that.
  • Reply 2 of 26
    tedz98tedz98 Posts: 80member
    I’m totally against child porn. But this capability is a double edged sword. Apple, which touts its strong belief in the privacy of its customers using its devices, to begin surveilling customer content sets up an unusual and potentially dangerous precedent. What undesirable content will be next to be monitored? Who gets notified when supposed undesirable content is identified? Who determines what constitutes undesirable content? What if governments demand unique monitoring capabilities if Apple wants to sell their products in their countries? Despite the universally agreed upon disgust towards child pornography, the freedom of the entire Apple ecosystem will be jeopardized if this capability is deployed. This is as good as giving government agencies “back door” access to iPhones - which Apple has vehemently opposed. Very good intentions with very dangerous side effects.
    BeatsbonobobOferbaconstangwatto_cobra
  • Reply 4 of 26
    jdwjdw Posts: 1,324member
    No software is perfect.  So if this gets installed on everyone's phones only to call the cops on the iPhone owner when the AI detects what it thinks is a naughty photo, imagine the melt-down when it gets it wrong.  But even if it gets it right, the "concept" of have AI scan the photo for naughty content and then secretly call the cops can lead to an Orwellian scenario real fast.

    The biggest problem with this story is how short it is on details.  We really need to know how it will be used and what safeguards are put in place to protect privacy.  We can ASSUME Apple has that covered, but I'll believe it when I see the details.
    CloudTalkinBeatsOferbaconstangkkqd1337caladanianwatto_cobra
  • Reply 5 of 26
    crowley said:
    "This sort of tool can be a boon for finding child pornography in people's phones," he said. "But imagine what it could do in the hands of an authoritarian government?"
    But it's not in the hands of an authoritarian government?  It's in the hands of Apple.  If an authoritarian government wanted to do something like this I have no doubt they'd be capable of doing it, I don't see how Apple going after child abusers is going to affect that.
    Coupla few things. 
    1. It's not in the hands of Apple.  It's not in the hands of anyone.  It's a, thus far, unsubstantiated rumor from a security researcher. 

    2. If it comes to fruition that Apple does enable the AI feature, wouldn't they be bound by the law to report the info to authorities (idk, ianal).  If the offending data is stored in iCloud, then it would also be subject to worldwide government data requests.  Requests that Apple has honored ~80% of the time on average.  

    3. Keeping in mind this is only a claim by a researcher,  and not Apple, the question would then have to be asked: What constitutes child pornography to the AI?  Is it reviewed by a human for higher level verification?  If so, Apple employee or 3rd party source (like the original voice recordings)?  What triggers reporting to authorities and who bears responsibility for errors?

    A parent sending pics of the kids in bubble bath to grandparents.  Photo of a young looking 18 girl topless at a nude beach.  Scouts shirtless around a campfire.  
    Would any one of those trigger the AI?  What if all three were on the same phone?  It's entirely possible and not far fetched.  

    I can't stress enough this isn't Apple going after child abusers.  This is a researcher making a claim.  But if Apple were going to do so it would most definitely affect that "government access -authoritarian or otherwise- query made by the researcher, in myriad way not even addressed in my comment.
    lkruppBeatsStrangeDayssully54bonobobOferbaconstangcaladanian
  • Reply 6 of 26
    swat671swat671 Posts: 150member
    "This sort of tool can be a boon for finding child pornography in people's phones," he said. "But imagine what it could do in the hands of an authoritarian government?”

    I’m mean, I’m sorry, but I don’t care WHAT type of government it is. CSM (child sex material) is CSM. Repressive governments definitely don’t have issues “taking out the trash”. 

    As far as finding the pics or videos, there is “hashing” or “fingerprinting”. Governments or organizations like ICAC (Internet Crimes Against Children) Task Force use these fingerprints to search the web (and I assume the dark web) for images like this. I assume that Apple would just use these hashes. 

    watto_cobra
  • Reply 7 of 26
    dewmedewme Posts: 5,332member
    tedz98 said:
    I’m totally against child porn. But this capability is a double edged sword. Apple, which touts its strong belief in the privacy of its customers using its devices, to begin surveilling customer content sets up an unusual and potentially dangerous precedent. What undesirable content will be next to be monitored? Who gets notified when supposed undesirable content is identified? Who determines what constitutes undesirable content? What if governments demand unique monitoring capabilities if Apple wants to sell their products in their countries? Despite the universally agreed upon disgust towards child pornography, the freedom of the entire Apple ecosystem will be jeopardized if this capability is deployed. This is as good as giving government agencies “back door” access to iPhones - which Apple has vehemently opposed. Very good intentions with very dangerous side effects.
    These are all legitimate concerns, i.e., the law of unintended consequences. I’m looking at another side of the “double edged sword” from a different perspective. What Apple is proposing is in my opinion 100% the right thing to do. Bravo Apple, go right ahead lower the heel of scrutiny on those low lifes and societal cockroaches. However, one concern I have is that once Apple steps up and claims that they are going to help stomp out a social, societal, or human behavioral problems for the common good they’ll be held liable for the cases they miss. We see this every day with respect to the App Store. Despite Apple’s scrutiny of app submissions a few bad apps still get through. When this happens Apple is skewered and lambasted in the both the court of public opinion and in real courts with real class action lawsuits involving real money. 

    Still, I don’t think Apple should avoid taking on challenges where the penalty for imperfection is high when it’s the right thing to do. Being forthright up-front about the infallibility of what they are taking on will probably not make a difference later on when they are sued, and they most certainly will be sued at some point. That’s just a cost they’ll have to accept and the law schools will be grateful to Apple for their high minded decision.  
  • Reply 8 of 26
    rcfarcfa Posts: 1,124member
    crowley said:
    "This sort of tool can be a boon for finding child pornography in people's phones," he said. "But imagine what it could do in the hands of an authoritarian government?"
    But it's not in the hands of an authoritarian government?  It's in the hands of Apple.  If an authoritarian government wanted to do something like this I have no doubt they'd be capable of doing it, I don't see how Apple going after child abusers is going to affect that.
    Apple complies with all laws in countries it sells its products, it is not a company that pulls out of markets, when it doesn’t like requests.
    Thus in Russia iPhones will come with the Russian approved software installation options, servers for Russian and Chinese users are in their respective countries, subject to access by the local authorities, and if laws are passed, that (once they build the capabilities) messages be scanned for certain content, Apple will comply, too.
    Or do you seriously think they will drop the Chinese market over human rights concerns? 🤣

    That’s exactly why jailed systems are bad: once something like this is baked in, the user has no chance at modifying the system to prevent it.

    So, no, it’s not in the hands of Apple, at least not in any realistic sense of the word.
    Oferwatto_cobra
  • Reply 9 of 26
    rcfarcfa Posts: 1,124member
    I hear echos of Minority Report…

    Going to be fun for security researchers, journalists, whistleblowers, law enforcement officers: how will the system distinguish between holding evidence and peddling?
    Oferbaconstangkkqd1337watto_cobra
  • Reply 10 of 26
    shaminoshamino Posts: 527member
    Abuse is more than just child pornography, and non-pornographic abuse photos are not illegal.

    What happens if I visit Amnesty International's web site, read a few articles, and view photos of children that were beaten by an abuser?  Is Apple going to tell law enforcement that I am now abusing children simply because I have the pictures on my phone?  What if I decide to save these photos in order to republish them on a blog or in a newsletter?  What if I work for Amnesty and took the photos myself?

    And what about law enforcement officers, who are obligated to take photos of abuse victims?  Will they also be subject to this automated system?  And if not, does that create a loophole where a crooked cop can trade in this filth without scrutiny?

    i don't see a possible way that this feature could be implemented without it becoming a nightmare scenario.  Both the false positives and false negatives will destroy people's lives, even if the problems are eventually corrected.

    The only way to win this game is to not play.
    bonobobOferbaconstangjdwwatto_cobra
  • Reply 11 of 26
    crowleycrowley Posts: 10,453member
    shamino said:
    Abuse is more than just child pornography, and non-pornographic abuse photos are not illegal.

    What happens if I visit Amnesty International's web site, read a few articles, and view photos of children that were beaten by an abuser?  Is Apple going to tell law enforcement that I am now abusing children simply because I have the pictures on my phone?  What if I decide to save these photos in order to republish them on a blog or in a newsletter?  What if I work for Amnesty and took the photos myself?
    Then you have excellent, verifiable answers to any questions and there won't be a problem
    shamino said:
    And what about law enforcement officers, who are obligated to take photos of abuse victims?  Will they also be subject to this automated system?  
    Then you have excellent, verifiable answers to any questions and there won't be a problem
    shamino said:
    And if not, does that create a loophole where a crooked cop can trade in this filth without scrutiny?
    As opposed to the zero scrutiny they currently face?  What has been lost here?
    shamino said:
    i don't see a possible way that this feature could be implemented without it becoming a nightmare scenario. Both the false positives and false negatives will destroy people's lives, even if the problems are eventually corrected.
    Nothing you've described is a nightmare, all are very easily solvable with a simple conversation.
    mariowinco
  • Reply 12 of 26
    crowleycrowley Posts: 10,453member
    rcfa said:
    crowley said:
    "This sort of tool can be a boon for finding child pornography in people's phones," he said. "But imagine what it could do in the hands of an authoritarian government?"
    But it's not in the hands of an authoritarian government?  It's in the hands of Apple.  If an authoritarian government wanted to do something like this I have no doubt they'd be capable of doing it, I don't see how Apple going after child abusers is going to affect that.
    Apple complies with all laws in countries it sells its products, it is not a company that pulls out of markets, when it doesn’t like requests.
    Thus in Russia iPhones will come with the Russian approved software installation options, servers for Russian and Chinese users are in their respective countries, subject to access by the local authorities, and if laws are passed, that (once they build the capabilities) messages be scanned for certain content, Apple will comply, too.
    Or do you seriously think they will drop the Chinese market over human rights concerns? 🤣

    That’s exactly why jailed systems are bad: once something like this is baked in, the user has no chance at modifying the system to prevent it.

    So, no, it’s not in the hands of Apple, at least not in any realistic sense of the word.
    Russia or China could pass those laws now.  Absolutely nothing preventing them.
  • Reply 13 of 26
    zimmiezimmie Posts: 651member
    crowley said:
    "This sort of tool can be a boon for finding child pornography in people's phones," he said. "But imagine what it could do in the hands of an authoritarian government?"
    But it's not in the hands of an authoritarian government?  It's in the hands of Apple.  If an authoritarian government wanted to do something like this I have no doubt they'd be capable of doing it, I don't see how Apple going after child abusers is going to affect that.
    Coupla few things. 
    1. It's not in the hands of Apple.  It's not in the hands of anyone.  It's a, thus far, unsubstantiated rumor from a security researcher. 

    2. If it comes to fruition that Apple does enable the AI feature, wouldn't they be bound by the law to report the info to authorities (idk, ianal).  If the offending data is stored in iCloud, then it would also be subject to worldwide government data requests.  Requests that Apple has honored ~80% of the time on average.  

    3. Keeping in mind this is only a claim by a researcher,  and not Apple, the question would then have to be asked: What constitutes child pornography to the AI?  Is it reviewed by a human for higher level verification?  If so, Apple employee or 3rd party source (like the original voice recordings)?  What triggers reporting to authorities and who bears responsibility for errors?

    A parent sending pics of the kids in bubble bath to grandparents.  Photo of a young looking 18 girl topless at a nude beach.  Scouts shirtless around a campfire.  
    Would any one of those trigger the AI?  What if all three were on the same phone?  It's entirely possible and not far fetched.  

    I can't stress enough this isn't Apple going after child abusers.  This is a researcher making a claim.  But if Apple were going to do so it would most definitely affect that "government access -authoritarian or otherwise- query made by the researcher, in myriad way not even addressed in my comment.
    1. Calling Dr. Green a "security researcher" is significantly understating what he does. He's a mathematician who builds this kind of tech, and his students build this kind of tech.

    2,3. The system Dr. Green talked about isn't AI in any meaningful sense. It's fuzzy hashing. It matches only images which are close to existing known images. The concern is fuzzy hashing is, by its very nature, imprecise. While normal hashes match an exact chunk of data, fuzzy hashes are more likely to match "similar" data, and our idea of similar may not be the hash's idea.

    These systems are normally used forensically, after someone is already suspected of possession of CSAM. The system directs investigators to specific files, then the investigators confirm. We don't have good studies of false positive rates, because when the systems are used, it's typically on drives which contain thousands of CSAM images. And the drives people use to store CSAM tend to contain little else, limiting false positives. What's a false positive here or there, when you confirm a hundred of the images are CSAM?

    When a false positive can basically destroy your ability to live in modern society, we really need a solid understanding of how likely they are.



    As for the authoritarian government angle, if such a capability were built, China would definitely demand it be used to report people in China who have a copy of Tank Man.
    baconstang
  • Reply 14 of 26
    BeatsBeats Posts: 3,073member
    Apple scanning our photo library is enough to switch to Android. This seems worse than Google, I don’t care what reason or excuse they use.

    shamino said:
    Abuse is more than just child pornography, and non-pornographic abuse photos are not illegal.

    What happens if I visit Amnesty International's web site, read a few articles, and view photos of children that were beaten by an abuser?  Is Apple going to tell law enforcement that I am now abusing children simply because I have the pictures on my phone?  What if I decide to save these photos in order to republish them on a blog or in a newsletter?  What if I work for Amnesty and took the photos myself?

    And what about law enforcement officers, who are obligated to take photos of abuse victims?  Will they also be subject to this automated system?  And if not, does that create a loophole where a crooked cop can trade in this filth without scrutiny?

    i don't see a possible way that this feature could be implemented without it becoming a nightmare scenario.  Both the false positives and false negatives will destroy people's lives, even if the problems are eventually corrected.

    The only way to win this game is to not play.

    And these are the nice scenarios. I know some cops and a detective who fabricate evidence to get people in trouble. They would love this feature! Send a photo to someone and arrest them.

    Yes this is a bad idea because it will then lead to other back doors. This is how the government operates, they use an excuse to advance to an end goal that isn’t moral.
    Oferbaconstang
  • Reply 15 of 26
    Rayz2016Rayz2016 Posts: 6,957member
    I seriously doubt this is going to happen. Guess we’ll find out tomorrow. 
    Oferwatto_cobra
  • Reply 16 of 26
    crowleycrowley Posts: 10,453member
    Man, people in here certainly enjoy extrapolating their ideas of how things might work to hypothetical situations that might happen in order to find thing that are definitely wrong with this.
    StrangeDays
  • Reply 17 of 26
    StrangeDaysStrangeDays Posts: 12,834member
    crowley said:
    Man, people in here certainly enjoy extrapolating their ideas of how things might work to hypothetical situations that might happen in order to find thing that are definitely wrong with this.
    This pattern of handwringing happens with almost every rumor on these sites. People write out a plot/details in their heads, then present that as if it is real.
    watto_cobra
  • Reply 18 of 26
    markbyrnmarkbyrn Posts: 661member
    if this story is true, Aple would have the worst case of fork-tongue ever known. Perpetuating surveillance of user iPhones is a massive privacy violation, and we know it won't stop at just loathsome imagery.
    boboliciouswatto_cobra
  • Reply 19 of 26
    Rayz2016Rayz2016 Posts: 6,957member
    crowley said:
    Man, people in here certainly enjoy extrapolating their ideas of how things might work to hypothetical situations that might happen in order to find thing that are definitely wrong with this.
    This pattern of handwringing happens with almost every rumor on these sites. People write out a plot/details in their heads, then present that as if it is real.
    The more I read about this, the more ridiculous it sounds. 

    Apple spends the past three years banging on about privacy, then starts scanning your photo library?

    I don’t buy it, and no one would buy another iPhone if they thought their lives could be ruined by buggy software. 

    The only way I see this being acceptable is if this so-called “client-side tool” would allow parents to ensure their children weren’t sending naked pics of themselves to Republican politicians posing as high-school kids. 
    edited August 2021 mariowincomuthuk_vanalingamwatto_cobra
  • Reply 20 of 26
    baconstangbaconstang Posts: 1,103member
    Considering how "Photos' mislabels my images way more than half the time, I can't see what could possibly go wrong.
This discussion has been closed.