Apple exec said iCloud was the 'greatest platform' for CSAM distribution
Apple's introduction of CSAM tools may be due to its high level of privacy, with messages submitted during the Epic Games App Store trial revealing Apple's anti-fraud chief thought its services was the "greatest platform for distributing" the material.
Apple has a major focus on ensuring consumer privacy, which it builds as far as possible into its products and services. However, it seems that focus may have some unintended consequences, in enabling some illegal user behavior.
Uncovered from documents submitted during the Apple-Epic Games trial and discovered by The Verge, an iMessage involving Apple Fraud Engineering Algorithms and Risk head Eric Friedman from 2020 seems to hint at the reasons behind Apple's introduction of CSAM tools.
In the thread about privacy, Friedman points out that Facebook's priorities differ greatly from Apple's, which has some unintended effects. While Facebook works on "trust and safety," in dealing with fake accounts and other elements, Friedman offers the assessment "In privacy, they suck."
"Our properties are the inverse," states the chief, before claiming "Which is why we are the greatest platform for distributing child porn, etc."
An iMessage screenshot from documents uncovered during the Apple-Epic Games lawsuit.
When offered that there are more opportunities for bad actors on other file-sharing systems, Friedman highlights "we have chosen to not know enough in places where we really cannot say." The fraud head then references a graph from the New York Times depicting how firms are combatting the issue, but then suggests "I think it's an underreport."
Friedman also shares a slide from a presentation he was going to make on trust and safety, with "child predator grooming reports" listed as an issue for the App Store. It is also deemed an "active threat," as regulators are "on our case" on the matter.
While the findings aren't directly attributable to the introduction of CSAM tools, they do indicate that Apple has considered the problem for some time, and is keenly aware of its faults that led to the situation.
Despite Apple's efforts on the matter, it has received considerable pushback from critics over privacy. Critics and government bodies have asked Apple to reconsider, over concerns that the systems could lead to widespread surveillance.
Apple, meanwhile, has taken steps to de-escalate arguments, including detailing how the systems work and emphasizing its attempts to maintain privacy.
Read on AppleInsider
Apple has a major focus on ensuring consumer privacy, which it builds as far as possible into its products and services. However, it seems that focus may have some unintended consequences, in enabling some illegal user behavior.
Uncovered from documents submitted during the Apple-Epic Games trial and discovered by The Verge, an iMessage involving Apple Fraud Engineering Algorithms and Risk head Eric Friedman from 2020 seems to hint at the reasons behind Apple's introduction of CSAM tools.
In the thread about privacy, Friedman points out that Facebook's priorities differ greatly from Apple's, which has some unintended effects. While Facebook works on "trust and safety," in dealing with fake accounts and other elements, Friedman offers the assessment "In privacy, they suck."
"Our properties are the inverse," states the chief, before claiming "Which is why we are the greatest platform for distributing child porn, etc."
An iMessage screenshot from documents uncovered during the Apple-Epic Games lawsuit.
When offered that there are more opportunities for bad actors on other file-sharing systems, Friedman highlights "we have chosen to not know enough in places where we really cannot say." The fraud head then references a graph from the New York Times depicting how firms are combatting the issue, but then suggests "I think it's an underreport."
Friedman also shares a slide from a presentation he was going to make on trust and safety, with "child predator grooming reports" listed as an issue for the App Store. It is also deemed an "active threat," as regulators are "on our case" on the matter.
While the findings aren't directly attributable to the introduction of CSAM tools, they do indicate that Apple has considered the problem for some time, and is keenly aware of its faults that led to the situation.
Despite Apple's efforts on the matter, it has received considerable pushback from critics over privacy. Critics and government bodies have asked Apple to reconsider, over concerns that the systems could lead to widespread surveillance.
Apple, meanwhile, has taken steps to de-escalate arguments, including detailing how the systems work and emphasizing its attempts to maintain privacy.
Read on AppleInsider
Comments
It isn’t Apple’s job to know or prevent criminal behavior.
Sony doesn’t have cameras that try to detect what pictures you take, paper vendors aren’t concerned with the fact that you jot down notes of an assassination plot on the paper you bought, the post office doesn’t care if you mail a USB stick with snuff videos on it.
It’s not Apple’s job to know, care, prevent, or report crime, unless as a service at the request of the owner (“find my stolen phone!”)
So Apple’s very thinking is flawed, as they hold themselves responsible for something that’s fundamentally none of their business, literally and figuratively.
* If they know that by CSAM hashing iCloud photos on their servers then again there is an already effective mechanism that revealed so many CSAM. Then what is the point of injecting a warrantless blanket criminal search mechanism into the user’s property?
Example: if you went on Periscope before it shut down, and looked at the live streams there were regular repeating streams out of eastern Europe (Cyrillic countries) offering to sell archives of CSAM (CP/cheeze pizza as they refer to it) often by contact on Telegram*. Easy enough for law enforcement to make that contact and see what platform is being used for distribution.
*often flanked by youth sports streams and church livestreams.
Legal obligation? Probably not, though that's down to the law. Every photocopier in circulation will not allow you to photocopy currency, I see no reason why there couldn't/shouldn't be a law that says digital storage should proactively prevent transfer and hosting of CSAM.
Moral obligation? That's entirely up to Apple.
https://www.microsoft.com/en-us/photodna
https://protectingchildren.google/intl/en/
…are they also outraged about it? Are they going to give up Gmail and Dropbox because they too scan hashes for child pornography?
Apple has already been scanning your uploaded and encrypted photos for CSAM images and using hashtags just as the on-device system will. That was confirmed early last year (and probably being done since at least 2019) by Apple's own Chief Privacy Officer, Jane Horvath. So the reason for now doing so on device isn't clear, at least to me and countless others.
Apple seems to have been scanning iCloud email for CSAM but not iCloud Photos and backups:
https://9to5mac.com/2021/08/23/apple-scans-icloud-mail-for-csam/
Nonetheless, I share your unease about on-device scanning for whatever purpose.