San Francisco doctor charged with possessing child pornography in iCloud
Amid controversy surrounding Apple's CSAM detection system, a San Francisco Bay Area doctor has been charged with possessing child pornography in his Apple iCloud account, according to federal authorities.
Credit: Apple
The U.S. Department of Justice announced Thursday that Andrew Mollick, 58, had at least 2,000 sexually exploitative images and videos of children stored in his iCloud account. Mollick is an oncology specialist affiliated with several Bay Area medical facilitates, as well as an associate professor at UCSF School of Medicine.
Additionally, he uploaded one of the images to social media app Kik, according to the recently unsealed federal complaint (via KRON4).
Apple has recently announced plans to introduce a system designed to detect child sexual abuse material (CSAM) in iCloud and provide a report to the National Center for Missing and Exploited Children (NCMEC). The system, which relies on cryptographic techniques to ensure user privacy, has caused controversy among the digital rights and cybersecurity communities.
The system does not scan actual images on a user's iCloud account. Instead, it relies on matching hashes of images stored in iCloud to known CSAM hashes provided by at least two child safety organizations. There's also a threshold of at least 30 pieces of CSAM to help mitigate false positives.
Documents revealed during the Epic Games v. Apple trial indicated that Apple anti-fraud chief Eric Friedman thought that the Cupertino tech giant's services were the "greatest platform for distributing" CSAM. Friedman attributed that fact to Apple's strong stance on user privacy.
Despite the backlash, Apple is pressing forward with its plans to debut the CSAM detection system. It maintains that the platform will still preserve the privacy of users who do not have collections of CSAM on their iCloud accounts.
Read on AppleInsider
Credit: Apple
The U.S. Department of Justice announced Thursday that Andrew Mollick, 58, had at least 2,000 sexually exploitative images and videos of children stored in his iCloud account. Mollick is an oncology specialist affiliated with several Bay Area medical facilitates, as well as an associate professor at UCSF School of Medicine.
Additionally, he uploaded one of the images to social media app Kik, according to the recently unsealed federal complaint (via KRON4).
Apple has recently announced plans to introduce a system designed to detect child sexual abuse material (CSAM) in iCloud and provide a report to the National Center for Missing and Exploited Children (NCMEC). The system, which relies on cryptographic techniques to ensure user privacy, has caused controversy among the digital rights and cybersecurity communities.
The system does not scan actual images on a user's iCloud account. Instead, it relies on matching hashes of images stored in iCloud to known CSAM hashes provided by at least two child safety organizations. There's also a threshold of at least 30 pieces of CSAM to help mitigate false positives.
Documents revealed during the Epic Games v. Apple trial indicated that Apple anti-fraud chief Eric Friedman thought that the Cupertino tech giant's services were the "greatest platform for distributing" CSAM. Friedman attributed that fact to Apple's strong stance on user privacy.
Despite the backlash, Apple is pressing forward with its plans to debut the CSAM detection system. It maintains that the platform will still preserve the privacy of users who do not have collections of CSAM on their iCloud accounts.
Read on AppleInsider
Comments
With this new plan to scan images on end users' devices as they are being uploaded to iCloud, the images themselves can be encrypted in a way that Apple can't break. Each uploaded instance of CSAM includes a partial direction on how to find the key to decrypt the images. This is known as threshold secret sharing. They aren't exactly parts of the key, but once Apple has enough (apparently 30 in this case), they can use the directions to generate their own copy of the key. (Edited to clarify that last sentence.)
Today, Apple employees can poke through your photos and share ones they find interesting (presumably policy does not allow this, but they have the capability to). With the announced system in place, they would no longer be able to at a technical level.
https://youtu.be/OQUO1DSwYN0
CSAI Match
CSAI Match is our proprietary technology, developed by the YouTube team, for combating child sexual abuse imagery (CSAI) in video content online. It was the first technology to use hash-matching to identify known violative videos and allows us to identify this type of violative content amid a high volume of non-violative video content. When a match of violative content is found, it is then flagged to partners to responsibly report in accordance to local laws and regulations. Through YouTube, we make CSAI Match available for free to NGOs and industry partners like Adobe, Reddit, and Tumblr, who use it to counter the spread of online child exploitation videos on their platforms as well.
What “backdoor” in Messages are you talking about?
https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/
…what they’re working on now is making the check happen on your phone prior to upload, so that your iCloud stuff is fully E2E, which it isn’t currently. The butthurts don’t comprehend this.
https://www.microsoft.com/en-us/photodna
https://protectingchildren.google/intl/en/
You sure sound paranoid over all this.
Google is doing it now and have been for years.