San Francisco doctor charged with possessing child pornography in iCloud

Posted:
in General Discussion edited August 2021
Amid controversy surrounding Apple's CSAM detection system, a San Francisco Bay Area doctor has been charged with possessing child pornography in his Apple iCloud account, according to federal authorities.

Credit: Apple
Credit: Apple


The U.S. Department of Justice announced Thursday that Andrew Mollick, 58, had at least 2,000 sexually exploitative images and videos of children stored in his iCloud account. Mollick is an oncology specialist affiliated with several Bay Area medical facilitates, as well as an associate professor at UCSF School of Medicine.

Additionally, he uploaded one of the images to social media app Kik, according to the recently unsealed federal complaint (via KRON4).

Apple has recently announced plans to introduce a system designed to detect child sexual abuse material (CSAM) in iCloud and provide a report to the National Center for Missing and Exploited Children (NCMEC). The system, which relies on cryptographic techniques to ensure user privacy, has caused controversy among the digital rights and cybersecurity communities.

The system does not scan actual images on a user's iCloud account. Instead, it relies on matching hashes of images stored in iCloud to known CSAM hashes provided by at least two child safety organizations. There's also a threshold of at least 30 pieces of CSAM to help mitigate false positives.

Documents revealed during the Epic Games v. Apple trial indicated that Apple anti-fraud chief Eric Friedman thought that the Cupertino tech giant's services were the "greatest platform for distributing" CSAM. Friedman attributed that fact to Apple's strong stance on user privacy.

Despite the backlash, Apple is pressing forward with its plans to debut the CSAM detection system. It maintains that the platform will still preserve the privacy of users who do not have collections of CSAM on their iCloud accounts.

Read on AppleInsider
«13

Comments

  • Reply 1 of 44
    I’m ok with this.   What is NOT ok is them backdooring Messages (to start).

    darkvadermarklark
  • Reply 2 of 44
    DAalsethDAalseth Posts: 2,783member
    This says to me that they then have no reason to add any additional measures. They can already detect these images in iCloud. 
    baconstanglkruppentropysdarkvadermuthuk_vanalingammarklark
  • Reply 3 of 44
    baconstangbaconstang Posts: 1,103member
    Looks like they caught this jerk without installing spyware on his phone.
    nizzardentropysdarkvadermuthuk_vanalingamchemengin1
  • Reply 4 of 44
    zimmiezimmie Posts: 651member
    DAalseth said:
    This says to me that they then have no reason to add any additional measures. They can already detect these images in iCloud. 
    They can ... because images in iCloud aren't encrypted today. Apple's servers have the ability to see them.

    With this new plan to scan images on end users' devices as they are being uploaded to iCloud, the images themselves can be encrypted in a way that Apple can't break. Each uploaded instance of CSAM includes a partial direction on how to find the key to decrypt the images. This is known as threshold secret sharing. They aren't exactly parts of the key, but once Apple has enough (apparently 30 in this case), they can use the directions to generate their own copy of the key. (Edited to clarify that last sentence.)

    Today, Apple employees can poke through your photos and share ones they find interesting (presumably policy does not allow this, but they have the capability to). With the announced system in place, they would no longer be able to at a technical level.
    edited August 2021 patchythepirateAnilu_777mpw_amherstStrangeDayssireofsethkillroymwhitedoozydozenAlex_VDougie.S
  • Reply 5 of 44
    sflocalsflocal Posts: 6,092member
    Looks like they caught this jerk without installing spyware on his phone.
    Where did Apple say they were installing "spyware" on iPhones?  They are scanning iCloud photos in its datacenter, but nothing gets loaded on the phone itself.

    Why do people continue pushing false stories like this?
    Anilu_777lkruppdysamoriasireofsethkillroymwhitedoozydozenAlex_Vmacxpressmarklark
  • Reply 6 of 44
    narwhalnarwhal Posts: 119member
    I imagine this will inspire people who have those types of images to either delete them all or move to Android. And I assume Google will implement the same checks at some point so they don't become known as that type of platform. (Maybe they already are.) I can't really see this tech as being helpful to China, Russia, or other dictatorships since people tend not to store self-incriminating photos on their devices. Wait, China might want to find images of Tank Man or Tiananmen Square, since those things never happened. Well, never mind.
    mwhite
  • Reply 7 of 44
    fallenjtfallenjt Posts: 4,053member
    DAalseth said:
    This says to me that they then have no reason to add any additional measures. They can already detect these images in iCloud. 
    That’s what Craig Federighi already said!
    marklark
  • Reply 8 of 44
    fallenjtfallenjt Posts: 4,053member
    zimmie said:
    DAalseth said:
    This says to me that they then have no reason to add any additional measures. They can already detect these images in iCloud. 
    They can ... because images in iCloud aren't encrypted today. Apple's servers have the ability to see them.

    With this new plan to scan images on end users' devices as they are being uploaded to iCloud, the images themselves can be encrypted in a way that Apple can't break. Each uploaded instance of CSAM includes a partial direction on how to find the key to decrypt the images. This is known as threshold secret sharing. They aren't exactly parts of the key, but once you have enough (apparently 30 in this case), you can use them to generate your own copy of the key.

    Today, Apple employees can poke through your photos and share ones they find interesting (presumably policy does not allow this, but they have the capability to). With the announced system in place, they would no longer be able to at a technical level.
    They can anyway:
     https://youtu.be/OQUO1DSwYN0

  • Reply 9 of 44
    nizzard said:
    I’m ok with this.   What is NOT ok is them backdooring Messages (to start).

    There’s no back door to Messages. Sexual images are blurred if the account is a child account by AI. 
    sireofsethkillroybaconstangAlex_Vjony0
  • Reply 10 of 44
    lkrupplkrupp Posts: 10,557member
    narwhal said:
    I imagine this will inspire people who have those types of images to either delete them all or move to Android. And I assume Google will implement the same checks at some point so they don't become known as that type of platform. (Maybe they already are.) I can't really see this tech as being helpful to China, Russia, or other dictatorships since people tend not to store self-incriminating photos on their devices. Wait, China might want to find images of Tank Man or Tiananmen Square, since those things never happened. Well, never mind.
    Move to Android, eh? I guess you live in some alternate universe. From Google's own mouth... https://protectingchildren.google/intl/en/

    CSAI Match

    CSAI Match is our proprietary technology, developed by the YouTube team, for combating child sexual abuse imagery (CSAI) in video content online. It was the first technology to use hash-matching to identify known violative videos and allows us to identify this type of violative content amid a high volume of non-violative video content. When a match of violative content is found, it is then flagged to partners to responsibly report in accordance to local laws and regulations. Through YouTube, we make CSAI Match available for free to NGOs and industry partners like Adobe, Reddit, and Tumblr, who use it to counter the spread of online child exploitation videos on their platforms as well.

    edited August 2021 dysamoriaStrangeDayssireofsethkillroyjony0
  • Reply 11 of 44
    dysamoriadysamoria Posts: 3,430member
    sflocal said:
    Looks like they caught this jerk without installing spyware on his phone.
    Where did Apple say they were installing "spyware" on iPhones?  They are scanning iCloud photos in its datacenter, but nothing gets loaded on the phone itself.

    Why do people continue pushing false stories like this?
    Because they want to believe them.
    StrangeDayskillroymwhitedewmejony0
  • Reply 12 of 44
    crowleycrowley Posts: 10,453member
    Looks like they caught this jerk without installing spyware on his phone.
    2000 is a shit-ton of images, and there's no indication in the article of how he was caught or how long he's been hoarding this filth.  And 1 guy caught is hardly a triumph when you have no idea how many others like him are out there.  Given the numbers of reports we know that Facebook and Google make, this one guy could very well be a drop in the ocean of iCloud child porn goblins.
    dewmejony0
  • Reply 13 of 44
    dysamoriadysamoria Posts: 3,430member
    nizzard said:
    I’m ok with this.   What is NOT ok is them backdooring Messages (to start).

    You’re okay with what?

    What “backdoor” in Messages are you talking about?
    killroyjony0
  • Reply 14 of 44
    StrangeDaysStrangeDays Posts: 12,834member
    nizzard said:
    I’m ok with this.   What is NOT ok is them backdooring Messages (to start).

    Good thing they aren’t. The Messages component is an opt-in thing for parents who don’t want their children sending nudes, and has nothing to do with CSAM hash checks for iCloud Photos uploads. 
    killroyjony0
  • Reply 15 of 44
    StrangeDaysStrangeDays Posts: 12,834member
    Looks like they caught this jerk without installing spyware on his phone.
    That’s old news. All the commercial cloud services have hash checking server-side, including Apple:

    https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/

    …what they’re working on now is making the check happen on your phone prior to upload, so that your iCloud stuff is fully E2E, which it isn’t currently. The butthurts don’t comprehend this.

    edited August 2021 killroydoozydozenAlex_Vdewme
  • Reply 16 of 44
    StrangeDaysStrangeDays Posts: 12,834member
    narwhal said:
    I imagine this will inspire people who have those types of images to either delete them all or move to Android. And I assume Google will implement the same checks at some point so they don't become known as that type of platform. (Maybe they already are.) I can't really see this tech as being helpful to China, Russia, or other dictatorships since people tend not to store self-incriminating photos on their devices. Wait, China might want to find images of Tank Man or Tiananmen Square, since those things never happened. Well, never mind.
    Google, Microsoft, et all already have this…there just wasn’t any butthurt whining about it. 

    https://www.microsoft.com/en-us/photodna

    https://protectingchildren.google/intl/en/

    edited August 2021 killroydoozydozendewmejony0
  • Reply 17 of 44
    love the chaos this is causing to all us innocent people - probably means Apple got it right
    killroyjony0
  • Reply 18 of 44
    fastasleepfastasleep Posts: 6,408member
    nizzard said:
    I’m ok with this.   What is NOT ok is them backdooring Messages (to start).

    They didn’t. 
    killroydoozydozenjony0
  • Reply 19 of 44
    fastasleepfastasleep Posts: 6,408member
    DAalseth said:
    This says to me that they then have no reason to add any additional measures. They can already detect these images in iCloud. 
    Sure sounds like the uploading of an image to Kik is likely what led the feds to subpoena his iCloud data.

    You sure sound paranoid over all this. 
    killroydoozydozenjony0
  • Reply 20 of 44
    killroykillroy Posts: 271member
    narwhal said:
    I imagine this will inspire people who have those types of images to either delete them all or move to Android. And I assume Google will implement the same checks at some point so they don't become known as that type of platform. (Maybe they already are.) I can't really see this tech as being helpful to China, Russia, or other dictatorships since people tend not to store self-incriminating photos on their devices. Wait, China might want to find images of Tank Man or Tiananmen Square, since those things never happened. Well, never mind.

    Google is doing it now and have been for years.
    jony0
Sign In or Register to comment.