San Francisco doctor charged with possessing child pornography in iCloud

2

Comments

  • Reply 21 of 44
    entropysentropys Posts: 4,168member
    narwhal said:
    I imagine this will inspire people who have those types of images to either delete them all or move to Android. And I assume Google will implement the same checks at some point so they don't become known as that type of platform. (Maybe they already are.) I can't really see this tech as being helpful to China, Russia, or other dictatorships since people tend not to store self-incriminating photos on their devices. Wait, China might want to find images of Tank Man or Tiananmen Square, since those things never happened. Well, never mind.
    Google, Microsoft, et all already have this…there just wasn’t any butthurt or whining about it. 

    https://www.microsoft.com/en-us/photodna

    https://protectingchildren.google/intl/en/

    So? Isn’t a reason to be an Apple user is they are not like those companies? Why I believe Apple even markets itself as thinking different.
    baconstangrcfamuthuk_vanalingam
  • Reply 22 of 44
    crowleycrowley Posts: 10,453member
    entropys said:
    narwhal said:
    I imagine this will inspire people who have those types of images to either delete them all or move to Android. And I assume Google will implement the same checks at some point so they don't become known as that type of platform. (Maybe they already are.) I can't really see this tech as being helpful to China, Russia, or other dictatorships since people tend not to store self-incriminating photos on their devices. Wait, China might want to find images of Tank Man or Tiananmen Square, since those things never happened. Well, never mind.
    Google, Microsoft, et all already have this…there just wasn’t any butthurt or whining about it. 

    https://www.microsoft.com/en-us/photodna

    https://protectingchildren.google/intl/en/

    So? Isn’t a reason to be an Apple user is they are not like those companies? Why I believe Apple even markets itself as thinking different.
    I don't believe "Think Different, shelter child abusers" is what they had in mind.
    robabafastasleepdewmejony0
  • Reply 23 of 44
    macplusplusmacplusplus Posts: 2,112member
    Looks like they caught this jerk without installing spyware on his phone.
    That’s old news. All the commercial cloud services have hash checking server-side, including Apple:

    https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/

    …what they’re working on now is making the check happen on your phone prior to upload, so that your iCloud stuff is fully E2E, which it isn’t currently. The butthurts don’t comprehend this.

    This is not the E2E encryption you were waiting for, i.e the one that would fully encrypt "your iCloud stuff". Their technical document mentions some "device-generated keys" but that pertains only to iCloud Photos, since it is tied to CSAM matching.

    Besides, they have never declared openly and clearly that they are bringing E2E encryption to iCloud Photos, correct me if I missed something.
    edited August 2021 gatorguybaconstangAlex_Vmuthuk_vanalingam
  • Reply 24 of 44
    roakeroake Posts: 811member
    sflocal said:
    Looks like they caught this jerk without installing spyware on his phone.
    Where did Apple say they were installing "spyware" on iPhones?  They are scanning iCloud photos in its datacenter, but nothing gets loaded on the phone itself.

    Why do people continue pushing false stories like this?
    Well, Apple doesn't call it "Spyware," but it is, and it's coming in iOS 15.
    baconstang
  • Reply 25 of 44
    roakeroake Posts: 811member
    As a physician myself, I have to wonder if the context of these images was interpreted correctly.

    This guy was a medical oncologist.  I have not tried to find the story, but I wonder if he is a pediatric medical oncologist.  Sometimes these doctors take photos of lesions, wounds, other issues on patients and later post them to the chart, etc.  If the affected areas are near breasts or genitals, I could see how these could be mistaken by a lay person for sexually-exploitative images.  These chart photos are may be distasteful to lay people, but they are important to monitor progress of these areas.  He could have left Cloud backup of photos turned on, and that's how they ended up on iCloud.  If THAT is the scenario, then a perfectly good man just had his career destroyed by an overzealous tech industry that didn't stay in their own lane.  Even if this is not the case, it's a possibility that could occur to someone else, destroying the very person that is trying to save those kids' lives.

    All that being said, it's possible that these photos had nothing to do with patient treatment and were purely for sexual gratification of some twisted human mind.  If that's the case, then he needs some very serious mental help, and deserves what's coming.

    For those that are about to point out that a doctor shouldn't be using a personal phone for those kinds of things, try looking into what it takes to get a formal by-the-bureaucracy authorized photo of these things.  It's damned near impossible, and can take days for approval, which is useless in monitoring fast-moving treatments such as surgery or certain types of radiation therapy where things change daily or sometimes hourly.

    And yes, high-end EMR's such as EPIC have applications for your phone that allow you to take photos directly to the chart without storing them on the phone.  But less expensive EMR's don't necessarily have this option.

    Update:
    After making this post, I spent some time looking for some information that was more informative than "exploitative images" or "pornography," which could be subject to interpretation.  I found a copy of the initial criminal complaint at: https://padailypost.com/wp-content/uploads/2021/08/file0.43959436616036.pdf
    This complaint makes it very clear that the image he uploaded to Kik was indeed sexually explicit, depicting adult penetration of a prepubescent female.  In other words, this guy has one sick mind, and has destroyed his own career.
    edited August 2021 baconstangAlex_Vgatorguydewme
  • Reply 26 of 44
    chadbagchadbag Posts: 2,000member
    AI said: “ The system does not scan actual images on a user's iCloud account

    pray tell, how do they get a hash for the image without scanning (reading) it?

    baconstang
  • Reply 27 of 44
    bluefire1bluefire1 Posts: 1,302member
    Apple constantly boasts about user privacy and then acknowledges access to one’s iCloud pictures? There shouldn’t be any back door. Period. Either you believe in privacy or you don’t. 
    edited August 2021 baconstang
  • Reply 28 of 44
    GeorgeBMacGeorgeBMac Posts: 11,421member
    fallenjt said:
    DAalseth said:
    This says to me that they then have no reason to add any additional measures. They can already detect these images in iCloud. 
    That’s what Craig Federighi already said!

    I thought he said "You're holding it wrong!"
    darkvader
  • Reply 29 of 44
    GeorgeBMacGeorgeBMac Posts: 11,421member
    DAalseth said:
    This says to me that they then have no reason to add any additional measures. They can already detect these images in iCloud. 
    Sure sounds like the uploading of an image to Kik is likely what led the feds to subpoena his iCloud data.

    You sure sound paranoid over all this. 
    I agree with your theory about what triggered this.

    But, paranoia?
    This is one of those areas where the choice is between bad options. 
    Pretty much everybody agrees that child pornography is bad and must be contained or eliminated.  But then you get to "How" and "Who"?

    If this were China the government would simply step in and squash it.   But here in the U.S. we ironically fear our government more than we do child pornographers -- so that is not an option.   That leaves private industry to police things like child pornography and disinformation.  But that introduces a whole new set of concerns and questions.   Most would agree that the third option (Do Nothing) is unacceptable.   So, we seem to be back to the first two -- neither of which is a good option -- just a necessary one.

    Somehow, over the past couple decades America has shifted to demanding simple, easy, black and white solutions.  While that's a good goal, it's not always possible or advisable.

    IreneWbaconstang
  • Reply 30 of 44
    nizzard said:
    I’m ok with this.   What is NOT ok is them backdooring Messages (to start).
    People need to stop completely misunderstanding the situation and spreading misinformation. iMessage is NOT backdoored, Apple or the parent never learns about the actual contents of the message (the parent would have to physically take the child's device to see it). The parent simply receives a notification saying "your child viewed an explicit photo" if said child is under 13 and the parent opted in to receive this notification. Otherwise, for children aged 13-17, it remains a completely on-device content scan that only warns the child and nobody else.
    fastasleepdewme
  • Reply 31 of 44
    rcfarcfa Posts: 1,124member
    DAalseth said:
    This says to me that they then have no reason to add any additional measures. They can already detect these images in iCloud. 
    Or they have been unofficially testing it already on other OS releases…
  • Reply 32 of 44
    rcfarcfa Posts: 1,124member
    crowley said:
    entropys said:
    narwhal said:
    I imagine this will inspire people who have those types of images to either delete them all or move to Android. And I assume Google will implement the same checks at some point so they don't become known as that type of platform. (Maybe they already are.) I can't really see this tech as being helpful to China, Russia, or other dictatorships since people tend not to store self-incriminating photos on their devices. Wait, China might want to find images of Tank Man or Tiananmen Square, since those things never happened. Well, never mind.
    Google, Microsoft, et all already have this…there just wasn’t any butthurt or whining about it. 

    https://www.microsoft.com/en-us/photodna

    https://protectingchildren.google/intl/en/

    So? Isn’t a reason to be an Apple user is they are not like those companies? Why I believe Apple even markets itself as thinking different.
    I don't believe "Think Different, shelter child abusers" is what they had in mind.
    Privacy is topic and value agnostic, for technical AND ethical reasons. 

    Either you have privacy, or you don’t.

    The LAW is NO STANDARD for ETHICS.
    Under communism being a capitalist was a capital offense.
    Under the Nazi regime being a Jew was good enough for a death sentence.
    Under certain African countries’ laws there’s death penalty on gay sex.

    Exactly how would you want privacy that both ethical and yet allows governments to catch what they consider criminal while not restricting Apple’s business to select few western countries?

    There’s only one stance: It is not Apple’s problem to fix. Apple is a tool maker, and modern computing devices are in essence brain prosthetics. Apple’s duty is to protect user’s privacy, not to catch bad guys.

    A gun manufacturer should make the most deadly guns; it’s not incumbent on them to decide who may or may not be shot.
    A maker of pen and paper must make sure, it writes well, the writing is lightfast and smudge proof; it’s not their task to prevent dubious texts from being written.

    And makers of printers should never have been allowed to print nearly invisible markings on printed output, that allow tracing the origins of printed pages. (Same goes for document formats… Which is why critical things should only be done in plain ASCII, human readable files without “ASCII-encoded BLOBS”)

    In an utterly wrong interpretation that it’s the tech industry’s task to make law enforcement’s jobs easier, privacy is being eroded small step by small step until its gone.

    Time to switch to GraphemeOS ….
    macplusplusbaconstangdarkvader
  • Reply 33 of 44
    chadbag said:
    AI said: “ The system does not scan actual images on a user's iCloud account”

    pray tell, how do they get a hash for the image without scanning (reading) it?

    This article explains what a hash is, in a relatively non-technical way. 
     
    https://newtech.law/en/the-hash-a-computer-files-digital-fingerprint/

    So, the sequence of ‘ones and zeros’ that make up any file, but in this case, image files, are run through a standard hashing algorithm that produces a unique code, or fingerprint, for that file. Change one bit in the file, and the fingerprint will change. 

    So, the organizations that monitor child pornography images have a growing collection of hashes, or fingerprints, for these images.  So, when files are uploaded to iCloud, Apple can generate hashes for files, and then compare the hashes to the known list of child porn images, to look for matches. 

    Hope this helps. 
    edited August 2021 fastasleepjony0
  • Reply 34 of 44
    GeorgeBMacGeorgeBMac Posts: 11,421member
    rcfa said:
    crowley said:
    entropys said:
    narwhal said:
    I imagine this will inspire people who have those types of images to either delete them all or move to Android. And I assume Google will implement the same checks at some point so they don't become known as that type of platform. (Maybe they already are.) I can't really see this tech as being helpful to China, Russia, or other dictatorships since people tend not to store self-incriminating photos on their devices. Wait, China might want to find images of Tank Man or Tiananmen Square, since those things never happened. Well, never mind.
    Google, Microsoft, et all already have this…there just wasn’t any butthurt or whining about it. 

    https://www.microsoft.com/en-us/photodna

    https://protectingchildren.google/intl/en/

    So? Isn’t a reason to be an Apple user is they are not like those companies? Why I believe Apple even markets itself as thinking different.
    I don't believe "Think Different, shelter child abusers" is what they had in mind.
    Privacy is topic and value agnostic, for technical AND ethical reasons. 

    Either you have privacy, or you don’t.

    The LAW is NO STANDARD for ETHICS.
    Under communism being a capitalist was a capital offense.
    Under the Nazi regime being a Jew was good enough for a death sentence.
    Under certain African countries’ laws there’s death penalty on gay sex.

    Exactly how would you want privacy that both ethical and yet allows governments to catch what they consider criminal while not restricting Apple’s business to select few western countries?

    There’s only one stance: It is not Apple’s problem to fix. Apple is a tool maker, and modern computing devices are in essence brain prosthetics. Apple’s duty is to protect user’s privacy, not to catch bad guys.

    A gun manufacturer should make the most deadly guns; it’s not incumbent on them to decide who may or may not be shot.
    A maker of pen and paper must make sure, it writes well, the writing is lightfast and smudge proof; it’s not their task to prevent dubious texts from being written.

    And makers of printers should never have been allowed to print nearly invisible markings on printed output, that allow tracing the origins of printed pages. (Same goes for document formats… Which is why critical things should only be done in plain ASCII, human readable files without “ASCII-encoded BLOBS”)

    In an utterly wrong interpretation that it’s the tech industry’s task to make law enforcement’s jobs easier, privacy is being eroded small step by small step until its gone.

    Time to switch to GraphemeOS ….

    So, if I read that correctly, you are saying that neither government nor private industry should restrict child pornography or anything else that society deems abhorrent.

    As far as I can see, that leaves anarchy -- with anybody free to do what they want.  Through the ages, that has generally been rejected
    jony0
  • Reply 35 of 44
    crowleycrowley Posts: 10,453member
    rcfa said:

    Apple’s duty is to protect user’s privacy, not to catch bad guys.
    Nope, Apple's duty is Apple's business, they can choose to make it whatever they want, as long as it's within the law.  They appear to have decided that rooting out child abuse is part of that duty, and I think that's a pretty noble goal.

    The rest of your post seems to be base sociopathy; everything is someone else's problem.
    edited August 2021 jony0Detnator
  • Reply 36 of 44
    fastasleepfastasleep Posts: 6,420member
    bluefire1 said:
    Apple constantly boasts about user privacy and then acknowledges access to one’s iCloud pictures? There shouldn’t be any back door. Period. Either you believe in privacy or you don’t. 
    Apple (and every other online data storage service) has always been able to provide your iCloud data and the decryption keys to the authorities when compelled to do so by the courts. That’s absolutely nothing new. 
    jony0
  • Reply 37 of 44
    No, Apple has not been scanning photos for CSAM server-side. I thought the TechCrunch interview would’ve put that little morsel of misinformation to rest, but I guess it just can’t die. If you read the actual federal complaint on PACER, you see that law enforcement was first alerted to Mollick’s alleged possession of CSAM because of a report by Kik, triggered by PhotoDNA. They then served a subpoena to Apple for subscriber information to confirm he was the same user who uploaded CSAM to Kik. They then served a search warrant to Apple, where they found 2000 images in a library of 80000 files. 

    fastasleepmacplusplusjony0
  • Reply 38 of 44
    crowleycrowley Posts: 10,453member
    bluefire1 said:
    Apple constantly boasts about user privacy and then acknowledges access to one’s iCloud pictures? There shouldn’t be any back door. Period. Either you believe in privacy or you don’t. 
    I’d love to see you refuse a court issued warrant. See how far it gets you.
    fastasleepjony0
  • Reply 39 of 44
    fastasleepfastasleep Posts: 6,420member
    Asprosin said:
    No, Apple has not been scanning photos for CSAM server-side. I thought the TechCrunch interview would’ve put that little morsel of misinformation to rest, but I guess it just can’t die. If you read the actual federal complaint on PACER, you see that law enforcement was first alerted to Mollick’s alleged possession of CSAM because of a report by Kik, triggered by PhotoDNA. They then served a subpoena to Apple for subscriber information to confirm he was the same user who uploaded CSAM to Kik. They then served a search warrant to Apple, where they found 2000 images in a library of 80000 files. 
    Thanks for confirming. 
    jony0
  • Reply 40 of 44
    sflocal said:

    Where did Apple say they were installing "spyware" on iPhones?  They are scanning iCloud photos in its datacenter, but nothing gets loaded on the phone itself.

    Why do people continue pushing false stories like this?
    Apple's CSAM initiative is designed to create hashes of photos as they are uploaded to iCloud and compare them to the National Center for Missing and Exploited Children database of known CSAM files.

    That way they could encrypt the image files on iCloud.
    jony0
Sign In or Register to comment.