San Francisco doctor charged with possessing child pornography in iCloud

13»

Comments

  • Reply 41 of 44
    rbnetengr said:
    chadbag said:
    AI said: “ The system does not scan actual images on a user's iCloud account”

    pray tell, how do they get a hash for the image without scanning (reading) it?

    This article explains what a hash is, in a relatively non-technical way. 
     
    https://newtech.law/en/the-hash-a-computer-files-digital-fingerprint/

    So, the sequence of ‘ones and zeros’ that make up any file, but in this case, image files, are run through a standard hashing algorithm that produces a unique code, or fingerprint, for that file. Change one bit in the file, and the fingerprint will change. 

    So, the organizations that monitor child pornography images have a growing collection of hashes, or fingerprints, for these images.  So, when files are uploaded to iCloud, Apple can generate hashes for files, and then compare the hashes to the known list of child porn images, to look for matches. 

    Hope this helps. 
    If changing one bit of a file changes the hash, then it seems the system could be pretty easily thwarted by manipulating the image somehow without actually changing what the image displays (ie,. resizing the image, adding some sort of visual watermark, etc.).  Any changes to the original file changes the hash so now it is a not a match?
    darkvaderbaconstang
  • Reply 42 of 44
    2000 images!! This guy needed and was going to get caught w/ that much material. That is a problem beyond "mistaken" photo detection.

    Of course, this is most likely a CSAM promotion piece for Apple due to the offensive magnitude and the public's obvious endorsement. 
  • Reply 43 of 44
    fastasleepfastasleep Posts: 6,420member
    ITGUYINSD said:
    rbnetengr said:
    chadbag said:
    AI said: “ The system does not scan actual images on a user's iCloud account”

    pray tell, how do they get a hash for the image without scanning (reading) it?

    This article explains what a hash is, in a relatively non-technical way. 
     
    https://newtech.law/en/the-hash-a-computer-files-digital-fingerprint/

    So, the sequence of ‘ones and zeros’ that make up any file, but in this case, image files, are run through a standard hashing algorithm that produces a unique code, or fingerprint, for that file. Change one bit in the file, and the fingerprint will change. 

    So, the organizations that monitor child pornography images have a growing collection of hashes, or fingerprints, for these images.  So, when files are uploaded to iCloud, Apple can generate hashes for files, and then compare the hashes to the known list of child porn images, to look for matches. 

    Hope this helps. 
    If changing one bit of a file changes the hash, then it seems the system could be pretty easily thwarted by manipulating the image somehow without actually changing what the image displays (ie,. resizing the image, adding some sort of visual watermark, etc.).  Any changes to the original file changes the hash so now it is a not a match?
    Apple’s system is allegedly able to work around certain kinds of file manipulation. 
    jony0
Sign In or Register to comment.