So, the sequence of ‘ones and zeros’ that make up any file, but in this case, image files, are run through a standard hashing algorithm that produces a unique code, or fingerprint, for that file. Change one bit in the file, and the fingerprint will change.
So, the organizations that monitor child pornography images have a growing collection of hashes, or fingerprints, for these images. So, when files are uploaded to iCloud, Apple can generate hashes for files, and then compare the hashes to the known list of child porn images, to look for matches.
Hope this helps.
If changing one bit of a file changes the hash, then it seems the system could be pretty easily thwarted by manipulating the image somehow without actually changing what the image displays (ie,. resizing the image, adding some sort of visual watermark, etc.). Any changes to the original file changes the hash so now it is a not a match?
So, the sequence of ‘ones and zeros’ that make up any file, but in this case, image files, are run through a standard hashing algorithm that produces a unique code, or fingerprint, for that file. Change one bit in the file, and the fingerprint will change.
So, the organizations that monitor child pornography images have a growing collection of hashes, or fingerprints, for these images. So, when files are uploaded to iCloud, Apple can generate hashes for files, and then compare the hashes to the known list of child porn images, to look for matches.
Hope this helps.
If changing one bit of a file changes the hash, then it seems the system could be pretty easily thwarted by manipulating the image somehow without actually changing what the image displays (ie,. resizing the image, adding some sort of visual watermark, etc.). Any changes to the original file changes the hash so now it is a not a match?
Apple’s system is allegedly able to work around certain kinds of file manipulation.
Comments
Of course, this is most likely a CSAM promotion piece for Apple due to the offensive magnitude and the public's obvious endorsement.