tylersdad
About
- Username
- tylersdad
- Joined
- Visits
- 58
- Last Active
- Roles
- member
- Points
- 2,020
- Badges
- 2
- Posts
- 310
Reactions
-
Internal Apple memo addresses public concern over new child protection features
Mike Wuerthele said:tylersdad said:Mike Wuerthele said:tylersdad said:Mike Wuerthele said:tylersdad said:Wesley Hilliard said:tylersdad said:It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology.The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash.
They are opening the file. They have to. Sure, it's happening on your phone, but it's still happening. It may not be a human looking at your file, but it's being opened by Apple code that generates the hash.
You cannot generate a hash without some sort of input. In this case, the input is the 1's and 0's from the file...which they MUST read to generate the hash.
I'm not entirely sure what your argument is. Seeing the 0 and 1 in a file without knowing how to decode it isn't the same as knowing what the file is.
Today they're only looking at 1's and 0's. Tomorrow? Who knows...
The bottom line is that the code MUST open the file. The code MUST read the contents of the file. -
Internal Apple memo addresses public concern over new child protection features
Mike Wuerthele said:tylersdad said:Mike Wuerthele said:tylersdad said:Wesley Hilliard said:tylersdad said:It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology.The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash.
They are opening the file. They have to. Sure, it's happening on your phone, but it's still happening. It may not be a human looking at your file, but it's being opened by Apple code that generates the hash.
You cannot generate a hash without some sort of input. In this case, the input is the 1's and 0's from the file...which they MUST read to generate the hash. -
Internal Apple memo addresses public concern over new child protection features
Mike Wuerthele said:tylersdad said:Wesley Hilliard said:tylersdad said:It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology.The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash. -
Internal Apple memo addresses public concern over new child protection features
StrangeDays said:tylersdad said:Wesley Hilliard said:tylersdad said:It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology. -
Internal Apple memo addresses public concern over new child protection features
leehericks said:Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures.To explain this quickly, this “scanning applies only (for now) to photos before they are uploaded to iCloud Photos. If you don’t use iCloud Photos, then this will never happen (for now).
Your device scans your photos and calculates a photo “fingerprint” ON YOUR DEVICE. It downloads fingerprints of bad materials and checks ON YOUR DEVICE.If by some chance you had a match, it would put a kind of safety token with the media while uploading it to iCloud Photos. If you reach some number of safety tokens someone will be tasked with checking up on your media (presumably in a very secure way which logs everything they do during the check).
The big question is who creates/validates the media fingerprints that get compared. Example “what if” concerns include a government like China giving Apple fingerprints for anti-government images.