tylersdad
About
- Username
- tylersdad
- Joined
- Visits
- 58
- Last Active
- Roles
- member
- Points
- 2,020
- Badges
- 2
- Posts
- 310
Reactions
-
Internal Apple memo addresses public concern over new child protection features
aderutter said:Maybe https://inhope.org/EN/articles/what-is-image-hashing will help the non-techies?
Note that image hashing is not reversible so one cannot use an image hash to create another image that matches, or be used to modify an existing image so that it matches the original image.
Also, nobody is forced into using iCloud. I for one assumed this kind of system had already been in place for years!
Yes, nobody is forcing people to use iCloud, but this means I have to choose between having my privacy violated or not use the features of iCloud that I'm paying for. -
Internal Apple memo addresses public concern over new child protection features
Mike Wuerthele said:tylersdad said:Mike Wuerthele said:tylersdad said:Mike Wuerthele said:tylersdad said:Mike Wuerthele said:tylersdad said:Wesley Hilliard said:tylersdad said:It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology.The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash.
They are opening the file. They have to. Sure, it's happening on your phone, but it's still happening. It may not be a human looking at your file, but it's being opened by Apple code that generates the hash.
You cannot generate a hash without some sort of input. In this case, the input is the 1's and 0's from the file...which they MUST read to generate the hash.
I'm not entirely sure what your argument is. Seeing the 0 and 1 in a file without knowing how to decode it isn't the same as knowing what the file is.
Today they're only looking at 1's and 0's. Tomorrow? Who knows...
The bottom line is that the code MUST open the file. The code MUST read the contents of the file.
Would you consent to law enforcement searching your phone without a warrant? Some random cop stops you on the street and wants to look at your pictures to see if you've broken the law. Would you consent? Probably not. I know I wouldn't.
What Apple is doing isn't much different except that they aren't looking at the physical representation of the file.
A rose by any other name is still a rose. This is a massive violation of privacy that has the potential to go very wrong in the wrong hands.
-
Internal Apple memo addresses public concern over new child protection features
OctoMonkey said:Mike Wuerthele said:tylersdad said:Mike Wuerthele said:tylersdad said:Wesley Hilliard said:tylersdad said:It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology.The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash. -
Internal Apple memo addresses public concern over new child protection features
Mike Wuerthele said:tylersdad said:Mike Wuerthele said:tylersdad said:Mike Wuerthele said:tylersdad said:Wesley Hilliard said:tylersdad said:It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology.The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash.
They are opening the file. They have to. Sure, it's happening on your phone, but it's still happening. It may not be a human looking at your file, but it's being opened by Apple code that generates the hash.
You cannot generate a hash without some sort of input. In this case, the input is the 1's and 0's from the file...which they MUST read to generate the hash.
I'm not entirely sure what your argument is. Seeing the 0 and 1 in a file without knowing how to decode it isn't the same as knowing what the file is.
Today they're only looking at 1's and 0's. Tomorrow? Who knows...
The bottom line is that the code MUST open the file. The code MUST read the contents of the file. -
Internal Apple memo addresses public concern over new child protection features
Mike Wuerthele said:tylersdad said:Wesley Hilliard said:tylersdad said:It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology.The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.