Mike Wuerthele
About
- Username
- Mike Wuerthele
- Joined
- Visits
- 178
- Last Active
- Roles
- administrator
- Points
- 23,992
- Badges
- 3
- Posts
- 7,272
Reactions
-
Internal Apple memo addresses public concern over new child protection features
tylersdad said:Mike Wuerthele said:tylersdad said:Mike Wuerthele said:tylersdad said:Mike Wuerthele said:tylersdad said:Wesley Hilliard said:tylersdad said:It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology.The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash.
They are opening the file. They have to. Sure, it's happening on your phone, but it's still happening. It may not be a human looking at your file, but it's being opened by Apple code that generates the hash.
You cannot generate a hash without some sort of input. In this case, the input is the 1's and 0's from the file...which they MUST read to generate the hash.
I'm not entirely sure what your argument is. Seeing the 0 and 1 in a file without knowing how to decode it isn't the same as knowing what the file is.
Today they're only looking at 1's and 0's. Tomorrow? Who knows...
The bottom line is that the code MUST open the file. The code MUST read the contents of the file. -
Internal Apple memo addresses public concern over new child protection features
tylersdad said:Mike Wuerthele said:tylersdad said:Wesley Hilliard said:tylersdad said:It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology.The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash. -
Internal Apple memo addresses public concern over new child protection features
tylersdad said:Wesley Hilliard said:tylersdad said:It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology.The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash. -
What you need to know: Apple's iCloud Photos and Messages child safety initiatives
elijahg said:Remember that 1 in 1 trillion isn't 1 false positive per 1 trillion iCloud accounts - it's 1 per 1 trillion photos. I have 20,000 photos, that brings the chances I have a falsely flagged photo to 1 in 50 million. Not quite such spectacular odds then.
And even if it was, one in 50 million is still pretty spectacularly against. -
Apple expanding child safety features across iMessage, Siri, iCloud Photos
OutdoorAppDeveloper said:This is tremendously dangerous. Pattern matching can create false hits triggering ... what exactly? Transferring your data to a third party for their inspection without your permission? Keeping their private data private is exactly, precisely the reason users trust and buy Apple products. If Apple starts backing out of privacy guarantees, then what?
Let me say clearly: I have zero faith that Apple has the technical skill to implement a feature like this. Apple screws up really simple things routinely. This is a really hard computer science problem. Apple is going to screw it up. Guaranteed. When they do screw it up, they will certainly have people looking at your private photos and quite likely get innocent people in trouble with law enforcement. The way the USA treats kids who share pictures of themselves with other kids as life long criminals is a good reason to stop and think carefully about a plan like this.
Wait and see. If there are colossal fuck-ups, they'll be found out.