OutdoorAppDeveloper
About
- Banned
- Username
- OutdoorAppDeveloper
- Joined
- Visits
- 86
- Last Active
- Roles
- member
- Points
- 1,898
- Badges
- 1
- Posts
- 1,293
Reactions
-
What you need to know: Apple's iCloud Photos and Messages child safety initiatives
Unintended consequences are guaranteed if Apple starts scanning all photos in iCloud. Here are a few possibilities:- A trillion sounds like a big number but modern GPUs perform trillions of operations a second. That means that someone with the CSAM database of photos could create false photos that can fool the pattern recognition system. They could even modify your existing photos to make them pass Apple's threshold and get looked at by humans.
- There are a very large number of iOS apps and MacOS apps that have access to your photos. We know this because these apps have to ask for permission to access them. They can view, modify and save photos. We also know that there are a lot of very sketchy apps on the app store. What would prevent an app that modifies your photos to trigger the CSAM pattern recognition, or worse actually upload CSAM photos to your photo library which would automatically be copied to iCloud?
- The humans reviewing photos will be exposed to the photos. We know from workers at Facebook that they suffer post traumatic stress disorder from their constant exposure to hateful comments. How will Apple prevent the same thing happening to its workers? We know that Apple treats its contractors very poorly compared to its full time employees.
- Why did Apple state repeatedly that people deserved privacy and then create a program that is the opposite of privacy? Your photos are scanned by some algorithm no one outside of Apple has checked for errors. Humans will review your photos that the algorithm determines are illegal. The entire contents of your iCloud storage will be sent to law enforcement if the humans determine that it is illegal. What is legal and illegal varies country to country. Apple is an international company operating all over the planet. If you travel to a different country or your iCloud photos are stored on a server in another country, what stops them from arresting you when you arrive because you had photos of people hugging or women with uncovered faces driving cars?
- Why did Apple return to 1984?
-
Apple expanding child safety features across iMessage, Siri, iCloud Photos
In answer to one of my own questions: Apple does not have the database of CSAM images. They only have the pattern match database (similar to checksums). However this just makes the situation worse. Apple is going to do pattern matching on your images without your permission. They have no way to know if it is working or not. If the pattern match ratio passes a threshold they will turn this information over to the police. The police will get a search warrant and then Apple will give all your data (the pictures, movies, files, notes, email, everything) to the police. Let's say there is a little bug somewhere or that the threshold software triggers on very large collections of pictures of various sorts (but legal). Now you have the police desperately searching all your data for any justification to have issued the search warrant in the first place (not to mention justifying the whole program). What has given you any confidence that the police will act justly and do the right thing? Apple would go into cover there ass mode and claim none of this was their fault and you would be hung out to dry with your reputation ruined. Let's not go down this road. We were buying Apple products to avoid situations like this. Also, I bet hardly anyone actually uses iCloud to store illegal photos. That just makes no sense. I would rather Apple just turn off the iCloud photo storage for everyone.
And one more thing. You know all those apps that ask for permission to access your photos? What if one of them had a backdoor that allowed someone to upload photos to your iCloud library without you knowing about it? You know how many crap apps there are on the App Store that Apple lets through all the time. This would not even be a hard thing to implement (a lot easier than building a the kind of mass photo pattern matching the Apple is promising will fail only one in a million times even though they have no way to test it without hosting a lot of illegal photos). -
Apple expanding child safety features across iMessage, Siri, iCloud Photos
Mike Wuerthele said:OutdoorAppDeveloper said:Mike Wuerthele said:OutdoorAppDeveloper said:This is tremendously dangerous. Pattern matching can create false hits triggering ... what exactly? Transferring your data to a third party for their inspection without your permission? Keeping their private data private is exactly, precisely the reason users trust and buy Apple products. If Apple starts backing out of privacy guarantees, then what?
Let me say clearly: I have zero faith that Apple has the technical skill to implement a feature like this. Apple screws up really simple things routinely. This is a really hard computer science problem. Apple is going to screw it up. Guaranteed. When they do screw it up, they will certainly have people looking at your private photos and quite likely get innocent people in trouble with law enforcement. The way the USA treats kids who share pictures of themselves with other kids as life long criminals is a good reason to stop and think carefully about a plan like this.
Wait and see. If there are colossal fuck-ups, they'll be found out.
Or we could simply not give the kid matches. By the time they are "found out" it will be too late for some people.
One revelation here is that none of your data is really secure at Apple. How can they scan your photos if they are securely encrypted? They shouldn't even be able to tell they are photos. It should just look like a lot of random data. The fist thing I recommend doing is to delete all your data in iCloud.
Courts and investigators don't use the hashes to prove anything. Subpoenas will still be required for the data itself, and then, another set of eyes.
You want to delete your iCloud based on pie-in-the-sky "mights" and "maybes" based on your own interpretation (big edit here. What I said wasn't fair, and I apologize), I'm certainly not going to try to stop you.
In your specious "your kid" argument that you didn't say anything about in your previous post and added to your response while I was composing mine, you're missing the subpoena, discovery, and legal parts of the in-between finding the images, and the registration as a sex offender parts. And, unless said pictures are in the database already that Apple isn't building and is being provided by NCMEC they won't get flagged.
That's what's getting missed here. That's what nearly every venue other than the tech-centric ones are missing: These database checksums that are being built are against known CSAM images. Not random ones, not vast expanses of skin. Known images, being circulated, and flagged as CSAM by NCMEC.
There's no privacy violation here. Nobody is looking at your pictures. Nobody is going to see or know because of this that you have 20,000 pictures of a Bud can sweating on the bow of your fishing boat.
How do people opt out of this program?
Will they tell users when their photos have been flagged and viewed by humans?
Will we get to see the score our own photos got?
Can we see the source code to make sure it was implemented correctly?
Where is this database of known CSAM images? How would Apple possess such a thing legally?
So many unanswered questions. I guess we have to wait for the S to hit the F. -
Apple expanding child safety features across iMessage, Siri, iCloud Photos
Mike Wuerthele said:OutdoorAppDeveloper said:This is tremendously dangerous. Pattern matching can create false hits triggering ... what exactly? Transferring your data to a third party for their inspection without your permission? Keeping their private data private is exactly, precisely the reason users trust and buy Apple products. If Apple starts backing out of privacy guarantees, then what?
Let me say clearly: I have zero faith that Apple has the technical skill to implement a feature like this. Apple screws up really simple things routinely. This is a really hard computer science problem. Apple is going to screw it up. Guaranteed. When they do screw it up, they will certainly have people looking at your private photos and quite likely get innocent people in trouble with law enforcement. The way the USA treats kids who share pictures of themselves with other kids as life long criminals is a good reason to stop and think carefully about a plan like this.
Wait and see. If there are colossal fuck-ups, they'll be found out.
Or we could simply not give the kid matches. By the time they are "found out" it will be too late for some people.
One revelation here is that none of your data is really secure at Apple. How can they scan your photos if they are securely encrypted? They shouldn't even be able to tell they are photos. It should just look like a lot of random data. The fist thing I recommend doing is to delete all your data in iCloud.
I notice you didn't mention my final point. Imagine it was your kid. The kid takes photos of themselves they should not have and sends them to a friend or alternately receives them from a friend their same age. Apple's AP fingerprints the pictures as CSAM and they get turned over to the police. Suddenly your kid has to register as a sexual criminal for the rest of their lives. What could have been dealt with quietly by their parents becomes a public legal nightmare. Thanks Apple! That's why we buy our kids iPhones right? How are those kids getting protected? -
Apple expanding child safety features across iMessage, Siri, iCloud Photos
This is tremendously dangerous. Pattern matching can create false hits triggering ... what exactly? Transferring your data to a third party for their inspection without your permission? Keeping their private data private is exactly, precisely the reason users trust and buy Apple products. If Apple starts backing out of privacy guarantees, then what?
Let me say clearly: I have zero faith that Apple has the technical skill to implement a feature like this. Apple screws up really simple things routinely. This is a really hard computer science problem. Apple is going to screw it up. Guaranteed. When they do screw it up, they will certainly have people looking at your private photos and quite likely get innocent people in trouble with law enforcement. The way the USA treats kids who share pictures of themselves with other kids as life long criminals is a good reason to stop and think carefully about a plan like this.