To hell with all the slippery slope nonsense, child abuse is a global pandemic causing misery far worse than COVID-19. This is a fantastic move from Apple.
It's terrifying that Apple is allowing the government to bypass legal restrictions that would have made this type of search unlawful. I am not defending criminals or pedos but I strongly object to the government having unlimited insight into the photos on personal devices. A list of hashes is exactly that, and that list could be expanded to anything the government would like as I strongly assume the the government is providing the hashes to Apple initially.
Yeah, I can see issues with:
1) Ironically, there's a headline right now "iTunes Match is not working for a growing number of users". Clearly doing matches of content hasn't been their strong suit previously, so...
2) Creepy AF for them to start scanning the files on your own device. If they scan content on iCloud, that's kind of par for the course. Dropbox, Google Drive, etc. all do the same, whether it's disturbing or not. (DMCA takedowns of a video your ripped from DVD at one point. music, etc. - but at least it's something that was pushed to an effectively public place.)
3) Definitely a slippery slope. Today it's from a CSAM database, tomorrow... is it a PDF of a blank vax card in the database? How about if that card was used to fake a child's vaccination? Think of the children? How about when it's the CCP adding images of the Tiananmen massacre to find dissidents? All it will need is an extra hash or 2, right? And with the tech there, there's no chance it doesn't get used.
4) Is it really Apple's place to scan the content on your computers? Really?
I'm assuming this is the "privacy workaround" they've done to dodge other encryption-breaking the government is pushing for, but it's only a matter of time before this becomes a scan of all files, whether being mirrored to iCloud or not, and the type of content is bound to expand. A bit of general image recognition and it won't need to match that database, just flag what is possibly categorized as problematic (CP, drugs, guns, political content) and upload to the government for review. Imagine how great this is for "the war of terror", as Borat put it. None of these tools has ever become less invasive over time.
Apple is only scanning images in iCloud. And only comparing the hashes to known images in the CSAM database. So your kid taking a bath won’t get flagged.
Apple is only scanning images in iCloud. And only comparing the hashes to known images in the CSAM database. So your kid taking a bath won’t get flagged.
Literally not the case, they're running the scan on your own device, right now it's just before uploading to iCloud, though yes, today it should only flag matches to whatever hashes.
Other than that, you ignored every concern. Why not run recognition on your camera and audio feed to look for these crimes proactively? You're not going to commit a crime, are you? Honestly, if we just had cameras through our homes (why hello Ring!), we could save everyone eventually.
Having the company that sold you your phone run scans of your local files is seriously creepy, and it just won't stop there. Putting the tech on the phone is total overreach and opens a huge can of worms.
This seems like a bad idea to me for many reasons.
As others have said, who controls the database of image hashes? If it's the government, they can easily add a photo to generate trumped up charges against someone. Not a problem in a first world country (maybe) but certainly when expanded to other countries with less trustworthy governments, is an easy way to get someone they don't like put away. And as the infrastructure is there, any government could choose what photos are objectionable - which could end up being quite literally anything or anyone they choose. Apple would be providing governments of Russia, Venezuela, China etc with a list of people who poses images of a particular person or anti-government event in their library. All of a sudden these innocent people have been exposed.
How is having a particular photo in someone's library proof that they are a child molester, or someone who desires such imagery? What if someone sends an abhorrent picture to them over Airdrop as has been done before, and the defendant perhaps didn't realise they hit accept? The photo(s) could be in their library without them even knowing - but then they get home to police waiting. Even in court that person would somehow have to prove that the picture wasn't wanted, how would they do that? How can someone prove that the photos arrived without their knowledge? "I don't know how that picture got in my library" isn't going to wash with many juries.
What if there's another "celebgate" type phishing attack on someone, where a photo was planted without their knowledge in their library? It only has to be dated a few years ago and it'd likely be buried amongst other rarely viewed photos. How does someone defend against that?
The algorithm must have some kind of fuzzy matching else a one pixel tweak to an offending image would fool it - which means false positives are more likely.
Also, who's to say the person verifying the images that've been flagged isn't a pedophile themselves? They could sit looking all day at exactly the disgusting imagery they wanted, legally, with no suspicion toward them.
Surely also this kind of announcement would just push any offenders to use a different service? Though I suppose these sorts of people are dumb enough to use Messenger etc for organising various other crimes so maybe not. Either way, this is a very slippery slope.
The more I read about this, the more ridiculous it sounds. Apple spends the past three years banging on about privacy, then starts scanning your photo library? I don’t buy it, and no one would buy another iPhone if they thought their lives could be ruined by buggy software. The only way I see this being acceptable is if this so-called “client-side tool” would allow parents to ensure their children weren’t sending naked pics of themselves to Republican politicians posing as high-school kids.
Okay, so this is about the kids, fair enough, but I think what we’re seeing here is the very thin edge of very wide edge.
Thing edge to a very wide edge? How so?
Yeah, sorry, I meant “very wide wedge”.
But what is the very wide wedge? What do you suspect this change by Apple potentially leads to?
It's terrifying that Apple is allowing the government to bypass legal restrictions that would have made this type of search unlawful. I am not defending criminals or pedos but I strongly object to the government having unlimited insight into the photos on personal devices. A list of hashes is exactly that, and that list could be expanded to anything the government would like as I strongly assume the the government is providing the hashes to Apple initially.
Yeah, I can see issues with:
1) Ironically, there's a headline right now "iTunes Match is not working for a growing number of users". Clearly doing matches of content hasn't been their strong suit previously, so...
2) Creepy AF for them to start scanning the files on your own device. If they scan content on iCloud, that's kind of par for the course. Dropbox, Google Drive, etc. all do the same, whether it's disturbing or not. (DMCA takedowns of a video your ripped from DVD at one point. music, etc. - but at least it's something that was pushed to an effectively public place.)
3) Definitely a slippery slope. Today it's from a CSAM database, tomorrow... is it a PDF of a blank vax card in the database? How about if that card was used to fake a child's vaccination? Think of the children? How about when it's the CCP adding images of the Tiananmen massacre to find dissidents? All it will need is an extra hash or 2, right? And with the tech there, there's no chance it doesn't get used.
4) Is it really Apple's place to scan the content on your computers? Really?
I'm assuming this is the "privacy workaround" they've done to dodge other encryption-breaking the government is pushing for, but it's only a matter of time before this becomes a scan of all files, whether being mirrored to iCloud or not, and the type of content is bound to expand. A bit of general image recognition and it won't need to match that database, just flag what is possibly categorized as problematic (CP, drugs, guns, political content) and upload to the government for review. Imagine how great this is for "the war of terror", as Borat put it. None of these tools has ever become less invasive over time.
It will never end, iCloud and music Match suck always have and always will, Apples downfall in the end will their insertion into areas that don’t help the usability of their devices.
Anyone who believes this will be limited to child pornography is beyond naive.
This is how ‘overreach’ is accomplished. You can look at the government’s coronavirus response to see exactly how it’s done. Little by little the authoritarian approach is put in our faces, the initial public outrage mostly passes away, and many of the population accept it—but not because of independent thought.
Even those who mean well fall into the trap. I do believe Apple means well here. But little by little those seeking power break down resistance. America has done so well that many in the population have little meaningful understanding of the reasons for liberty, freedom, and checks and balances. They believe there aren’t powerful people actively working to effectively enslave them, and succeeding.
For the first time in 28 years I’m considering leaving Apple products. Child pornography is disgusting and an outrage, but overreach of power is currently running rampant in the US and that will eventually ruin an entire country.
In answer to one of my own questions: Apple does not have the database of CSAM images. They only have the pattern match database (similar to checksums). However this just makes the situation worse. Apple is going to do pattern matching on your images without your permission. They have no way to know if it is working or not. If the pattern match ratio passes a threshold they will turn this information over to the police. The police will get a search warrant and then Apple will give all your data (the pictures, movies, files, notes, email, everything) to the police. Let's say there is a little bug somewhere or that the threshold software triggers on very large collections of pictures of various sorts (but legal). Now you have the police desperately searching all your data for any justification to have issued the search warrant in the first place (not to mention justifying the whole program). What has given you any confidence that the police will act justly and do the right thing? Apple would go into cover there ass mode and claim none of this was their fault and you would be hung out to dry with your reputation ruined. Let's not go down this road. We were buying Apple products to avoid situations like this. Also, I bet hardly anyone actually uses iCloud to store illegal photos. That just makes no sense. I would rather Apple just turn off the iCloud photo storage for everyone.
And one more thing. You know all those apps that ask for permission to access your photos? What if one of them had a backdoor that allowed someone to upload photos to your iCloud library without you knowing about it? You know how many crap apps there are on the App Store that Apple lets through all the time. This would not even be a hard thing to implement (a lot easier than building a the kind of mass photo pattern matching the Apple is promising will fail only one in a million times even though they have no way to test it without hosting a lot of illegal photos).
What if parents didn’t give young children mobile phones and then let them use them in their rooms? What if they educated children earlier on the dangers of online grooming? What if chat room operators had better checks? Hell, what if they checked at all?
What if it’s you being advised by your lawyer to just say you did it for a lighter sentence?
What if it’s you that your friends ostracise because there’s no smoke without fire?
This is tremendously dangerous. Pattern matching can create false hits triggering ... what exactly? Transferring your data to a third party for their inspection without your permission? Keeping their private data private is exactly, precisely the reason users trust and buy Apple products. If Apple starts backing out of privacy guarantees, then what? Let me say clearly: I have zero faith that Apple has the technical skill to implement a feature like this. Apple screws up really simple things routinely. This is a really hard computer science problem. Apple is going to screw it up. Guaranteed. When they do screw it up, they will certainly have people looking at your private photos and quite likely get innocent people in trouble with law enforcement. The way the USA treats kids who share pictures of themselves with other kids as life long criminals is a good reason to stop and think carefully about a plan like this.
Given that Google and Facebook have been doing this for about a decade, and both companies are at least as incompetent as you claim Apple is, plus there's no rash of false positives amongst those larger user base, I believe your fears are unfounded.
Wait and see. If there are colossal fuck-ups, they'll be found out.
Yeah, but here’s the thing: before, when Apple made a colossal fuck up, such as letting random text strings crash the iPhone, or bugs in the login code, they’re mildly annoying.
A colossal fuck-up now means that someone’s life is ruined.
Guess you guys are continuing to ignore the part about subpoenas, investigators, and prosecutors. Nobody is given an automatic “Go to jail” card. If it’s not child porn, nobody is going to jail. These systems already exist, right?
So what you’re saying is that the legal system is absolutely foolproof and that no one will go to prison for a crime they didn’t commit or lose friends, family and their reputation for a crime they didn’t commit?
The more I read about this, the more ridiculous it sounds. Apple spends the past three years banging on about privacy, then starts scanning your photo library? I don’t buy it, and no one would buy another iPhone if they thought their lives could be ruined by buggy software. The only way I see this being acceptable is if this so-called “client-side tool” would allow parents to ensure their children weren’t sending naked pics of themselves to Republican politicians posing as high-school kids.
Okay, so this is about the kids, fair enough, but I think what we’re seeing here is the very thin edge of very wide edge.
Thing edge to a very wide edge? How so?
Yeah, sorry, I meant “very wide wedge”.
But what is the very wide wedge? What do you suspect this change by Apple potentially leads to?
Well, the middle of the wedge is authoritarian governments telling Apple that they must now use this clever system of theirs to report on other types of images.
When the Chinese government tells Apple that they want to flag for hashes of images of a political opponent in every citizen’s photo library (oh, and they mustn’t tell anyone they’re doing it because that’s the law), do you think Apple will withdraw from the country immediately, or comply with the request?
For the first time in 28 years I’m considering leaving Apple products. Child pornography is disgusting and an outrage, but overreach of power is currently running rampant in the US and that will eventually ruin an entire country.
I’m wondering if this is some sort of deal that Apple has struck to prevent an even more intrusive solution from US law enforcement. If that’s the case then I imagine Google will have to implement something very similar in the near future.
But regardless, what I realise now is that I have been somewhat naive about Apple's stance on privacy. All this clever stuff they bang on about: “We can’t see this, we won’t look at that” – it really is smoke and mirrors. What Apple is doing is walking through the girls’ locker room while promising to keep its eyes shut. Yeah, that’s nice of them, but guess what? They’re still in the locker room.
So the next time GoogleGuy breezes through trying to prove that Google cares just as much about privacy as Apple does, I’m afraid he won’t be wrong: they both kinda suck, but at least Google is honest about it.
A Trojan horse is the best description I’ve seen. As much of a diehard Apple user I’ve been since Steve Job’s early days, my present iPhone will be my last. I guess that goes for my MacBook Pro, as well. Punishing the surely 95+% innocent, decent people, by spying on EVERYONE, only to catch the slime they’re after just doesn’t sound like an Apple way of doing things. Catch the scumbags but do it any way that does not have the essence of BIG BROTHER. Goodbye, Apple!
And of course, any self-respecting pedophile won’t be sending their collection to the cloud anyway, or will switch off iCloud photos as soon as this lands, so the only folk this is targeting are non-pedophiles.
For the first time in 28 years I’m considering leaving Apple products. Child pornography is disgusting and an outrage, but overreach of power is currently running rampant in the US and that will eventually ruin an entire country
A Trojan horse is the best description I’ve seen. As much of a diehard Apple user I’ve been since Steve Job’s early days, my present iPhone will be my last. I guess that goes for my MacBook Pro, as well. Punishing the surely 95+% innocent, decent people, by spying on EVERYONE, only to catch the slime they’re after just doesn’t sound like an Apple way of doing things. Catch the scumbags but do it any way that does not have the essence of BIG BROTHER. Goodbye, Apple!
You may find your choices for an ecosystem that puts privacy first are not as numerous as you think.
Apple has been scanning pictures for a long time as it turns out, so has Google, and Twitter, and Microsoft.
So in the scheme of things, nothing has really changed.
The real takeaway from this is that when Apple bangs on about privacy, you now know it’s bullshit.
Worth taking a look at Matthew Green’s Twitter feed. Some fascinating stuff in there.
He points out that Apple is using a fuzzy matching algorithm, and that it's possible to have false matches that will flag due to the the fuzziness of the match. He also points out that no one has seen Apple’s matching algorithm, a matching algorithm built by the same company that gave you iTunes Match and Siri.
But that’s okay, to make sure that there’s no chance of a false match, Apple will have some stranger look at your pictures to make sure. So if your photos start showing up in odd places around the internet, this may be why.
That particular blog is about identifying images on the web, which is well known (and not only used for child abuse pics). But, as far as I understand, this has nothing to do with Google backups, which as far as I know is encrypted and they does not (yet?) perform on-device hashing. But if there are other sources saying otherwise, I'd like to know.
Apple is only scanning images in iCloud. And only comparing the hashes to known images in the CSAM database. So your kid taking a bath won’t get flagged.
Literally not the case, they're running the scan on your own device, right now it's just before uploading to iCloud, though yes, today it should only flag matches to whatever hashes.
Other than that, you ignored every concern. Why not run recognition on your camera and audio feed to look for these crimes proactively? You're not going to commit a crime, are you? Honestly, if we just had cameras through our homes (why hello Ring!), we could save everyone eventually.
Having the company that sold you your phone run scans of your local files is seriously creepy, and it just won't stop there. Putting the tech on the phone is total overreach and opens a huge can of worms.
I guess the next move would be to start scanning your browser cache.
That particular blog is about identifying images on the web, which is well known (and not only used for child abuse pics). But, as far as I understand, this has nothing to do with Google backups, which as far as I know is encrypted and they does not (yet?) perform on-device hashing. But if there are other sources saying otherwise, I'd like to know.
Yes, you are right.
Google has provided the technology for scanning and tagging images found on the web. My guess is that this is the technology behind the CSAM database.
The article does not say that Google is scanning for photos on its customers accounts and notifying the authorities if it finds a match.
I’m not going to change my original post though, as I think it’s important to highlight the difference between Google’s approach (provide the tools to help law enforcement), and Apple’s (assume its customers are guilty until they can prove otherwise).
Comments
Other than that, you ignored every concern. Why not run recognition on your camera and audio feed to look for these crimes proactively? You're not going to commit a crime, are you? Honestly, if we just had cameras through our homes (why hello Ring!), we could save everyone eventually.
Having the company that sold you your phone run scans of your local files is seriously creepy, and it just won't stop there. Putting the tech on the phone is total overreach and opens a huge can of worms.
As others have said, who controls the database of image hashes? If it's the government, they can easily add a photo to generate trumped up charges against someone. Not a problem in a first world country (maybe) but certainly when expanded to other countries with less trustworthy governments, is an easy way to get someone they don't like put away. And as the infrastructure is there, any government could choose what photos are objectionable - which could end up being quite literally anything or anyone they choose. Apple would be providing governments of Russia, Venezuela, China etc with a list of people who poses images of a particular person or anti-government event in their library. All of a sudden these innocent people have been exposed.
How is having a particular photo in someone's library proof that they are a child molester, or someone who desires such imagery? What if someone sends an abhorrent picture to them over Airdrop as has been done before, and the defendant perhaps didn't realise they hit accept? The photo(s) could be in their library without them even knowing - but then they get home to police waiting. Even in court that person would somehow have to prove that the picture wasn't wanted, how would they do that? How can someone prove that the photos arrived without their knowledge? "I don't know how that picture got in my library" isn't going to wash with many juries.
What if there's another "celebgate" type phishing attack on someone, where a photo was planted without their knowledge in their library? It only has to be dated a few years ago and it'd likely be buried amongst other rarely viewed photos. How does someone defend against that?
The algorithm must have some kind of fuzzy matching else a one pixel tweak to an offending image would fool it - which means false positives are more likely.
Also, who's to say the person verifying the images that've been flagged isn't a pedophile themselves? They could sit looking all day at exactly the disgusting imagery they wanted, legally, with no suspicion toward them.
Surely also this kind of announcement would just push any offenders to use a different service? Though I suppose these sorts of people are dumb enough to use Messenger etc for organising various other crimes so maybe not. Either way, this is a very slippery slope.
It will never end, iCloud and music Match suck always have and always will, Apples downfall in the end will their insertion into areas that don’t help the usability of their devices.
This is how ‘overreach’ is accomplished. You can look at the government’s coronavirus response to see exactly how it’s done. Little by little the authoritarian approach is put in our faces, the initial public outrage mostly passes away, and many of the population accept it—but not because of independent thought.
Even those who mean well fall into the trap. I do believe Apple means well here. But little by little those seeking power break down resistance. America has done so well that many in the population have little meaningful understanding of the reasons for liberty, freedom, and checks and balances. They believe there aren’t powerful people actively working to effectively enslave them, and succeeding.
For the first time in 28 years I’m considering leaving Apple products. Child pornography is disgusting and an outrage, but overreach of power is currently running rampant in the US and that will eventually ruin an entire country.
And one more thing. You know all those apps that ask for permission to access your photos? What if one of them had a backdoor that allowed someone to upload photos to your iCloud library without you knowing about it? You know how many crap apps there are on the App Store that Apple lets through all the time. This would not even be a hard thing to implement (a lot easier than building a the kind of mass photo pattern matching the Apple is promising will fail only one in a million times even though they have no way to test it without hosting a lot of illegal photos).
What if it’s you being advised by your lawyer to just say you did it for a lighter sentence?
Yup, that never happens.
What extraordinary times we live in.
https://blog.google/outreach-initiatives/google-org/our-continued-commitment-to-combating/
And so has Twitter
https://www.theguardian.com/technology/2013/jul/22/twitter-photodna-child-abuse
You may find your choices for an ecosystem that puts privacy first are not as numerous as you think.
Apple has been scanning pictures for a long time as it turns out, so has Google, and Twitter, and Microsoft.
So in the scheme of things, nothing has really changed.
The real takeaway from this is that when Apple bangs on about privacy, you now know it’s bullshit.
But if there are other sources saying otherwise, I'd like to know.