EFF protesting Apple CSAM identification programs on Monday evening
On Monday at 6:00 PM local time, the EFF and other privacy groups are protesting Apple's CSAM identification initiatives in person at select Apple Retail stores across the country.
An image of 'Apple surveillance' promoting the EFF protest
The Electronic Frontier Foundation is sponsoring a nationwide protest of Apple's CSAM on-device protections it announced, then delayed, for iOS 15 and macOS Monterey. The protest is being held in several major US cities, including San Francisco, Atlanta, New York, Washington D.C., and Chicago.
A post from the EFF outlines the protest and simply tells Apple, "Don't scan our phones." The EFF has been one of the largest vocal entities against Apple's CSAM detection system that was meant to release with iOS 15, citing that the technology is no better than mass government surveillance.
"We're winning-- but we can't let up the pressure," the EFF said in a blog post. "Apple has delayed their plan to install dangerous mass surveillance software onto their devices, but we need them to cancel the program entirely."
The in-person protests are only one avenue of attack the EFF has planned. The organization sent 60,000 petitions to Apple on September 7, cites over 90 organizations backing the movement, and plans to fly an aerial banner over Apple's campus during the "California Streaming" event.
Those interested can find a protest in their area, sign up for newsletters, and email Apple leadership directly from this website.
Cloud providers like Google and Microsoft already search for CSAM within their user's photo collections but in the cloud, versus on users' devices. The concern with Apple's implementation lies with where the processing takes place.
Apple designed it so the hash-matching against the CSAM database would take place on the iPhone before images were uploaded to iCloud. It says this implementation is safer and more private than trying to identify user's photos in the cloud server.
A series of poor messaging efforts from Apple, general confusion surrounding the technology, and concern about how it might be abused led Apple to delay the feature.
Read on AppleInsider
An image of 'Apple surveillance' promoting the EFF protest
The Electronic Frontier Foundation is sponsoring a nationwide protest of Apple's CSAM on-device protections it announced, then delayed, for iOS 15 and macOS Monterey. The protest is being held in several major US cities, including San Francisco, Atlanta, New York, Washington D.C., and Chicago.
A post from the EFF outlines the protest and simply tells Apple, "Don't scan our phones." The EFF has been one of the largest vocal entities against Apple's CSAM detection system that was meant to release with iOS 15, citing that the technology is no better than mass government surveillance.
"We're winning-- but we can't let up the pressure," the EFF said in a blog post. "Apple has delayed their plan to install dangerous mass surveillance software onto their devices, but we need them to cancel the program entirely."
The in-person protests are only one avenue of attack the EFF has planned. The organization sent 60,000 petitions to Apple on September 7, cites over 90 organizations backing the movement, and plans to fly an aerial banner over Apple's campus during the "California Streaming" event.
We need to interrupt next week's #AppleEvent to say clearly-- it's not OK to put surveillance software on our phones. https://t.co/6EDXAbo0Wn
-- EFF (@EFF)
Those interested can find a protest in their area, sign up for newsletters, and email Apple leadership directly from this website.
Cloud providers like Google and Microsoft already search for CSAM within their user's photo collections but in the cloud, versus on users' devices. The concern with Apple's implementation lies with where the processing takes place.
Apple designed it so the hash-matching against the CSAM database would take place on the iPhone before images were uploaded to iCloud. It says this implementation is safer and more private than trying to identify user's photos in the cloud server.
A series of poor messaging efforts from Apple, general confusion surrounding the technology, and concern about how it might be abused led Apple to delay the feature.
Read on AppleInsider
Comments
https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/
https://www.microsoft.com/en-us/photodna
https://protectingchildren.google/intl/en/
I share this before it only works for images which are known about and stored in the CSAM data base these are image that sick people trade in. It can not catch newly created images since they are not in the database and a Hash has not been created. If it can catch new images not in the data base it looking for feature which could catch other nude image of young kids which is demand acceptable and it becomes an invasion of everyone's privacy. Once the tech is installed what going to stop bad actors for using it for other reasons.
No one can say it will only catch the bad guys, no one know that to be true since no one really know how it works. This will catch the stupid people not the people who are smart and dangerous and who are creating the original images.
You are free to give away your privacy, but your not free to give everyone else's away. I am glad to see what EFF is standing up.
It can't.
Apple in the first, most obvious instance. Though you'll need to be clearer about which bad actors you're talking about.
It's been pretty comprehensively documented, so people do know how it works. And if it catches some stupid paedophiles then that's good, I'm not precious about the IQ of my child molesters.
No one's privacy is being given away. If you don't like it, don't use iCloud Photos.
You can reject ANY scanning, server or on device, simply by declining to use iCloud Photos. Same for Google or Microsoft cloud offerings.
The other "feature" built into Messages (parents may opt in for the kids iPhone) seems to detect new (nude) images, there are other implications that civil rights groups mentioned. Maybe Apple can solve this by just issue a warning that can be ignored by the kid.
Keep in mind Apple is going to Scan iCloud and your phone not just server side and Apple has every right to make sure their servers are not contributing to the problem. However, it only works if there is a original source Hash in the database, yes you have idiots who trade and share these photos which have been in circulations for a long time. Keep in mind this does not catch the abuser which is the people we want to really catch since they are the one creating the content. These are truly the bad people, catching the viewers of the content does not stop the human trafficing, The people who make this content they do not store their image on public servers and they will encrypt the files so they can not be seen or analyze. They have been doing this for a long time and they know how not to get caught.
If someone is using their phone to take images of kids the SCAM Hash will not catch these since they are new and the database does not have Hash of the image to compare against. There has been statement that the system will be able to know if someone taking an innocent baby photo verse someone who is involved in human trafficing.
Also the images contained in the database can be modified to the point that it image Hash will not match the one stored in the database. Take any photo you have, create a Hash for it then open the image in Photoshop and run it through a filter and regenerate the Hash and they will not match, also go in and remove all the meta data on the image and the Hash will not match.
Also as point pointed out, any government can place Hash in the database of any image they like and have the system tell who has those images on their phone. Keep in mind these organization doing this work is getting funding from governments so they are beholden to people giving them money.
You claim there will be a special teams reviewing image this then imply that someone will have access to your phone and able to get the pictures off your phone. Lets not talk about how many laws were broken there. Also they system is only comparing the Hash of the file, there is nothing to review by the special team, system is just reporting that someone has an image that has a hash that match a hash in the database.
If you do not think this is not complicated, go check why the Music industry stop going after individuals who they claim had illegal music down loaded on to their computers by claiming an IP was used by a speicfic house had down loaded music. Someone finally sued the music industry claiming they could not prove the owner of that household actually downloaded the files just becuase they had an IP address. You could had a visitor using your internet access or someone outside could have access your wireless network with our your permission and down load content. Thye got sue on the fact they invade people privacy by accessing their computers without their permission.