EFF protesting Apple CSAM identification programs on Monday evening

Posted:
in iPhone
On Monday at 6:00 PM local time, the EFF and other privacy groups are protesting Apple's CSAM identification initiatives in person at select Apple Retail stores across the country.

An image of 'Apple surveillance' promoting the EFF protest
An image of 'Apple surveillance' promoting the EFF protest


The Electronic Frontier Foundation is sponsoring a nationwide protest of Apple's CSAM on-device protections it announced, then delayed, for iOS 15 and macOS Monterey. The protest is being held in several major US cities, including San Francisco, Atlanta, New York, Washington D.C., and Chicago.

A post from the EFF outlines the protest and simply tells Apple, "Don't scan our phones." The EFF has been one of the largest vocal entities against Apple's CSAM detection system that was meant to release with iOS 15, citing that the technology is no better than mass government surveillance.

"We're winning-- but we can't let up the pressure," the EFF said in a blog post. "Apple has delayed their plan to install dangerous mass surveillance software onto their devices, but we need them to cancel the program entirely."

The in-person protests are only one avenue of attack the EFF has planned. The organization sent 60,000 petitions to Apple on September 7, cites over 90 organizations backing the movement, and plans to fly an aerial banner over Apple's campus during the "California Streaming" event.

We need to interrupt next week's #AppleEvent to say clearly-- it's not OK to put surveillance software on our phones. https://t.co/6EDXAbo0Wn

-- EFF (@EFF)


Those interested can find a protest in their area, sign up for newsletters, and email Apple leadership directly from this website.

Cloud providers like Google and Microsoft already search for CSAM within their user's photo collections but in the cloud, versus on users' devices. The concern with Apple's implementation lies with where the processing takes place.

Apple designed it so the hash-matching against the CSAM database would take place on the iPhone before images were uploaded to iCloud. It says this implementation is safer and more private than trying to identify user's photos in the cloud server.

A series of poor messaging efforts from Apple, general confusion surrounding the technology, and concern about how it might be abused led Apple to delay the feature.

Read on AppleInsider

Comments

  • Reply 1 of 12
    Scanning on our devices, bad, scanning on servers, good? I guess it doesn't matter which way they do it, but it would have kept the positives-count private if done on device.

    Either way, CSAM hash scanning for kids being raped has happened for years, and will continue to. Are they going to protest Dropbox, Google, Microsoft, Tumblr, Twitter, etc?

    https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/

    https://www.microsoft.com/en-us/photodna

    https://protectingchildren.google/intl/en/
    edited September 2021 jony0
  • Reply 2 of 12
    mcdavemcdave Posts: 1,927member
    I don’t get this. Other scanning systems can do whatever but Apple’s only reports anything if a significant collection of known CSAM is found and that’s worse?
    jony0watto_cobra
  • Reply 3 of 12
    Anyone who think this is good idea of Apple scanning your phones for images fail to see the altruist down line bad effects this can have. 

    I share this before it only works for images which are known about and stored in the CSAM data base these are image that sick people trade in. It can not catch newly created images since they are not in the database and a Hash has not been created. If it can catch new images not in the data base it looking for feature which could catch other nude image of young kids which is demand acceptable and it becomes an invasion of everyone's privacy. Once the tech is installed what going to stop bad actors for using it for other reasons.

    No one can say it will only catch the bad guys, no one know that to be true since no one really know how it works. This will catch the stupid people not the people who are smart and dangerous and who are creating the original images.

    You are free to give away your privacy, but your not free to give everyone else's away. I am glad to see what EFF is standing up.
    Oferxyzzy-xxxchemengin1
  • Reply 4 of 12
    crowleycrowley Posts: 10,453member
    maestro64 said:
    It can not catch newly created images since they are not in the database and a Hash has not been created.
    Yes, that's a limitation, but not a reason not to do it.
    maestro64 said:
    If it can catch new images not in the data base it looking for feature which could catch other nude image of young kids which is demand acceptable and it becomes an invasion of everyone's privacy.
    It can't.
    maestro64 said:
    Once the tech is installed what going to stop bad actors for using it for other reasons.
    Apple in the first, most obvious instance.  Though you'll need to be clearer about which bad actors you're talking about. 
    maestro64 said:

    No one can say it will only catch the bad guys, no one know that to be true since no one really know how it works. This will catch the stupid people not the people who are smart and dangerous and who are creating the original images.
    It's been pretty comprehensively documented, so people do know how it works.  And if it catches some stupid paedophiles then that's good, I'm not precious about the IQ of my child molesters.
    maestro64 said:

    You are free to give away your privacy, but your not free to give everyone else's away. I am glad to see what EFF is standing up.
    No one's privacy is being given away.  If you don't like it, don't use iCloud Photos.
    dewmeStrangeDaysfastasleepkillroyjony0
  • Reply 5 of 12
    maestro64 said:
    Anyone who think this is good idea of Apple scanning your phones for images fail to see the altruist down line bad effects this can have. 

    I share this before it only works for images which are known about and stored in the CSAM data base these are image that sick people trade in. It can not catch newly created images since they are not in the database and a Hash has not been created. If it can catch new images not in the data base it looking for feature which could catch other nude image of young kids which is demand acceptable and it becomes an invasion of everyone's privacy. Once the tech is installed what going to stop bad actors for using it for other reasons.

    No one can say it will only catch the bad guys, no one know that to be true since no one really know how it works. This will catch the stupid people not the people who are smart and dangerous and who are creating the original images.

    You are free to give away your privacy, but your not free to give everyone else's away. I am glad to see what EFF is standing up.
    Hard to understand what you’re saying…but how is it any different scanning for hashes of child rape on the server-side? Same exact scanning. The hand-waiving about “But what if government adds other images!” doesn’t make any sense since by that logic it could be added to the server-side CSAM scanning just as easily. Arguably easier, in fact, as it doesn’t require a software push like it does for on-device scanning. So what is the difference?

    You can reject ANY scanning, server or on device, simply by declining to use iCloud Photos. Same for Google or Microsoft cloud offerings. 
    edited September 2021 auxiofastasleepkillroyjony0watto_cobra
  • Reply 6 of 12
    Apple should move CSAM scanning to the cloud. Then it's clear that this "feature" is not used for any offline content and there are less ways to exploit (think Pegasus).
    The other "feature" built into Messages (parents may opt in for the kids iPhone) seems to detect new (nude) images, there are other implications that civil rights groups mentioned. Maybe Apple can solve this by just issue a warning that can be ignored by the kid.
  • Reply 7 of 12
    Apple in the first, most obvious instance.  Though you'll need to be clearer about which bad actors you're talking about. 
    Government
    williamlondon
  • Reply 8 of 12
    maestro64 said:
    Anyone who think this is good idea of Apple scanning your phones for images fail to see the altruist down line bad effects this can have. 

    I share this before it only works for images which are known about and stored in the CSAM data base these are image that sick people trade in. It can not catch newly created images since they are not in the database and a Hash has not been created. If it can catch new images not in the data base it looking for feature which could catch other nude image of young kids which is demand acceptable and it becomes an invasion of everyone's privacy. Once the tech is installed what going to stop bad actors for using it for other reasons.

    No one can say it will only catch the bad guys, no one know that to be true since no one really know how it works. This will catch the stupid people not the people who are smart and dangerous and who are creating the original images.

    You are free to give away your privacy, but your not free to give everyone else's away. I am glad to see what EFF is standing up.
    The hashes are obtained from at least two separate groups. The hashes will need to push in a software update to your phone. A human team will need to review those flagged images only if you reach a threshold. Let’s be honest, if you reach that threshold, you’re not innocent. Only then a special team will refer you the authorities. The authorities will investigate as hashes cannot be used as evidence. 

    You give up some piece of your privacy every time you walk out of your house. Your neighbors may have ring doorbells. Traffic cams. ATMs have cameras. Businesses have cameras. People have cell phones. 
    fastasleepjony0watto_cobra
  • Reply 9 of 12
    mike54 said:
    I hope EFF can change Apple's mind. But unfortunately it's not going to. This is being pushed by the US Gov and Apple willingly obligated by designing a software framework that can be configured for other uses. Currently it's supposedly for CSAM to get it publicly accepted, but it will be re-purposed at any time unknown to the user. It's a disappointment most people, even so called smart people, can't seem to see tomorrow and unquestioningly believe those in power and give them a free pass.
    Ridiculous. ALL of the software on your device can be configured for other uses. Why are you concerned about this one particular thing and not, say the Spotlight or Photos Library indexing that happens every day on your device? How can you state definitively that it "will be re-purposed at any time unknown to the user" without any evidence to support that claim? LOL "so called smart people" indeed.
    jony0watto_cobra
  • Reply 10 of 12
    This is complicate issue that most people fail to see the unintended consequence or do not care about them until they show up on your door step. We live in country where we error on the side of personal privacy, however, many here think its okay to invade privacy and cause innocent people to be questioned. This is system casing a wide net and can be used for more than just catching bad people. I am all about catching bad guys but i do not want to be sweep up in the process even innocently since it could cost you more than you think.

    Keep in mind Apple is going to Scan iCloud and your phone not just server side and Apple has every right to make sure their servers are not contributing to the problem. However, it only works if there is a original source Hash in the database, yes you have idiots who trade and share these photos which have been in circulations for a long time. Keep in mind this does not catch the abuser which is the people we want to really catch since they are the one creating the content. These are truly the bad people, catching the viewers of the content does not stop the human trafficing, The people who make this content they do not store their image on public servers and they will encrypt the files so they can not be seen or analyze. They have been doing this for a long time and they know how not to get caught. 

    If someone is using their phone to take images of kids the SCAM Hash will not catch these since they are new and the database does not have Hash of the image to compare against. There has been statement that the system will be able to know if someone taking an innocent baby photo verse someone who is involved in human trafficing.

    Also the images contained in the database can be modified to the point that it image Hash will not match the one stored in the database. Take any photo you have, create a Hash for it then open the image in Photoshop and run it through a filter and regenerate the Hash and they will not match, also go in and remove all the meta data on the image and the Hash will not match.

    Also as point pointed out, any government can place Hash in the database of any image they like and have the system tell who has those images on their phone. Keep in mind these organization doing this work is getting funding from governments so they are beholden to people giving them money.



    edited September 2021 muthuk_vanalingam
  • Reply 11 of 12
    jungmark said:
    maestro64 said:
    Anyone who think this is good idea of Apple scanning your phones for images fail to see the altruist down line bad effects this can have. 

    I share this before it only works for images which are known about and stored in the CSAM data base these are image that sick people trade in. It can not catch newly created images since they are not in the database and a Hash has not been created. If it can catch new images not in the data base it looking for feature which could catch other nude image of young kids which is demand acceptable and it becomes an invasion of everyone's privacy. Once the tech is installed what going to stop bad actors for using it for other reasons.

    No one can say it will only catch the bad guys, no one know that to be true since no one really know how it works. This will catch the stupid people not the people who are smart and dangerous and who are creating the original images.

    You are free to give away your privacy, but your not free to give everyone else's away. I am glad to see what EFF is standing up.
    The hashes are obtained from at least two separate groups. The hashes will need to push in a software update to your phone. A human team will need to review those flagged images only if you reach a threshold. Let’s be honest, if you reach that threshold, you’re not innocent. Only then a special team will refer you the authorities. The authorities will investigate as hashes cannot be used as evidence. 

    You give up some piece of your privacy every time you walk out of your house. Your neighbors may have ring doorbells. Traffic cams. ATMs have cameras. Businesses have cameras. People have cell phones. 
    No one has an expecation of privacy outside their home, whether there are cameras there or not. However, you do have an expecation of privacy inside you home and on your propery. This include your car and things in your house and on your person. You also have an expectation of privacy on the phone and even talking to someone in public, This is why you can not record a third parites conversation when you are no part of the conversation. You can listen to their conversation and if you try to report them to the police they were talking about something illegal it may be considered hersay since you were not involved in the conversation. The fact you do not know this is concerning, please stop trying to give people privacy away.

    You claim there will be a special teams reviewing image this then imply that someone will have access to your phone and able to get the pictures off your phone. Lets not talk about how many laws were broken there. Also they system is only comparing the Hash of the file, there is nothing to review by the special team, system is just reporting that someone has an image that has a hash that match a hash in the database.

    If you do not think this is not complicated, go check why the Music industry stop going after individuals who they claim had illegal music down loaded on to their computers by claiming an IP was used by a speicfic house had down loaded music. Someone finally sued the music industry claiming they could not prove the owner of that household actually downloaded the files just becuase they had an IP address. You could had a visitor using your internet access or someone outside could have access your wireless network with our your permission and down load content. Thye got sue on the fact they invade people privacy by accessing their computers without their permission.

     

    muthuk_vanalingam
Sign In or Register to comment.