EFF urges Apple to drop CSAM tool plans completely

2

Comments

  • Reply 21 of 47
    rcfarcfa Posts: 1,124member
    crowley said:
    crowley said:
    Well that's a surefire way to get yourself sidelined.  If the EFF can't be reasonable and have a dialogue then they'll be treated as unreasonable and unworthy of dialogue.
    This is obviously the first time you heard of the EFF and have no idea of the work they do otherwise you wouldn't have made such a dopey comment. 
    I have heard of the EFF and know what they do.  They've done some decent and worthwhile campaigning in the past, but they're very close to being no-compromise privacy zealots.  And I doubt Apple are going to be much bothered to engage with people who cannot be reasoned with given the concurrent obligations Apple feels that it has.
    What “obligations Apple feels that it has” are irrelevant. The are a gadget manufacturer and their ONLY job is to protect their users’ privacy, they are NOT in the law enforcement business.

    A screwdriver manufacturer or for that matter a gun manufacturer has to make sure the tool does what’s expected of it, not to be concerned about how the tool might be abused.

    You want screwdrivers with microphones, gps, and a mobile chip in it, to detect if it’s abused in a bomb plot, abduction, or murder? You know, no price too high to “save innocent lives” 🤦🏻‍♂️
    baconstang
  • Reply 22 of 47

    What’s so wrong with constantly scanning everyone’s devices without their consent for content that the government might find objectionable? If they don’t know what’s best for you - who will? We did elect them after all (except a few exceptions like China, Iran, all Arab countries, Russia and a few more). That those countries also are places where Apple has been strong-armed to give up user privacy are just evil rumours and nobody should pay any attention to it. Finally claims that continuous monitoring of people’s devices might have a dampening effect on people’s human right to express themselves is clearly just right-wing conspiracy theories and should be ignored. So why should we worry?! -It’s for the children goddammit. 

    mbenz1962baconstangmartinxyzJaiOh81
  • Reply 23 of 47
    rcfarcfa Posts: 1,124member
    Honestly the device side scanning was a privacy thing. They should just not encrypt iCloud photos and scan them for child porn of any kind like Facebook and Google already do. I’m tired of all these people acting like you deserve the right to privately rape kids. 

    Freedom isn’t free, it ALWAYS comes at the cost of others suffering under the irresponsible abuse of freedom. If one isn’t willing to pay that price, there will be no freedom at all.

    A system scanning for CSAM is TECHNICALLY indistinguishable from one scanning for any other type of content; once a system is in place, policy changes are just a matter of time.

    It’s ludicrous to say the criticism is about child porn, it’s about a massive infrastructure being built, which at a moment’s notice, and with the stroke of a (foreign) law maker’s pen, can be completely changed in its orientation.

    Apple’s word is meaningless, as nobody in their right mind will believe that Apple will drop the US, EU, or Chinese Market if unpalatable laws get passed.
    muthuk_vanalingambaconstang
  • Reply 24 of 47
    rcfarcfa Posts: 1,124member
    I feel Apple is not going to be able to do that because the whole thing is very unApple and reeks of government intervention. 
    Unfortunately it reeks of the new Apple that is too busy being woke instead of simply providing tech gadgets that work and protect users’ privacy.

    Apple has unfortunately been taken over by SJWs
    mobirdbloggerblog
  • Reply 25 of 47
    rcfarcfa Posts: 1,124member

    What’s so wrong with constantly scanning everyone’s devices without their consent for content that the government might find objectionable? If they don’t know what’s best for you - who will? We did elect them after all (except a few exceptions like China, Iran, all Arab countries, Russia and a few more). That those countries also are places where Apple has been strong-armed to give up user privacy are just evil rumours and nobody should pay any attention to it. Finally claims that continuous monitoring of people’s devices might have a dampening effect on people’s human right to express themselves is clearly just right-wing conspiracy theories and should be ignored. So why should we worry?! -It’s for the children goddammit. 

    I hope this was sarcasm, because otherwise it’s beyond help…

    …unfortunately some of the woke people would actually say stuff like that and mean it…🤦🏻‍♂️
    bloggerblog
  • Reply 26 of 47
    wood1208wood1208 Posts: 2,917member
    Looks like it is getting harder to fight against hidden Evil to protect the vulnerable part of human race.. This non-profit EFF entity who claims to defend civil liberties in digital world must have enough(50000 signatures) members/supporters who are child abusers,molesters, predators,pedophiles,etc. And who are these 90 organizations around the world who asked Apple not to implement CSAM ? Child pornography,illegal trafficking organizations


    edited September 2021 baconstang
  • Reply 27 of 47
    rcfarcfa Posts: 1,124member
    This is one idea Apple needs to abandon. Do the scanning server side. Don’t scan on device. Period. 
    Don’t scan. Period.
    Prevent scanning by anyone for any reason as best as technically possible. Period.

    Apple’s business isn’t to help law enforcement, but to guard users privacy under all circumstances.

    There’s no privacy for good people and lack of privacy for bad people; privacy mechanisms cannot distinguish between good and evil.

    The only two options are privacy and absence of privacy.

    On device scanning is a just particularly insidious way of eroding privacy, one that at a moment’s notice can search for anything else.

    Got pictures of guns? Bongs? Gay sex? Pooh? Dalai Lama? All of these could land you in jail or get you executed, in various jurisdictions around the globe.
    xyzzy01baconstang
  • Reply 28 of 47
    rcfa said:

    What’s so wrong with constantly scanning everyone’s devices without their consent for content that the government might find objectionable? If they don’t know what’s best for you - who will? We did elect them after all (except a few exceptions like China, Iran, all Arab countries, Russia and a few more). That those countries also are places where Apple has been strong-armed to give up user privacy are just evil rumours and nobody should pay any attention to it. Finally claims that continuous monitoring of people’s devices might have a dampening effect on people’s human right to express themselves is clearly just right-wing conspiracy theories and should be ignored. So why should we worry?! -It’s for the children goddammit. 

    I hope this was sarcasm, because otherwise it’s beyond help…

    …unfortunately some of the woke people would actually say stuff like that and mean it…🤦🏻‍♂️
    Americans!
    Yes, that was sarcasm. 
  • Reply 29 of 47
    rcfarcfa Posts: 1,124member
    Scenario A. User agrees to iCloud terms of service which grant Apple the right to scan files from apps the user chooses to back up in iCloud. All the files coming from those user designated apps are scanned on the server side.

    Scenario B. User agrees to iCloud terms of service which grant Apple the right to scan files from apps the user chooses to back up in iCloud. All the files coming from those user designated apps are scanned on the device.

    In both scenarios, the user has complete control over which apps/files are backed up and scanned. The files that get scanned are exactly the same. There is absolutely no difference at all in terms of user control or Apple's ability to scan files. 
    There’s already the problem: Terms of service!

    There should be no terms that allow Apple to do any of it!

    But at least iCloud I can turn off, unsubscribe, wipe.

    On device scanning isn’t TECHNICALLY dependent on iCloud, it’s just an Apple POLICY decision to tie it to iCloud uploads, a policy that could change at any time, gagged by a Patriot Act National Security letter even without user consent or public notice.

    Apple putting such an infrastructure onto its devices is abuse waiting to happen.
    mobirdmuthuk_vanalingamxyzzy01baconstang
  • Reply 30 of 47
    crowleycrowley Posts: 10,453member
    rcfa said:
    crowley said:
    Well that's a surefire way to get yourself sidelined.  If the EFF can't be reasonable and have a dialogue then they'll be treated as unreasonable and unworthy of dialogue.
    Haha! In some things, one can’t compromise.

    ”I want to shackle you on hands and feet, blindfold and gag you!”
    ”I like to retain my freedom!”
    ”OK, let’s compromise. I’ll only gag you, and put you on handcuffs. I’ll also turn off the lights and cover the windows. But you’re going to be free to walk about the room.”
    ”No, I like to retain my freedom!”
    ”Well, that’s a surefire way to get yourself sidelined. If you can’t be reasonable and have a dialogue, then you’ll be treated as unreasonable and unworthy of dialogue.”
    🤦🏻‍♂️

    There’s really no middle ground: either you do the on-device scanning, at which point it’s up to Apple’s policies what gets scanned for (and the policies are subject to all sorts of governments’ pressure), or you don’t. There’s simply no TECHNICAL boundaries that can be set that limit the scanning specific content, there are only the soft boundaries of maleable policies. And THAT’s why this has to go and EFF is 100% right.
    What is the purpose of shackling, blindfolding and gagging me?  If you've got a good, reasonable one then absolutely there is room for compromise, with a view to balancing my human rights with whatever justified societal imprisonment is being invoked.

    Apple aren't trying to do something malicious, they're trying to combat child abuse.
    jony0
  • Reply 31 of 47
    crowleycrowley Posts: 10,453member
    rcfa said:
    crowley said:
    crowley said:
    Well that's a surefire way to get yourself sidelined.  If the EFF can't be reasonable and have a dialogue then they'll be treated as unreasonable and unworthy of dialogue.
    This is obviously the first time you heard of the EFF and have no idea of the work they do otherwise you wouldn't have made such a dopey comment. 
    I have heard of the EFF and know what they do.  They've done some decent and worthwhile campaigning in the past, but they're very close to being no-compromise privacy zealots.  And I doubt Apple are going to be much bothered to engage with people who cannot be reasoned with given the concurrent obligations Apple feels that it has.
    What “obligations Apple feels that it has” are irrelevant. The are a gadget manufacturer and their ONLY job is to protect their users’ privacy, they are NOT in the law enforcement business.

    A screwdriver manufacturer or for that matter a gun manufacturer has to make sure the tool does what’s expected of it, not to be concerned about how the tool might be abused.

    You want screwdrivers with microphones, gps, and a mobile chip in it, to detect if it’s abused in a bomb plot, abduction, or murder? You know, no price too high to “save innocent lives” 🤦🏻‍♂️
    The obligations Apple feel they have aren't irrelevant to Apple.  What a ridiculous statement, facepalm yourself harder.
    jony0
  • Reply 32 of 47
    crowleycrowley Posts: 10,453member
    rcfa said:
    Honestly the device side scanning was a privacy thing. They should just not encrypt iCloud photos and scan them for child porn of any kind like Facebook and Google already do. I’m tired of all these people acting like you deserve the right to privately rape kids. 

    Freedom isn’t free, it ALWAYS comes at the cost of others suffering under the irresponsible abuse of freedom. If one isn’t willing to pay that price, there will be no freedom at all.

    A system scanning for CSAM is TECHNICALLY indistinguishable from one scanning for any other type of content; once a system is in place, policy changes are just a matter of time.

    It’s ludicrous to say the criticism is about child porn, it’s about a massive infrastructure being built, which at a moment’s notice, and with the stroke of a (foreign) law maker’s pen, can be completely changed in its orientation.

    Apple’s word is meaningless, as nobody in their right mind will believe that Apple will drop the US, EU, or Chinese Market if unpalatable laws get passed.
    "If one isn’t willing to pay that price, there will be no freedom at all." absolutist trash, and everything else you've said there is baseless speculation.
    StrangeDaysn2itivguyjony0
  • Reply 33 of 47
    neilmneilm Posts: 989member
    Does anyone actually think CSAM isn't dead as a doornail at this point? 

    Yeah, I know what Apple said, but that's about as convincing as an accused politician resigning "to spend more time with my family."
    baconstangpayeco
  • Reply 34 of 47
    rcfa said: On device scanning isn’t TECHNICALLY dependent on iCloud, it’s just an Apple POLICY decision to tie it to iCloud uploads, a policy that could change at any time, gagged by a Patriot Act National Security letter even without user consent or public notice.
    A. Your future scenario for both Apple's intent and the governments intent is nothing more than conjecture. 

    B. The Patriot Act was passed in 2001, modified by Congress several times since, and in all cases has been publicly available for review. There's nothing "secret" about the parameters of the Patriot Act. If Apple was motivated to spy on it's users via the Patriot Act, why wait until 2021? Were they just procrastinating for the last 20 years? 

    C. Any computing device could be theoretically used for surveillance. Any software could be theoretically used for surveillance. If you're truly worried about the theoretical part more than the factual part, you probably shouldn't be using computing devices at all. 
    n2itivguyjony0
  • Reply 35 of 47
    crowley said:
    I have heard of the EFF and know what they do.  They've done some decent and worthwhile campaigning in the past, but they're very close to being
    no-compromise privacy zealots.  And I doubt Apple are going to be much bothered to engage with people who cannot be reasoned with given the concurrent obligations Apple feels that it has.
    "no-compromise privacy zealots" is how a lot of people would describe Apple.  Remember the San Bernardino case where they refused to help decrypt the shooter's iPhone? What about when the FBI asked for a backdoor to help fight crime?  In both cases, Apple has clearly said "NO, we will not help you hack our phones because it would compromise our users' privacy."  Beginning in MacOS 10.8, Apple added privacy checks that required applications to ask permission to read your personal data.  In Mojave (10.14), they ramped it up with the requirement to ask permission to use the camera and microphone, and in Catalina (10.15) they make apps ask permission to use screen recording or scan most files on your disk.  They have made an entire series of commercials about privacy.

    The one place, sadly, where Apple has "compromised" is in their dealing with China, where they contracted iCloud to GCBD, a company that is capable of being influenced the Chinese Communist Party.  Without this arrangement, the CCP would have embargoed ALL iPhone sales inmainland China.  Period.  This set a terrible precedent, and the EFF and others continue to give them flak for it.  The CSAM image scanning would be a bridge too far, because scanning and reporting rules could be enforced by foreign governments looking to silence dissidents for sharing memes or pictures that match a "known database" of images.
    Apple didn’t simply refuse, they don’t have the ability to decrypt encrypted devices. You get that right?

    As for CSAM scanning, Apple, Google, Microsoft, Dropbox, etc already do it, server-side. Have for years. Why aren’t you upset about then? Should they all stop? If your reasons were valid then it doesn’t matter if it’s server or client side. 

    https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/ ;

    https://www.microsoft.com/en-us/photodna

    https://protectingchildren.google/intl/en/

    edited September 2021 foregoneconclusionn2itivguyjony0
  • Reply 36 of 47
    rcfa said:
    Honestly the device side scanning was a privacy thing. They should just not encrypt iCloud photos and scan them for child porn of any kind like Facebook and Google already do. I’m tired of all these people acting like you deserve the right to privately rape kids. 

    Freedom isn’t free, it ALWAYS comes at the cost of others suffering under the irresponsible abuse of freedom. If one isn’t willing to pay that price, there will be no freedom at all.

    A system scanning for CSAM is TECHNICALLY indistinguishable from one scanning for any other type of content; once a system is in place, policy changes are just a matter of time.

    It’s ludicrous to say the criticism is about child porn, it’s about a massive infrastructure being built, which at a moment’s notice, and with the stroke of a (foreign) law maker’s pen, can be completely changed in its orientation.

    Apple’s word is meaningless, as nobody in their right mind will believe that Apple will drop the US, EU, or Chinese Market if unpalatable laws get passed.
    Yeah buddy, child porn scanning has existed for years. You were just ignorant about it. 

    https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/ ;

    https://www.microsoft.com/en-us/photodna

    https://protectingchildren.google/intl/en/

    n2itivguyjony0
  • Reply 37 of 47
    chasmchasm Posts: 3,347member
    The amount of misinformation upon which the EFF draws their conclusions strongly suggests they are a) completely unaware that Google, Microsoft, Facebook, Yahoo, Twitter, and nearly every photo-loading or sharing site has been doing server-side CSAM scanning for YEARS, which is personally identifiable, and b) does not understand Apple’s implementation. It’s shocking to see the group so completely uninformed on such an important issue.

    The use of the phrase “backdoor in their encryption” — which is easily provably not the case — proves that for once, the EFF is misusing their authority and trust to spread FUD and simply don’t know what they are talking about. I’ve already sent them an email with the Apple documents they clearly did not read, and informed them that I’m withdrawing my support over this.
    jony0
  • Reply 38 of 47
    payecopayeco Posts: 581member
    neilm said:
    Does anyone actually think CSAM isn't dead as a doornail at this point? 

    Yeah, I know what Apple said, but that's about as convincing as an accused politician resigning "to spend more time with my family."
    That was the impression I got. They undertook so much effort defending it over the last month that my guess is they feel like they can’t just say “it’s completely dead now, you guys won”. They announced they’re still working on it and need more time to consult, but we’ll never hear about it again. 
  • Reply 39 of 47
    This is one idea Apple needs to abandon. Do the scanning server side. Don’t scan on device. Period. 
    The CSAM scanning is entirely server side. The message scanning is on device without leaving the device, and it only happens on a phone with a child account, once you turn 18 or aren't listed as a child account under someone else's iCloud there is no more on device scanning. 
  • Reply 40 of 47
    crowleycrowley Posts: 10,453member
    scdundas said:
    This is one idea Apple needs to abandon. Do the scanning server side. Don’t scan on device. Period. 
    The CSAM scanning is entirely server side. The message scanning is on device without leaving the device, and it only happens on a phone with a child account, once you turn 18 or aren't listed as a child account under someone else's iCloud there is no more on device scanning. 
    Apple's planned solution for checking the hash of photos being uploaded to iCloud Photos is not server side.  You are misinformed.
    jony0
Sign In or Register to comment.