Civil rights groups worldwide ask Apple to drop CSAM plans

Posted:
in iOS edited August 19
More than 80 civil rights groups have sent an open letter to Apple, asking the company to abandon its child safety plans in Messages and Photos, fearing expansion of the technology by governments.

Apple's new child protection feature
Apple's new child protection feature


Following the German government's description of Apple's Child Sexual Abuse Material plans as surveillance, 85 organizations around the world have joined the protest. Groups including 28 US-based ones, have written to CEO Tim Cook.

"Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material (CSAM)," says the full letter, "we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children."

"Once this capability is built into Apple products, the company and its competitors will face enormous pressure -- and potentially legal requirements -- from governments around the world to scan photos not just for CSAM, but also for other images a government finds objectionable," it continues.

"Those images may be of human rights abuses, political protests, images companies have tagged as 'terrorist' or violent extremist content, or even unflattering images of the very politicians who will pressure the company to scan for them," says the letter.

"And that pressure could extend to all images stored on the device, not just those uploaded to iCloud. Thus, Apple will have laid the foundation for censorship, surveillance and persecution on a global basis."

Signatories on the letter have separately been promoting its criticisms, including the Electronic Frontier Foundation.

Apple's latest feature is intended to protect children from abuse, but the company doesn't seem to have considered the ways in which it could enable abuse https://t.co/FjIJN8bKaL

-- EFF (@EFF)


The letter concludes by urging Apple to abandon the new features. It also urges "Apple to more regularly consult with civil society groups," in future.

Apple has not responded to the letter. However, Apple's Craig Federighi has previously said that the company's child protection message was "jumbled," and "misunderstood."

Read on AppleInsider
rcfa
«13

Comments

  • Reply 1 of 48
    cjlaczcjlacz Posts: 42member
    Little late for that now. Apple already opened that can of worms (if it is actually a problem) just by announcing it. I'd still kind of like to see this implemented. I think Apple has addressed this as best it can, and governments can mandate that companies scan for this stuff with laws regardless of any prior implementation. Tech has kind of created this problem of sharing these CSAM photos easily. I'd like to see them as part of the solution too.
    sireofsethchemengin1killroyiHyjony0
  • Reply 2 of 48
    cjlacz said:
    Little late for that now. Apple already opened that can of worms (if it is actually a problem) just by announcing it. I'd still kind of like to see this implemented. I think Apple has addressed this as best it can, and governments can mandate that companies scan for this stuff with laws regardless of any prior implementation. Tech has kind of created this problem of sharing these CSAM photos easily. I'd like to see them as part of the solution too.

    Agreed - our ability to say, post or store online what we want without consequence has gone too far. This has been caused by the tech companies so it’s really up to them to fix it.

    killroyjony0
  • Reply 3 of 48
    FYI to EFF: totalitarian governments already have their populations under surveillance. You're not thwarting totalitarianism by having Apple remove the CSAM hash scanning capability. Citizens of China, Russia, Turkey etc. use smartphones and are still under totalitarian control regardless. 
    killroyjony0
  • Reply 4 of 48
    FYI to EFF: totalitarian governments already have their populations under surveillance. You're not thwarting totalitarianism by having Apple remove the CSAM hash scanning capability. Citizens of China, Russia, Turkey etc. use smartphones and are still under totalitarian control regardless. 
    So what you are saying is that since China, Russia and Turkey have their population under surveillance. Then countries under a Democracy shouldn’t express their VALID privacy concerns about the implementation of this scanning mechanism. 

    People don’t forget that none of the people arguing about this implementation are against of protecting children or scanning CSAM material which btw already occurs on most cloud services…

    is about on the wrong hands or even with government pressure what else can I scan on that device…

    again not totalitarian governments because they just do it. I am talking about every other government.

    It doesn’t matter how many times Apple will try to explain it. It has that feeling of being a slippery slope for privacy in general. 
    tnet-primarygeorgie01mrstepmuthuk_vanalingamCheeseFreezeOctoMonkeyxyzzy-xxxbaconstangentropysstevenoz
  • Reply 5 of 48
    Apple can do no wrong in the eyes of many. This new feature that Apple has developed is wrong. It’s a bad capability put to good use. The objective of reducing the transmission of CSAM is good. But it’s like plugging leaks in the proverbial dike. It makes the transmission of illicit content more difficult but If implemented it will just force the use of other pathways to move the content about. However the byproduct of this action- the scanning of content of people’s devices- will be disastrous. Now that governments know there is an ability for Apple to interrogate the content on people’s devices it won’t be long before governments require Apple to perform other types of content scanning on devices. Governments routinely require Apple to divulge iCloud content. That content is not encrypted. Users had the option of keeping content secured from government eyes by keeping content on their devices and out of iCloud. This capability will mark the beginning of the end of that security. This capability is totally at odds with Apple’s heretofore emphasis on the privacy and security of content on their devices. The law of unintended consequences is going to have a significant impact if this capability is implemented. This is an example of the old Ben Franklin adage about giving up some freedom to have better security and having neither as a result. I’m surprised that Apple leadership hasn’t thought through this decision better and I’m fairly sure the marketing department at Apple somehow sees this as being beneficial to the company and revenues - which I think is decidedly wrong.
    mrstepCheeseFreezexyzzy-xxxchemengin1repressthisbaconstangentropys
  • Reply 6 of 48
    Apple_Bar said:
    FYI to EFF: totalitarian governments already have their populations under surveillance. You're not thwarting totalitarianism by having Apple remove the CSAM hash scanning capability. Citizens of China, Russia, Turkey etc. use smartphones and are still under totalitarian control regardless. 
    So what you are saying is that since China, Russia and Turkey have their population under surveillance. Then countries under a Democracy shouldn’t express their VALID privacy concerns about the implementation of this scanning mechanism. 
    No, I'm responding to EFF's claim that Apple is somehow harming global freedom and privacy rights with the CSAM hash scanning. Hash scanning isn't something Apple invented. Any government from any country could hire programmers to create hash scanning programs for phones or computers. Whether or not Apple includes the CSAM hash scanning functionality doesn't change anything in that regard. 
    sireofsethkillroy
  • Reply 7 of 48
    tedz98 said: However the byproduct of this action- the scanning of content of people’s devices- will be disastrous. Now that governments know there is an ability for Apple to interrogate the content on people’s devices it won’t be long before governments require Apple to perform other types of content scanning on devices.
    The origins of hash scanning began in the 1940s. It's too late to be worried about the implications for electronic search that involves hashes.

    https://spectrum.ieee.org/hans-peter-luhn-and-the-birth-of-the-hashing-algorithm
    edited August 19 killroyjony0
  • Reply 8 of 48

    …our ability to say, post or store online what we want without consequence has gone too far.

    I wish people had enough knowledge of history to see how scary what you’ve said sounds. Yes there are problems, some big ones (such as CSAM), but the US and it’s principles have done so well that people have no meaningful awareness of the cost and what about it made things so (relatively) good. The idea of living under a dictatorship doesn’t even concern many people as long as they agree with what is being dictated. The past 17 months prove that about many people in this US.

    The reality is that freedom must be fiercely protected. While there are many people who don’t understand this, there are many people who do. And that’s why we’re seeing the pushback against Apple’s surveillance technology. I’m glad there are still people in higher profile organisations who see it.
    muthuk_vanalingamrcfaxyzzy-xxxbaconstangOferentropys
  • Reply 9 of 48
    rcfarcfa Posts: 1,107member
    cjlacz said:
    Little late for that now. Apple already opened that can of worms (if it is actually a problem) just by announcing it. I'd still kind of like to see this implemented. I think Apple has addressed this as best it can, and governments can mandate that companies scan for this stuff with laws regardless of any prior implementation. Tech has kind of created this problem of sharing these CSAM photos easily. I'd like to see them as part of the solution too.

    Agreed - our ability to say, post or store online what we want without consequence has gone too far. This has been caused by the tech companies so it’s really up to them to fix it.

    Nothing has gone too far! It’s not going far enough. Rights are not defined by how they can be abused. It’s also not the tech companies issue to “fix” something they didn’t even break. Law enforcement hat a lucky break with analog telecommunications, which were trivial to intercept. Before that, people talked in person, or sent trusted messengers or messages that would then be burnt. There was never a right to intercept messages, which is why the laws against intercepting mail are very strict.

    The idea, that a temporary technological weakness (analog transmissions) leads to a permanent right to unfettered access to people’s data, is ridiculous.

    Does the constitution make any exceptions about search and seizure? No, it doesn’t. Police can’t just randomly enter your house and say: “Oh, we’re just looking for child porn.”
    Does the constitution allow torture or “truth serums”? No, it doesn’t. Modern computing devices, especially phones, are in essence brain prosthetics. Access to these is like tapping someone’s brain.

    Our ability to say, post, or store online what we want without consequences by far doesn’t go far enough, as cancel culture, political correctness, executed atheists, incarcerated regime critics, or even the cases of Snowden and Assange amply prove.
    xyzzy-xxxmrstepbaconstang
  • Reply 10 of 48
    mrstepmrstep Posts: 478member
    tedz98 said: However the byproduct of this action- the scanning of content of people’s devices- will be disastrous. Now that governments know there is an ability for Apple to interrogate the content on people’s devices it won’t be long before governments require Apple to perform other types of content scanning on devices.
    The origins of hash scanning began in the 1940s. It's too late to be worried about the implications for electronic search that involves hashes.

    https://spectrum.ieee.org/hans-peter-luhn-and-the-birth-of-the-hashing-algorithm
    Concentration camps were pioneered by the British during the Boer wars, so it's a little bit late to oppose them?  Murder and theft have always been around too, why bother making laws against them either?

    Apple is adding client-side file scanning on customers devices - both with hashes (photos) and with ML matching (messages), and everyone's point is that it's almost certain that it will be abused. Once that 'protection' is added, governments will start the push for matching content on the device regardless of whether it's being uploaded to iCloud, and push to expand the definition to include political content, copyrighted content, and anything they want to control.  You have humble Apple defending this idea, rights groups, governments, technical experts, and regular people questioning it.  It shifts the definition of "privacy" to include searching your content on your own device when there's no warrant and no reasonable suspicion.

    That cloud-side content has been scanned for a while is well known, even if it seems like a questionable practice as well.  If you put something in a bank vault, it typically at least takes a search warrant to get that opened.  If a bank decided it was going to install cameras in your home - but would only use the cameras to inspect items you were going to bring to their vault, pinky promise that it wouldn't be abused! - most people would see that as overreach.

    Others would explain that cameras aren't a new technology and that it's too late to worry about the implications of how they're used. 🤦‍♂️
    rcfaxyzzy-xxxbaconstangandrewj5790
  • Reply 11 of 48
    rcfarcfa Posts: 1,107member
    The organizations that know how dangerous this is, put pressure on Apple. Bravo!

    Apple can get out of this: yank out the feature, and implement E2E encryption for all data, including iCloud backups.

    Apple can say, that they were emotionally swayed in their attempt of solving this problem, but were convinced that it could and likely would be abused, and that they through the resulting discussion were convinced to double down on user privacy.

    They can. The question is: will they?
    xyzzy-xxxbaconstangentropys
  • Reply 12 of 48
    rcfarcfa Posts: 1,107member
    Apple_Bar said:
    FYI to EFF: totalitarian governments already have their populations under surveillance. You're not thwarting totalitarianism by having Apple remove the CSAM hash scanning capability. Citizens of China, Russia, Turkey etc. use smartphones and are still under totalitarian control regardless. 
    So what you are saying is that since China, Russia and Turkey have their population under surveillance. Then countries under a Democracy shouldn’t express their VALID privacy concerns about the implementation of this scanning mechanism. 
    No, I'm responding to EFF's claim that Apple is somehow harming global freedom and privacy rights with the CSAM hash scanning. Hash scanning isn't something Apple invented. Any government from any country could hire programmers to create hash scanning programs for phones or computers. Whether or not Apple includes the CSAM hash scanning functionality doesn't change anything in that regard. 
    Yes, it changes EVERYTHING if Apple includes hash scanning functionality or not. Because for hash scanning to work, things have to be unencrypted. And if Apple’s on-device data is encrypted by keys to which only the user has access to, and is e2e encrypted, including at rest, as it’s stored on the cloud, there’s no hash scanning, either.

    The only time hash scanning could possibly work, is if someone opens up an album for cloud sharing to friends, at which time it must be unencrypted or encrypted with a key to which Apple has access. But such scanning would then be stupid, because only the dumbest of the dumb would share questionable contents directly on an open photo album, rather than in an encrypted zip file, via file sharing.

    In other words: it’s rather clear: unless there’s on-device scanning, people’s data can be kept safe by e2e encryption, something that if Apple won’t do it GrapheneOS will do; so if Apple wants to remain competitive, they should do it, too.

    On device scanning is opening the door to all sorts of devils, it’s utterly irrelevant how old the basics of hash scanning are. Anyone who pretends otherwise either demonstrates a lack of understanding of security, or is a government/law enforcement troll who wants to spread FUD about the whole thing, in an effort to save and later expand on Apple’s efforts.
    xyzzy-xxxbaconstangiHy
  • Reply 13 of 48
    lkrupplkrupp Posts: 9,552member
    cjlacz said:
    Little late for that now. Apple already opened that can of worms (if it is actually a problem) just by announcing it. I'd still kind of like to see this implemented. I think Apple has addressed this as best it can, and governments can mandate that companies scan for this stuff with laws regardless of any prior implementation. Tech has kind of created this problem of sharing these CSAM photos easily. I'd like to see them as part of the solution too.

    Agreed - our ability to say, post or store online what we want without consequence has gone too far. This has been caused by the tech companies so it’s really up to them to fix it.

    What you say, post, or store online DOES have consequences, especially if it incites violence, promotes racism, sexism, xenophobia, child abuse, the list goes on. We DO NOT have freedom of speech ANYWHERE except on public property. That’s ALL the First Amendment covers. If a private platform doesn’t like what you say, post, or store online they can they can do censor you. So all this blathering about being able to say whatever you want anywhere you want is nonsense. That freedom does not, and never has, existed.

    i took your response as sarcasm. If it wasn’t I apologize. 
    edited August 19 Oferjony0
  • Reply 14 of 48
    cjlacz said:
    Little late for that now. Apple already opened that can of worms (if it is actually a problem) just by announcing it. I'd still kind of like to see this implemented. I think Apple has addressed this as best it can, and governments can mandate that companies scan for this stuff with laws regardless of any prior implementation. Tech has kind of created this problem of sharing these CSAM photos easily. I'd like to see them as part of the solution too.

    Agreed - our ability to say, post or store online what we want without consequence has gone too far. This has been caused by the tech companies so it’s really up to them to fix it.

    Uh...so you're against the First Amendment? Freedom of expression? And to what end would you allow law enforcement to monitor content to make sure people are facing consequences for posting inappropriate online content? 
    Oferentropys
  • Reply 15 of 48
    lkrupplkrupp Posts: 9,552member
    I just wonder if these civil rights groups are also asking Google and Microsoft to cease their CSAM activities? Anyone know?
  • Reply 16 of 48
    gatorguygatorguy Posts: 23,252member
    lkrupp said:
    I just wonder if these civil rights groups are also asking Google and Microsoft to cease their CSAM activities? Anyone know?
    They aren't doing so on a user's own private smartphone. They wait until you put the data on the company's "property" rather than your own. That's the difference, and I'm certain you know that. 
    mrstepmuthuk_vanalingamxyzzy-xxxchemengin1baconstangentropys
  • Reply 17 of 48
    lkrupplkrupp Posts: 9,552member

    tylersdad said:
    cjlacz said:
    Little late for that now. Apple already opened that can of worms (if it is actually a problem) just by announcing it. I'd still kind of like to see this implemented. I think Apple has addressed this as best it can, and governments can mandate that companies scan for this stuff with laws regardless of any prior implementation. Tech has kind of created this problem of sharing these CSAM photos easily. I'd like to see them as part of the solution too.

    Agreed - our ability to say, post or store online what we want without consequence has gone too far. This has been caused by the tech companies so it’s really up to them to fix it.

    Uh...so you're against the First Amendment? Freedom of expression? And to what end would you allow law enforcement to monitor content to make sure people are facing consequences for posting inappropriate online content? 
    You clearly don’t understand the First Amendment. 

    'Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.”

    Congress shall make no laws... I have the right to censor speech in my own home. I get to decide who says what in my business. If you work for me and use my internet access, computer, phone, email, I decide what’s appropriate. I get to censor people on the social media platform I own. Only the government is prohibited by the First Amendment from doing so. And by the way, the government does control speech when it comes to certain situations, like incitement to riot, violence, etc.
    jony0
  • Reply 18 of 48
    payecopayeco Posts: 455member
    The pressure on this issue keeps rising. I’m curious to see if Apple relents. 
    xyzzy-xxxprismaticskillroy
  • Reply 19 of 48
    It’s too late now. They just proven they can build something that helps governments, so they’ll demand it now. What an utterly stupid move. The only solution is to INCREASE security by providing end-to-end encryption, regardless of territory. Which they won’t.

    I agree with the objections, especially with Apple having demonstrated to let their commercial interests be a priority over their privacy mantra (foregoing end-to-end encryption after FBI objections, concessions after objections from Saudi Arabia, Chinese and Russian governments).

    At least I hope Apple opens up the operating system for equal integration opportunities (e.g storage of photos on third party cloud storage solutions that actually provide 100% privacy security).

    Bunch of hypocrites. 


    prismaticsmuthuk_vanalingamxyzzy-xxxbaconstang
  • Reply 20 of 48
    Here is the link for the Electronic Frontier Foundation Petition to stop CSAM scanning:

Sign In or Register to comment.