Civil rights groups worldwide ask Apple to drop CSAM plans

2

Comments

  • Reply 21 of 48
    maximaramaximara Posts: 409member
    It’s too late now. They just proven they can build something that helps governments, so they’ll demand it now. What an utterly stupid move. The only solution is to INCREASE security by providing end-to-end encryption, regardless of territory. Which they won’t.
    Actually if you think about it, it is a brilliant move.  Odds are this thing is based on complying with laws on the books (18 U.S. Code § 2258 and its four sub laws here in the US) allowing Apple to take a high moral ground on government efforts to open up their iPhones. 

    "But if we allow sideloading they will be able to bypass this measure.  Think of the children that will be harmed!" and to their customers they will point to those very same laws and likely say "But the law effectively requires us to do this - to protect the children."

    Then they break out the popcorn and watch the political free-for-all.

    CheeseFreeze
  • Reply 22 of 48
    maestro64maestro64 Posts: 5,043member
    I think the better solution is not to allow the device to store or transmit these kind of image or even take those kinds of photos. Like the color photocopy industry did when the government found out how easy it was to photocopy money and have it look real. Today if you try to photocopy currency you will just get a black image or it just will not produce a copy.

     It is not Apple is not in the business of report laws being broken or spying on people to figure if they broken some law. If they care about kids then prevent the device from contributing to the crime.
    muthuk_vanalingam
  • Reply 23 of 48
    eriamjheriamjh Posts: 1,644member
    Won’t somebody think of the children?!?!

    /Mrs. Lovejoy
    prismaticskillroyentropys
  • Reply 24 of 48
    DAalsethDAalseth Posts: 2,783member
    I can’t believe how badly Apple has bungled this. They used to be masters of controlling the message, but this has been pure amateur hour. They should have foreseen the worries, the criticism, the accusations. Heck people here saw the possible ramifications within minutes of the first article being posted. But no they dropped it, and let the narrative get away from them. Now I won’t comment on if these organizations worries are correct, but Apple has put themselves in a no win situation. If they go ahead with it they will be the company that works with dictators to spy on dissent and activists. If they kill the project they will be the company that supports predators. 

    Apple has screwed themselves utterly on this one. 
    xyzzy-xxxmobirdbaconstangentropysmacplusplus
  • Reply 25 of 48
    jungmarkjungmark Posts: 6,926member
    maestro64 said:
    I think the better solution is not to allow the device to store or transmit these kind of image or even take those kinds of photos. Like the color photocopy industry did when the government found out how easy it was to photocopy money and have it look real. Today if you try to photocopy currency you will just get a black image or it just will not produce a copy.

     It is not Apple is not in the business of report laws being broken or spying on people to figure if they broken some law. If they care about kids then prevent the device from contributing to the crime.
    How can Apple do that if no one wants them to “scan images”? 
    killroy
  • Reply 26 of 48
    9secondkox29secondkox2 Posts: 2,710member
    George Orwell would be proud. 

    This is somebody snooping on you and controlling you, no matter which way it goes. 

    Sure, the premise is something we all want: protect children from evil - even that which they could bring in themselves. 

    But it really is an open door to massive abuse, spying, censorship, and control. 

    In the madman era we live in now where evil is called good, good is called evil, and common sense doesn’t matter if it’s unpopular, this is really scary stuff. 

    This can and WILL be used against unpopular opinion whether right or wrong, political dissent, religious conviction and liberty, etc. and it will be a tool to silence those not deemed politically “correct” regardless of the freedom of speech/expression afforded us in the USA and regardless of the merit of any disagreement. 

    We’ve already seen what the media does with it’s reality distortion field, what social media companies (and even Apple podcasts) do with opinions they disagree with. They simply cancel them. 

    Enough is enough. Come one Apple. You made the wrong choice here. 

    Simply bake an image/text viewing option into parental controls with a notification on the child’s phone that the image they are about to send/receive is monitored (and make it a term for other communications apps to have this app activated in a mandatory manner) and be done with it. 


    entropys
  • Reply 27 of 48
    killroykillroy Posts: 276member
    lkrupp said:
    I just wonder if these civil rights groups are also asking Google and Microsoft to cease their CSAM activities? Anyone know?

    You know their not. On some boards if you bring that up they go deaf dumb and blind.
    edited August 2021
  • Reply 28 of 48
    killroykillroy Posts: 276member
    tylersdad said:
    cjlacz said:
    Little late for that now. Apple already opened that can of worms (if it is actually a problem) just by announcing it. I'd still kind of like to see this implemented. I think Apple has addressed this as best it can, and governments can mandate that companies scan for this stuff with laws regardless of any prior implementation. Tech has kind of created this problem of sharing these CSAM photos easily. I'd like to see them as part of the solution too.

    Agreed - our ability to say, post or store online what we want without consequence has gone too far. This has been caused by the tech companies so it’s really up to them to fix it.

    Uh...so you're against the First Amendment? Freedom of expression? And to what end would you allow law enforcement to monitor content to make sure people are facing consequences for posting inappropriate online content? 

    And the child's freedom.
  • Reply 29 of 48
    zimmiezimmie Posts: 651member
    Spotlight already indexes everything on your phone. If these totalitarian nightmares were going to happen, why wouldn't they have started a decade ago with the technology which has always done what people are afraid this new tech might be extended to do?
    dee_deemaximara
  • Reply 30 of 48
    EsquireCatsEsquireCats Posts: 1,268member
    "Backdoor" doesn't mean what these people think it means. (That's likely also why Craig wasn't able to address the question.)

    Most of this stuff is just hysterics from technophobes - scanning for CP content is long out of the bag, it's in every major social media service and Apple would be the last large provider to enable such a technology on their Cloud photo service. Apple's implementation is arguably the best for preserving privacy, minimising false positives and addressing potential for abuse by having human intervention as part of the review process.

    The only merit I can give is to the example of unfriendly relations between abusive parents and children, however it's such a teased out series of requirements, and the child can still prevent any notification being sent to the abusive parent - in the worst case scenario the child merely uses a different chat app.
    maximara
  • Reply 31 of 48
    I don‘t get it: if what is being said is true, then already this kind of tech is running on many file hosting servers, which would mean that the door of concern is already wide open.
  • Reply 32 of 48
    They have turned this into a bigger fiasco than the Maps screw up.
    DAalsethmacplusplus
  • Reply 33 of 48
    Apple makes mistakes. The butterfly keyboard was one of them. But this mistake was trivial compared to Apple’s decision to start scanning our phones for images. It’s the equivalent of someone entering your house, rifling through your photo albums, taking a few away, and delivering them to the authorities. 

    Of course we must do all we can to prevent child abuse. But not by abusing our fundamental rights. Surely a better way to prevent CSAM is to scan photo collections in the cloud, like Google does. But this process should NOT occur on our phones which should be sacrosanct. 

    Yes, I know that Apple says only half the process is done on the phone and that it’s only looking for digital tags. But that’s just techno mumbo jumbo. The point is that the privacy of our iPhone is being breached and being used to monitor us. It’s a back door to a bleak future. 
    DAalsethmacplusplusCheeseFreezebaconstangmuthuk_vanalingam
  • Reply 34 of 48
    maximara said:
    It’s too late now. They just proven they can build something that helps governments, so they’ll demand it now. What an utterly stupid move. The only solution is to INCREASE security by providing end-to-end encryption, regardless of territory. Which they won’t.
    Actually if you think about it, it is a brilliant move.  Odds are this thing is based on complying with laws on the books (18 U.S. Code § 2258 and its four sub laws here in the US) allowing Apple to take a high moral ground on government efforts to open up their iPhones. 

    "But if we allow sideloading they will be able to bypass this measure.  Think of the children that will be harmed!" and to their customers they will point to those very same laws and likely say "But the law effectively requires us to do this - to protect the children."

    Then they break out the popcorn and watch the political free-for-all.

    Very interesting point you’re making here. Indeed, this might be much more political than most of us think, protecting their self-interest in keeping a closed ecosystem alive.  
  • Reply 35 of 48
    maestro64 said:
    I think the better solution is not to allow the device to store or transmit these kind of image or even take those kinds of photos. Like the color photocopy industry did when the government found out how easy it was to photocopy money and have it look real. Today if you try to photocopy currency you will just get a black image or it just will not produce a copy.

     It is not Apple is not in the business of report laws being broken or spying on people to figure if they broken some law. If they care about kids then prevent the device from contributing to the crime.
    It’s practically impossible to manually detect when an imagine is contextually child-porn. 

    A big part of what makes these images illegal is context/situation. You can’t really accurately differentiate between what is completely inappropriate and unacceptable, and a photo of your kid running in the yard naked. 
    The false positives would be terrible. 
    baconstang
  • Reply 36 of 48
    payecopayeco Posts: 581member
    maestro64 said:
    I think the better solution is not to allow the device to store or transmit these kind of image or even take those kinds of photos. Like the color photocopy industry did when the government found out how easy it was to photocopy money and have it look real. Today if you try to photocopy currency you will just get a black image or it just will not produce a copy.

     It is not Apple is not in the business of report laws being broken or spying on people to figure if they broken some law. If they care about kids then prevent the device from contributing to the crime.
    This is completely infeasible. How is Apple supposed to determine what is the breast of a 15 year old girl and therefore block someone from taking a photo versus the breast of an 18 year old woman and allowing it?
    fastasleep
  • Reply 37 of 48
    I see 3 problems with Apple’s CSAM detection:

    1) Proof of concept

    Remember when the FBI tried to use the San Bernardino shooter case to compel Apple to put a back door in iOS? They couldn’t, because in an American courtroom code is speech and the first amendment protects Apple from being compelled to speak.

    That protection comes into question when they’ve already shipped code which the government would like to repurpose. There’s already a working proof of concept: the capability now exists for Apple to scan to your content and report you to the authorities.

    2) Lack of foresight

    Ostensibly, this tech exists to detect and report CSAM. But by publicizing it well in advance and issuing documentation about the process of detection, much of this content will either become better hidden or leave their platform. That doesn’t mean it will go away; it will just go somewhere else. So the problem Apple is solving is not truly “how can we detect and report CSAM,” it’s “how can we get CSAM off our platform.”

    I don’t see how this solves the real issue, and regrettably, it may make it worse.

    3) Unintended consequences

    We like to think we know what the future holds and the net effect of what we’re building. But our best intentions often backfire. Apple execs now limit their kids’ use of devices. Facebook execs now keep their children off the same platforms they’re building. Things evolve from their original design.

    The only protection we have right now against Apple using this tech against people is a promise. “We won’t. And if we ever did, programmers could find out.” These promises don’t protect anyone from the legal, regulatory, and economic pressure that could compel or coerce Apple to do otherwise.

    We’ve seen Apple publicly flip on all sorts of things (iPods with video, anyone?). Normally when it’s in their customer’s best interests but also when it’s technically, legally, financially, or ethically in Apple’s OWN best interests (the way they continue to capitulate to China feels like a harbinger here).

    I fear for the future of this tech.
    edited August 2021
  • Reply 38 of 48
    Oh my God how people are not understanding this feature.
    To all of you worried about on device image scanning, how do you think you can search for images of people on your phone? This has been done for years.
    To those that think this breaks e2e encryption, this is actually Apples way of preserving this. By scanning images on device instead of in the cloud.
    Basically all other companies offering cloud storage are already doing this as part of the service, Microsoft, Google, Facebook, Amazon etc. It would be much easier for nefarious governments to force them to scan for more things. If they agree to this you would not even know.
    For Apple to expand this functionality, it would have to be on device, which would be much harder to do in the dark.
    There is one new thing here. Apple will receive information about on device scanning when you upload your pictures. The a positive CSAM scan result will be uploaded with a key that allows for the decryption of that image by apple.
    So there IS a privacy concern but it is not at all what people are howling about.
    I think that Apple had gone to great lengths to preserve privacy while also attempting to offer less implicit support for child abuse.
    There is definitely no Pandora's box here. The novelty here is not about surveillance but how to avoid it as much as possible while also preventing abuse of their services.

    CheeseFreezefastasleepDetnator
  • Reply 39 of 48
    DAalseth said:
    I can’t believe how badly Apple has bungled this. They used to be masters of controlling the message, but this has been pure amateur hour. They should have foreseen the worries, the criticism, the accusations. Heck people here saw the possible ramifications within minutes of the first article being posted. But no they dropped it, and let the narrative get away from them. Now I won’t comment on if these organizations worries are correct, but Apple has put themselves in a no win situation. If they go ahead with it they will be the company that works with dictators to spy on dissent and activists. If they kill the project they will be the company that supports predators. 

    Apple has screwed themselves utterly on this one. 
    Perhaps the problem with Apple's apparent inability to see the issue here is due to a lack of diversity.  Not the irrelevant view of diversity where we categorize people based upon skin color, gender, religion and the like.  But true diversity...  diversity of thought.  Denise Young Smith was right on target with her comments on diversity, so Apple ousted her.  Could this be an example of what happens when you fail to have diversity of thought and perspective?  When senior management is working in lock step rather than asking questions?  Having seen Cook's response to share holder questions on his leadership, would an employee dare to rock the boat?

    Perhaps?
    edited August 2021
  • Reply 40 of 48
    gatorguygatorguy Posts: 24,213member
    "Backdoor" doesn't mean what these people think it means. (That's likely also why Craig wasn't able to address the question.)

    Most of this stuff is just hysterics from technophobes - scanning for CP content is long out of the bag, it's in every major social media service and Apple would be the last large provider to enable such a technology on their Cloud photo service. Apple's implementation is arguably the best for preserving privacy, minimising false positives and addressing potential for abuse by having human intervention as part of the review process.

    The only merit I can give is to the example of unfriendly relations between abusive parents and children, however it's such a teased out series of requirements, and the child can still prevent any notification being sent to the abusive parent - in the worst case scenario the child merely uses a different chat app.
    Read this article concerning a pair of researchers who designed a system essentially identical to Apple's CSAM on-device scanning and found it had a HUGE issue and simply too dangerous to put to use:


    Our research project began two years ago, as an experimental system to identify CSAM in end-to-end-encrypted online services. As security researchers, we know the value of end-to-end encryption, which protects data from third-party access. But we’re also horrified that CSAM is proliferating on encrypted platforms. And we worry online services are reluctant to use encryption without additional tools to combat CSAM.

    We sought to explore a possible middle ground, where online services could identify harmful content while otherwise preserving end-to-end encryption. The concept was straightforward: If someone shared material that matched a database of known harmful content, the service would be alerted. If a person shared innocent content, the service would learn nothing. People couldn’t read the database or learn whether content matched, since that information could reveal law enforcement methods and help criminals evade detection. 

    After many false starts, we built a working prototype. But we encountered a glaring problem.

    Our system could be easily repurposed for surveillance and censorship. The design wasn’t restricted to a specific category of content; a service could simply swap in any content-matching database, and the person using that service would be none the wiser.

    A foreign government could, for example, compel a service to out people sharing disfavored political speech. That’s no hypothetical: WeChat, the popular Chinese messaging app, already uses content matching to identify dissident materials,  India enacted rules this year that could require pre-screening content critical of government policy. Russia recently fined Google, Facebook, and Twitter for not removing pro-democracy protest materials.

    We were so disturbed that we took a step we hadn’t seen before in computer science literature: We warned against our own system design, urging further research on how to mitigate the serious downsides. We’d planned to discuss paths forward at an academic conference this month.

    That dialogue never happened. The week before our presentation, Apple announced it would deploy its nearly identical system on iCloud Photos, which exists on more than 1.5 billion devices. Apple’s motivation, like ours, was to protect children. And its system was technically more efficient and capable than ours. But we were baffled to see that Apple had few answers for the hard questions we’d surfaced.

    https://www.washingtonpost.com/opinions/2021/08/19/apple-csam-abuse-encryption-security-privacy-dangerous/

    edited August 2021 muthuk_vanalingam
Sign In or Register to comment.