German government wants Tim Cook to reconsider CSAM plans

2

Comments

  • Reply 21 of 42
    fastasleepfastasleep Posts: 6,408member

    It’s quite possible that any detection system that may actually work will receive a lot of criticism. Not because a loads of people are guilty of what is being looked for, but because most of us probably have files/photos which we would rather not be looked at as they maybe marginally or actually illegal.

    Nobody is looking at your photos. You should probably read about this feature a bit more since you clearly don't understand how it works.
  • Reply 22 of 42
    fastasleepfastasleep Posts: 6,408member
    jido said:

    The Messages update may be more nefarious, don't want to notify authorities about what images I receive.
    Well, since it doesn't do that, you have nothing to worry about. If you were under 13 years old and your parent had opted in to this feature in your Family account, you'd have to worry about your parents (not authorities) getting notified, and your device would warn you first that that's about to happen if you proceed to send/receive nudes. Are you twelve? Is your parent going to turn this on? No? Then don't worry about it.
  • Reply 23 of 42
    baconstangbaconstang Posts: 1,104member
    The Cats. Esq.. sez

    "It's no more a slippery slope than Facebook, Instagram, Google Photos, Youtube, et. al. each scanning content and hashes upon upload. To address the topic of people worried this would devolve into a tool to suppress political dissent: it's trivial to repurpose a political message to evade a hash by reinventing the content - indeed this happens naturally even without such restrictions, while on the other hand it's difficult to repurpose a series of photos or videos to evade filters (see also copyright filters.)

    That's why I don't use any of that BS.
    OctoMonkey
  • Reply 24 of 42
    M68000M68000 Posts: 719member
    Ofer said:
    As others have already said, technically this amounts to illegal search. Law enforcement agencies are required to have reasonable suspicion and to obtain a warrant in order to search for illegal materials. Yet what Apple is doing is searching through every iCloud owner’s account for illegal material. Regardless of whether or not other companies already do this or whether or not it has potential for abuse by authorities, the act of searching people who are supposed to be presumed innocent is not right.
    It’s legal if Apple has it in their terms of use or legal disclaimers that nearly everybody clicks through without reading when installing software.   It’s really their computer and their software and OS.  They can and will change their legal agreement for their terms of use for their customers. 
  • Reply 25 of 42
    entropysentropys Posts: 4,152member
    M68000 said:
    Ofer said:
    As others have already said, technically this amounts to illegal search. Law enforcement agencies are required to have reasonable suspicion and to obtain a warrant in order to search for illegal materials. Yet what Apple is doing is searching through every iCloud owner’s account for illegal material. Regardless of whether or not other companies already do this or whether or not it has potential for abuse by authorities, the act of searching people who are supposed to be presumed innocent is not right.
    It’s legal if Apple has it in their terms of use or legal disclaimers that nearly everybody clicks through without reading when installing software.   It’s really their computer and their software and OS.  They can and will change their legal agreement for their terms of use for their customers. 
    Yes unfortunately that is true.
  • Reply 26 of 42
    bbhbbh Posts: 134member
    After much research, I personally, have come to the conclusion this is much ado about nothing. The doomsday "what if" scenarios can go on ad infinitum. It does seem that Apple, smart folks that they are, has come up with a scheme that can keep CSAM off their servers, WITHOUT LOOKING AT YOUR PHOTOS. 

    Yes, part of it is resident on your device. Yes, endless potential for abuse by foreign or domestic Facists exists, but it does already. There are Congressmen that even now don't think you have the right to any secrets. This sets the stage for Apple to offer ee2e while still hosting your photos on their servers. 

    Take a deep breath and consider your options if you are still bummed by this. Synology has a nice variety of NAS devices that would allow you to remain totally "local". No cloud presence at all. I've got a 2 drive one, but am not concerned with Apple's approach to CSAM. 
  • Reply 27 of 42
    Well, I guess it does not matter whether Apple is right or wrong. Based on the letter, it seems German Parliament (and hence the government) is against this. Looks like this will be discussed in the EU also and the situation may not be different there. I doubt whether China, Russia, Iran, and other countries would agree to an American company installing official spyware on their phones giving literally America a tool to spy on their citizens. The German Parliament has issued a veiled threat (based on what is in the article) that Apple is bound to lose access to large markets if they continue with this harebrained idea. I think Apple will backtrack on this issue and will be left with egg on their face. However, its usual PR machine will cover it up in a few months.
    muthuk_vanalingamelijahg
  • Reply 28 of 42
    jidojido Posts: 125member
    jido said:

    The Messages update may be more nefarious, don't want to notify authorities about what images I receive.
    Well, since it doesn't do that, you have nothing to worry about. If you were under 13 years old and your parent had opted in to this feature in your Family account, you'd have to worry about your parents (not authorities) getting notified,
    Luckily nobody forces me to join their Family account, that's right.
     and your device would warn you first that that's about to happen
    Does the device notify that I received suspicious content or does it notify that I bypassed the warning and loaded it? Not that it makes a huge difference.
    if you proceed to send/receive nudes.
    Who knows what it will flag, do I trust Apple that their AI only looks for nudes?
    Are you twelve? Is your parent going to turn this on? No? Then don't worry about it.
    Yeah but still...
  • Reply 29 of 42
    crowleycrowley Posts: 10,453member
    jido said:
    jido said:

    The Messages update may be more nefarious, don't want to notify authorities about what images I receive.
    Well, since it doesn't do that, you have nothing to worry about. If you were under 13 years old and your parent had opted in to this feature in your Family account, you'd have to worry about your parents (not authorities) getting notified,
    Luckily nobody forces me to join their Family account, that's right.
     and your device would warn you first that that's about to happen
    Does the device notify that I received suspicious content or does it notify that I bypassed the warning and loaded it? Not that it makes a huge difference.
    if you proceed to send/receive nudes.
    Who knows what it will flag, do I trust Apple that their AI only looks for nudes?
    Are you twelve? Is your parent going to turn this on? No? Then don't worry about it.
    Yeah but still...
    But still what?
  • Reply 30 of 42
    ikirikir Posts: 127member
    genovelle said:
    Since the images are not scanned but specific data from hash tag markers to identify know child pornography images from a data base. It is no different from a file that has a known virus being detected and handled. There is a reason this guy is coming out as many others. They have these files themselves and are fearful of being caught  
    Sure... everyone that's against this has CSAM... that makes perfect sense...

    Or I don't know, maybe people actually value their privacy and Apple just messed up big time?

    It's not an objection to the scanning, it's an objection to the scanning being done on your device without you having the option to disable it.
    Or users just didn’t understand how it works. Since Google and other are already doing it without respecting your privacy, Apple found a way to check for CSAM without scanning photos. There are two people who are against These features: one that have CSAM images, one who didn’t understand how it works and listened to other paranoia
  • Reply 31 of 42
    ikirikir Posts: 127member
    jido said:
    Client-side CSAM detection is incontrovertibly better for privacy than server-side CSAM detection.
    THAT!

    I am happy of the change, it means Apple doesn't need to see the contents of my pictures to thwart child abuse.

    The Messages update may be more nefarious, don't want to notify authorities about what images I receive.
    Apple didn’t see your photos. See, that’s paranoia. Neither messages function this way, it is for children who are in a family group. 

    These comments are the proof used don’t have a clue.
  • Reply 32 of 42
    bbh said:
    After much research, I personally, have come to the conclusion this is much ado about nothing. The doomsday "what if" scenarios can go on ad infinitum. It does seem that Apple, smart folks that they are, has come up with a scheme that can keep CSAM off their servers, WITHOUT LOOKING AT YOUR PHOTOS. 

    Yes, part of it is resident on your device. Yes, endless potential for abuse by foreign or domestic Facists exists, but it does already. There are Congressmen that even now don't think you have the right to any secrets. This sets the stage for Apple to offer ee2e while still hosting your photos on their servers. 

    Take a deep breath and consider your options if you are still bummed by this. Synology has a nice variety of NAS devices that would allow you to remain totally "local". No cloud presence at all. I've got a 2 drive one, but am not concerned with Apple's approach to CSAM. 
    This sets the stage for Apple to offer ee2e while still hosting your photos on their servers. - This is pure SPECULATION from you, nothing more. Apple has NOT mentioned it anywhere in their explanation. If this IS their goal, they SHOULD have called this out explicitly in their announcement for CSAM scanning, which they have NOT done YET. Let Apple make an announcement on this FIRST, then they will get little bit of credibility on this topic.
  • Reply 33 of 42

    I applaud Apple’s initiative to monitor our private content on our private devices without our consent or a search warrant or even suspicion of wrongdoing. In fact I don’t think they went far enough. They should do what they did in 1984 and install cameras and microphones in every single wall in every single room so that whoever is on the other side of the line can monitor our every move all the time. Thanks to Apple we are almost there anyway. The slogan can remain ‘If you have nothing to hide, you have nothing to fear’ or maybe ‘It is for the children!’. 

    muthuk_vanalingamOferOctoMonkeyzeus423
  • Reply 34 of 42
    jidojido Posts: 125member
    ikir said:
    jido said:
    Client-side CSAM detection is incontrovertibly better for privacy than server-side CSAM detection.
    THAT!

    I am happy of the change, it means Apple doesn't need to see the contents of my pictures to thwart child abuse.

    The Messages update may be more nefarious, don't want to notify authorities about what images I receive.
    Apple didn’t see your photos. See, that’s paranoia. Neither messages function this way, it is for children who are in a family group. 

    These comments are the proof used don’t have a clue.
    Feel free to attack my comments. Do you know how CSAM works?
    Even if no Apple employee has seen my pictures, the Apple (or Facebook, or Google, ...) servers need access to unencrypted images to produce a hash. Which is definitely worse for privacy than scanning on-device.

    I will not comment further on Messages and keep my concerns to myself.
  • Reply 35 of 42
    zimmiezimmie Posts: 651member
    jido said:

    The Messages update may be more nefarious, don't want to notify authorities about what images I receive.
    Well, since it doesn't do that, you have nothing to worry about. If you were under 13 years old and your parent had opted in to this feature in your Family account, you'd have to worry about your parents (not authorities) getting notified, and your device would warn you first that that's about to happen if you proceed to send/receive nudes. Are you twelve? Is your parent going to turn this on? No? Then don't worry about it.
    Strictly, the first question should be "Are your parents going to tell Apple you're 12?" To the best of my knowledge, Apple doesn't do any verification of the actual age of child accounts. That part is definitely open to abuse, and is likely to wind up outing some LGBT teens to their less-than-understanding parents.

    Only applies to Messages, though, and there are plenty of other services.
  • Reply 36 of 42
    IreneWIreneW Posts: 303member
    entropys said:
    M68000 said:
    Ofer said:
    As others have already said, technically this amounts to illegal search. Law enforcement agencies are required to have reasonable suspicion and to obtain a warrant in order to search for illegal materials. Yet what Apple is doing is searching through every iCloud owner’s account for illegal material. Regardless of whether or not other companies already do this or whether or not it has potential for abuse by authorities, the act of searching people who are supposed to be presumed innocent is not right.
    It’s legal if Apple has it in their terms of use or legal disclaimers that nearly everybody clicks through without reading when installing software.   It’s really their computer and their software and OS.  They can and will change their legal agreement for their terms of use for their customers. 
    Yes unfortunately that is true.
    Actually, in many parts of the world it is not enough to put something in a EULA that most people click away. 
  • Reply 37 of 42
    M68000 said:
    Ofer said:
    As others have already said, technically this amounts to illegal search. Law enforcement agencies are required to have reasonable suspicion and to obtain a warrant in order to search for illegal materials. Yet what Apple is doing is searching through every iCloud owner’s account for illegal material. Regardless of whether or not other companies already do this or whether or not it has potential for abuse by authorities, the act of searching people who are supposed to be presumed innocent is not right.
    It’s legal if Apple has it in their terms of use or legal disclaimers that nearly everybody clicks through without reading when installing software.   It’s really their computer and their software and OS.  They can and will change their legal agreement for their terms of use for their customers. 
    Point of clarification on terms and conditions of use:  It is legal only if doing so does not violate law.  Contracts (agreements) which violate law are either invalid in their entirety or the portions which violate law are invalid.  This is not to say the terms of use being discussed here are illegal, just that there are limits to terms and conditions of use.
    IreneW
  • Reply 38 of 42
    mSakmSak Posts: 22member
    It’s a total slippery slope. While Apple promises not to let rogue governments force it to do anything that isn’t palatable, let’s look at the U.S. The Trump administration was one of the least palatable US governments in recent history and its effects are still ongoing and greatly influences the Republican party. Do we trust even a government like Trump administration to not find a way to get Apple to misuse CSAM?

    For the sake of privacy, Apple should drop this silly CSAM move.
    edited August 2021
  • Reply 39 of 42
    fastasleepfastasleep Posts: 6,408member
    jido said:
    jido said:

    The Messages update may be more nefarious, don't want to notify authorities about what images I receive.
    Well, since it doesn't do that, you have nothing to worry about. If you were under 13 years old and your parent had opted in to this feature in your Family account, you'd have to worry about your parents (not authorities) getting notified,
    Luckily nobody forces me to join their Family account, that's right.
     and your device would warn you first that that's about to happen
    Does the device notify that I received suspicious content or does it notify that I bypassed the warning and loaded it? Not that it makes a huge difference.
    if you proceed to send/receive nudes.
    Who knows what it will flag, do I trust Apple that their AI only looks for nudes?
    Are you twelve? Is your parent going to turn this on? No? Then don't worry about it.
    Yeah but still...
    It notifies YOU that you are sending or receiving nudes and that if you continue, it will notify your parents, IF this feature is enabled by the parent. Why would it scan for something else? It doesn’t notify the parents unless you proceed. This is all documented by Apple. 

    “Yeah but still” isn’t a compelling argument. 
  • Reply 40 of 42
    fastasleepfastasleep Posts: 6,408member
    zimmie said:
    jido said:

    The Messages update may be more nefarious, don't want to notify authorities about what images I receive.
    Well, since it doesn't do that, you have nothing to worry about. If you were under 13 years old and your parent had opted in to this feature in your Family account, you'd have to worry about your parents (not authorities) getting notified, and your device would warn you first that that's about to happen if you proceed to send/receive nudes. Are you twelve? Is your parent going to turn this on? No? Then don't worry about it.
    Strictly, the first question should be "Are your parents going to tell Apple you're 12?" To the best of my knowledge, Apple doesn't do any verification of the actual age of child accounts. That part is definitely open to abuse, and is likely to wind up outing some LGBT teens to their less-than-understanding parents.

    Only applies to Messages, though, and there are plenty of other services.
    There are all sorts of ways parental controls or other mechanisms can be potentially abused by bad actors. Does that mean we shouldn’t build any controls? No. 

    Also, the child in question is given a choice by the system that allows them to avoid the situation you describe by not proceeding.
Sign In or Register to comment.