Epic Games CEO slams Apple 'government spyware'

1235

Comments

  • Reply 81 of 108
    macplusplusmacplusplus Posts: 2,112member
    Rayz2016 said:
    I am interested to know what the difference is between Apple doing this and Google doing this. Google has been scanning photos for CSAM for years and I've never seen an uproar over it. Is there something I'm missing that makes it terrible on Apple's part and OK on Google's part?

    For reference, here's an article from 2016 about a man being arrested on child porn charges after authorities received a tip from Google that he had uploaded CSAM to his Gmail account.

    https://www.denverpost.com/2016/06/22/jonbenet-ramsey-child-porn-boulder-gary-oliva/


    Google found the child porn by scanning the file on their servers. This is nothing new; Microsoft, Apple and Google have been doing this for quite some time.

    The problem is that this will not find porn if the files are encrypted before they are sent to the server.

    So Apple will get around this by installing a spy program that will scan photos, hash them, compare with the database of child porn image hashes they have download to your phone and report on them before they are sent to iCloud.

    The problem is that Apple has already admitted that the system is open to abuse.  Since the images are hashed, then Apple doesn't know what images are being searched for. This means that any government can inject any image into the database Apple is picking up. In some countries, this won't be child porn; it'll be images of wanted dissidents, government protestors, subversive poetry. 

    Apple has basically told the world that they're happy to build them a surveillance network for any country that asks them to.

    That's the problem.
    I’m not positive here but I think you are conflating two similar but different initiatives. 

    The first is the “scanning” of images uploaded to iCloud Photos for CSAM. If a threshold is met then an Apple employee will verify if the algorithm correctly identified CSAM. If so, it is reported to authorities. None of this happens on an iPhone, it is all done on Apple’s servers.

    The second is scanning of images in Messages which happens entirely on-device, doesn’t send any report to Apple and only to the owner of the child’s device. Also, this feature is opt-in.

    So again I ask, how is this different than what has already been happening? The iCloud Photos scanning is, as far as I can tell, no different than the GMail scanning.

    Also, if a government wants to inject it’s own images to a hashed library to get positive hits on CSAM, the positive hits will still be reviewed. When there is no CSAM involved then it will end there. 

    The first is the labeling of user content upon the request of an external entity. If you open the gate to such attempts, you can't control where it will end. You can't control which entities may request the labeling of your content and according to what. This is not much different from implanting a camera in your house to monitor if you abuse your children and that records only when a child screams !
  • Reply 82 of 108
    anonymouseanonymouse Posts: 6,857member

    M68000 said:
    Why are people so up in arms about this?  If you have nothing to hide why are you fighting Apple’s attempt to make the world safer from criminals and sick people?   This “ceo” made the amazingly stupid comment about “presumption of guilt”?   Well,  at least in the United States a person is innocent until proven guilty.   

    I’m frankly amazed by the feedback in this forum in the last few days,  people who trust and love Apple and now don’t want to help Apple try to make the world safer.  

    If you work for any large company I hope you know that any email, chat messages and files on company computers can be looked at and scanned by security and network admins.

    if you want total privacy,  get rid of cloud storage and maybe go back to using a typewriter LOL

    Apple does not want their products used for crime and is making an attempt to do something about it - what is so hard to understand?
    They're upset about it because, like you, they've mistakenly concluded, or been told, that this reduces privacy, when it does not. You're defending it because you think the loss of privacy is justified for this purpose, but the real reason to defend it is it will interfere with the distribution of child pornography without threatening privacy.
    edited August 2021
  • Reply 83 of 108
    crowleycrowley Posts: 10,453member
    crowley said:
    crowley said:
    crowley said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
    It is your data and it is private but that privacy cannot prevent Apple from performing legally required checks and scans on their servers. This is one reason most of the iCloud data is not end-to-end encrypted. It is still encrypted, but with Apple's keys, not your own device keys. Practically unencrypted, from the user's point of view. And this is why law enforcement can access your iCloud data anytime by presenting a search warrant.

    But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement.
    They aren't scanning your iPhone, they're scanning the photo that you want to put on their service.  They're refusing to even take it without first checking if it matches a known child abuse picture.  That seems fine to me.
    No they don't refuse anything. If their intent were to refuse something they could refuse it on the server as well. They accept whatever you send, but with an associated "safety voucher" if there is a CSAM match. And if those vouchers reach a certain threshold they report you.
    They're refusing to even take it without first checking if it matches a known child abuse picture.
    How so? The graphic says"uploaded to Apple" so they take it.

    https://appleinsider.com/articles/21/08/06/what-you-need-to-know-apples-icloud-photos-and-messages-child-safety-initiatives
    After the check.  
    No the check is only for producing the safety voucher. Besides, since human review is required after the threshold is reached, refusing anything before the human review would be meaningless.
    I think you've misunderstood me.  I never said that Apple will refuse to take the photo based on the output of the check, just that they refuse to take it without the check having taking place.  The result of the check doesn't make a difference to whether they take it, but it has to be checked against the CSAM list.
    edited August 2021
  • Reply 84 of 108
    macplusplusmacplusplus Posts: 2,112member
    crowley said:
    crowley said:
    crowley said:
    crowley said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
    It is your data and it is private but that privacy cannot prevent Apple from performing legally required checks and scans on their servers. This is one reason most of the iCloud data is not end-to-end encrypted. It is still encrypted, but with Apple's keys, not your own device keys. Practically unencrypted, from the user's point of view. And this is why law enforcement can access your iCloud data anytime by presenting a search warrant.

    But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement.
    They aren't scanning your iPhone, they're scanning the photo that you want to put on their service.  They're refusing to even take it without first checking if it matches a known child abuse picture.  That seems fine to me.
    No they don't refuse anything. If their intent were to refuse something they could refuse it on the server as well. They accept whatever you send, but with an associated "safety voucher" if there is a CSAM match. And if those vouchers reach a certain threshold they report you.
    They're refusing to even take it without first checking if it matches a known child abuse picture.
    How so? The graphic says"uploaded to Apple" so they take it.

    https://appleinsider.com/articles/21/08/06/what-you-need-to-know-apples-icloud-photos-and-messages-child-safety-initiatives
    After the check.  
    No the check is only for producing the safety voucher. Besides, since human review is required after the threshold is reached, refusing anything before the human review would be meaningless.
    I think you've misunderstood me.  I never said that Apple will refuse to take the photo based on the output of the check, just that they refuse to take it without the check having taking place.  The result of the check doesn't make a difference to whether they take it, but it has to be checked against the CSAM list.
    OK, refusing on the server / refusing on the client conflict has caused it. The check, as proposed, seems for good cause, but it is new, it signals the introduction of a new paradigm. This is the new paradigm that causes people's concerns and questioning, as to where that can go. No such labeling of user content upon the request of a third party has occurred before. Today pictures, tomorrow what? Would you like to be labeled as copyright pirate because of a song sent to you as a gift and you long forgotten it exists somewhere on your iPhone?
    baconstang
  • Reply 85 of 108
    crowleycrowley Posts: 10,453member
    crowley said:
    crowley said:
    crowley said:
    crowley said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
    It is your data and it is private but that privacy cannot prevent Apple from performing legally required checks and scans on their servers. This is one reason most of the iCloud data is not end-to-end encrypted. It is still encrypted, but with Apple's keys, not your own device keys. Practically unencrypted, from the user's point of view. And this is why law enforcement can access your iCloud data anytime by presenting a search warrant.

    But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement.
    They aren't scanning your iPhone, they're scanning the photo that you want to put on their service.  They're refusing to even take it without first checking if it matches a known child abuse picture.  That seems fine to me.
    No they don't refuse anything. If their intent were to refuse something they could refuse it on the server as well. They accept whatever you send, but with an associated "safety voucher" if there is a CSAM match. And if those vouchers reach a certain threshold they report you.
    They're refusing to even take it without first checking if it matches a known child abuse picture.
    How so? The graphic says"uploaded to Apple" so they take it.

    https://appleinsider.com/articles/21/08/06/what-you-need-to-know-apples-icloud-photos-and-messages-child-safety-initiatives
    After the check.  
    No the check is only for producing the safety voucher. Besides, since human review is required after the threshold is reached, refusing anything before the human review would be meaningless.
    I think you've misunderstood me.  I never said that Apple will refuse to take the photo based on the output of the check, just that they refuse to take it without the check having taking place.  The result of the check doesn't make a difference to whether they take it, but it has to be checked against the CSAM list.
    OK, refusing on the server / refusing on the client conflict has caused it. The check, as proposed, seems for good cause, but it is new, it signals the introduction of a new paradigm. This is the new paradigm that causes people's concerns and questioning, as to where that can go. No such labeling of user content upon the request of a third party has occurred before. Today pictures, tomorrow what? Would you like to be labeled as copyright pirate because of a song sent to you as a gift and you long forgotten it exists somewhere on your iPhone?
    Deal with tomorrow's problems tomorrow.
    dewme
  • Reply 86 of 108
    anonymouseanonymouse Posts: 6,857member
    crowley said:
    crowley said:
    crowley said:
    crowley said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
    It is your data and it is private but that privacy cannot prevent Apple from performing legally required checks and scans on their servers. This is one reason most of the iCloud data is not end-to-end encrypted. It is still encrypted, but with Apple's keys, not your own device keys. Practically unencrypted, from the user's point of view. And this is why law enforcement can access your iCloud data anytime by presenting a search warrant.

    But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement.
    They aren't scanning your iPhone, they're scanning the photo that you want to put on their service.  They're refusing to even take it without first checking if it matches a known child abuse picture.  That seems fine to me.
    No they don't refuse anything. If their intent were to refuse something they could refuse it on the server as well. They accept whatever you send, but with an associated "safety voucher" if there is a CSAM match. And if those vouchers reach a certain threshold they report you.
    They're refusing to even take it without first checking if it matches a known child abuse picture.
    How so? The graphic says"uploaded to Apple" so they take it.

    https://appleinsider.com/articles/21/08/06/what-you-need-to-know-apples-icloud-photos-and-messages-child-safety-initiatives
    After the check.  
    No the check is only for producing the safety voucher. Besides, since human review is required after the threshold is reached, refusing anything before the human review would be meaningless.
    I think you've misunderstood me.  I never said that Apple will refuse to take the photo based on the output of the check, just that they refuse to take it without the check having taking place.  The result of the check doesn't make a difference to whether they take it, but it has to be checked against the CSAM list.
    OK, refusing on the server / refusing on the client conflict has caused it. The check, as proposed, seems for good cause, but it is new, it signals the introduction of a new paradigm. This is the new paradigm that causes people's concerns and questioning, as to where that can go. No such labeling of user content upon the request of a third party has occurred before. Today pictures, tomorrow what? Would you like to be labeled as copyright pirate because of a song sent to you as a gift and you long forgotten it exists somewhere on your iPhone?
    Go to Daring Fireball, read about it, read Apple's own technical description of how it works. The scenario you present as a concern will not happen. The problem is a bunch of people who either don't understand it, didn't take the time to understand it, or, like the Epic CEO, want to deliberately misrepresent it for personal gain, have shaped your idea of how it works and they're all wrong. Take the time to actually learn about it and understand it before you condemn it.
    n2itivguydewmeforegoneconclusionbasjhj
  • Reply 87 of 108
    n2itivguyn2itivguy Posts: 103member
    @anonymouse is right on about this — too many folks here haven’t bothered to read & understand the technical description on how it works. 
    dewme
  • Reply 88 of 108
    Rayz2016 said:
    I am interested to know what the difference is between Apple doing this and Google doing this. Google has been scanning photos for CSAM for years and I've never seen an uproar over it. Is there something I'm missing that makes it terrible on Apple's part and OK on Google's part?

    For reference, here's an article from 2016 about a man being arrested on child porn charges after authorities received a tip from Google that he had uploaded CSAM to his Gmail account.

    https://www.denverpost.com/2016/06/22/jonbenet-ramsey-child-porn-boulder-gary-oliva/


    Google found the child porn by scanning the file on their servers. This is nothing new; Microsoft, Apple and Google have been doing this for quite some time.

    The problem is that this will not find porn if the files are encrypted before they are sent to the server.

    So Apple will get around this by installing a spy program that will scan photos, hash them, compare with the database of child porn image hashes they have download to your phone and report on them before they are sent to iCloud.

    The problem is that Apple has already admitted that the system is open to abuse.  Since the images are hashed, then Apple doesn't know what images are being searched for. This means that any government can inject any image into the database Apple is picking up. In some countries, this won't be child porn; it'll be images of wanted dissidents, government protestors, subversive poetry. 

    Apple has basically told the world that they're happy to build them a surveillance network for any country that asks them to.

    That's the problem.
    I’m not positive here but I think you are conflating two similar but different initiatives. 

    The first is the “scanning” of images uploaded to iCloud Photos for CSAM. If a threshold is met then an Apple employee will verify if the algorithm correctly identified CSAM. If so, it is reported to authorities. None of this happens on an iPhone, it is all done on Apple’s servers.

    The second is scanning of images in Messages which happens entirely on-device, doesn’t send any report to Apple and only to the owner of the child’s device. Also, this feature is opt-in.

    So again I ask, how is this different than what has already been happening? The iCloud Photos scanning is, as far as I can tell, no different than the GMail scanning.

    Also, if a government wants to inject it’s own images to a hashed library to get positive hits on CSAM, the positive hits will still be reviewed. When there is no CSAM involved then it will end there. 

    The first is the labeling of user content upon the request of an external entity. If you open the gate to such attempts, you can't control where it will end. You can't control which entities may request the labeling of your content and according to what. This is not much different from implanting a camera in your house to monitor if you abuse your children and that records only when a child screams !
    There isn’t any labeling going on and there are no requests from external entities, from what I have read. Have you seen something I haven’t?
  • Reply 89 of 108
    macplusplusmacplusplus Posts: 2,112member
    crowley said:
    crowley said:
    crowley said:
    crowley said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
    It is your data and it is private but that privacy cannot prevent Apple from performing legally required checks and scans on their servers. This is one reason most of the iCloud data is not end-to-end encrypted. It is still encrypted, but with Apple's keys, not your own device keys. Practically unencrypted, from the user's point of view. And this is why law enforcement can access your iCloud data anytime by presenting a search warrant.

    But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement.
    They aren't scanning your iPhone, they're scanning the photo that you want to put on their service.  They're refusing to even take it without first checking if it matches a known child abuse picture.  That seems fine to me.
    No they don't refuse anything. If their intent were to refuse something they could refuse it on the server as well. They accept whatever you send, but with an associated "safety voucher" if there is a CSAM match. And if those vouchers reach a certain threshold they report you.
    They're refusing to even take it without first checking if it matches a known child abuse picture.
    How so? The graphic says"uploaded to Apple" so they take it.

    https://appleinsider.com/articles/21/08/06/what-you-need-to-know-apples-icloud-photos-and-messages-child-safety-initiatives
    After the check.  
    No the check is only for producing the safety voucher. Besides, since human review is required after the threshold is reached, refusing anything before the human review would be meaningless.
    I think you've misunderstood me.  I never said that Apple will refuse to take the photo based on the output of the check, just that they refuse to take it without the check having taking place.  The result of the check doesn't make a difference to whether they take it, but it has to be checked against the CSAM list.
    OK, refusing on the server / refusing on the client conflict has caused it. The check, as proposed, seems for good cause, but it is new, it signals the introduction of a new paradigm. This is the new paradigm that causes people's concerns and questioning, as to where that can go. No such labeling of user content upon the request of a third party has occurred before. Today pictures, tomorrow what? Would you like to be labeled as copyright pirate because of a song sent to you as a gift and you long forgotten it exists somewhere on your iPhone?
    Go to Daring Fireball, read about it, read Apple's own technical description of how it works. The scenario you present as a concern will not happen. The problem is a bunch of people who either don't understand it, didn't take the time to understand it, or, like the Epic CEO, want to deliberately misrepresent it for personal gain, have shaped your idea of how it works and they're all wrong. Take the time to actually learn about it and understand it before you condemn it.
    Gruber admits that this is new:
    "The difference going forward is that Apple will be matching fingerprints against NCMEC’s database client-side, not server-side"

    He then continues:

    "
    This slippery-slope argument is a legitimate concern." and 

    "
    Will Apple actually flatly refuse any and all such demands? If they do, it’s all good. If they don’t, and these features creep into surveillance for things like political dissent, copyright infringement, LGBT imagery, or adult pornography — anything at all beyond irrefutable CSAM — it’ll prove disastrous to Apple’s reputation for privacy protection. The EFF seems to see such slipping down the slope as inevitable.

    We shall see. The stakes are incredibly high, and Apple knows it. Whatever you think of Apple’s decision to implement these features, they’re not doing so lightly."

    That doesn't support your argument much.
    edited August 2021 baconstangFileMakerFeller
  • Reply 90 of 108
    jungmarkjungmark Posts: 6,926member
    bluefire1 said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    By that logic, banks should be able to check what’s in their  vault boxes without the customer’s knowledge to monitor for illegal images.
    Banks do report transfers over $10000 to the Feds. 
  • Reply 91 of 108
    wow who knew tim sweeney was so concerned about 'spyware' now
    williamlondon
  • Reply 92 of 108
    jungmark said:
    bluefire1 said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    By that logic, banks should be able to check what’s in their  vault boxes without the customer’s knowledge to monitor for illegal images.
    Banks do report transfers over $10000 to the Feds. 
    Well they were referring to safety deposit boxes in particular. But I used to work for the FDIC in the early 90s and basically know that in the US the federal government can basically get access to all your bank records without a court order. That is because all federal banks have to be members of the Federal Reserve and FDIC to have a federal charter and to get a federal charter have to obtain it from The Office of the Comptroller of the Currency. So the Federal Reserve, FDIC and OCC all have direct access to all bank records since they regularly examine banks as part of “safety and soundness” exams. If any examiner from one of those agencies sees anything suspicious when reviewing an account they will automatically make a criminal referral to the FBI. For state banks it was mostly the same since all states at that time (except Louisiana) required their chartered banks to be members of the FDIC. Those state banks that were members of the Federal Reserve also were subject to exams from them but not all state banks were members of the Fed. For state banks the OCC wouldn’t apply but instead the state would send in their own team of examiners.
  • Reply 93 of 108
    jungmarkjungmark Posts: 6,926member
    jungmark said:
    bluefire1 said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    By that logic, banks should be able to check what’s in their  vault boxes without the customer’s knowledge to monitor for illegal images.
    Banks do report transfers over $10000 to the Feds. 
    Well they were referring to safety deposit boxes in particular. But I used to work for the FDIC in the early 90s and basically know that in the US the federal government can basically get access to all your bank records without a court order. That is because all federal banks have to be members of the Federal Reserve and FDIC to have a federal charter and to get a federal charter have to obtain it from The Office of the Comptroller of the Currency. So the Federal Reserve, FDIC and OCC all have direct access to all bank records since they regularly examine banks as part of “safety and soundness” exams. If any examiner from one of those agencies sees anything suspicious when reviewing an account they will automatically make a criminal referral to the FBI. For state banks it was mostly the same since all states at that time (except Louisiana) required their chartered banks to be members of the FDIC. Those state banks that were members of the Federal Reserve also were subject to exams from them but not all state banks were members of the Fed. For state banks the OCC wouldn’t apply but instead the state would send in their own team of examiners.
    So this process is like that. No images are reviewed until a threshold is reached. Then someone will look at those images that are flagged. If it warrants further investigation, the police/fbi will get involved. 
  • Reply 94 of 108
    anonymouseanonymouse Posts: 6,857member
    crowley said:
    crowley said:
    crowley said:
    crowley said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
    It is your data and it is private but that privacy cannot prevent Apple from performing legally required checks and scans on their servers. This is one reason most of the iCloud data is not end-to-end encrypted. It is still encrypted, but with Apple's keys, not your own device keys. Practically unencrypted, from the user's point of view. And this is why law enforcement can access your iCloud data anytime by presenting a search warrant.

    But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement.
    They aren't scanning your iPhone, they're scanning the photo that you want to put on their service.  They're refusing to even take it without first checking if it matches a known child abuse picture.  That seems fine to me.
    No they don't refuse anything. If their intent were to refuse something they could refuse it on the server as well. They accept whatever you send, but with an associated "safety voucher" if there is a CSAM match. And if those vouchers reach a certain threshold they report you.
    They're refusing to even take it without first checking if it matches a known child abuse picture.
    How so? The graphic says"uploaded to Apple" so they take it.

    https://appleinsider.com/articles/21/08/06/what-you-need-to-know-apples-icloud-photos-and-messages-child-safety-initiatives
    After the check.  
    No the check is only for producing the safety voucher. Besides, since human review is required after the threshold is reached, refusing anything before the human review would be meaningless.
    I think you've misunderstood me.  I never said that Apple will refuse to take the photo based on the output of the check, just that they refuse to take it without the check having taking place.  The result of the check doesn't make a difference to whether they take it, but it has to be checked against the CSAM list.
    OK, refusing on the server / refusing on the client conflict has caused it. The check, as proposed, seems for good cause, but it is new, it signals the introduction of a new paradigm. This is the new paradigm that causes people's concerns and questioning, as to where that can go. No such labeling of user content upon the request of a third party has occurred before. Today pictures, tomorrow what? Would you like to be labeled as copyright pirate because of a song sent to you as a gift and you long forgotten it exists somewhere on your iPhone?
    Go to Daring Fireball, read about it, read Apple's own technical description of how it works. The scenario you present as a concern will not happen. The problem is a bunch of people who either don't understand it, didn't take the time to understand it, or, like the Epic CEO, want to deliberately misrepresent it for personal gain, have shaped your idea of how it works and they're all wrong. Take the time to actually learn about it and understand it before you condemn it.
    Gruber admits that this is new:
    "The difference going forward is that Apple will be matching fingerprints against NCMEC’s database client-side, not server-side"

    He then continues:

    "This slippery-slope argument is a legitimate concern." and 

    "Will Apple actually flatly refuse any and all such demands? If they do, it’s all good. If they don’t, and these features creep into surveillance for things like political dissent, copyright infringement, LGBT imagery, or adult pornography — anything at all beyond irrefutable CSAM — it’ll prove disastrous to Apple’s reputation for privacy protection. The EFF seems to see such slipping down the slope as inevitable.

    We shall see. The stakes are incredibly high, and Apple knows it. Whatever you think of Apple’s decision to implement these features, they’re not doing so lightly."

    That doesn't support your argument much.
    His commentary not supporting what I said, and the technical description not matching what I said are two different things. 

    Would you like to be labeled as copyright pirate because of a song sent to you as a gift and you long forgotten it exists somewhere on your iPhone?
    This would absolutely not happen, even if someone sent you an image tagged as child pornography, even if it were extended to music. (And there is zero reason to expect it to be.)

    His commentary also says the EFF thinks it's a slippery slope, not that he does, he only think it's a legitimate concern, a point I disagree on because, as he says, " it’ll prove disastrous to Apple’s reputation for privacy protection," and I think if there is anything Apple is trying to avoid here, it's just that.

    But, the way this works is not Apple "scanning" your images as has been reported, they are simply hashed and the hash compared against a database of hashes, and a single hit doesn't get you reported, multiple hits don't even get you automatically reported. To keep this material off Apple's servers, since this only comes into play if you store images there, I think it's entirely justified for Apple to do this, and that it's something they have a moral obligation to do, to not be a conduit for child pornography, and no one's privacy is going to be harmed in the process, well, except someone keeping an extensive collection of child pornography on Apple's servers. But, to repeat, Apple is not scanning, or looking at, your pictures in any way that compromises your privacy one bit.

    And, I'll also say that the Epic dude knows this very well, but doesn't care and is just using this as yet another opportunity to try to make Apple look bad for his own personal gain. And, it's not like he gives a sh** about your privacy.
    edited August 2021
  • Reply 95 of 108
    coolfactorcoolfactor Posts: 2,239member
    Just for the sake of fun let's imagine a user sporadically bombarded with child porn over WeChat, Telegram, WhatsApp and alike. Our decent guy duly deletes every image he finds in those chat apps, he thinks he keeps his iPhone clean of such things. But what he doesn't know is that everytime such an image arrives, it is automatically saved to his iCloud photo library too. Safety vouchers will accumulate and our poor guy will end up in jail even without figuring out what has happened to him !..

    Do photos sent via these messaging apps automatically get added to iCloud Photos simply by being sent and not viewed with one's eyes?

    We don't know Apple's thresholds. It could be 1000 vouchers. It could be 10,000 vouchers. If someone has manually viewed 1000+ photos by choice and each time they view, that photo is uploaded to iCloud, then let them go to jail, as they are willfully participating in the exchange of such material.

    The technology that Apple has designed and implemented is not the problem. It's strongly designed with privacy in mind. What the main concern is that Apple could be forced to make that same technology be used for other purposes, which may border on invasion of privacy. It's the potential for abuse that is getting everyone up in arms.

    I trust that Apple has implemented this system really well. Innocent until *proven* guilty, after all, right?
  • Reply 96 of 108
    longfanglongfang Posts: 445member
    Sweeney can go jump off bridge. 
  • Reply 97 of 108
    crowleycrowley Posts: 10,453member

    Just for the sake of fun let's imagine a user sporadically bombarded with child porn over WeChat, Telegram, WhatsApp and alike. Our decent guy duly deletes every image he finds in those chat apps, he thinks he keeps his iPhone clean of such things. But what he doesn't know is that everytime such an image arrives, it is automatically saved to his iCloud photo library too. Safety vouchers will accumulate and our poor guy will end up in jail even without figuring out what has happened to him !..
    Just for the sake of fun let's imagine that this would ever happen (it wouldn't).  Your decent guy didn't report someone sending him images of child abuse to the police?  Then he's not such a decent guy.  And if he did then there's a record of that and he can use that in the appeal to Apple if it causes him any issues.
  • Reply 98 of 108
    macplusplusmacplusplus Posts: 2,112member
    crowley said:

    Just for the sake of fun let's imagine a user sporadically bombarded with child porn over WeChat, Telegram, WhatsApp and alike. Our decent guy duly deletes every image he finds in those chat apps, he thinks he keeps his iPhone clean of such things. But what he doesn't know is that everytime such an image arrives, it is automatically saved to his iCloud photo library too. Safety vouchers will accumulate and our poor guy will end up in jail even without figuring out what has happened to him !..
    Just for the sake of fun let's imagine that this would ever happen (it wouldn't).  Your decent guy didn't report someone sending him images of child abuse to the police?  Then he's not such a decent guy.  And if he did then there's a record of that and he can use that in the appeal to Apple if it causes him any issues.
    What country do you live in? Not reporting a crime doesn't make one criminal, you must explicitly prove that he's involved with crime. Innocent until proven guilty.
    edited August 2021
  • Reply 99 of 108
    macplusplusmacplusplus Posts: 2,112member
    Just for the sake of fun let's imagine a user sporadically bombarded with child porn over WeChat, Telegram, WhatsApp and alike. Our decent guy duly deletes every image he finds in those chat apps, he thinks he keeps his iPhone clean of such things. But what he doesn't know is that everytime such an image arrives, it is automatically saved to his iCloud photo library too. Safety vouchers will accumulate and our poor guy will end up in jail even without figuring out what has happened to him !..

    Do photos sent via these messaging apps automatically get added to iCloud Photos simply by being sent and not viewed with one's eyes?

    We don't know Apple's thresholds. It could be 1000 vouchers. It could be 10,000 vouchers. If someone has manually viewed 1000+ photos by choice and each time they view, that photo is uploaded to iCloud, then let them go to jail, as they are willfully participating in the exchange of such material.

    The technology that Apple has designed and implemented is not the problem. It's strongly designed with privacy in mind. What the main concern is that Apple could be forced to make that same technology be used for other purposes, which may border on invasion of privacy. It's the potential for abuse that is getting everyone up in arms.

    I trust that Apple has implemented this system really well. Innocent until *proven* guilty, after all, right?
    Whether photos are added automatically or after viewing doesn't matter. Most people use their phones by trial and error without having a coherent view of the operating system, they don't know what resides where. Besides, it is not always easy to qualify an image as child porn. Subject below the age of consent it is child porn, but the age of consent varies between jurisprudences. You defend Apple with the argument "Innocent until proven guilty" but you explicitly refuse to apply it to the decent individual in our fictitious case.
    baconstang
  • Reply 100 of 108
    crowleycrowley Posts: 10,453member
    crowley said:

    Just for the sake of fun let's imagine a user sporadically bombarded with child porn over WeChat, Telegram, WhatsApp and alike. Our decent guy duly deletes every image he finds in those chat apps, he thinks he keeps his iPhone clean of such things. But what he doesn't know is that everytime such an image arrives, it is automatically saved to his iCloud photo library too. Safety vouchers will accumulate and our poor guy will end up in jail even without figuring out what has happened to him !..
    Just for the sake of fun let's imagine that this would ever happen (it wouldn't).  Your decent guy didn't report someone sending him images of child abuse to the police?  Then he's not such a decent guy.  And if he did then there's a record of that and he can use that in the appeal to Apple if it causes him any issues.
    What country do you live in? Not reporting a crime doesn't make one criminal, you must explicitly prove that he's involved with crime. Innocent until proven guilty.
    I didn't say he was a criminal, I said he wasn't a decent guy.  I live in a country where if you receive an indecent image or in any other way suspect that child abuse is happening then you should report it to the authorities.  It's not criminal to not do so, but if you're later found to have such images in your possession it won't do your defence any favours that you failed to be forthcoming and act to prevent child abuse.  

    Also, I live in a country where possession of indecent images of child abuse is a crime.
Sign In or Register to comment.