Epic Games CEO slams Apple 'government spyware'

1246

Comments

  • Reply 61 of 108
    bluefire1bluefire1 Posts: 1,301member
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    By that logic, banks should be able to check what’s in their  vault boxes without the customer’s knowledge to monitor for illegal images.
    FileMakerFeller
  • Reply 62 of 108
    Can’t believe I am agreeing with this Sweeney guy. For a company whose brand is privacy, this is a weird move. Once something like this is done - regardless of the cause - the genie cannot be put back in the bottle. 
    p-dog
  • Reply 63 of 108
    bluefire1 said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    By that logic, banks should be able to check what’s in their  vault boxes without the customer’s knowledge to monitor for illegal images.
    You can look it up, but the only thing protecting your safe deposit box is the terms that come from the bank itself. There aren't any federal or state laws that protect the contents. I did a copy/paste of Apple's iCloud terms from a couple of years ago in this thread and they explicitly said that they had the right to screen the contents.
    edited August 2021
  • Reply 64 of 108
    GabyGaby Posts: 190member
    M68000 said:
    Why are people so up in arms about this?  If you have nothing to hide 

    Is this post for real? An actual serious point? Wow...

    Fyi to the rest of the world. Privacy is one of the fundamental tenants of any level of freedom. When the powerful get to know everything you do/think/say, they will always (and I do mean always) make sure you abide by what they dictate. The very fact that the powerful can't see and know means their power, at least to some degree, is in check. 

    One of the most pernicious arguments the powerful use is "As long as you have nothing to hide" or "as long as you have done nothing wrong". Everyone has something to hide, everyone has done something "wrong"(though it may not be illegally wrong), everyone has some level of thoughts and words that go against what others believe is acceptable. And everyone has done something actually illegal (there are many thousands of laws and regulations in the books. it would be impossible to live a life without crossing sentences if the massive volumes knows as law and regulation). 
    The powerful use a very interesting tactic of equivocation. They try to lump ordinary people in with grossly illegal behaviors like kiddie porn peddlers -- i.e. well those despicable people have something to hide. Now why do you have something to hide. 
    See how that works?
    I do believe you meant tenets, not tenants…..
    p-dogFileMakerFeller
  • Reply 65 of 108
    wood1208wood1208 Posts: 2,905member
    So indirectly Epic supports child abuse!
  • Reply 66 of 108
    BeatsBeats Posts: 3,073member
    “2024 won’t be like 1984.”
  • Reply 67 of 108
    BeatsBeats Posts: 3,073member
    M68000 said:
    Why are people so up in arms about this?  If you have nothing to hide 

    Is this post for real? An actual serious point? Wow...

    Fyi to the rest of the world. Privacy is one of the fundamental tenants of any level of freedom. When the powerful get to know everything you do/think/say, they will always (and I do mean always) make sure you abide by what they dictate. The very fact that the powerful can't see and know means their power, at least to some degree, is in check. 

    One of the most pernicious arguments the powerful use is "As long as you have nothing to hide" or "as long as you have done nothing wrong". Everyone has something to hide, everyone has done something "wrong"(though it may not be illegally wrong), everyone has some level of thoughts and words that go against what others believe is acceptable. And everyone has done something actually illegal (there are many thousands of laws and regulations in the books. it would be impossible to live a life without crossing sentences if the massive volumes knows as law and regulation). 
    The powerful use a very interesting tactic of equivocation. They try to lump ordinary people in with grossly illegal behaviors like kiddie porn peddlers -- i.e. well those despicable people have something to hide. Now why do you have something to hide. 
    See how that works?

    I mean, North Koreans have nothing to hide. 🤷 
  • Reply 68 of 108
    omasouomasou Posts: 562member
    omasou said:
    Sweeney no one cares about what you have to say or your opinion.
    I care.  I don't like his Epic Games app store case, but he's RIGHT ON THE MONEY on this one.  "Authorities" will lie to gain access to your private data.  This technology will expand to other areas.  This is a terrible move by Apple.  Can you imagine if you are one of the poor fools who was a false positive?  Then law enforcement digs through your stuff with a warrant and found out you cheated on your taxes.  This is a mess.  Cloud storage can no longer be trusted as a source for privacy.
    Is anyone taking the time to read how this works?!

    There is no access to anyone’s data. It’s comparing two numbers.

    I imagine all of the data privacy chicken little people here all have Facebook and use the platform to speak their minds. Think about that it’s laughable.

    if you have something to hide they still manufacture HDs and NAS and your iCloud data is still encrypted and Apple cannot decrypt it.
    edited August 2021
  • Reply 69 of 108
    killroykillroy Posts: 271member
    crowley said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
    It is your data and it is private but that privacy cannot prevent Apple from performing legally required checks and scans on their servers. This is one reason most of the iCloud data is not end-to-end encrypted. It is still encrypted, but with Apple's keys, not your own device keys. Practically unencrypted, from the user's point of view. And this is why law enforcement can access your iCloud data anytime by presenting a search warrant.

    But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement.
    They aren't scanning your iPhone, they're scanning the photo that you want to put on their service.  They're refusing to even take it without first checking if it matches a known child abuse picture.  That seems fine to me.
    No they don't refuse anything. If their intent were to refuse something they could refuse it on the server as well. They accept whatever you send, but with an associated "safety voucher" if there is a CSAM match. And if those vouchers reach a certain threshold they report you.

    Just for the sake of fun let's imagine a user sporadically bombarded with child porn over WeChat, Telegram, WhatsApp and alike. Our decent guy duly deletes every image he finds in those chat apps, he thinks he keeps his iPhone clean of such things. But what he doesn't know is that everytime such an image arrives, it is automatically saved to his iCloud photo library too. Safety vouchers will accumulate and our poor guy will end up in jail even without figuring out what has happened to him !..

    If he got a Safety voucher there's no problem. No Safety voucher will be given if it shows up in a hash.
  • Reply 70 of 108
    jungmarkjungmark Posts: 6,926member
    I guess epic’s games would be a good place to hide child porn. 
    williamlondon
  • Reply 71 of 108
    Mike WuertheleMike Wuerthele Posts: 6,858administrator
    killroy said:
    crowley said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
    It is your data and it is private but that privacy cannot prevent Apple from performing legally required checks and scans on their servers. This is one reason most of the iCloud data is not end-to-end encrypted. It is still encrypted, but with Apple's keys, not your own device keys. Practically unencrypted, from the user's point of view. And this is why law enforcement can access your iCloud data anytime by presenting a search warrant.

    But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement.
    They aren't scanning your iPhone, they're scanning the photo that you want to put on their service.  They're refusing to even take it without first checking if it matches a known child abuse picture.  That seems fine to me.
    No they don't refuse anything. If their intent were to refuse something they could refuse it on the server as well. They accept whatever you send, but with an associated "safety voucher" if there is a CSAM match. And if those vouchers reach a certain threshold they report you.

    Just for the sake of fun let's imagine a user sporadically bombarded with child porn over WeChat, Telegram, WhatsApp and alike. Our decent guy duly deletes every image he finds in those chat apps, he thinks he keeps his iPhone clean of such things. But what he doesn't know is that everytime such an image arrives, it is automatically saved to his iCloud photo library too. Safety vouchers will accumulate and our poor guy will end up in jail even without figuring out what has happened to him !..

    If he got a Safety voucher there's no problem. No Safety voucher will be given if it shows up in a hash.
    And images sent to somebody with Messages are not automatically saved to the iCloud library.
    killroy
  • Reply 72 of 108
    omasouomasou Posts: 562member
    I hope Apple adds a hash/compare to all their apps and offers the API in the iOS and MacOS SDK so that developers can to run images through it before uploading to their apps/platforms, Facebook, SnapChat, Dropbox, etc.

    This would hopefully put a dent into those using privacy tools to hide in plain sight.
  • Reply 73 of 108
    macplusplusmacplusplus Posts: 2,112member
    crowley said:
    crowley said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
    It is your data and it is private but that privacy cannot prevent Apple from performing legally required checks and scans on their servers. This is one reason most of the iCloud data is not end-to-end encrypted. It is still encrypted, but with Apple's keys, not your own device keys. Practically unencrypted, from the user's point of view. And this is why law enforcement can access your iCloud data anytime by presenting a search warrant.

    But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement.
    They aren't scanning your iPhone, they're scanning the photo that you want to put on their service.  They're refusing to even take it without first checking if it matches a known child abuse picture.  That seems fine to me.
    No they don't refuse anything. If their intent were to refuse something they could refuse it on the server as well. They accept whatever you send, but with an associated "safety voucher" if there is a CSAM match. And if those vouchers reach a certain threshold they report you.
    They're refusing to even take it without first checking if it matches a known child abuse picture.
    How so? The graphic says"uploaded to Apple" so they take it.

    https://appleinsider.com/articles/21/08/06/what-you-need-to-know-apples-icloud-photos-and-messages-child-safety-initiatives
  • Reply 74 of 108
    Rayz2016Rayz2016 Posts: 6,957member
    crowley said:
    srice said:
    Rayz2016 said:

    The problem is that Apple has already admitted that the system is open to abuse.  Since the images are hashed, then Apple doesn't know what images are being searched for. This means that any government can inject any image into the database Apple is picking up. In some countries, this won't be child porn; it'll be images of wanted dissidents, government protestors, subversive poetry. 

    Apple has basically told the world that they're happy to build them a surveillance network for any country that asks them to.

    That's the problem.
    This!
    Except Apple review the images before raising any alarm. If it’s not child abuse, no alarm.
    Except, of course, in countries such as China, where Apple won’t be the only ones reviewing the files. 
  • Reply 75 of 108
    crowleycrowley Posts: 10,453member
    crowley said:
    crowley said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
    It is your data and it is private but that privacy cannot prevent Apple from performing legally required checks and scans on their servers. This is one reason most of the iCloud data is not end-to-end encrypted. It is still encrypted, but with Apple's keys, not your own device keys. Practically unencrypted, from the user's point of view. And this is why law enforcement can access your iCloud data anytime by presenting a search warrant.

    But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement.
    They aren't scanning your iPhone, they're scanning the photo that you want to put on their service.  They're refusing to even take it without first checking if it matches a known child abuse picture.  That seems fine to me.
    No they don't refuse anything. If their intent were to refuse something they could refuse it on the server as well. They accept whatever you send, but with an associated "safety voucher" if there is a CSAM match. And if those vouchers reach a certain threshold they report you.
    They're refusing to even take it without first checking if it matches a known child abuse picture.
    How so? The graphic says"uploaded to Apple" so they take it.

    https://appleinsider.com/articles/21/08/06/what-you-need-to-know-apples-icloud-photos-and-messages-child-safety-initiatives
    After the check.  
  • Reply 76 of 108
    crowleycrowley Posts: 10,453member
    bluefire1 said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    By that logic, banks should be able to check what’s in their  vault boxes without the customer’s knowledge to monitor for illegal images.
    Sure.
  • Reply 77 of 108
    macplusplusmacplusplus Posts: 2,112member
    crowley said:
    crowley said:
    crowley said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
    It is your data and it is private but that privacy cannot prevent Apple from performing legally required checks and scans on their servers. This is one reason most of the iCloud data is not end-to-end encrypted. It is still encrypted, but with Apple's keys, not your own device keys. Practically unencrypted, from the user's point of view. And this is why law enforcement can access your iCloud data anytime by presenting a search warrant.

    But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement.
    They aren't scanning your iPhone, they're scanning the photo that you want to put on their service.  They're refusing to even take it without first checking if it matches a known child abuse picture.  That seems fine to me.
    No they don't refuse anything. If their intent were to refuse something they could refuse it on the server as well. They accept whatever you send, but with an associated "safety voucher" if there is a CSAM match. And if those vouchers reach a certain threshold they report you.
    They're refusing to even take it without first checking if it matches a known child abuse picture.
    How so? The graphic says"uploaded to Apple" so they take it.

    https://appleinsider.com/articles/21/08/06/what-you-need-to-know-apples-icloud-photos-and-messages-child-safety-initiatives
    After the check.  
    No the check is only for producing the safety voucher. Besides, since human review is required after the threshold is reached, refusing anything before the human review would be meaningless.
  • Reply 78 of 108
    macplusplusmacplusplus Posts: 2,112member
    omasou said:
    I hope Apple adds a hash/compare to all their apps and offers the API in the iOS and MacOS SDK so that developers can to run images through it before uploading to their apps/platforms, Facebook, SnapChat, Dropbox, etc.

    This would hopefully put a dent into those using privacy tools to hide in plain sight.
    That would be the correct implementation, other concerns aside. If they manage to reach an unanimous understanding in both developer and user communities, why not? In its current form, it is badly implemented, doesn't protect the user from malicious attacks but has the potential of making him a criminal, doesn't bring any competitive edge to Apple, and just scares parents: "No, no iPhone. I heard that Apple's cloud is full of paedophiles and Apple struggles to kick them out!"
  • Reply 79 of 108
    Rayz2016 said:
    I am interested to know what the difference is between Apple doing this and Google doing this. Google has been scanning photos for CSAM for years and I've never seen an uproar over it. Is there something I'm missing that makes it terrible on Apple's part and OK on Google's part?

    For reference, here's an article from 2016 about a man being arrested on child porn charges after authorities received a tip from Google that he had uploaded CSAM to his Gmail account.

    https://www.denverpost.com/2016/06/22/jonbenet-ramsey-child-porn-boulder-gary-oliva/


    Google found the child porn by scanning the file on their servers. This is nothing new; Microsoft, Apple and Google have been doing this for quite some time.

    The problem is that this will not find porn if the files are encrypted before they are sent to the server.

    So Apple will get around this by installing a spy program that will scan photos, hash them, compare with the database of child porn image hashes they have download to your phone and report on them before they are sent to iCloud.

    The problem is that Apple has already admitted that the system is open to abuse.  Since the images are hashed, then Apple doesn't know what images are being searched for. This means that any government can inject any image into the database Apple is picking up. In some countries, this won't be child porn; it'll be images of wanted dissidents, government protestors, subversive poetry. 

    Apple has basically told the world that they're happy to build them a surveillance network for any country that asks them to.

    That's the problem.
    I’m not positive here but I think you are conflating two similar but different initiatives. 

    The first is the “scanning” of images uploaded to iCloud Photos for CSAM. If a threshold is met then an Apple employee will verify if the algorithm correctly identified CSAM. If so, it is reported to authorities. None of this happens on an iPhone, it is all done on Apple’s servers.

    The second is scanning of images in Messages which happens entirely on-device, doesn’t send any report to Apple and only to the owner of the child’s device. Also, this feature is opt-in.

    So again I ask, how is this different than what has already been happening? The iCloud Photos scanning is, as far as I can tell, no different than the GMail scanning.

    Also, if a government wants to inject it’s own images to a hashed library to get positive hits on CSAM, the positive hits will still be reviewed. When there is no CSAM involved then it will end there. 


    dewme
  • Reply 80 of 108
    anonymouseanonymouse Posts: 6,857member
    iadlib said:
    He’s not wrong. The perception of this easily skews this way. It’s a slippery slope. 
    He is wrong. He's part of the bunch trying to skew the perception "this way". (And for blatantly obvious reasons.) And it is not actually a slippery slope.
Sign In or Register to comment.