Epic Games CEO slams Apple 'government spyware'

1246

Comments

  • Reply 61 of 109
    fordee said:
    Its funny that everyone is calling Apple out on this, but don’t mention that this is already happening on other major cloud platforms. Why not call Amazon out as well. Also, Tim hasn’t mentioned that Epic’s owners, Tencent already surveil Chinese citizens. Having said that, I can appreciate the slippery slope concerns.
    Maybe because Amazon or Google doesn’t shout “How great we are regarding to privacy”

    iCloud or any other Cloud content is already being scanned then why implement something shady like this. Tarnishing their reputation of protecting their customers data. 
    elijahganantksundaramp-dog
  • Reply 62 of 109
    crowleycrowley Posts: 9,124member
    srice said:
    Rayz2016 said:

    The problem is that Apple has already admitted that the system is open to abuse.  Since the images are hashed, then Apple doesn't know what images are being searched for. This means that any government can inject any image into the database Apple is picking up. In some countries, this won't be child porn; it'll be images of wanted dissidents, government protestors, subversive poetry. 

    Apple has basically told the world that they're happy to build them a surveillance network for any country that asks them to.

    That's the problem.
    This!
    Except Apple review the images before raising any alarm. If it’s not child abuse, no alarm.
  • Reply 63 of 109
    bluefire1bluefire1 Posts: 1,189member
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    By that logic, banks should be able to check what’s in their  vault boxes without the customer’s knowledge to monitor for illegal images.
    FileMakerFeller
  • Reply 64 of 109
    Can’t believe I am agreeing with this Sweeney guy. For a company whose brand is privacy, this is a weird move. Once something like this is done - regardless of the cause - the genie cannot be put back in the bottle. 
    p-dog
  • Reply 65 of 109
    bluefire1 said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    By that logic, banks should be able to check what’s in their  vault boxes without the customer’s knowledge to monitor for illegal images.
    You can look it up, but the only thing protecting your safe deposit box is the terms that come from the bank itself. There aren't any federal or state laws that protect the contents. I did a copy/paste of Apple's iCloud terms from a couple of years ago in this thread and they explicitly said that they had the right to screen the contents.
    edited August 8
  • Reply 66 of 109
    GabyGaby Posts: 180member
    M68000 said:
    Why are people so up in arms about this?  If you have nothing to hide 

    Is this post for real? An actual serious point? Wow...

    Fyi to the rest of the world. Privacy is one of the fundamental tenants of any level of freedom. When the powerful get to know everything you do/think/say, they will always (and I do mean always) make sure you abide by what they dictate. The very fact that the powerful can't see and know means their power, at least to some degree, is in check. 

    One of the most pernicious arguments the powerful use is "As long as you have nothing to hide" or "as long as you have done nothing wrong". Everyone has something to hide, everyone has done something "wrong"(though it may not be illegally wrong), everyone has some level of thoughts and words that go against what others believe is acceptable. And everyone has done something actually illegal (there are many thousands of laws and regulations in the books. it would be impossible to live a life without crossing sentences if the massive volumes knows as law and regulation). 
    The powerful use a very interesting tactic of equivocation. They try to lump ordinary people in with grossly illegal behaviors like kiddie porn peddlers -- i.e. well those despicable people have something to hide. Now why do you have something to hide. 
    See how that works?
    I do believe you meant tenets, not tenants…..
    p-dogFileMakerFeller
  • Reply 67 of 109
    wood1208wood1208 Posts: 2,582member
    So indirectly Epic supports child abuse!
  • Reply 68 of 109
    BeatsBeats Posts: 2,642member
    “2024 won’t be like 1984.”
  • Reply 69 of 109
    BeatsBeats Posts: 2,642member
    M68000 said:
    Why are people so up in arms about this?  If you have nothing to hide 

    Is this post for real? An actual serious point? Wow...

    Fyi to the rest of the world. Privacy is one of the fundamental tenants of any level of freedom. When the powerful get to know everything you do/think/say, they will always (and I do mean always) make sure you abide by what they dictate. The very fact that the powerful can't see and know means their power, at least to some degree, is in check. 

    One of the most pernicious arguments the powerful use is "As long as you have nothing to hide" or "as long as you have done nothing wrong". Everyone has something to hide, everyone has done something "wrong"(though it may not be illegally wrong), everyone has some level of thoughts and words that go against what others believe is acceptable. And everyone has done something actually illegal (there are many thousands of laws and regulations in the books. it would be impossible to live a life without crossing sentences if the massive volumes knows as law and regulation). 
    The powerful use a very interesting tactic of equivocation. They try to lump ordinary people in with grossly illegal behaviors like kiddie porn peddlers -- i.e. well those despicable people have something to hide. Now why do you have something to hide. 
    See how that works?

    I mean, North Koreans have nothing to hide. 🤷 
  • Reply 70 of 109
    omasouomasou Posts: 264member
    omasou said:
    Sweeney no one cares about what you have to say or your opinion.
    I care.  I don't like his Epic Games app store case, but he's RIGHT ON THE MONEY on this one.  "Authorities" will lie to gain access to your private data.  This technology will expand to other areas.  This is a terrible move by Apple.  Can you imagine if you are one of the poor fools who was a false positive?  Then law enforcement digs through your stuff with a warrant and found out you cheated on your taxes.  This is a mess.  Cloud storage can no longer be trusted as a source for privacy.
    Is anyone taking the time to read how this works?!

    There is no access to anyone’s data. It’s comparing two numbers.

    I imagine all of the data privacy chicken little people here all have Facebook and use the platform to speak their minds. Think about that it’s laughable.

    if you have something to hide they still manufacture HDs and NAS and your iCloud data is still encrypted and Apple cannot decrypt it.
    edited August 8
  • Reply 71 of 109
    killroykillroy Posts: 167member
    crowley said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to [email protected]
    It is your data and it is private but that privacy cannot prevent Apple from performing legally required checks and scans on their servers. This is one reason most of the iCloud data is not end-to-end encrypted. It is still encrypted, but with Apple's keys, not your own device keys. Practically unencrypted, from the user's point of view. And this is why law enforcement can access your iCloud data anytime by presenting a search warrant.

    But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement.
    They aren't scanning your iPhone, they're scanning the photo that you want to put on their service.  They're refusing to even take it without first checking if it matches a known child abuse picture.  That seems fine to me.
    No they don't refuse anything. If their intent were to refuse something they could refuse it on the server as well. They accept whatever you send, but with an associated "safety voucher" if there is a CSAM match. And if those vouchers reach a certain threshold they report you.

    Just for the sake of fun let's imagine a user sporadically bombarded with child porn over WeChat, Telegram, WhatsApp and alike. Our decent guy duly deletes every image he finds in those chat apps, he thinks he keeps his iPhone clean of such things. But what he doesn't know is that everytime such an image arrives, it is automatically saved to his iCloud photo library too. Safety vouchers will accumulate and our poor guy will end up in jail even without figuring out what has happened to him !..

    If he got a Safety voucher there's no problem. No Safety voucher will be given if it shows up in a hash.
  • Reply 72 of 109
    jungmarkjungmark Posts: 6,883member
    I guess epic’s games would be a good place to hide child porn. 
    williamlondon
  • Reply 73 of 109
    Mike WuertheleMike Wuerthele Posts: 6,321administrator
    killroy said:
    crowley said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to [email protected]
    It is your data and it is private but that privacy cannot prevent Apple from performing legally required checks and scans on their servers. This is one reason most of the iCloud data is not end-to-end encrypted. It is still encrypted, but with Apple's keys, not your own device keys. Practically unencrypted, from the user's point of view. And this is why law enforcement can access your iCloud data anytime by presenting a search warrant.

    But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement.
    They aren't scanning your iPhone, they're scanning the photo that you want to put on their service.  They're refusing to even take it without first checking if it matches a known child abuse picture.  That seems fine to me.
    No they don't refuse anything. If their intent were to refuse something they could refuse it on the server as well. They accept whatever you send, but with an associated "safety voucher" if there is a CSAM match. And if those vouchers reach a certain threshold they report you.

    Just for the sake of fun let's imagine a user sporadically bombarded with child porn over WeChat, Telegram, WhatsApp and alike. Our decent guy duly deletes every image he finds in those chat apps, he thinks he keeps his iPhone clean of such things. But what he doesn't know is that everytime such an image arrives, it is automatically saved to his iCloud photo library too. Safety vouchers will accumulate and our poor guy will end up in jail even without figuring out what has happened to him !..

    If he got a Safety voucher there's no problem. No Safety voucher will be given if it shows up in a hash.
    And images sent to somebody with Messages are not automatically saved to the iCloud library.
    killroy
  • Reply 74 of 109
    omasouomasou Posts: 264member
    I hope Apple adds a hash/compare to all their apps and offers the API in the iOS and MacOS SDK so that developers can to run images through it before uploading to their apps/platforms, Facebook, SnapChat, Dropbox, etc.

    This would hopefully put a dent into those using privacy tools to hide in plain sight.
  • Reply 75 of 109
    macplusplusmacplusplus Posts: 2,091member
    crowley said:
    crowley said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to [email protected]
    It is your data and it is private but that privacy cannot prevent Apple from performing legally required checks and scans on their servers. This is one reason most of the iCloud data is not end-to-end encrypted. It is still encrypted, but with Apple's keys, not your own device keys. Practically unencrypted, from the user's point of view. And this is why law enforcement can access your iCloud data anytime by presenting a search warrant.

    But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement.
    They aren't scanning your iPhone, they're scanning the photo that you want to put on their service.  They're refusing to even take it without first checking if it matches a known child abuse picture.  That seems fine to me.
    No they don't refuse anything. If their intent were to refuse something they could refuse it on the server as well. They accept whatever you send, but with an associated "safety voucher" if there is a CSAM match. And if those vouchers reach a certain threshold they report you.
    They're refusing to even take it without first checking if it matches a known child abuse picture.
    How so? The graphic says"uploaded to Apple" so they take it.

    https://appleinsider.com/articles/21/08/06/what-you-need-to-know-apples-icloud-photos-and-messages-child-safety-initiatives
  • Reply 76 of 109
    Rayz2016Rayz2016 Posts: 6,957member
    crowley said:
    srice said:
    Rayz2016 said:

    The problem is that Apple has already admitted that the system is open to abuse.  Since the images are hashed, then Apple doesn't know what images are being searched for. This means that any government can inject any image into the database Apple is picking up. In some countries, this won't be child porn; it'll be images of wanted dissidents, government protestors, subversive poetry. 

    Apple has basically told the world that they're happy to build them a surveillance network for any country that asks them to.

    That's the problem.
    This!
    Except Apple review the images before raising any alarm. If it’s not child abuse, no alarm.
    Except, of course, in countries such as China, where Apple won’t be the only ones reviewing the files. 
  • Reply 77 of 109
    crowleycrowley Posts: 9,124member
    crowley said:
    crowley said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to [email protected]
    It is your data and it is private but that privacy cannot prevent Apple from performing legally required checks and scans on their servers. This is one reason most of the iCloud data is not end-to-end encrypted. It is still encrypted, but with Apple's keys, not your own device keys. Practically unencrypted, from the user's point of view. And this is why law enforcement can access your iCloud data anytime by presenting a search warrant.

    But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement.
    They aren't scanning your iPhone, they're scanning the photo that you want to put on their service.  They're refusing to even take it without first checking if it matches a known child abuse picture.  That seems fine to me.
    No they don't refuse anything. If their intent were to refuse something they could refuse it on the server as well. They accept whatever you send, but with an associated "safety voucher" if there is a CSAM match. And if those vouchers reach a certain threshold they report you.
    They're refusing to even take it without first checking if it matches a known child abuse picture.
    How so? The graphic says"uploaded to Apple" so they take it.

    https://appleinsider.com/articles/21/08/06/what-you-need-to-know-apples-icloud-photos-and-messages-child-safety-initiatives
    After the check.  
  • Reply 78 of 109
    crowleycrowley Posts: 9,124member
    bluefire1 said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    By that logic, banks should be able to check what’s in their  vault boxes without the customer’s knowledge to monitor for illegal images.
    Sure.
  • Reply 79 of 109
    macplusplusmacplusplus Posts: 2,091member
    crowley said:
    crowley said:
    crowley said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to [email protected]
    It is your data and it is private but that privacy cannot prevent Apple from performing legally required checks and scans on their servers. This is one reason most of the iCloud data is not end-to-end encrypted. It is still encrypted, but with Apple's keys, not your own device keys. Practically unencrypted, from the user's point of view. And this is why law enforcement can access your iCloud data anytime by presenting a search warrant.

    But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement.
    They aren't scanning your iPhone, they're scanning the photo that you want to put on their service.  They're refusing to even take it without first checking if it matches a known child abuse picture.  That seems fine to me.
    No they don't refuse anything. If their intent were to refuse something they could refuse it on the server as well. They accept whatever you send, but with an associated "safety voucher" if there is a CSAM match. And if those vouchers reach a certain threshold they report you.
    They're refusing to even take it without first checking if it matches a known child abuse picture.
    How so? The graphic says"uploaded to Apple" so they take it.

    https://appleinsider.com/articles/21/08/06/what-you-need-to-know-apples-icloud-photos-and-messages-child-safety-initiatives
    After the check.  
    No the check is only for producing the safety voucher. Besides, since human review is required after the threshold is reached, refusing anything before the human review would be meaningless.
  • Reply 80 of 109
    macplusplusmacplusplus Posts: 2,091member
    omasou said:
    I hope Apple adds a hash/compare to all their apps and offers the API in the iOS and MacOS SDK so that developers can to run images through it before uploading to their apps/platforms, Facebook, SnapChat, Dropbox, etc.

    This would hopefully put a dent into those using privacy tools to hide in plain sight.
    That would be the correct implementation, other concerns aside. If they manage to reach an unanimous understanding in both developer and user communities, why not? In its current form, it is badly implemented, doesn't protect the user from malicious attacks but has the potential of making him a criminal, doesn't bring any competitive edge to Apple, and just scares parents: "No, no iPhone. I heard that Apple's cloud is full of paedophiles and Apple struggles to kick them out!"
Sign In or Register to comment.