Epic Games CEO slams Apple 'government spyware'

Posted:
in General Discussion
Tim Sweeney, the chief of Epic Games, has attacked Apple over its iCloud Photos and messages child safety initiatives, putting forward the idea of it being a way for governments to conduct surveillance.




On Thursday, Apple launched a suite of tools to help protect children online and to reduce the spread of child sexual abuse material (CSAM). As part of the tools, the initiative would introduce features to iMessage, Siri, and Search, as well as a mechanism for scanning iCloud Photos for known CSM imagery.

As part of the outpouring of criticism against Apple, Epic CEO Tim Sweeney took to Twitter once again to complain about Apple's initiative. Following an earlier stint on the microblogging service on Friday, Sweeney's Saturday proclamations framed the tools as Apple potentially enabling the future surveillance of user data for governments.

"I've tried hard to see this from Apple's point of view," starts Sweeney's thread, "but inescapably, this is government sypware installed by Apple based on a presumption of guilt. Though Apple wrote the code, its function is to scan personal data and report it to the government."

"This is entirely different from a content moderation system on a public forum or social medium," the CEO continued. "Before the operator chooses to host the data publicly, they can scan it for whatever they don't want to host. But this is peoples' private data."

Sweeney's accusations on personal data scanning is somewhat at odds to how Apple's system actually works. Rather than actually looking at the image itself, the scanning system compares mathematical hashes of files stored on iCloud.

I've tried hard to see this from Apple's point of view. But inescapably, this is government spyware installed by Apple based on a presumption of guilt. Though Apple wrote the code, its function is to scan personal data and report it to government.https://t.co/OrkfOSjvS1

-- Tim Sweeney (@TimSweeneyEpic)


The hashes generated from the files are checked against known CSAM image hashes, and the National Center for Missing & Exploited Children (NCMEC) is informed of the flagged accounts.

Furthermore, the scanning only applies to iCloud Photos, and that images stored only on the device with iCloud Photos turned off cannot be examined in such a way.

Sweeney goes on to claim Apple uses "dark patterns" to turn on iCloud uploads by default, forcing people to "accumulate unwanted data, and alludes to how iCloud.com email accounts cannot be deleted without a user losing everything they purchased in the Apple ecosystem.

Alluding to the Epic-Apple antitrust trial and testimony that Apple has to comply with all applicable laws wherever it does business, Sweeney mentions "presumably Apple will now be an arm of state surveillance wherever it's required," before referencing Apple's dealings with the Chinese government.

Sweeney concludes his tweet thread by writing "Liberty is built on due process and limited government. The existential threat here is an unholy alliance between government the monopolies who control online discourse and everyone's devices, using the guise of private corporations to circumvent constitutional protections."

While the Epic CEO is known for attacking Apple on Twitter throughout and after the major App Store lawsuit, as Chinese tech giant Tencent owns a 40% stake in the game developer and publisher.

Tencent has previously allegedly complied with the Chinese government to perform surveillance on non-Chinese users of its WeChat messaging app. In 2020, it was believed that the surveillance was used to better censor content for user accounts registered in China.

WeChat is also believed to be a tool used to monitor dissidents, including censoring speech and punishing political opponents who speak ill of the government.

Read on AppleInsider
«13456

Comments

  • Reply 1 of 108
    iadlibiadlib Posts: 95member
    He’s not wrong. The perception of this easily skews this way. It’s a slippery slope. 
    newisneverenoughmike549secondkox2elijahgwilliamlondonrinosaurpulseimagesRayz2016entropysapplguy
  • Reply 2 of 108
    M68000M68000 Posts: 726member
    Why are people so up in arms about this?  If you have nothing to hide why are you fighting Apple’s attempt to make the world safer from criminals and sick people?   This “ceo” made the amazingly stupid comment about “presumption of guilt”?   Well,  at least in the United States a person is innocent until proven guilty.   

    I’m frankly amazed by the feedback in this forum in the last few days,  people who trust and love Apple and now don’t want to help Apple try to make the world safer.  

    If you work for any large company I hope you know that any email, chat messages and files on company computers can be looked at and scanned by security and network admins.

    if you want total privacy,  get rid of cloud storage and maybe go back to using a typewriter LOL

    Apple does not want their products used for crime and is making an attempt to do something about it - what is so hard to understand?
    edited August 2021 9secondkox2williamlondonravnorodomlkruppkillroyDnykjpRfC6fnBsterun78watto_cobrajony0Detnator
  • Reply 3 of 108
    Can't stand Sweeney. He's revolting, but right on this one.

    I will get rid of cloud storage.
    newisneverenoughmike549secondkox2williamlondonrinosaurpulseimagesKTRcaladanianbluefire1anantksundaram
  • Reply 4 of 108
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    macplusplus9secondkox2elijahgrinosaurbaconstangRayz2016entropysmike54Hedwarecaladanian
  • Reply 5 of 108
    crowleycrowley Posts: 10,453member
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    lkruppdoozydozenfotoformatapplguykillroybyronlDnykjpRfC6fnBsDBSyncwatto_cobrajony0
  • Reply 6 of 108
    M68000 said:
    Why are people so up in arms about this?  If you have nothing to hide why are you fighting Apple’s attempt to make the world safer from criminals and sick people?   This “ceo” made the amazingly stupid comment about “presumption of guilt”?   Well,  at least in the United States a person is innocent until proven guilty.   

    I’m frankly amazed by the feedback in this forum in the last few days,  people who trust and love Apple and now don’t want to help Apple try to make the world safer.  

    If you work for any large company I hope you know that any email, chat messages and files on company computers can be looked at and scanned by security and network admins.

    if you want total privacy,  get rid of cloud storage and maybe go back to using a typewriter LOL

    Apple does not want their products used for crime and is making an attempt to do something about it - what is so hard to understand?
    Your employer has the right to monitor your activities at work - but certainly not at home.

    My general concern is if things were to go wrong, and things can and do go wrong.  We're Apple users, we expect things to go wrong!  Due to a bug or hash mismatch (okay - the odds of it triggering a false positive are very low), it could be possible for completely innocent images to be flagged up incorrectly.  Apple hasn't exactly the most marvellous reputation for dealing with sensitive and urgent problems when accounts are closed for something the account isn't responsible for.

    But, as many other people have said, it doesn't have to stop there.  The same tech can be used (or expanded) to check for other content that, say, governments can enforce on Apple to weed out and notify them of any infraction.  This has the capability (mind you, most things do) for abuse.

    HOWEVER..

    Adobe already do this with their cloud services.  This is outlined here:

    https://www.adobe.com/uk/legal/lawenforcementrequests/childsafety.html

    So those using Lightroom Desktop/CC or Lightroom Classic which syncs photos to the Adobe Creative Cloud are already having their photos scanned with CSAM technology when it's uploaded to their servers.  I've not seen any articles that mention this, or any feedback that Adobe has to say on it.

    I can definitely see why Apple wants to implement CSAM on the iPhone (and perhaps give Apple a chance to say to law enforcement - hey, you don't need to crack the phone - we do the job for you!) - and it'd be one of the few companies that aren't already doing so (Google already do it through many of their products and services already - https://transparencyreport.google.com/child-sexual-abuse-material/reporting?hl=en), but it does somewhat go against their privacy matters mantra.
    edited August 2021 applguyviclauyycwatto_cobra
  • Reply 7 of 108
    9secondkox29secondkox2 Posts: 2,710member
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
    elijahgpulseimagesbaconstangmike54Beatsbyronlmuthuk_vanalingam
  • Reply 8 of 108
    What a sad pathetic man.
    baconstangKTRBeatskillroyviclauyycDBSyncwatto_cobrajony0
  • Reply 9 of 108
    JFC_PAJFC_PA Posts: 932member
    Scanning their own servers. Nothing more. No one using an iPhone is required to store their photos in iCloud. A nice, cheap, external drive on your desktop and they’re backed up (with my smaller previous laptop I used two external drives, one the primary Photos library, the other the backup for the internal and other external drives). I happen to use both but that’s my choice, others are free to never go near iCloud for photo storage. 

    I find iCloud convenient but that’s because with 1.3 tb of images even my 1 TB iPad Pro can’t hold them all and I like having everything. But had a I bought a 2 TB? Maybe a different story. 

    edited August 2021 watto_cobra
  • Reply 10 of 108
    crowleycrowley Posts: 10,453member
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
    Your data is still being stored and your privacy is still very important.  Apple just do a quick once over to make sure you're aren't a fan of child abuse first.  Because there are a lot of child abusers out there, and catching some of them is worth a quick hash that doesn't harm anyone else.
    killroydewmeDBSyncwatto_cobrajony0
  • Reply 11 of 108
    macplusplusmacplusplus Posts: 2,112member
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
    It is your data and it is private but that privacy cannot prevent Apple from performing legally required checks and scans on their servers. This is one reason most of the iCloud data is not end-to-end encrypted. It is still encrypted, but with Apple's keys, not your own device keys. Practically unencrypted, from the user's point of view. And this is why law enforcement can access your iCloud data anytime by presenting a search warrant.

    But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement on your own property.
    edited August 2021 williamlondonbaconstang
  • Reply 12 of 108
    DAalsethDAalseth Posts: 2,783member
    And it starts. First the EFF. Then competitors like EPIC, soon the very lawmakers who were complaining about Apple not helping the FBI will be out for blood. This was a monumentally stupid move on Apple’s part.

    M68000 said:
    Why are people so up in arms about this?  If you have nothing to hide why are you fighting Apple’s attempt to make the world safer from criminals and sick people? 
    So you’d be happy having the FBI, IRS, and any other government agency looking through your phone? Computer? Home? All of your business records? 

    That is the same excuse all police forces, from the FBI under the last administration to the KGB, to the PRC use. Here’s something to chew on. If the right to privacy is gone then you have no civil rights. Every article in the Bill of Rights is grounded in a solid right to privacy. From self incrimination, to illegal search, to the right to assemble and all the rest. None of it means a GD thing without a rock solid right to privacy. This back door for the police, puts all of your rights and freedoms in peril. Today it’s porn they are hunting for, tomorrow it will be abortion providers, the next day info on how you vote, and the day after, who owns a gun, and then who supports and opposes the administration in power. 
    macpluspluswilliamlondonbaconstangpulseimagesRayz2016mike54caladaniananantksundaramp-dogBeats
  • Reply 13 of 108
    crowleycrowley Posts: 10,453member
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
    It is your data and it is private but that privacy cannot prevent Apple from performing legally required checks and scans on their servers. This is one reason most of the iCloud data is not end-to-end encrypted. It is still encrypted, but with Apple's keys, not your own device keys. Practically unencrypted, from the user's point of view. And this is why law enforcement can access your iCloud data anytime by presenting a search warrant.

    But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement.
    They aren't scanning your iPhone, they're scanning the photo that you want to put on their service.  They're refusing to even take it without first checking if it matches a known child abuse picture.  That seems fine to me.
    pumpkin_kingkillroydewmeDBSyncwatto_cobrajony0
  • Reply 14 of 108
    omasouomasou Posts: 572member
    Sweeney no one cares about what you have to say or your opinion.
    edited August 2021 williamlondonbaconstangmikeybabeskillroyDBSyncwatto_cobrajony0
  • Reply 15 of 108
    macplusplusmacplusplus Posts: 2,112member
    crowley said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
    It is your data and it is private but that privacy cannot prevent Apple from performing legally required checks and scans on their servers. This is one reason most of the iCloud data is not end-to-end encrypted. It is still encrypted, but with Apple's keys, not your own device keys. Practically unencrypted, from the user's point of view. And this is why law enforcement can access your iCloud data anytime by presenting a search warrant.

    But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement.
    They aren't scanning your iPhone, they're scanning the photo that you want to put on their service.  They're refusing to even take it without first checking if it matches a known child abuse picture.  That seems fine to me.
    No they don't refuse anything. If their intent were to refuse something they could refuse it on the server as well. They accept whatever you send, but with an associated "safety voucher" if there is a CSAM match. And if those vouchers reach a certain threshold they report you.

    Just for the sake of fun let's imagine a user sporadically bombarded with child porn over WeChat, Telegram, WhatsApp and alike. Our decent guy duly deletes every image he finds in those chat apps, he thinks he keeps his iPhone clean of such things. But what he doesn't know is that everytime such an image arrives, it is automatically saved to his iCloud photo library too. Safety vouchers will accumulate and our poor guy will end up in jail even without figuring out what has happened to him !..
    edited August 2021 baconstangRayz2016Beats
  • Reply 16 of 108
    M68000 said:
    Why are people so up in arms about this?  If you have nothing to hide why are you fighting Apple’s attempt to make the world safer from criminals and sick people?   This “ceo” made the amazingly stupid comment about “presumption of guilt”?   Well,  at least in the United States a person is innocent until proven guilty.   

    I’m frankly amazed by the feedback in this forum in the last few days,  people who trust and love Apple and now don’t want to help Apple try to make the world safer.  

    If you work for any large company I hope you know that any email, chat messages and files on company computers can be looked at and scanned by security and network admins.

    if you want total privacy,  get rid of cloud storage and maybe go back to using a typewriter LOL

    Apple does not want their products used for crime and is making an attempt to do something about it - what is so hard to understand?
    You are one of those people that screams “if you have  nothing to hide”

    This is not about that… don’t you get it? 

    As much as I hate the guy when it comes to the Epic issue. He is not wrong on this one. 

    Of course employers (large or small) monitor computers work phones etc… What does that have to do with anything?

    this is not about total privacy which doesn’t exist btw. 
    williamlondoncaladaniananantksundaramBeatskillroymuthuk_vanalingam
  • Reply 17 of 108

    I agree one hundred percent with this article. Someone commented that when you work for a company that your data can be scanned. I don't use Facebook or Twitter and never post anything online. When I saw this comment I had to create an account and respond.This article has nothing to do with companies monitoring employee's computers. Apple has no business scanning customer's personal files. Should they also scan your text messages and send your call logs to the feds? Would that make the world even safer? The Apple cloud services are for private storage and to share files with friends and family. Should the government open every storage locker and closet and make sure their is nothing illegal in there?

    As an IT Pro with over 20 years experience I've been advising clients about the Pros and Cons of cloud storage for decades. Cloud storage is suppose to make data management and backup easier for the end user. If you store your personal data on your hard drive it is private and you should expect the same when storing in the cloud. I have never stored any of my data in the cloud and still backup my iPhone to my computer using a USB cable. Cloud storage has been pushed on people over the past decade. Companies offer free storage to try and get you interested and then when it fills up they can start charging you for more space. Cloud storage is not meant to encroach on people's privacy. If someone is arrested for something illegal their devices and accounts can be analyzed at that time.

     When your private data is being stored no one should have access to it. Google is the largest online advertising medium in the world. I always advise against using Google Drive and Microsoft OneDrive. Box and Dropbox are not interested in what data you store on their servers and is only accessible by you and anyone you grant access. But no matter where you store your files online the government can always get a subpoena to gain access to your account. If Apple wants to scan everyone's photos there should be a warning message stating such every time you upload a photo to the cloud.

    baconstangpulseimagesRayz2016mobirdmike54caladaniananantksundaramp-dogFileMakerFeller
  • Reply 18 of 108
    omasou said:
    Sweeney no one cares about what you have to say or your opinion.
    I care.  I don't like his Epic Games app store case, but he's RIGHT ON THE MONEY on this one.  "Authorities" will lie to gain access to your private data.  This technology will expand to other areas.  This is a terrible move by Apple.  Can you imagine if you are one of the poor fools who was a false positive?  Then law enforcement digs through your stuff with a warrant and found out you cheated on your taxes.  This is a mess.  Cloud storage can no longer be trusted as a source for privacy.
    williamlondonRayz2016mike54caladaniananantksundaramBeatsmuthuk_vanalingam
  • Reply 19 of 108
    KuyangkohKuyangkoh Posts: 838member
    JFC_PA said:
    Scanning their own servers. Nothing more. No one using an iPhone is required to store their photos in iCloud. A nice, cheap, external drive on your desktop and they’re backed up (with my smaller previous laptop I used two external drives, one the primary Photos library, the other the backup for the internal and other external drives). I happen to use both but that’s my choice, others are free to never go near iCloud for photo storage. 

    I find iCloud convenient but that’s because with 1.3 tb of images even my 1 TB iPad Pro can’t hold them all and I like having everything. But had a I bought a 2 TB? Maybe a different story. 

    Very good point having locally saved on your external HD….now im worried that if something happened where i kept my HD like fire or thief then im F____. I lost everything 
  • Reply 20 of 108
    One of the worst angles of this Apple uproar is that main players of Surveillance Capitalism are the first to pile on about data privacy. 
    I think it's good Apple is being questioned about this. Any move that could impinge privacy should be. If this story could make a small turn to include data privacy from corporations?
    But people reading these stories are reading them on Twitter, or on Facebook, or from a web search from Google -- surveillance capitalism's big players. These players don't want you to read that angle to the story.
    See how that works?
    mike54
Sign In or Register to comment.