Epic Games CEO slams Apple 'government spyware'
Tim Sweeney, the chief of Epic Games, has attacked Apple over its iCloud Photos and messages child safety initiatives, putting forward the idea of it being a way for governments to conduct surveillance.

On Thursday, Apple launched a suite of tools to help protect children online and to reduce the spread of child sexual abuse material (CSAM). As part of the tools, the initiative would introduce features to iMessage, Siri, and Search, as well as a mechanism for scanning iCloud Photos for known CSM imagery.
As part of the outpouring of criticism against Apple, Epic CEO Tim Sweeney took to Twitter once again to complain about Apple's initiative. Following an earlier stint on the microblogging service on Friday, Sweeney's Saturday proclamations framed the tools as Apple potentially enabling the future surveillance of user data for governments.
"I've tried hard to see this from Apple's point of view," starts Sweeney's thread, "but inescapably, this is government sypware installed by Apple based on a presumption of guilt. Though Apple wrote the code, its function is to scan personal data and report it to the government."
"This is entirely different from a content moderation system on a public forum or social medium," the CEO continued. "Before the operator chooses to host the data publicly, they can scan it for whatever they don't want to host. But this is peoples' private data."
Sweeney's accusations on personal data scanning is somewhat at odds to how Apple's system actually works. Rather than actually looking at the image itself, the scanning system compares mathematical hashes of files stored on iCloud.
The hashes generated from the files are checked against known CSAM image hashes, and the National Center for Missing & Exploited Children (NCMEC) is informed of the flagged accounts.
Furthermore, the scanning only applies to iCloud Photos, and that images stored only on the device with iCloud Photos turned off cannot be examined in such a way.
Sweeney goes on to claim Apple uses "dark patterns" to turn on iCloud uploads by default, forcing people to "accumulate unwanted data, and alludes to how iCloud.com email accounts cannot be deleted without a user losing everything they purchased in the Apple ecosystem.
Alluding to the Epic-Apple antitrust trial and testimony that Apple has to comply with all applicable laws wherever it does business, Sweeney mentions "presumably Apple will now be an arm of state surveillance wherever it's required," before referencing Apple's dealings with the Chinese government.
Sweeney concludes his tweet thread by writing "Liberty is built on due process and limited government. The existential threat here is an unholy alliance between government the monopolies who control online discourse and everyone's devices, using the guise of private corporations to circumvent constitutional protections."
While the Epic CEO is known for attacking Apple on Twitter throughout and after the major App Store lawsuit, as Chinese tech giant Tencent owns a 40% stake in the game developer and publisher.
Tencent has previously allegedly complied with the Chinese government to perform surveillance on non-Chinese users of its WeChat messaging app. In 2020, it was believed that the surveillance was used to better censor content for user accounts registered in China.
WeChat is also believed to be a tool used to monitor dissidents, including censoring speech and punishing political opponents who speak ill of the government.
Read on AppleInsider

On Thursday, Apple launched a suite of tools to help protect children online and to reduce the spread of child sexual abuse material (CSAM). As part of the tools, the initiative would introduce features to iMessage, Siri, and Search, as well as a mechanism for scanning iCloud Photos for known CSM imagery.
As part of the outpouring of criticism against Apple, Epic CEO Tim Sweeney took to Twitter once again to complain about Apple's initiative. Following an earlier stint on the microblogging service on Friday, Sweeney's Saturday proclamations framed the tools as Apple potentially enabling the future surveillance of user data for governments.
"I've tried hard to see this from Apple's point of view," starts Sweeney's thread, "but inescapably, this is government sypware installed by Apple based on a presumption of guilt. Though Apple wrote the code, its function is to scan personal data and report it to the government."
"This is entirely different from a content moderation system on a public forum or social medium," the CEO continued. "Before the operator chooses to host the data publicly, they can scan it for whatever they don't want to host. But this is peoples' private data."
Sweeney's accusations on personal data scanning is somewhat at odds to how Apple's system actually works. Rather than actually looking at the image itself, the scanning system compares mathematical hashes of files stored on iCloud.
I've tried hard to see this from Apple's point of view. But inescapably, this is government spyware installed by Apple based on a presumption of guilt. Though Apple wrote the code, its function is to scan personal data and report it to government.https://t.co/OrkfOSjvS1
-- Tim Sweeney (@TimSweeneyEpic)
The hashes generated from the files are checked against known CSAM image hashes, and the National Center for Missing & Exploited Children (NCMEC) is informed of the flagged accounts.
Furthermore, the scanning only applies to iCloud Photos, and that images stored only on the device with iCloud Photos turned off cannot be examined in such a way.
Sweeney goes on to claim Apple uses "dark patterns" to turn on iCloud uploads by default, forcing people to "accumulate unwanted data, and alludes to how iCloud.com email accounts cannot be deleted without a user losing everything they purchased in the Apple ecosystem.
Alluding to the Epic-Apple antitrust trial and testimony that Apple has to comply with all applicable laws wherever it does business, Sweeney mentions "presumably Apple will now be an arm of state surveillance wherever it's required," before referencing Apple's dealings with the Chinese government.
Sweeney concludes his tweet thread by writing "Liberty is built on due process and limited government. The existential threat here is an unholy alliance between government the monopolies who control online discourse and everyone's devices, using the guise of private corporations to circumvent constitutional protections."
While the Epic CEO is known for attacking Apple on Twitter throughout and after the major App Store lawsuit, as Chinese tech giant Tencent owns a 40% stake in the game developer and publisher.
Tencent has previously allegedly complied with the Chinese government to perform surveillance on non-Chinese users of its WeChat messaging app. In 2020, it was believed that the surveillance was used to better censor content for user accounts registered in China.
WeChat is also believed to be a tool used to monitor dissidents, including censoring speech and punishing political opponents who speak ill of the government.
Read on AppleInsider
Comments
if you want total privacy, get rid of cloud storage and maybe go back to using a typewriter LOL
Apple does not want their products used for crime and is making an attempt to do something about it - what is so hard to understand?
I will get rid of cloud storage.
My general concern is if things were to go wrong, and things can and do go wrong. We're Apple users, we expect things to go wrong! Due to a bug or hash mismatch (okay - the odds of it triggering a false positive are very low), it could be possible for completely innocent images to be flagged up incorrectly. Apple hasn't exactly the most marvellous reputation for dealing with sensitive and urgent problems when accounts are closed for something the account isn't responsible for.
But, as many other people have said, it doesn't have to stop there. The same tech can be used (or expanded) to check for other content that, say, governments can enforce on Apple to weed out and notify them of any infraction. This has the capability (mind you, most things do) for abuse.
HOWEVER..
Adobe already do this with their cloud services. This is outlined here:
https://www.adobe.com/uk/legal/lawenforcementrequests/childsafety.html
So those using Lightroom Desktop/CC or Lightroom Classic which syncs photos to the Adobe Creative Cloud are already having their photos scanned with CSAM technology when it's uploaded to their servers. I've not seen any articles that mention this, or any feedback that Adobe has to say on it.
I can definitely see why Apple wants to implement CSAM on the iPhone (and perhaps give Apple a chance to say to law enforcement - hey, you don't need to crack the phone - we do the job for you!) - and it'd be one of the few companies that aren't already doing so (Google already do it through many of their products and services already - https://transparencyreport.google.com/child-sexual-abuse-material/reporting?hl=en), but it does somewhat go against their privacy matters mantra.
But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement on your own property.
So you’d be happy having the FBI, IRS, and any other government agency looking through your phone? Computer? Home? All of your business records?
Just for the sake of fun let's imagine a user sporadically bombarded with child porn over WeChat, Telegram, WhatsApp and alike. Our decent guy duly deletes every image he finds in those chat apps, he thinks he keeps his iPhone clean of such things. But what he doesn't know is that everytime such an image arrives, it is automatically saved to his iCloud photo library too. Safety vouchers will accumulate and our poor guy will end up in jail even without figuring out what has happened to him !..
This is not about that… don’t you get it?
this is not about total privacy which doesn’t exist btw.
I agree one hundred percent with this article. Someone commented that when you work for a company that your data can be scanned. I don't use Facebook or Twitter and never post anything online. When I saw this comment I had to create an account and respond.This article has nothing to do with companies monitoring employee's computers. Apple has no business scanning customer's personal files. Should they also scan your text messages and send your call logs to the feds? Would that make the world even safer? The Apple cloud services are for private storage and to share files with friends and family. Should the government open every storage locker and closet and make sure their is nothing illegal in there?
As an IT Pro with over 20 years experience I've been advising clients about the Pros and Cons of cloud storage for decades. Cloud storage is suppose to make data management and backup easier for the end user. If you store your personal data on your hard drive it is private and you should expect the same when storing in the cloud. I have never stored any of my data in the cloud and still backup my iPhone to my computer using a USB cable. Cloud storage has been pushed on people over the past decade. Companies offer free storage to try and get you interested and then when it fills up they can start charging you for more space. Cloud storage is not meant to encroach on people's privacy. If someone is arrested for something illegal their devices and accounts can be analyzed at that time.
When your private data is being stored no one should have access to it. Google is the largest online advertising medium in the world. I always advise against using Google Drive and Microsoft OneDrive. Box and Dropbox are not interested in what data you store on their servers and is only accessible by you and anyone you grant access. But no matter where you store your files online the government can always get a subpoena to gain access to your account. If Apple wants to scan everyone's photos there should be a warning message stating such every time you upload a photo to the cloud.
I think it's good Apple is being questioned about this. Any move that could impinge privacy should be. If this story could make a small turn to include data privacy from corporations?
But people reading these stories are reading them on Twitter, or on Facebook, or from a web search from Google -- surveillance capitalism's big players. These players don't want you to read that angle to the story.
See how that works?