Epic Games CEO slams Apple 'government spyware'

2456

Comments

  • Reply 21 of 108
    cpsrocpsro Posts: 3,198member
    “ Sweeney's accusations on personal data scanning is somewhat at odds to how Apple's system actually works. Rather than actually looking at the image itself, the scanning system compares mathematical hashes of files stored on iCloud.”
    On the contrary, the images are being looked at, in a way that computers (or iPhones) can manage to do quickly. And despite the examinations occurring on one’s own devices, the owner can’t disable it.
    Next thing you know, Apple won’t be selling iPhones… it will be leasing them 😝

    Is Apple paying AI to spin this matter in a pro-Apple direction?
    edited August 2021 baconstangmobirdmike54prismaticsJaiOh81
  • Reply 22 of 108
    M68000 said:
    Why are people so up in arms about this?  If you have nothing to hide 

    Is this post for real? An actual serious point? Wow...

    Fyi to the rest of the world. Privacy is one of the fundamental tenants of any level of freedom. When the powerful get to know everything you do/think/say, they will always (and I do mean always) make sure you abide by what they dictate. The very fact that the powerful can't see and know means their power, at least to some degree, is in check. 

    One of the most pernicious arguments the powerful use is "As long as you have nothing to hide" or "as long as you have done nothing wrong". Everyone has something to hide, everyone has done something "wrong"(though it may not be illegally wrong), everyone has some level of thoughts and words that go against what others believe is acceptable. And everyone has done something actually illegal (there are many thousands of laws and regulations in the books. it would be impossible to live a life without crossing sentences if the massive volumes knows as law and regulation). 
    The powerful use a very interesting tactic of equivocation. They try to lump ordinary people in with grossly illegal behaviors like kiddie porn peddlers -- i.e. well those despicable people have something to hide. Now why do you have something to hide. 
    See how that works?
    baconstangentropysmike54dv8orcaladanianp-dogBeats
  • Reply 23 of 108
    JaiOh81JaiOh81 Posts: 60member
    Kuyangkoh said:
    JFC_PA said:
    Scanning their own servers. Nothing more. No one using an iPhone is required to store their photos in iCloud. A nice, cheap, external drive on your desktop and they’re backed up (with my smaller previous laptop I used two external drives, one the primary Photos library, the other the backup for the internal and other external drives). I happen to use both but that’s my choice, others are free to never go near iCloud for photo storage. 

    I find iCloud convenient but that’s because with 1.3 tb of images even my 1 TB iPad Pro can’t hold them all and I like having everything. But had a I bought a 2 TB? Maybe a different story. 

    Very good point having locally saved on your external HD….now im worried that if something happened where i kept my HD like fire or thief then im F____. I lost everything 
    Or what happens when Apple gets rid of the USB port altogether and forces cloud storage on everyone? 
  • Reply 24 of 108
    baconstangbaconstang Posts: 1,105member
    TechPro said:
    " But no matter where you store your files online the government can always get a subpoena to gain access to your account."

    If a court subpoenaed the files on your phone, or there was some sort of probable cause to examine your phone, fine, that's how our laws work.
    But blanket scanning of all images on your phone because you wish to store them in a cloud system you are paying for, is in conflict with the 4th Amendment.

    Then of course there is mission creep.  'Will' this backdoor be abused ?  'When' is the question.  Immediately is the answer.
    entropysmike54caladanian
  • Reply 25 of 108
    I don't see why Tim Sweeney is all bent out of shape about protecting children.
    If you're not into spreading child sexual abuse material, then you have nothing to worry about; but if you are then ...
  • Reply 26 of 108
    I am interested to know what the difference is between Apple doing this and Google doing this. Google has been scanning photos for CSAM for years and I've never seen an uproar over it. Is there something I'm missing that makes it terrible on Apple's part and OK on Google's part?

    For reference, here's an article from 2016 about a man being arrested on child porn charges after authorities received a tip from Google that he had uploaded CSAM to his Gmail account.

    https://www.denverpost.com/2016/06/22/jonbenet-ramsey-child-porn-boulder-gary-oliva/
    dewme
  • Reply 27 of 108
    williamhwilliamh Posts: 1,033member
    As long as Epic doesn’t mind the use of hyperbole or slippery slopes, I think one could make the case that Epic sells software that glorifies violence and encourages or trains mass shooters and other terrorists. 
    mikeybabes
  • Reply 28 of 108
    M68000 said:
    Why are people so up in arms about this?  If you have nothing to hide why are you fighting Apple’s attempt to make the world safer from criminals and sick people?   This “ceo” made the amazingly stupid comment about “presumption of guilt”?   Well,  at least in the United States a person is innocent until proven guilty.   

    I’m frankly amazed by the feedback in this forum in the last few days,  people who trust and love Apple and now don’t want to help Apple try to make the world safer.  

    If you work for any large company I hope you know that any email, chat messages and files on company computers can be looked at and scanned by security and network admins.

    if you want total privacy,  get rid of cloud storage and maybe go back to using a typewriter LOL

    Apple does not want their products used for crime and is making an attempt to do something about it - what is so hard to understand?
    Let’s say if it’s not about child abuse, what if Apple decide to do other things?

    For example, if Chinese government wants Apple to match Tian An Men or Hong Kong or Muslim Uighur related photos and report to Chinese government in exactly the same way, what can you do in this situation?

    If for anything Apple’s algorithm gone wrong and match your perfectly fine photo as one in the CSAM database, do you want to be suddenly under investigation? Because although the process is hashed, if you are under investigation there will be some officers actually looking at your photos. And algorithm won’t ever be perfect. Didn’t you get Facebook warning about violence when posting a cute dog? Photo matching can’t be this clumsy but if you ever study any algorithm it’s based on confident threshold, which is opposed to 100% accurate.

    In summary, this means only one thing - that Apple has a 🔑 key that can and is willing to look into your private data. Saying not to use this and that features is useless as saying don’t buy iPhone or go back to use typewriter.
    caladanianbaconstang
  • Reply 29 of 108
    Rayz2016Rayz2016 Posts: 6,957member
    M68000 said:
    Why are people so up in arms about this?  If you have nothing to hide why are you fighting Apple’s attempt to make the world safer from criminals and sick people?   This “ceo” made the amazingly stupid comment about “presumption of guilt”?   Well,  at least in the United States a person is innocent until proven guilty.   

    I’m frankly amazed by the feedback in this forum in the last few days,  people who trust and love Apple and now don’t want to help Apple try to make the world safer.  

    If you work for any large company I hope you know that any email, chat messages and files on company computers can be looked at and scanned by security and network admins.

    if you want total privacy,  get rid of cloud storage and maybe go back to using a typewriter LOL

    Apple does not want their products used for crime and is making an attempt to do something about it - what is so hard to understand?

    Yeah, the reason that folk are up in arms is because they're thinking of others, not themselves.

    Apple's system will lead to the persecution of human rights activists, the LGBTQ+ community in authoritarian countries. 

    Not one person I've come across is against CSAM scanning; indeed, Apple has been scanning your photo library since January this year at the latest, and no one batted an eyelid.

    https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/

    The problem is that Apple has now put the scanner on your' phone. This means that all attempts at encrypting your subversive poetry, anti-government literature, pictures of your gay rights parade float cannot be encrypted and sent to whichever server the government tells Apple to send it to. 

    And Apple knows this, as shown by the response to the question on MacRumours:

    Apple did admit that there is no silver bullet answer as it relates to the potential of the system being abused, but the company said it is committed to using the system solely for known CSAM imagery detection.

    And some of the more extreme anti-Apple brigade reckon that this is really a signal from Apple to authoritarian regimes the world over: come on in; this is the phone you want to 'recommend' to your oppressed population. Me personally? I think this is simply Apple being honest: yes, this system will be open to abuse, and there's not really anything they can do about it. A bad look all round really.

    p-dogbaconstang
  • Reply 30 of 108
    Rayz2016Rayz2016 Posts: 6,957member
    mike54 said:
    Not a fan of Sweeney, but he is on point here, I totally agree with him.

    While Tim Cook was conducting his privacy advertising campaign not long ago, he had already signed off on this scanning project. While he was standing up smiling sprouting about privacy he was mostly believed, but now its clear he has deceived everyone. We can understand now the timing of that advertising campaign. Tim Cook has lost all credibility.

    I'd have to agree; he's shown himself to be something of a hypocrite now. 
  • Reply 31 of 108
    Rayz2016Rayz2016 Posts: 6,957member
    omasou said:
    Sweeney no one cares about what you have to say or your opinion.
    I care.  I don't like his Epic Games app store case, but he's RIGHT ON THE MONEY on this one.  "Authorities" will lie to gain access to your private data.  This technology will expand to other areas.  This is a terrible move by Apple.  Can you imagine if you are one of the poor fools who was a false positive?  Then law enforcement digs through your stuff with a warrant and found out you cheated on your taxes.  This is a mess.  Cloud storage can no longer be trusted as a source for privacy.
    iCloud has never been trusted as a source of privacy, because the backups are unencrypted.

    What Apple has done is extend that lack of privacy to the device you have in your hand.
  • Reply 32 of 108
    Rayz2016Rayz2016 Posts: 6,957member
    Apple_Bar said:
    M68000 said:
    Why are people so up in arms about this?  If you have nothing to hide why are you fighting Apple’s attempt to make the world safer from criminals and sick people?   This “ceo” made the amazingly stupid comment about “presumption of guilt”?   Well,  at least in the United States a person is innocent until proven guilty.   

    I’m frankly amazed by the feedback in this forum in the last few days,  people who trust and love Apple and now don’t want to help Apple try to make the world safer.  

    If you work for any large company I hope you know that any email, chat messages and files on company computers can be looked at and scanned by security and network admins.

    if you want total privacy,  get rid of cloud storage and maybe go back to using a typewriter LOL

    Apple does not want their products used for crime and is making an attempt to do something about it - what is so hard to understand?
    You are one of those people that screams “if you have  nothing to hide”

    This is not about that… don’t you get it? 

    As much as I hate the guy when it comes to the Epic issue. He is not wrong on this one. 

    Of course employers (large or small) monitor computers work phones etc… What does that have to do with anything?

    this is not about total privacy which doesn’t exist btw. 

    This is the point Apple is hoping to hide.

    This is not about the my photo collection. Look if you want; I'm not a great photographer so you'll be bored out of your skull.

    This is about protecting the lives of the persecuted in oppressed countries. Apple has already admitted that the system has the potential to be abused, so I'm not really sure why folk are still arguing for this.
    anantksundaram
  • Reply 33 of 108
    Coming from the man whose Epic Game Store is notorious for its spyware sniffing and snooping around peoples' computer and logs.  As with everything, Timmy tries a scheme and looks like a joke with an obvious agenda.
  • Reply 34 of 108
    Rayz2016Rayz2016 Posts: 6,957member
    DAalseth said:
    And it starts. First the EFF. Then competitors like EPIC, soon the very lawmakers who were complaining about Apple not helping the FBI will be out for blood. This was a monumentally stupid move on Apple’s part.

    M68000 said:
    Why are people so up in arms about this?  If you have nothing to hide why are you fighting Apple’s attempt to make the world safer from criminals and sick people? 
    So you’d be happy having the FBI, IRS, and any other government agency looking through your phone? Computer? Home? All of your business records? 

    That is the same excuse all police forces, from the FBI under the last administration to the KGB, to the PRC use. Here’s something to chew on. If the right to privacy is gone then you have no civil rights. Every article in the Bill of Rights is grounded in a solid right to privacy. From self incrimination, to illegal search, to the right to assemble and all the rest. None of it means a GD thing without a rock solid right to privacy. This back door for the police, puts all of your rights and freedoms in peril. Today it’s porn they are hunting for, tomorrow it will be abortion providers, the next day info on how you vote, and the day after, who owns a gun, and then who supports and opposes the administration in power. 

    Sweeney can smell blood in the water, but he's made a mistake. Here's what he's thinking:

    Apple is now scanning stuff coming into your phone and reporting it. That means any notion of Apple devices supporting privacy have gone out the window. So why can't I have  my app store on Apple's app store since privacy is no longer an issue?

    What he hasn't realised that the iPhone is now the government's favoured phone since they have now have a backdoor that can bypass any attempt at encrypting data because it can scan files before the phone's owner has a chance to encrypt it. So in return for this, it's time for the government to scratch Apple's back …


    mike54
  • Reply 35 of 108
    Rayz2016Rayz2016 Posts: 6,957member
    TechPro said:
    " But no matter where you store your files online the government can always get a subpoena to gain access to your account."

    If a court subpoenaed the files on your phone, or there was some sort of probable cause to examine your phone, fine, that's how our laws work.
    But blanket scanning of all images on your phone because you wish to store them in a cloud system you are paying for, is in conflict with the 4th Amendment.

    Then of course there is mission creep.  'Will' this backdoor be abused ?  'When' is the question.  Immediately is the answer.
    Apple has already admitted that there is no silver bullet that will prevent the system being abused. Good that they had the foresight to cover their asses as early as possible.
    baconstang
  • Reply 36 of 108
    Rayz2016Rayz2016 Posts: 6,957member
    Coming from the man whose Epic Game Store is notorious for its spyware sniffing and snooping around peoples' computer and logs.  As with everything, Timmy tries a scheme and looks like a joke with an obvious agenda.
    Yeah, but unfortunately, he's still right.
    FileMakerFeller
  • Reply 37 of 108
    Rayz2016Rayz2016 Posts: 6,957member
    mbdrake76 said:
    M68000 said:
    Why are people so up in arms about this?  If you have nothing to hide why are you fighting Apple’s attempt to make the world safer from criminals and sick people?   This “ceo” made the amazingly stupid comment about “presumption of guilt”?   Well,  at least in the United States a person is innocent until proven guilty.   

    I’m frankly amazed by the feedback in this forum in the last few days,  people who trust and love Apple and now don’t want to help Apple try to make the world safer.  

    If you work for any large company I hope you know that any email, chat messages and files on company computers can be looked at and scanned by security and network admins.

    if you want total privacy,  get rid of cloud storage and maybe go back to using a typewriter LOL

    Apple does not want their products used for crime and is making an attempt to do something about it - what is so hard to understand?
    Your employer has the right to monitor your activities at work - but certainly not at home.

    My general concern is if things were to go wrong, and things can and do go wrong.  We're Apple users, we expect things to go wrong!  Due to a bug or hash mismatch (okay - the odds of it triggering a false positive are very low), it could be possible for completely innocent images to be flagged up incorrectly.  Apple hasn't exactly the most marvellous reputation for dealing with sensitive and urgent problems when accounts are closed for something the account isn't responsible for.

    But, as many other people have said, it doesn't have to stop there.  The same tech can be used (or expanded) to check for other content that, say, governments can enforce on Apple to weed out and notify them of any infraction.  This has the capability (mind you, most things do) for abuse.

    HOWEVER..

    Adobe already do this with their cloud services.  This is outlined here:

    https://www.adobe.com/uk/legal/lawenforcementrequests/childsafety.html

    So those using Lightroom Desktop/CC or Lightroom Classic which syncs photos to the Adobe Creative Cloud are already having their photos scanned with CSAM technology when it's uploaded to their servers.  I've not seen any articles that mention this, or any feedback that Adobe has to say on it.

    I can definitely see why Apple wants to implement CSAM on the iPhone (and perhaps give Apple a chance to say to law enforcement - hey, you don't need to crack the phone - we do the job for you!) - and it'd be one of the few companies that aren't already doing so (Google already do it through many of their products and services already - https://transparencyreport.google.com/child-sexual-abuse-material/reporting?hl=en), but it does somewhat go against their privacy matters mantra.

    Adobe, Microsoft, Apple and Google scan servers for CSAM images and report the ones found to the authorities. 

    Apple, as far as I can tell, is the only one installing spyware on your device to scan files before they reach the servers.

    baconstang
  • Reply 38 of 108

    Just for the sake of fun let's imagine a user sporadically bombarded with child porn over WeChat, Telegram, WhatsApp and alike. Our decent guy duly deletes every image he finds in those chat apps, he thinks he keeps his iPhone clean of such things. But what he doesn't know is that everytime such an image arrives, it is automatically saved to his iCloud photo library too. Safety vouchers will accumulate and our poor guy will end up in jail even without figuring out what has happened to him !..
    Is getting randomly bombarded with child porn images even a thing? I’m seriously asking because I’ve never heard of it.
  • Reply 39 of 108
    Rayz2016Rayz2016 Posts: 6,957member
    I am interested to know what the difference is between Apple doing this and Google doing this. Google has been scanning photos for CSAM for years and I've never seen an uproar over it. Is there something I'm missing that makes it terrible on Apple's part and OK on Google's part?

    For reference, here's an article from 2016 about a man being arrested on child porn charges after authorities received a tip from Google that he had uploaded CSAM to his Gmail account.

    https://www.denverpost.com/2016/06/22/jonbenet-ramsey-child-porn-boulder-gary-oliva/


    Google found the child porn by scanning the file on their servers. This is nothing new; Microsoft, Apple and Google have been doing this for quite some time.

    The problem is that this will not find porn if the files are encrypted before they are sent to the server.

    So Apple will get around this by installing a spy program that will scan photos, hash them, compare with the database of child porn image hashes they have download to your phone and report on them before they are sent to iCloud.

    The problem is that Apple has already admitted that the system is open to abuse.  Since the images are hashed, then Apple doesn't know what images are being searched for. This means that any government can inject any image into the database Apple is picking up. In some countries, this won't be child porn; it'll be images of wanted dissidents, government protestors, subversive poetry. 

    Apple has basically told the world that they're happy to build them a surveillance network for any country that asks them to.

    That's the problem.
    mike54sriceanantksundarambaconstangFileMakerFeller
  • Reply 40 of 108
    Rayz2016Rayz2016 Posts: 6,957member
    Stop press!

    RayzInsider has just read a note slipped under the lavatory door by the not-so-well-connected iPundit, Ming I DontReallyKnow.


    In light of its ongoing commitment to protecting user privacy, Apple will be rebranding its ubiquitous smartphone.

    Apple says that all the changes are under the hood. To ensure that the software is installed correctly, say:

    Hey, Siri

    If Siri does not respond with 

    Blessed day, Citizen

    then return the phone immediately to your nearest Apple authorised repair and re-education centre.

    #UnderHisEyePhone
    edited August 2021 mike54
Sign In or Register to comment.