Epic Games CEO slams Apple 'government spyware'

12346»

Comments

  • Reply 101 of 108
    My understanding is that there will be several layers of safeguards in place - if the algorithm produces a false positive, it will be checked by a human before anything gets off the ground.

    Comprehensive hashes can be incredibly reliable - enterprises use deduplication based storage which uses hash comparisons of data written to disk to see if data for a file even needs to be transmitted or can be just flagged as a duplicate - and enterprises don't like to lose data.
  • Reply 102 of 108
    I don't see why Tim Sweeney is all bent out of shape about protecting children.
    If you're not into spreading child sexual abuse material, then you have nothing to worry about; but if you are then ...
    Tim Sweeny could give a crap about protecting children.

    His bent has been and always will be to hurt Apple in any way imaginable.
    thinkman100000000baconstang
  • Reply 103 of 108
    Not surprised by the epic jerk Sweeny taking every opportunity to jump down Apple's throat on anything perceived as negative, but am surprised by how many others are without knowing the facts of this issue. This kind of pre-judgment has become far too great a part of our culture! Ready, Fire, Aim.
  • Reply 104 of 108
    SuperSkunkTXKSE said:

    I care.  I don't like his Epic Games app store case, but he's RIGHT ON THE MONEY on this one.  "Authorities" will lie to gain access to your private data.  This technology will expand to other areas.  This is a terrible move by Apple.  Can you imagine if you are one of the poor fools who was a false positive?  Then law enforcement digs through your stuff with a warrant and found out you cheated on your taxes.  This is a mess.  Cloud storage can no longer be trusted as a source for privacy.
    Actually, in order for a hash to match, they'd have to have to original to generate the hash.

    And if they took say a photo you posted on Facebook and generated that hash, when the hash collision was detected a human would inspect the photo to see if it contained child porn and if not the error would be reported back to NCMEC as a potential error, where the hash could be back-checked to the original source photo. If the source were governmental in nature and the photo was not child porn, NCMEC would raise a stink because having their database polluted from a government source would degrade their credibility and trust in their database (which is used by a lot of folks - not just Apple).

    If it were child porn, then the information would make its way to the government for further action.
    thinkman100000000
  • Reply 105 of 108
    basjhjbasjhj Posts: 97member
    Kolvorok said:
    Gosh, what a mis-step from Apple, likely to cost them millions... TOTALLY wrong to analyse data on phones without consent. That store of "hashed" data could contain people's faces, it could contain flags, locations, car plates, nudes, screenshots with text... anything a government may be interested in intercepting, imagine Hong Kong right now, Cina, Iran, Hungary or even Western states with a penchant for constantly surveilling and policing their citizens for political reasons... this is UNACCEPTABLE and I will, heavy heartedly, after 35 years of non-stop almost evangelical Apple ownership, have to switch to another phone and/or computer if this function ever came online.
    I’m pretty sure Apple will ask you for consent, if only for legal reasons. If you refuse, you cannot back up your photos in iCloud. That’s apparently the deal. By the way, I do agree with the ‘slippery slope’ arguments.
    baconstang
  • Reply 106 of 108
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
    It is your data and it is private but that privacy cannot prevent Apple from performing legally required checks and scans on their servers. This is one reason most of the iCloud data is not end-to-end encrypted. It is still encrypted, but with Apple's keys, not your own device keys. Practically unencrypted, from the user's point of view. And this is why law enforcement can access your iCloud data anytime by presenting a search warrant.

    But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement on your own property.
    Interesting. If Apple were to provide "hardware as a service" (which I and others have speculated about before) then it would probably qualify as a lease rather than a purchase, and Apple would retain more rights. This is worth further thought.
  • Reply 107 of 108
    I am interested to know what the difference is between Apple doing this and Google doing this. Google has been scanning photos for CSAM for years and I've never seen an uproar over it. Is there something I'm missing that makes it terrible on Apple's part and OK on Google's part?

    For reference, here's an article from 2016 about a man being arrested on child porn charges after authorities received a tip from Google that he had uploaded CSAM to his Gmail account.

    https://www.denverpost.com/2016/06/22/jonbenet-ramsey-child-porn-boulder-gary-oliva/
    Two reasons:
    1. Apple is doing a portion of it on phone. That has at minimum an element of concern. Prior case law suggests a third party cloud space ownership is up for debate. The ownership of pic and text data in a third party cloud. On a phone you bought, you own it. 
    2. Because it is Apple. Apple has and has marketed itself as a company that provides much greater privacy. Fair game that they will be looked at more skeptically when privacy appears negotiable. Look, outside of the trolls and angry adamantly Android, no one actually believes Google is a data privacy company. They are a secure comms company - end to end encryption security they are good at. But collecting as much of your personal data as they can, LOL no one except the aforementioned would be silly enough to state they are. 

    IMHO the question isn't about what Google does about CSAM. CSAM will probably always make privacy rights be negotiable (for better or worse) when it is in the cloud. The better question about on phone surveillance is: why is on phone watching for CSAM an outrage! But watching on phone for where you go and who you know and who you talk to most and what you buy (even with cash!) and common phrases you type and more and more, why is that not an ongoing privacy uproar? Seriously, this is not up for debate, Google has shared this on going surveillance with government. So to me the what's odd here is singular anger at scanning for kiddie porn, but scanning and recording on phone (and back to servers) of who you know and where you go (etc etc etc etc) is just A-Ok, business as usual, not worth any discussion. On that I'm very disappointed that EFF and Snowden and others didn't use the big stage on the Apple discussion to also rope in Surveillance Capitalism into the public discussion. JHFC, Facebook! is taking the privacy high ground (Facebook!). That's surreal.
    Privacy advocates are using the Anti Apple frenzy of the moment to get media air play -- at the expense of making a real difference for privacy by also roping in Google and Facebook. What they both do has been and is as slippery a slope as it gets (the more they can find about you, the more money they make, so the more they are incentivized to have your phone record and report back to corporate servers. Gee, how could that be a slippery slope?).  
Sign In or Register to comment.