Epic Games CEO slams Apple 'government spyware'

12346»

Comments

  • Reply 101 of 109
    macplusplusmacplusplus Posts: 2,091member
    Just for the sake of fun let's imagine a user sporadically bombarded with child porn over WeChat, Telegram, WhatsApp and alike. Our decent guy duly deletes every image he finds in those chat apps, he thinks he keeps his iPhone clean of such things. But what he doesn't know is that everytime such an image arrives, it is automatically saved to his iCloud photo library too. Safety vouchers will accumulate and our poor guy will end up in jail even without figuring out what has happened to him !..

    Do photos sent via these messaging apps automatically get added to iCloud Photos simply by being sent and not viewed with one's eyes?

    We don't know Apple's thresholds. It could be 1000 vouchers. It could be 10,000 vouchers. If someone has manually viewed 1000+ photos by choice and each time they view, that photo is uploaded to iCloud, then let them go to jail, as they are willfully participating in the exchange of such material.

    The technology that Apple has designed and implemented is not the problem. It's strongly designed with privacy in mind. What the main concern is that Apple could be forced to make that same technology be used for other purposes, which may border on invasion of privacy. It's the potential for abuse that is getting everyone up in arms.

    I trust that Apple has implemented this system really well. Innocent until *proven* guilty, after all, right?
    Whether photos are added automatically or after viewing doesn't matter. Most people use their phones by trial and error without having a coherent view of the operating system, they don't know what resides where. Besides, it is not always easy to qualify an image as child porn. Subject below the age of consent it is child porn, but the age of consent varies between jurisprudences. You defend Apple with the argument "Innocent until proven guilty" but you explicitly refuse to apply it to the decent individual in our fictitious case.
    baconstang
  • Reply 102 of 109
    crowleycrowley Posts: 9,113member
    crowley said:

    Just for the sake of fun let's imagine a user sporadically bombarded with child porn over WeChat, Telegram, WhatsApp and alike. Our decent guy duly deletes every image he finds in those chat apps, he thinks he keeps his iPhone clean of such things. But what he doesn't know is that everytime such an image arrives, it is automatically saved to his iCloud photo library too. Safety vouchers will accumulate and our poor guy will end up in jail even without figuring out what has happened to him !..
    Just for the sake of fun let's imagine that this would ever happen (it wouldn't).  Your decent guy didn't report someone sending him images of child abuse to the police?  Then he's not such a decent guy.  And if he did then there's a record of that and he can use that in the appeal to Apple if it causes him any issues.
    What country do you live in? Not reporting a crime doesn't make one criminal, you must explicitly prove that he's involved with crime. Innocent until proven guilty.
    I didn't say he was a criminal, I said he wasn't a decent guy.  I live in a country where if you receive an indecent image or in any other way suspect that child abuse is happening then you should report it to the authorities.  It's not criminal to not do so, but if you're later found to have such images in your possession it won't do your defence any favours that you failed to be forthcoming and act to prevent child abuse.  

    Also, I live in a country where possession of indecent images of child abuse is a crime.
  • Reply 103 of 109
    My understanding is that there will be several layers of safeguards in place - if the algorithm produces a false positive, it will be checked by a human before anything gets off the ground.

    Comprehensive hashes can be incredibly reliable - enterprises use deduplication based storage which uses hash comparisons of data written to disk to see if data for a file even needs to be transmitted or can be just flagged as a duplicate - and enterprises don't like to lose data.
  • Reply 104 of 109
    I don't see why Tim Sweeney is all bent out of shape about protecting children.
    If you're not into spreading child sexual abuse material, then you have nothing to worry about; but if you are then ...
    Tim Sweeny could give a crap about protecting children.

    His bent has been and always will be to hurt Apple in any way imaginable.
    thinkman100000000baconstang
  • Reply 105 of 109
    Not surprised by the epic jerk Sweeny taking every opportunity to jump down Apple's throat on anything perceived as negative, but am surprised by how many others are without knowing the facts of this issue. This kind of pre-judgment has become far too great a part of our culture! Ready, Fire, Aim.
  • Reply 106 of 109
    SuperSkunkTXKSE said:

    I care.  I don't like his Epic Games app store case, but he's RIGHT ON THE MONEY on this one.  "Authorities" will lie to gain access to your private data.  This technology will expand to other areas.  This is a terrible move by Apple.  Can you imagine if you are one of the poor fools who was a false positive?  Then law enforcement digs through your stuff with a warrant and found out you cheated on your taxes.  This is a mess.  Cloud storage can no longer be trusted as a source for privacy.
    Actually, in order for a hash to match, they'd have to have to original to generate the hash.

    And if they took say a photo you posted on Facebook and generated that hash, when the hash collision was detected a human would inspect the photo to see if it contained child porn and if not the error would be reported back to NCMEC as a potential error, where the hash could be back-checked to the original source photo. If the source were governmental in nature and the photo was not child porn, NCMEC would raise a stink because having their database polluted from a government source would degrade their credibility and trust in their database (which is used by a lot of folks - not just Apple).

    If it were child porn, then the information would make its way to the government for further action.
    thinkman100000000
  • Reply 107 of 109
    basjhjbasjhj Posts: 93member
    Kolvorok said:
    Gosh, what a mis-step from Apple, likely to cost them millions... TOTALLY wrong to analyse data on phones without consent. That store of "hashed" data could contain people's faces, it could contain flags, locations, car plates, nudes, screenshots with text... anything a government may be interested in intercepting, imagine Hong Kong right now, Cina, Iran, Hungary or even Western states with a penchant for constantly surveilling and policing their citizens for political reasons... this is UNACCEPTABLE and I will, heavy heartedly, after 35 years of non-stop almost evangelical Apple ownership, have to switch to another phone and/or computer if this function ever came online.
    I’m pretty sure Apple will ask you for consent, if only for legal reasons. If you refuse, you cannot back up your photos in iCloud. That’s apparently the deal. By the way, I do agree with the ‘slippery slope’ arguments.
    baconstang
  • Reply 108 of 109
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to [email protected]
    It is your data and it is private but that privacy cannot prevent Apple from performing legally required checks and scans on their servers. This is one reason most of the iCloud data is not end-to-end encrypted. It is still encrypted, but with Apple's keys, not your own device keys. Practically unencrypted, from the user's point of view. And this is why law enforcement can access your iCloud data anytime by presenting a search warrant.

    But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement on your own property.
    Interesting. If Apple were to provide "hardware as a service" (which I and others have speculated about before) then it would probably qualify as a lease rather than a purchase, and Apple would retain more rights. This is worth further thought.
  • Reply 109 of 109
    I am interested to know what the difference is between Apple doing this and Google doing this. Google has been scanning photos for CSAM for years and I've never seen an uproar over it. Is there something I'm missing that makes it terrible on Apple's part and OK on Google's part?

    For reference, here's an article from 2016 about a man being arrested on child porn charges after authorities received a tip from Google that he had uploaded CSAM to his Gmail account.

    https://www.denverpost.com/2016/06/22/jonbenet-ramsey-child-porn-boulder-gary-oliva/
    Two reasons:
    1. Apple is doing a portion of it on phone. That has at minimum an element of concern. Prior case law suggests a third party cloud space ownership is up for debate. The ownership of pic and text data in a third party cloud. On a phone you bought, you own it. 
    2. Because it is Apple. Apple has and has marketed itself as a company that provides much greater privacy. Fair game that they will be looked at more skeptically when privacy appears negotiable. Look, outside of the trolls and angry adamantly Android, no one actually believes Google is a data privacy company. They are a secure comms company - end to end encryption security they are good at. But collecting as much of your personal data as they can, LOL no one except the aforementioned would be silly enough to state they are. 

    IMHO the question isn't about what Google does about CSAM. CSAM will probably always make privacy rights be negotiable (for better or worse) when it is in the cloud. The better question about on phone surveillance is: why is on phone watching for CSAM an outrage! But watching on phone for where you go and who you know and who you talk to most and what you buy (even with cash!) and common phrases you type and more and more, why is that not an ongoing privacy uproar? Seriously, this is not up for debate, Google has shared this on going surveillance with government. So to me the what's odd here is singular anger at scanning for kiddie porn, but scanning and recording on phone (and back to servers) of who you know and where you go (etc etc etc etc) is just A-Ok, business as usual, not worth any discussion. On that I'm very disappointed that EFF and Snowden and others didn't use the big stage on the Apple discussion to also rope in Surveillance Capitalism into the public discussion. JHFC, Facebook! is taking the privacy high ground (Facebook!). That's surreal.
    Privacy advocates are using the Anti Apple frenzy of the moment to get media air play -- at the expense of making a real difference for privacy by also roping in Google and Facebook. What they both do has been and is as slippery a slope as it gets (the more they can find about you, the more money they make, so the more they are incentivized to have your phone record and report back to corporate servers. Gee, how could that be a slippery slope?).  
Sign In or Register to comment.