Epic Games CEO slams Apple 'government spyware'

1356

Comments

  • Reply 41 of 108
    JFC_PAJFC_PA Posts: 932member
    A cable and port for local iPhone backup are unnecessary now. Just backup over wifi to your desktop device. 

    For offsite security I’d say simply buy a second hard drive and swap them weekly. Store the spare drive outside your home. Even if only in your car. Or wherever. Before remote servers that was long the recommendation  for small businesses 
  • Reply 42 of 108
    Sweeney is only out to hurt Apple, whatever side he needs to take. So stop commenting on whether he’s right or not here. You are only feeding the troll.

    Instead, take the discussion of whether Apple’s filtering function is a good thing or bad censuring to one of all the other discussion threads here at AI.

    And Mr Sweeney, you should just stay out of any CSAM discussions wherever they are taking place. Because just this weekend you invited every single Fortnite player to your biggest event of the year — a face fuck fest with minors, teenagers, and adults. Oral sex is still child abuse even if taking place in the party life of Fortnite. Your own star artist, Ariana Grande, even turned her ass on you after this was taking place between millions of kids all over the party:



    So, Mr Swiney man … STFU!
    edited August 2021 p-dogkillroy
  • Reply 43 of 108
    fordeefordee Posts: 31member
    Its funny that everyone is calling Apple out on this, but don’t mention that this is already happening on other major cloud platforms. Why not call Amazon out as well. Also, Tim hasn’t mentioned that Epic’s owners, Tencent already surveil Chinese citizens. Having said that, I can appreciate the slippery slope concerns.
    killroy
  • Reply 44 of 108
    Rayz2016Rayz2016 Posts: 6,957member
    Sweeney is only out to hurt Apple, whatever side he needs to take. So stop commenting on whether he’s right or not here. You are only feeding the troll.

    Instead, take the discussion of whether Apple’s filtering function is a good thing or bad censuring to one of all the other discussion threads here at AI.
    Sorry, no.

    If I'm happy enough to tear strips off him when I think he's wrong, then I should be man enough to say so when I think he's right, no matter what his motives are (which I don't imagine have changed for even a minute).


    mike54FileMakerFeller
  • Reply 45 of 108
    Rayz2016Rayz2016 Posts: 6,957member
    Mmmm.




    I think Apple was hoping that by tagging their surveillance technology with the 'protect the children!' moniker, no one would dare question what they were doing. Not sure that worked out for them, especially since they've already been scanning iCloud for CSAM files, same as Google, same as Microsoft.
    mike54
  • Reply 46 of 108
    Rayz2016 said:
    Sweeney is only out to hurt Apple, whatever side he needs to take. So stop commenting on whether he’s right or not here. You are only feeding the troll.

    Instead, take the discussion of whether Apple’s filtering function is a good thing or bad censuring to one of all the other discussion threads here at AI.
    Sorry, no.

    If I'm happy enough to tear strips off him when I think he's wrong, then I should be man enough to say so when I think he's right, no matter what his motives are (which I don't imagine have changed for even a minute).


    I just updated my comment to better reflect what kind of person you would back then, Rayz2016. In principle, I understand you. But in practice we must ignore trolls like Swiney.
    killroy
  • Reply 47 of 108
    Rayz2016Rayz2016 Posts: 6,957member
    From the same thread:



    And here's something else I learned.

    I have been unfair on the Apple employees who complained about their desire to work from home. Although I still disagree that every company can work successfully if all the employees are at home full-time, reading back over my posts, I gave the impression that they shouldn't be complaining. This was wrong. They have every right to complain.

    Google's employees complained that Project Dragonfly should be shut down, and Google shut it down. That was good, because Dragonfly was a terrible idea.
    FileMakerFeller
  • Reply 48 of 108
    Rayz2016Rayz2016 Posts: 6,957member
    Rayz2016 said:
    Sweeney is only out to hurt Apple, whatever side he needs to take. So stop commenting on whether he’s right or not here. You are only feeding the troll.

    Instead, take the discussion of whether Apple’s filtering function is a good thing or bad censuring to one of all the other discussion threads here at AI.
    Sorry, no.

    If I'm happy enough to tear strips off him when I think he's wrong, then I should be man enough to say so when I think he's right, no matter what his motives are (which I don't imagine have changed for even a minute).


    I just updated my comment to better reflect what kind of person you would back then, Rayz2016. In principle, I understand you. But in practice we must ignore trolls like Swiney.

    It's not a question of 'backing' him; it's simply agreeing with him when he's right, just when I agree with Apple when I think they are right and everyone else thinks they're wrong. 

    Sweeney made a good point; I agree with it. 
    mike54anantksundaramFileMakerFeller
  • Reply 49 of 108
    Rayz2016 said:
    Rayz2016 said:
    Sweeney is only out to hurt Apple, whatever side he needs to take. So stop commenting on whether he’s right or not here. You are only feeding the troll.

    Instead, take the discussion of whether Apple’s filtering function is a good thing or bad censuring to one of all the other discussion threads here at AI.
    Sorry, no.

    If I'm happy enough to tear strips off him when I think he's wrong, then I should be man enough to say so when I think he's right, no matter what his motives are (which I don't imagine have changed for even a minute).


    I just updated my comment to better reflect what kind of person you would back then, Rayz2016. In principle, I understand you. But in practice we must ignore trolls like Swiney.

    It's not a question of 'backing' him; it's simply agreeing with him when he's right, just when I agree with Apple when I think they are right and everyone else thinks they're wrong. 

    Sweeney made a good point; I agree with it. 
    Ok.
    I’m sure Hitler was right about a few things too. But personally I’m quite happy not shouting out my praise for him just because of that. I just ignore it, and take my debate somewhere else.
    edited August 2021
  • Reply 50 of 108
    jlljll Posts: 2,713member
    a hawkins said:

    Let’s say if it’s not about child abuse, what if Apple decide to do other things?

    For example, if Chinese government wants Apple to match Tian An Men or Hong Kong or Muslim Uighur related photos and report to Chinese government in exactly the same way, what can you do in this situation?
    What's stopping them from doing that today? Apple already scan images in your iCloud Photo Library server side. The point is to move that from the server to the phone, and I would guess that it will be a step to enable end to end encrypted photo backups, which means that Apple has less data to give to authorities when asked to do so.
  • Reply 51 of 108
    jlljll Posts: 2,713member
    Rayz2016 said:

    Apple, as far as I can tell, is the only one installing spyware on your device to scan files before they reach the servers.

    On files that otherwise would reach the servers anyway.
  • Reply 52 of 108
    jlljll Posts: 2,713member
    Rayz2016 said:

    Apple is now scanning stuff coming into your phone and reporting it.
    That's not what the system will do. They scan hashes of "stuff" going out of your phone and entering your iCloud Photo Library. Stuff they already scan today.
  • Reply 53 of 108
    mbdrake76 said:
    M68000 said:
    Why are people so up in arms about this?  If you have nothing to hide why are you fighting Apple’s attempt to make the world safer from criminals and sick people?   This “ceo” made the amazingly stupid comment about “presumption of guilt”?   Well,  at least in the United States a person is innocent until proven guilty.   

    I’m frankly amazed by the feedback in this forum in the last few days,  people who trust and love Apple and now don’t want to help Apple try to make the world safer.  

    If you work for any large company I hope you know that any email, chat messages and files on company computers can be looked at and scanned by security and network admins.

    if you want total privacy,  get rid of cloud storage and maybe go back to using a typewriter LOL

    Apple does not want their products used for crime and is making an attempt to do something about it - what is so hard to understand?
    Your employer has the right to monitor your activities at work - but certainly not at home.

    My general concern is if things were to go wrong, and things can and do go wrong.  We're Apple users, we expect things to go wrong!  Due to a bug or hash mismatch (okay - the odds of it triggering a false positive are very low), it could be possible for completely innocent images to be flagged up incorrectly.  Apple hasn't exactly the most marvellous reputation for dealing with sensitive and urgent problems when accounts are closed for something the account isn't responsible for.

    But, as many other people have said, it doesn't have to stop there.  The same tech can be used (or expanded) to check for other content that, say, governments can enforce on Apple to weed out and notify them of any infraction.  This has the capability (mind you, most things do) for abuse.

    HOWEVER..

    Adobe already do this with their cloud services.  This is outlined here:

    https://www.adobe.com/uk/legal/lawenforcementrequests/childsafety.html

    So those using Lightroom Desktop/CC or Lightroom Classic which syncs photos to the Adobe Creative Cloud are already having their photos scanned with CSAM technology when it's uploaded to their servers.  I've not seen any articles that mention this, or any feedback that Adobe has to say on it.

    I can definitely see why Apple wants to implement CSAM on the iPhone (and perhaps give Apple a chance to say to law enforcement - hey, you don't need to crack the phone - we do the job for you!) - and it'd be one of the few companies that aren't already doing so (Google already do it through many of their products and services already - https://transparencyreport.google.com/child-sexual-abuse-material/reporting?hl=en), but it does somewhat go against their privacy matters mantra.
    Tim Sweeney is being completely self serving and is actually causing huge harm to an important discussion by adding misinformation and false narrative. 

    There is a definite discussion to be had about this technology and how it could be repurposed, but these extreme responses on either end are missing key points. 

    I’m ONLY talking about the iCloud upload CSA hash scanning here… not to be confused with the other underage image messaging proposed - these are being mixed to conflate the issue. 

    Apple are not ‘scanning your photos’ and have no access to the images with this technology. 
    The ‘hash’ generated ON DEVICE is simply a string of numbers and letters that cannot be interpolated back into an image. This is then compared to hashes of known pictures of CSA and ONLY if you upload them to their servers. The chance of a false positive is inconsequentially small, the chance of many false matches is infinitesimal. 
    This is NOT a privacy issue. 

    It IS a civil Liberty issue. Can this tool be adapted for nefarious uses my governments. Yes. Does the technology already exist, yes. Are photos stored and transmitted on other platforms scanned for CSI, Yes. Do they do it with any due diligence for your privacy, like Apple are proposing, NO. 

    Even if Multiple positive matches are identified, a subpoena is required to access your photos. At no point are your photos being ‘scanned’ or your privacy being broken. 

    The big risk here is governments forcing Apple to use this tech for ‘anti-terrorism’ and killing free speech. Will that happen, probably not. BUT THE TECHNOLOGY ALREADY EXISTS, Apple are not the enemy here. 

    Can you turn it off. Yes. 

    With respect to the other technology of notifying parents of possible nude photos being sent to underage children. 
    Your phone already ‘scans’ your photos with AI, so when you search for ‘car’ or a person, it will be able to find the photo. 
    Simplifying things, they are using the same on device tech to spot, before an image is sent by a child, or received by them, if it could be tagged as ‘nude’ or other relevant tags. 
    The parent is then notified. 

    If an underage child was sending or receiving messages containing the word ‘sex’, or other words or phrases, and Apple allowed parents to be informed, it would not be an issue. 
    Yet, it is the same result. The technology is already there, it is already being implemented, they have just joined the dots. 

    Again, can this be repurposed… yes. Could it cause some children to be harmed by abusive parents, possibly. Will it stop many children being groomed or abused, undoubtedly. Is it a risk worth taking… that is the discussion needed.  
  • Reply 54 of 108
    FYI: years ago, Yahoo! filed a lawsuit against the government per subpoenas for data. This was at the height of all the Snowden hysteria regarding hypothetical surveillance. Their case was dismissed by the judge because they had no evidence of abuse of this procedure by the government. All they had was hypotheticals about what the government "could do", not anything that pertained to what the government was actually doing.

    I would recommend that example to all the people in this thread using hypotheticals about the hash scanning. People can imagine all kinds of scenarios that might happen, but at the end of the day you need more than that if you really think your rights are being violated. 
  • Reply 55 of 108
    kimberlykimberly Posts: 427member
    Trying to identify child abuse material with some programming algorithm that turns out to be wrong more often that right is inevitable. Imagine getting flagged for child abuse when the source is perfectly legit - the accusation is proven false but too bad mud sticks. Apple should concentrate on making Apple's voice assistant more than comic relief (who was going to buy a HomePod at that price when Siri was a laughing stock) or reviving wifi with mesh devices or ... you know, useful stuff.
  • Reply 56 of 108
    Another FYI: text from the 2019 version of the iCloud user agreement. Apple has obviously already told people using the service that their files can be screened on iCloud.

    ----

    B. Your Conduct

    You agree that you will NOT use the Service to:

    a. upload, download, post, email, transmit, store or otherwise make available any Content that is unlawful, harassing, threatening, harmful, tortious, defamatory, libelous, abusive, violent, obscene, vulgar, invasive of another’s privacy, hateful, racially or ethnically offensive, or otherwise objectionable;

    b. stalk, harass, threaten or harm another;

    c. if you are an adult, request personal or other information from a minor (any person under the age of 18 or such other age as local law defines as a minor) who is not personally known to you, including but not limited to any of the following: full name or last name, home address, zip/postal code, telephone number, picture, or the names of the minor’s school, church, athletic team or friends;

    d. pretend to be anyone, or any entity, you are not — you may not impersonate or misrepresent yourself as another person (including celebrities), entity, another iCloud user, an Apple employee, or a civic or government leader, or otherwise misrepresent your affiliation with a person or entity (Apple reserves the right to reject or block any Apple ID or email address which could be deemed to be an impersonation or misrepresentation of your identity, or a misappropriation of another person’s name or identity);

    e. engage in any copyright infringement or other intellectual property infringement (including uploading any content to which you do not have the right to upload), or disclose any trade secret or confidential information in violation of a confidentiality, employment, or nondisclosure agreement;

    f. post, send, transmit or otherwise make available any unsolicited or unauthorized email messages, advertising, promotional materials, junk mail, spam, or chain letters, including, without limitation, bulk commercial advertising and informational announcements;

    g. forge any TCP-IP packet header or any part of the header information in an email or a news group posting, or otherwise putting information in a header designed to mislead recipients as to the origin of any Content transmitted through the Service (“spoofing”);

    h. upload, post, email, transmit, store or otherwise make available any material that contains viruses or any other computer code, files or programs designed to harm, interfere or limit the normal operation of the Service (or any part thereof), or any other computer software or hardware;

    i. interfere with or disrupt the Service (including accessing the Service through any automated means, like scripts or web crawlers), or any servers or networks connected to the Service, or any policies, requirements or regulations of networks connected to the Service (including any unauthorized access to, use or monitoring of data or traffic thereon);

    j. plan or engage in any illegal activity; and/or

    k. gather and store personal information on any other users of the Service to be used in connection with any of the foregoing prohibited activities.

    C. Removal of Content

    You acknowledge that Apple is not responsible or liable in any way for any Content provided by others and has no duty to screen such Content. However, Apple reserves the right at all times to determine whether Content is appropriate and in compliance with this Agreement, and may screen, move, refuse, modify and/or remove Content at any time, without prior notice and in its sole discretion, if such Content is found to be in violation of this Agreement or is otherwise objectionable.

    https://www.apple.com/legal/internet-services/icloud/


    edited August 2021
  • Reply 57 of 108
    crowleycrowley Posts: 10,453member
    crowley said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
    It is your data and it is private but that privacy cannot prevent Apple from performing legally required checks and scans on their servers. This is one reason most of the iCloud data is not end-to-end encrypted. It is still encrypted, but with Apple's keys, not your own device keys. Practically unencrypted, from the user's point of view. And this is why law enforcement can access your iCloud data anytime by presenting a search warrant.

    But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement.
    They aren't scanning your iPhone, they're scanning the photo that you want to put on their service.  They're refusing to even take it without first checking if it matches a known child abuse picture.  That seems fine to me.
    No they don't refuse anything. If their intent were to refuse something they could refuse it on the server as well. They accept whatever you send, but with an associated "safety voucher" if there is a CSAM match. And if those vouchers reach a certain threshold they report you.
    They're refusing to even take it without first checking if it matches a known child abuse picture.
  • Reply 58 of 108
    sricesrice Posts: 120member
    Rayz2016 said:

    The problem is that Apple has already admitted that the system is open to abuse.  Since the images are hashed, then Apple doesn't know what images are being searched for. This means that any government can inject any image into the database Apple is picking up. In some countries, this won't be child porn; it'll be images of wanted dissidents, government protestors, subversive poetry. 

    Apple has basically told the world that they're happy to build them a surveillance network for any country that asks them to.

    That's the problem.
    This!
  • Reply 59 of 108
    fordee said:
    Its funny that everyone is calling Apple out on this, but don’t mention that this is already happening on other major cloud platforms. Why not call Amazon out as well. Also, Tim hasn’t mentioned that Epic’s owners, Tencent already surveil Chinese citizens. Having said that, I can appreciate the slippery slope concerns.
    Maybe because Amazon or Google doesn’t shout “How great we are regarding to privacy”

    iCloud or any other Cloud content is already being scanned then why implement something shady like this. Tarnishing their reputation of protecting their customers data. 
    elijahganantksundaramp-dog
  • Reply 60 of 108
    crowleycrowley Posts: 10,453member
    srice said:
    Rayz2016 said:

    The problem is that Apple has already admitted that the system is open to abuse.  Since the images are hashed, then Apple doesn't know what images are being searched for. This means that any government can inject any image into the database Apple is picking up. In some countries, this won't be child porn; it'll be images of wanted dissidents, government protestors, subversive poetry. 

    Apple has basically told the world that they're happy to build them a surveillance network for any country that asks them to.

    That's the problem.
    This!
    Except Apple review the images before raising any alarm. If it’s not child abuse, no alarm.
Sign In or Register to comment.