Apple details user privacy, security features built into its CSAM scanning system

24

Comments

  • Reply 21 of 71
    Rayz2016Rayz2016 Posts: 6,957member
    User data should only used to improve that user's experience, not to share with other orgs. No matter what the scenario is.

    No user data is being shared with other orgs — until a very concerning threshold has been met, and Apple has full rights to audit and monitor their own servers for abuse. They would be irresponsible to allow anybody to user their servers for any purpose without some protections in place.

    This is not an invasion of privacy, no matter how people want to spin it.

    They’re running spyware on your phone that reports on you without informing you, and along the way they’ve failed to provide an answer to questions of overreach. When a government requests that this “feature” is extended by law, Apple’s best response is that they’ll resist, which hasn’t worked at all in China or Saudi. 

    Replacing “scanning” with “analysing” doesn’t help. And having strangers look at your pictures when the system fails? … The most egregious example of privacy-breaking I’ve seen in years. 

    edited August 2021 bloggerblogmobirdmike54muthuk_vanalingamcat52elijahgbaconstangdarkvaderbyronl
  • Reply 22 of 71
    Rayz2016Rayz2016 Posts: 6,957member
    chadbag said:
    The problem is the hubris of Apple and Tim Cook and his EVP and other staff.   They are so convinced that all their "social" wokeness and initiatives are 100% correct and a mission from god (not God).  They are not listening.  They don't care.  They think they are right and just need to convince you of that.
    Yes, the hubris of Apple...and every other company that was already doing this. Google, Microsoft, Facebook, Dropbox, Snapchat, ... Oh look: "To date, over 1,400 companies are registered to make reports to NCMEC’s CyberTipline and, in addition to making reports, these companies also receive notices from NCMEC about suspected CSAM on their servers."

    https://www.missingkids.org/theissues/csam#bythenumbers
    Google, Facebook, Snapchat and Microsoft aren’t running spyware  on your phone. This is the difference that detractors of this move bring up every time, and supporters of this move are desperate to ignore. 

    Oh, and Apple already scans their servers for CSAM images, so why move it the phone? 

    And this is what this is all about, in my opinion: the spyware. Apple needs you to accept the argument that with spyware running on your phone, your privacy is safe. Wrapping it in the noble cause of child protection was a good move; they hoped that if anyone criticised them then their supporters would use cries of “think of the children!” to silence them. 

    So why are they doing it?

    So when they introduce a client-side logger to record the apps you run and the sites you visit, they can tell you your privacy is safe even though this representation of your activity is sold on to advertisers. And you will, of course, support them and agree: “No, look; it’s not an invasion of privacy! They’re only sending this bit of data. The photo of you is blurred and they’ve used AI to scrub out your face!” You will agree because you were fooled the first time round, but rather than admit it, you’ll carry on desperately ignoring the obvious fact that this is spyware Apple is running on your phone. You’ll ignore the fact that apple is trying to redefine what privacy is so they can sell yours. 

    Over the years, we’ve been throwing around the phrase, “With Google, you’re the product.”  Google monetises your data. 

    With Apple, it’s different: access to you is the product. They’ve spent years and billions of dollars cultivating a user base of affluent people who actually buy stuff … and if you want access to that user base, Apple thinks you should pay them. 

    edited August 2021 mike54muthuk_vanalingammacpluspluscat52aderutterelijahgbaconstangharrywinterdarkvaderbyronl
  • Reply 23 of 71
    If multiple high-ranking executives at Apple have to come out with damage control PR in the span of one week, this should be a clue that the initiative, however well-intentioned, is deeply unpopular and should be abandoned.

    Apple is not law enforcement or a government agency.  They can't say no forever.  They've already said yes to exemptions for China, Russia, and Saudi Arabia just to keep selling the phones in those countries.  This CSAM process can and will be abused by foreign governments to coerce Apple to carve out exemptions under the threat of having Apple devices banned, embargoed, or cut off from the Internet.
    edited August 2021 xyzzy-xxxmuthuk_vanalingammacpluspluscat52elijahgbaconstangdarkvader
  • Reply 24 of 71
    jdwjdw Posts: 1,324member
    Apple is still not making the majority of people feel comfortable with its plan, and as such Apple has the moral obligation to at the very least DELAY its plan until it can better convey to the public why they will be 100% protected.  That remains true even of some contend this is being blown out of proportion.

    Apple needs to explain IN DETAIL how it will proactively help anyone falsely accused of engaging in illicit activities seeing that Apple would be primarily responsible for getting those falsely accused people in that situation in the first place.  That discussion must include monetary compensation, among other things.

    Apple needs to explain how it intends to address the mental toll on its own employees or contracted workers who will be forced to frequently examine kiddy porn to determine if the system properly flagged an account or not.  THIS IS HUGE and must not be overlooked!  On some level it is outrageous that the very content we wished banned from the world will be forced upon human eyes in order to determine if a machine made a mistake or not.

    Only when these points have been adequately addressed should Apple begin implement their plan or a variant of it.  The next operating system release is too soon.
    macpluspluswilliamlondonelijahgbaconstang
  • Reply 25 of 71
    xyzzy-xxx said:
    It's just unacceptable to build a spyware into billions of devices – now that criminals are warned, who are they going to catch?

    I will NEVER accept that my data will be scanned on my OWN DEVICES and for sure I won't PAY FOR A DEVICE to spy me.

    iOS 15 and thereby iPhone 13 is now dead for me.
    I totally agree. I don’t want this on my iphone. Period. 
    xyzzy-xxxmuthuk_vanalingamOctoMonkeycat52aderutterbaconstangharrywinterdarkvader
  • Reply 26 of 71
    If they are pushing this through I am done with Apple. I never ever will accept this on my phone. My data are mine and I don’t need any tech company scanning my data. Period. I expect Apple to cancel this feature.
    edited August 2021 mike54xyzzy-xxxwilliamlondoncat52elijahgharrywinterdarkvaderbluefire1byronl
  • Reply 27 of 71
    citpekscitpeks Posts: 246member
    The irony is if Apple had not made a big deal of these features, it probably would have been subject to a lot less scrutiny and criticism.  As noted, other companies already do take measures against CSAM, and have for a long time.  A bit of opaqueness, like the way it has handled subpoenas for iCloud data and such, wouldn't have generated as much attention or blowback.  San Bernardino was only a thing because the FBI made it a thing.  Apple wasn't the protagonist in that dispute, and fought back only because it had to.

    Traditionally, Apple has been deft at avoiding hot button issues, and other dilemmas, by "not going there," but the company has made a rare misstep this time, and worse, is doubling down on its stance.

    Even worse, it has proven the doubters and skeptics right, has made of mockery of that infamous ad it placed in Vegas, and damaged its credibility, which is hard to build, but easy to lose.  Nobody to blame but itself for this one.

    However well-intentioned, or morally justified, it has taken a philosophical leap from its former position.  People are getting bogged down arguing the hows, but It's the principle that is the heart of the dispute, not the tech or its execution.

    It's baffling.  Perhaps the company's success has caused the leadership to become too myopic, believe that their way is the right way, and the only way, and forget how to read a room.  No company stays on top forever; that's the cyclical nature of things.  It will be interesting to see if that chapter marks the inevitable decline the company will experience.
    edited August 2021 muthuk_vanalingamcat52entropyselijahgmobirdbaconstangharrywinterbyronl
  • Reply 28 of 71
    Rayz2016Rayz2016 Posts: 6,957member
    As usual, Rene Ritchie nails the problem and comes up with a solution:

    https://youtu.be/4poeneCscxI

    Scan the photos on an intermediate server before they’re sent to iCloud. That would remove the on-device scanner that everyone is concerned about. 

    Why they won’t do this:

    That would require more servers. It’s much cheaper for Apple to use the resources on your device to do the scanning for them. 

    If they want to add other goodies, such as activity tracking, later on, then they can only really do this on-device. 
    mike54xyzzy-xxxOctoMonkeyaderutterelijahgbaconstangharrywinterbyronl
  • Reply 29 of 71
    Rayz2016 said: Scan the photos on an intermediate server before they’re sent to iCloud. That would remove the on-device scanner that everyone is concerned about. 
    And what is the "concern" based on? Absolutely nothing beyond unfounded hypotheticals. It's like claiming that someone buying a computer is a slippery slope to criminal hacking, so computer purchasing shouldn't be allowed. Anyone can come up with hypothetical negative results. It's meaningless. 
    dewmefastasleep
  • Reply 30 of 71
    flydogflydog Posts: 1,123member
    lkrupp said:
    In this brave new world where facts are irrelevant and perception and spin is all that matters the horse is already out of the barn and nothing Apple says or does will change the perception being spun in the news media and tech blogs.
    Perception and facts are one and the same. Apple, the company that put up billboards proclaiming that what is on your phone is no one else’s business AND alleged that allowing a backdoor for government would compromise security, will be scanning its users’ photos to determine whether they committed a crime. 
    mike54muthuk_vanalingamOctoMonkeycat52entropyselijahgbaconstangdarkvaderchemengin1byronl
  • Reply 31 of 71

    Apple, this CSAM detection of yours is crazy! Even if it is used as advertised - people are going to be scared. Scared that you will yet again break your promise and introduce a ‘special version’ for the totalitarian regimes you so often submit to. On behalf of the crazy ones, the misfits, the rebels, the troublemakers, the round pegs in the square holes… the ones who see things differently yadi yada - I urge you to please please reconsider. People should not have to live in fear because their devices are under constant and warrantless surveillance. 

    edited August 2021 mike54xyzzy-xxxcat52elijahgbaconstangharrywinterdarkvaderbyronl
  • Reply 32 of 71
    flydog said: Perception and facts are one and the same. Apple, the company that put up billboards proclaiming that what is on your phone is no one else’s business AND alleged that allowing a backdoor for government would compromise security, will be scanning its users’ photos to determine whether they committed a crime. 
    Will be? Cloud services have always reserved the right to screen the contents uploaded to the cloud. It's in the user agreement. If iCloud is turned off, then there isn't any screening.
    dewmewilliamlondonfastasleep
  • Reply 33 of 71
    ErikSomething said: We do not want to live in constant fear because the content that is on our private devices are under constant and warrantless surveillance. 
    If you believe your device is really under warrantless surveillance, go to court and prove it. If you can't prove it, then it's a meaningless hypothetical. See how that works? Anything and anyone can be used to dream up hypothetical crimes. Your home? Someone could hypothesize that it's a front for selling drugs or that your basement is a torture chamber. But that's just b.s. if there's no proof. 
    edited August 2021 dewmewilliamlondonDBSyncfastasleep
  • Reply 34 of 71
    To quote Apple in 2019, "What happens on your iPhone, stays on your iPhone."  If Cook and company are willing to break that promise, it's safe to assume they won't honor their promise to only scan for CSAM.  As much as Apple's biggest defenders try to spin this betrayal as being noble, CSAM is just the proverbial Trojan Horse.   In 2021, Apple's new mantra will be, "You must surrender privacy under the guise of protecting children."
    mike54xyzzy-xxxOctoMonkeycat52entropysbaconstangharrywinterdarkvader
  • Reply 35 of 71
    markbyrn said:
    To quote Apple in 2019, "What happens on your iPhone, stays on your iPhone."  If Cook and company are willing to break that promise, it's safe to assume they won't honor their promise to only scan for CSAM.  As much as Apple's biggest defenders try to spin this betrayal as being noble, CSAM is just the proverbial Trojan Horse.   In 2021, Apple's new mantra will be, "You must surrender privacy under the guise of protecting children."
    To quote the iCloud user agreement, "However, Apple reserves the right at all times to determine whether Content is appropriate and in compliance with this Agreement, and may screen, move, refuse, modify and/or remove Content at any time, without prior notice and in its sole discretion, if such Content is found to be in violation of this Agreement or is otherwise objectionable."
    williamlondonDBSyncharrywinterfastasleep
  • Reply 36 of 71
    Apple sees all of the, from their perspective, undesirable legislation that is developing in Washington. This week’s App Store proposed legislation as an example, and want to do things that gains them favors from the legislators. They want to say to the DC crowd look at all of the benefits our ecosystem brings, so you really shouldn’t start mucking around with what we’ve put together. Sure child porn is an abhorrent problem. The ends don’t necessarily justify the means here. This is a classic example of “if you aren’t breaking the law you have nothing to worry about.” And so we go deep er down the road of the surveillance state.
    muthuk_vanalingamentropyselijahgbaconstangdarkvaderbyronl
  • Reply 37 of 71
    dewmedewme Posts: 5,332member
    flydog said: Perception and facts are one and the same. Apple, the company that put up billboards proclaiming that what is on your phone is no one else’s business AND alleged that allowing a backdoor for government would compromise security, will be scanning its users’ photos to determine whether they committed a crime. 
    Will be? Cloud services have always reserved the right to screen the contents uploaded to the cloud. It's in the user agreement. If iCloud is turned off, then there isn't any screening.
    Yes, you are correct about the terms and conditions of iCloud storage. It’s all there in clear text for everyone to see in the end user license agreement, which 99.9999% of users, or at least non lawyers, never read. I expect most people are like me, they click on the Agree checkbox to make that damn dialog box go away so they can get on with whatever it was they were trying to do, like installing a software update on their device. Yeah yeah yeah, just do it, and leave me happily wallowing in my world of blissful ignorance. You do it, I do it, we all do it.

    But anyone who’s been within arms length of a computer in the past 40 years knows in their heart of hearts and constantly withering brain that there’s some crazy shit in that EULA that they they’ve just agreed to that they hope will never come back to bite them in the ass. That’s where the “perception is reality” term applies, inside of one’s tiny brain, where reality can be suspended or morphed to fit whatever occasion and rationalization one desires at the time. It’s a marvelous hunk of head meat that keeps us about as happy as we need to be for the most part.

    I can understand some folks being perplexed by the discrepancy between Advertising and Reality, because in some pseudo universe, they’ve never encountered a case where advertising was even a little bit different than reality, you know, they actually did grow wings and start flying after downing a Red Bull, or were suddenly surrounded by the Swedish Bikini Team after buying some cheap ass watery beer.  Personally, I’ve never seen either of these outcomes, but I’ll keep chasing the promises.

    What I’d like to see come out of this frothfest is an updated privacy nutrition label sort of thing on all Apple devices that peels back the onion on the obscure EULAs and simply states exactly what Apple is free to do on your device and the hard and soft linkages between your device and cloud services. Putting it in worst case and pessimistic form might be good. Since the iCloud services on your device are inextricably bound to the iCloud servers in the cloud, whether Apple scans your content on your device or in the cloud doesn’t really matter. It’s and end-to-end service. If you want to stop the scanning you have to stop the service entirely.

    If Apple goes down this “pessimistic” privacy nutrition label path there’s no reason other vendors of similar cloud storage services should not follow, like Amazon, DropBox, Google Drive, OneDrive, etc.
    mike54elijahg
  • Reply 38 of 71
    I can’t believe that so many other people ever believed Apple’s claims about privacy.
    mike54xyzzy-xxxwilliamlondoncat52elijahgharrywinter
  • Reply 39 of 71
    mobirdmobird Posts: 752member
    I thought that the WWDC was in part to announce the forthcoming new OS's and all of the new features and capabilities?
    Why was the CSAM "feature" not mentioned?
    Deliberate deception on Apple's part?

    I wonder how the iOS 15 Beta is being adopted?
    xyzzy-xxxcat52elijahgbyronl
  • Reply 40 of 71
    mfrydmfryd Posts: 216member
    Apple has previously taken the reasonable position that if is technically possibly to abuse a system, then it will be abused.

    Consider the idea of a "back door" on iPhone encryption.   Obviously, it would only be used in cases where there was a court order.  This would involve a sworn law enforcement officer to present legally obtained evidence to a judge that there was reasonable cause to believe that a crime had been committed, and that a search warrant should be issued.   There are a number of checks and balances in this system, yet Apple knows that it can be abused.

    Therefore Apple has refused to create a back door, so that if a court orders them to unlock a phone, they can legitimately claim that this is not something they are capable of doing.  Apple knows that it is technically possible for a back door to be abused, and therefore Apple understands that if it exists, it will be abused at some point.

    When it comes to a back door for searching private photo libraries, Apple is trying to create their own system of checks and balances for this new "back door".  The problem is that once the back door exists, they can no longer refuse a government order on the grounds that the request is not possible.  If Apple receives a court order to add additional items to the blacklist, they cannot legally refuse.   Thus any government can order Apple to scan private photo libraries for forbidden images.   Imagine if a Muslim country ordered Apple to add hashes of images of Mohamed to the blacklist.  If Apple has a presence in that country, they are bound to follow the laws in that country.

    Apple is wrong if they think they can create a backdoor that only Apple controls, and that they can prevent governments from tapping in to this backdoor.

    muthuk_vanalingammobirdOctoMonkeyxyzzy-xxxcat52entropysbaconstangharrywinterdarkvader
Sign In or Register to comment.