Apple details user privacy, security features built into its CSAM scanning system

13

Comments

  • Reply 41 of 71
    robabarobaba Posts: 228member
    markbyrn said:
    To quote Apple in 2019, "What happens on your iPhone, stays on your iPhone."  If Cook and company are willing to break that promise, it's safe to assume they won't honor their promise to only scan for CSAM.  As much as Apple's biggest defenders try to spin this betrayal as being noble, CSAM is just the proverbial Trojan Horse.   In 2021, Apple's new mantra will be, "You must surrender privacy under the guise of protecting children."
    Sorry, but that’s BS.  They aren’t  run checks on what you keep on your phone, only what you send in an uplink to your iCloud account.  This is in preparation, I believe, for the complete end-to-end encryption of everything that leaves your phone.  This is also not a case where they are scanning and recording the photos on your phone.  They hash all the content and comparing it to hashes of known material.  Surveillance state this is not!
    edited August 2021 foregoneconclusionDBSyncfastasleep
  • Reply 42 of 71
    robaba said:
    markbyrn said:
    To quote Apple in 2019, "What happens on your iPhone, stays on your iPhone."  If Cook and company are willing to break that promise, it's safe to assume they won't honor their promise to only scan for CSAM.  As much as Apple's biggest defenders try to spin this betrayal as being noble, CSAM is just the proverbial Trojan Horse.   In 2021, Apple's new mantra will be, "You must surrender privacy under the guise of protecting children."
    Sorry, but that’s BS.  They aren’t  run checks on what you keep on your phone, only what you send in an uplink to your iCloud account.  This is in preparation, I believe, for the complete end-to-end encryption of everything that leaves your phone.  This is also not a case where they are scanning and recording the photos on your phone.  They hash all the content and comparing it to hashes of known material.  Surveillance state this is not!
    Apple needs to pull the check from the device, even if it only occurs when uploading to iCloud. I will never accept this kind of spyware on one of my devices!
    If they really believe that this is necessary they should make it part of their cloud infrastructure (would cost them some $$$ to calculate all the hashes)!
    edited August 2021 muthuk_vanalingamwilliamlondoncat52baconstangharrywinterdarkvaderbyronl
  • Reply 43 of 71
    The cause is good and just, and Apple's solution is glitzy.  I'm just not going to be a perpetual suspect for the possession and transmission of child pornography by having blinded hashes from NCMEC on my phone.

    That's it.  Because that is the same kind of treatment real-live pedos receive when they get out of prison and want to have a computer and smartphone, the FBI installs software to keep an eye on what they're doing.

    Sorry Apple, the pedo experience isn't happening here.
    williamlondoncat52aderutterbaconstangharrywinterdarkvaderbyronl
  • Reply 44 of 71
    mfryd said:
    Apple has previously taken the reasonable position that if is technically possibly to abuse a system, then it will be abused.

    Consider the idea of a "back door" on iPhone encryption.   Obviously, it would only be used in cases where there was a court order.  This would involve a sworn law enforcement officer to present legally obtained evidence to a judge that there was reasonable cause to believe that a crime had been committed, and that a search warrant should be issued.   There are a number of checks and balances in this system, yet Apple knows that it can be abused.
    Apple has never claimed that it's the U.S. government that would be abusing a "backdoor" for encryption. Apple has also never filed a lawsuit claiming they had evidence of the U.S. government abusing subpoenas for data in the cloud. I'm sure they wish that they had evidence of that because it would make their objections to the encryption backdoor easier. 
  • Reply 45 of 71
    bbhbbh Posts: 134member
    jdw said:
    Apple is still not making the majority of people feel comfortable with its plan, and as such Apple has the moral obligation to at the very least DELAY its plan until it can better convey to the public why they will be 100% protected.  That remains true even of some contend this is being blown out of proportion.

    Apple needs to explain IN DETAIL how it will proactively help anyone falsely accused of engaging in illicit activities seeing that Apple would be primarily responsible for getting those falsely accused people in that situation in the first place.  That discussion must include monetary compensation, among other things.

    Apple needs to explain how it intends to address the mental toll on its own employees or contracted workers who will be forced to frequently examine kiddy porn to determine if the system properly flagged an account or not.  THIS IS HUGE and must not be overlooked!  On some level it is outrageous that the very content we wished banned from the world will be forced upon human eyes in order to determine if a machine made a mistake or not.

    Only when these points have been adequately addressed should Apple begin implement their plan or a variant of it.  The next operating system release is too soon.

    Apple truly has done a terrible job of explaining what is actually happening and what actually does happen downstream of a photo being flagged. First, nobody is actually looking at your photo or any other photo. They are comparing a CSAM algorithm with an algorithm generated when your photo is uploaded. If your "photos" cross the threshold of too many suspect photos, a human confirms that the ALGORITHM of the suspect photos matches the CSAM algorithm. No body ever looks at your photo. This is not a case of someone looking at your pictures, and declaring that "to me, this is child pornography". 

    And, if you are still unconvinced and wrapped up in the hysteria surrounding this issue, all you have to do is System>Photos> iCloud Photos to OFF. Slide the button to the OFF position. There are numerous ways to back up your photos and have them available on numerous platforms anywhere in the world without using iCloud Photos. But...unless you totally rely on your own personal backup solution (I have a 2 Drive Synology NAS), chance are any other place you choose to remotely store your photos is most likely using the CSAM system already. 
    ihatescreennameswilliamlondonDBSyncharrywinterfastasleep
  • Reply 46 of 71
    bbhbbh Posts: 134member
    The motivation behind this, in addition to the outright moral repugnancy of kiddy porn, is that Apple simply does not want their servers to host that stuff. What’s wrong with that? If you want to use their servers…..your choice….you must go through the CSAM filter. 

    YOU still control your data. Choose to use their servers or don’t. This hysteria is unbelievably overwrought.  
    ihatescreennamesdewmeDBSyncforegoneconclusionfastasleep
  • Reply 47 of 71
    MarvinMarvin Posts: 15,486moderator
    Rayz2016 said:
    As usual, Rene Ritchie nails the problem and comes up with a solution:

    https://youtu.be/4poeneCscxI

    Scan the photos on an intermediate server before they’re sent to iCloud. That would remove the on-device scanner that everyone is concerned about. 

    Why they won’t do this:

    That would require more servers. It’s much cheaper for Apple to use the resources on your device to do the scanning for them. 

    If they want to add other goodies, such as activity tracking, later on, then they can only really do this on-device. 
    Apple is trying to make a system where nobody but the user has access to their photos, an intermediate server has the same flaws as doing it on Apple's server. It's even worse if that server is run by a 3rd party that can be trusted less.

    One way to move the database off the user-side would be to encrypt the images using the hash of the image. That way the image can only be decrypted using a matching hash in the database. However, to find if there's a match with only encrypted images, they'd have to brute force every image uploaded against every one of millions of hashes in the database to see if it decrypted. If they bundled the full hash with every image, they'd be able to decrypt every image.

    They could bundle a portion of the hash of each image like 25% of the hash and check if that part of the hash exists in the database. This way they can't decrypt any image as they have no idea what the rest of the hash is but they can quickly find a matching hash in the database and see if that decrypts the image.

    - on uploading a photo to iCloud, generate hash 00a8f7...8927g
    - encrypt the photo using the hash as the key
    - crop hash to 25% on the device 00a8f7...8927g and send the cropped hash to iCloud along with the encrypted image
    - for full restoration of images on a user's other device, the full hashes would also have to be stored in iCloud, encrypted with the user's account key

    - server receives encrypted image and 25% of the unencrypted hash and 100% of encrypted hash
    - server checks the 25% against the hashes in the database for matches
    - on finding a match (or multiple), it tries to decrypt the image using the full hash in the database
    - on successful decryption, add it to the threshold score for eventual human verification

    This also has the advantage that it doesn't require keeping the databases updated on user's devices. It would be harder to do partial matches but if they are only looking for exact matches, it would be ok.

    Taking the database, scanning and flagging off the user devices is largely psychological, the same process is happening overall but it might help put people's minds at ease.

    It's easy to see why Apple would want to have this system. They get requests for iCloud account access for suspects. An abuser would be taking images/videos with their iPhone so it goes directly into the library. They share these online and law enforcement finds these during investigations and adds them to the database. They don't always know who it is though. When they eventually follow leads to get to an iCloud account request, there could have been months of abuse happened in between. Apple would see that if their system had flagged the iCloud account as soon as the database was updated, they could have prevented months of abuse.
    fastasleep
  • Reply 48 of 71
    entropysentropys Posts: 4,306member
    It seems Apple has realised it’s poked a hornets’ nest and is making the hash algorithm more complex as a PR disaster management move. Too late.

    It doesn’t matter. The issue is killing its privacy reputation regardless of how much they want to pretend it’s a “won’t someone think of the children!!!!” move.  

    Now of course, if Apple backs down from its hubristic overreach and removes this bit of Big Brother enablement it will be portrayed as protecting kiddie fiddlers. What a completely self inflicted disaster!

    But if it does include this hash algorithm, will it be even 24 months before the algorithm gets some minor tweaks and some or all of the following hashs of photos on a separate hit list are pointed at:

    Mohamed
    Winnie the Pooh
    a certain incident in Teinnamen square
    A political dissident of the establishment.  Maybe even in the USA.

    Have a few well known pics of them in your Photos library (well 30 according to Apple. Never, ever less, guffaw), and you get on a list yourself.



    edited August 2021 baconstangjdwharrywinterdarkvadernumenorean
  • Reply 49 of 71
    hogmanhogman Posts: 25member
    F_Kent_D said:
    I 100% agree with everyone’s concern and disapproval of this feature. However, I have nothing to hide as far as child pornography or anything of the sort. I have 3 daughters and would rather them not receive any pornographic texts or communication from anyone and this is to help keep that from happening. Todays kids are chatting and messaging no telling who on the online games and I’ve found that one of my daughters was suckered into doing things that shouldn’t have been done as a 10 year old. She’s been warned but I’m unable to warn the other party. I’m not 100% happy about all of this scanning but at the same time I have young girls that if there’s a way to protect them I will accept the protection against sex trafficking and other improper activities via messaging. 
    Protect your daughters by treating them like the billionaires who built this stuff, don’t let them use it.
    baconstangharrywinter
  • Reply 50 of 71
    mobirdmobird Posts: 759member
    Where is Tim Cook? - this is his doing.

    I don't care about the technology or the methods that Apple is using, it should not be on my phone. They can check their own icloud servers for CSAM. No end user was asking for this.

    Why was this not announced at WWDC, I guess they can quit that farce going forward.

    Mentioned in a previous post-
    "People are getting bogged down arguing the hows, but It's the principle that is the heart of the dispute, not the tech or its execution."
    edited August 2021 baconstangxyzzy-xxxharrywinterdarkvadernumenorean
  • Reply 51 of 71
    mobird said:
    They don't get it!!! We DO NOT WANT THIS. I don't give a rat's ass on their "white paper".
    Who is “we”?
    fastasleep
  • Reply 52 of 71
    coolfactorcoolfactor Posts: 2,338member
    Technology is needed to keep the technology landscape safe for everyone.

    It's not good enough to let anybody do whatever they want wherever they want. That's not freedom, that's an invitation for abuse.

    We all call 9-1-1 when we need emergency services. We expect that help to be there. We expect that "system" to work to protect us. Apple creates a system to protect [young] people that only runs on their devices and their network (which they are fully entitled to do) and suddenly everyone is up in arms.

    Millions install and run antivirus software on their non-Apple computers every day. What do you think that software is doing? Yes, scanning _all_ of your files, looking for matches to their own virus database. This is expected and "normal".

    I trust that Apple has built this system entirely around privacy. Nobody is looking at your photos unless your content crosses very concerning thresholds. I'd much prefer that Apple build this system than be forced to install a technology built by a government or third-party.
    edited August 2021 foregoneconclusionfastasleep
  • Reply 53 of 71
    coolfactorcoolfactor Posts: 2,338member
    This is not enough. Apple has proven in several occasions their “privacy-first mantra” is just marketing, when you look at China, Russia and other countries.
    This system can be abused locally to search or collect data. I want Apple to at the very least state they will never ever do it, and if they do, are fine with the world-wide legal implications/liability.  

    Secondly, it’s MY device that I paid good money for and Apple allows me ZERO options to replace Photos with another app. It’s not that I can seamlessly switch and my camera and file browsing defaults to this new app - another topic but still relevant here.

    Ofcourse nobody wants child porn, but Apple is not the police nor did I choose to have my photos scanned.

    Screw this company and their hypocritical culture. I’m so much invested in their hardware and software but they are simply not the company anymore that I used to respect. 


    Wrong. You have many non-Apple choices for camera apps, photo management and cloud storage. Those options may not be as deeply integrated, but they exist. You just need to do your research.

    Furthermore, you don't need to use iCloud Photo storage (which this system requires to even operate). You could set up a Shortcut to sync your on-device photo library with some third-party system. You have options. Apple is not the enemy here, no matter how you want to perceive it. There's no privacy invasion unless you're abusing their cloud storage, and then you deserve what you get. :wink: 
    fastasleep
  • Reply 54 of 71
    Rayz2016Rayz2016 Posts: 6,957member
    John Kheit thinks the press has given Apple an easy ride with this. Not sure I agree. The press has reported what they’re seeing. 

    https://www.macobserver.com/columns-opinions/devils-advocate/apple-broke-privacy-promises-hearts/

    xyzzy-xxx
  • Reply 55 of 71
    longfang said:
    mobird said:
    They don't get it!!! We DO NOT WANT THIS. I don't give a rat's ass on their "white paper".
    Who is “we”?
    From the current polls some sites do on twitter, "we" is about 50% with 15% considering not to use Apple devices anymore !
    edited August 2021 harrywintermobird
  • Reply 56 of 71
    Ok. So basically, Apple has decided to do an illegal search and seizure of our photos, have their system analyze them for what a government sponsored non-profit determines might be criminal (which of course can't ever be hacked or manipulated, as that's never happened to any company or database /s), then a non-police officer human will look at your photos and make a legal determination over them before taking you to the police. This is completely unconstitutional, as laid out in the fourth amendment:

    4th Amendment: "The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized."

    Where's the warrant that gives anyone the right to look at your personal photos, be it on your phone, on your server storage, or your personal home? This also such a clear violation of the precept of innocent before proven guilty, as everyone is suspected guilty, and therefore, searched. Apple is not law enforcement, nor have they been empowered by law enforcement, nor do they have a legal warrant issued by a judge to search your property.

    Apple has really opened the door here for so many violations it's not even funny. And this is the company that would not help unlock the phone of a terrorist, but will now scan every single, innocent person's phone, you know "for the children". That excuse has always covered up so many sins. 

    You know, since Apple has decided to look at our photos, and one of their employees may end up looking at your personal photos should their magical system of identification accidentally flag one of your photos, how about we all get a view at the personal photos of Tim Cook and all other Apple employees. It's only fair isn't it?

    Just pondering...
    xyzzy-xxxdarkvadermuthuk_vanalingam
  • Reply 57 of 71
    Ok. So basically, Apple has decided to do an illegal search and seizure of our photos
    Apple's User Agreement for iCloud already states that they have the right to screen the files being uploaded to iCloud servers. That's been in place for years. If you think that's illegal, file a lawsuit. Just keep in mind that all the mainstream cloud services have terms like that. Also keep in mind that voluntarily choosing to use iCloud servers for your files means that they aren't restricted to your own property anymore. That makes it unlikely that a search/seizure claim would apply. 
  • Reply 58 of 71
    Business-wise, this "feature" makes sense for maintaining access to the declining Chinese market (and other totalitarian markets). I just wish they would've made it a PRC-only "feature" - like they did with the iCloud encryption keys of chinese customers.
  • Reply 59 of 71
    Ok. So basically, Apple has decided to do an illegal search and seizure of our photos
    Apple's User Agreement for iCloud already states that they have the right to screen the files being uploaded to iCloud servers. That's been in place for years. If you think that's illegal, file a lawsuit. Just keep in mind that all the mainstream cloud services have terms like that. Also keep in mind that voluntarily choosing to use iCloud servers for your files means that they aren't restricted to your own property anymore. That makes it unlikely that a search/seizure claim would apply. 
    They just should move this whole mess in the cloud, then there is no way it can be exploited on your device and people will not fear that some day this feature will be expanded for local photos.
    Personally I will not update my iPhone 12 Pro to iOS 15 (just restored back to 14.7.1 from the latest beta) and will hold the purchase of the iPhone 13 Pro (purchased a new one every single year until now) until this is sorted out.
    edited August 2021 darkvadernumenorean
  • Reply 60 of 71
    Ok. So basically, Apple has decided to do an illegal search and seizure of our photos, have their system analyze them for what a government sponsored non-profit determines might be criminal (which of course can't ever be hacked or manipulated, as that's never happened to any company or database /s), then a non-police officer human will look at your photos and make a legal determination over them before taking you to the police. This is completely unconstitutional, as laid out in the fourth amendment:

    4th Amendment: "The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized."

    Where's the warrant that gives anyone the right to look at your personal photos, be it on your phone, on your server storage, or your personal home? This also such a clear violation of the precept of innocent before proven guilty, as everyone is suspected guilty, and therefore, searched. Apple is not law enforcement, nor have they been empowered by law enforcement, nor do they have a legal warrant issued by a judge to search your property.

    Apple has really opened the door here for so many violations it's not even funny. And this is the company that would not help unlock the phone of a terrorist, but will now scan every single, innocent person's phone, you know "for the children". That excuse has always covered up so many sins. 

    You know, since Apple has decided to look at our photos, and one of their employees may end up looking at your personal photos should their magical system of identification accidentally flag one of your photos, how about we all get a view at the personal photos of Tim Cook and all other Apple employees. It's only fair isn't it?

    Just pondering...
    There’s nothing illegal about it. Apple is a private entity that you have given access to your records. They can voluntarily give any and all of your information to law enforcement. Even regardless if you have a “contract” with them not to because that contract would be void. The 4th Amendment only protects you if the government is attempting to get information and the entity that has it resists. They can voluntarily give whatever information they have to law enforcement at any time.
Sign In or Register to comment.