New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM

1235»

Comments

  • Reply 81 of 98
    nizzard said:
    nizzard said:
    Another reason to believe this has nothing to do with csam and everything to do with building in a back door:   No fûcking pedo is going to save their photos to iCloud (and certainly not now). And no pedo is going to opt in to any monitoring.  So how much benefit will this non-backdoor backdoor provide? What’s it going to stop? MAYBE kids sending each other inappropriate pictures? Perhaps some small time preditors sending or requesting pics to/from kids (I’m sure this happens..but at the scale worthy of  backdooring a secure platform?)?  Is that worth subverting global secure communications for CERTAIN exploitation in the future!?   

    No.

    And they’ll still get up on stage in September and boast their “end to end” encryption and all their bullshit privacy advocacy.  It’s all bullshit now.  I dont trust them anymore than Google or Facebook.
    What back door? Tell me how my privacy is compromised when I have 0 CSAM. 

    Your assertion that nobody will upload CSAM to iCloud isn’t backed up by how things have worked to date. Apple has already reported incidences of it. It was a relatively low number, in the mid-200s I think. Meanwhile, Facebook reported over 20 million instances in the same year. So, those numbers aren’t proving that the people sharing or storing CSAM are particularly clever. 

    By the way, you say they won’t be saving their photos to iCloud, which backs up that this is an opt in implementation. If you’re worried, turn it off. 

    They are building in the capability to scan the contents of messages prior to encryption and transmission.  THAT’S what people are MOST upset about.  Now that the capability to read/analyze messages prior to secure transmission exists, the fear is ANY gov org can/will pressure apple to allow them access to the capability to surveil people.  THAT’s the concern here.
    Now I don’t know what you’re referring to. 

    There are two separate but similar initiatives here. One is comparing hashes to known CSAM before photos get uploaded to iCloud Photo Library. The other is analyzing photos (only photos) in Messages for content that may be inappropriate for children under 13. 

    Both of these are opt in. One of them (the first) may flag photos for review by a human if a certain threshold is met for CSAM. The other doesn’t notify anyone, including Apple, except the owner of the account. 

    Which of these do you think is a back door into my phone and how do you think it will affect people that don’t have CSAM (keeping in mind that nothing will be reported and everything will be like it is right now as long as there’s no CSAM). 

    There are lots of theories going around this and other threads of what could happen. All of those scenarios could happen right now, server-side and haven’t.
    ronnwatto_cobra
  • Reply 82 of 98
    mejsricmejsric Posts: 152member
    macplusplus said:

    That article is an argument against Apple, not in favor of. Since they are already performing that scan on their servers, what is the point in injecting another mechanism into the device itself? I have no authority on iCloud servers, those are Apple's property, but I have authority on my device and I don't want it to be used to inspect me. This is not different than planting a camera into your house to monitor if you abuse your children or your wife.


    This!!!

    In a matter of time Siri always listening can be use to monitor potential abuses* she hears and Face ID is always watching you now.
    edited August 2021 elijahg
  • Reply 83 of 98
    baconstangbaconstang Posts: 1,103member
    So they're going to install this spyware on your iPhone and Mac to stop child porn.  It won't be used for anything else, they say.
    I don't buy the lack of mission creep at all, but I have another question.

    The jerks making new porn don't have to worry about matching hashes of extant material, because they just created it.   So how does this protect "the children"?   Or is there something else they're not talking about?

    This part of the plan reminds me a little of the South Park gnomes that stole underwear.    
    1)   Steal underwear.
    2)   ???
    3)   Make money
    edited August 2021
  • Reply 84 of 98
    Rayz2016Rayz2016 Posts: 6,957member
    nizzard said:
    I just find it incredibly hard to believe the timing on this is not related to their anti-trust case with Epic.   It’s all Dept of Justice.   “Build this in and you wont get smoked in that case”
    That thought had crossed my mind. But my guess is that Apple offered this to the DOJ, and the DOJ said, “Yeah! Gimme! Gimme!”

    Then while showing them how it works, Apple mentioned that it could only work if Apple had complete control of the apps that are loaded on to the phone: “If you allow alternative app stores then how can we check for apps that try to bypass it?”
    elijahgbaconstang
  • Reply 85 of 98
    crowleycrowley Posts: 10,453member
    Rayz2016 said:
    nizzard said:
    I just find it incredibly hard to believe the timing on this is not related to their anti-trust case with Epic.   It’s all Dept of Justice.   “Build this in and you wont get smoked in that case”
    That thought had crossed my mind. But my guess is that Apple offered this to the DOJ, and the DOJ said, “Yeah! Gimme! Gimme!”

    Then while showing them how it works, Apple mentioned that it could only work if Apple had complete control of the apps that are loaded on to the phone: “If you allow alternative app stores then how can we check for apps that try to bypass it?”
    It doesn't work for other apps on the iPhone that are from the App Store.
  • Reply 86 of 98
    Rayz2016Rayz2016 Posts: 6,957member
    elijahg said:
    elijahg said:
    Rayz2016 said:
    elijahg said:
    Bollocks. So when the Chinese government tells Apple to add a heap of CPP provided hashes, they’re going to refuse? Of course they won’t. If any government said provided data were hashes of CSAM material, who’s Apple to say it’s not?
    That's the great thing about the CSAM material; it's just hashes. In some countries it could kiddie porn; in other countries it could be photos taken by the police at protest march. And in those countries, Apple won't be the only ones checking the pictures.
    CSAM is not just hashes. Where did you get that idea? The hashes that Apple will compare against come from NCMEC, where the actual images are stored. The hashes are created from the images. Are we supposed to believe that NCMEC will now just accept a hash from any government that feels like sending it over without a corresponding image to go along with it?

    Let’s not forget that it US law requires tech companies to report incidences of CSAM. Also, using iCloud Photo Library is opt in, so people who are worried about their photos being matched to a hash don’t need to opt in.

    Gruber posits that doing the check client-side, rather than server-side, will allow them to fully encrypt iCloud backups.
    So you think China will be happy with Apple using hashes of NCMEC? Where the US government could insert hashes that are of someone they want in China, and then under the guise of CSAM find out all the photos they want of this person? 

    There is literally no point in encrypting backups if Apple has defied the trust of their customers by inserting this spyware. What's the point in end to end encryption if the spyware is already on the device pre-encryption? How long until it scans all files on your phone before syncing to iCloud? How long before it scans all files all the time? 
    That isn’t how hashes work. Hashes find the exact same photograph, not a photograph that is similar. So, your imagined scenario where the US government uploads a hash of a photo of someone they are looking for and in return get all photos of that person is not how it works. The uploaded hash would only help to find positive matches of that exact same photo.

    Also, as has been mentioned several times already, everyone can opt out.
    Try reading up on what you're trying to defend because otherwise you make yourself look pretty stupid. The matching is fuzzy, it looks for similar photos. Ones that have been cropped, mirrored, had a couple of pixels changed, blurred etc. 
    Which is why Apple needs to have a universal key to decrypt your image file so it can be checked manually by someone you neither know or trust. 


    Henry Farid, one of the people who helped develop PhotoDNA, wrote an article for Wired saying:

    Recent advances in encryption and hashing mean that technologies like PhotoDNA can operate within a service with end-to-end encryption. Certain types of encryption algorithms, known as partially or fully homomorphic, can perform image hashing on encrypted data. This means that images in encrypted messages can be checked against known harmful material without Facebook or anyone else being able to decrypt the image. This analysis provides no information about an image’s contents, preserving privacy, unless it is a known image of child sexual abuse.


    Sounds like a much better idea. What it has in common with Apple’s idea is that if you don’t want to be scanned then you don’t you don’t send it up to the server. What it has over Apple’s idea is that you know for sure there’s nothing nefarious going on because it’s not running spy software on the client.


    elijahgbaconstangmuthuk_vanalingam
  • Reply 87 of 98
    Rayz2016Rayz2016 Posts: 6,957member
    So they're going to install this spyware on your iPhone and Mac to stop child porn.  It won't be used for anything else, they say.
    I don't buy the lack of mission creep at all, but I have another question.

    The jerks making new porn don't have to worry about matching hashes of extant material, because they just created it.   So how does this protect "the children"?   Or is there something else they're not talking about?

    This part of the plan reminds me a little of the South Park gnomes that stole underwear.    
    1)   Steal underwear.
    2)   ???
    3)   Make money
    I imagine the thinking is that when law enforcement receive new images then they put them in the hash database. 

    This new database (or subset) is then downloaded to every iPhone, iPad and Mac, which will then do a check against against images going up to iCloud. 

    What they haven’t mentioned yet is how often the local database is going to be downloaded to your phone. I guess they’d want to do it as quickly as possible after a new set of images is uncovered. 

    I imagine that Apple will carry on scanning images on their servers so they don’t miss ones that are distributed before the hash database is updated. 
    muthuk_vanalingam
  • Reply 88 of 98
    crowleycrowley Posts: 10,453member
    mike54 said:

    Also, as has been mentioned several times already, everyone can opt out.
    It seems so. Today. Tomorrow? What about tomorrow? Do you think about tomorrow? I hope so.
    What about the inevitable mission creep, other government requests, etc.... tomorrow? It might pay to think about tomorrow, if not for you, for your children and their children.
    Literally anything could happen tomorrow.  Apple could send the entire contents of my phone to the NSA tomorrow.  Apple have an incredible amount of potential power over my data that they can use in any number of ways, and what they've done is a highly restricted method of checking the iOS estate for evidence of child abuse in a way that still manages to protects my privacy about as much as it can while still achieving its purpose.  

    Apple have shown a lot of good faith by being very open and publishing detailed process documentation, and being forthright and secretive about their plans to resist any government requests for expansion.  My expectation is that if there are any changes then they'll go through this exact same process, and there will be plenty of opportunity to ditch my iPhone if they go too far.  So maybe tomorrow they'll do a complete two-face, but maybe tomorrow the world will end from an asteroid impact.  I don't waste my time hypothesising about unlikely tomorrows.
    ihatescreennamesronnwatto_cobraDetnator
  • Reply 89 of 98
    irelandireland Posts: 17,798member
    Let’s all say it together: hahahahahaha!! Yeah, right.
    elijahg
  • Reply 90 of 98
    Rayz2016Rayz2016 Posts: 6,957member
    crowley said:
    mike54 said:

    Also, as has been mentioned several times already, everyone can opt out.
    It seems so. Today. Tomorrow? What about tomorrow? Do you think about tomorrow? I hope so.
    What about the inevitable mission creep, other government requests, etc.... tomorrow? It might pay to think about tomorrow, if not for you, for your children and their children.
    Literally anything could happen tomorrow.  Apple could send the entire contents of my phone to the NSA tomorrow.  Apple have an incredible amount of potential power over my data that they can use in any number of ways, and what they've done is a highly restricted method of checking the iOS estate for evidence of child abuse in a way that still manages to protects my privacy about as much as it can while still achieving its purpose.  

    Apple have shown a lot of good faith by being very open and publishing detailed process documentation, and being forthright and secretive about their plans to resist any government requests for expansion.  My expectation is that if there are any changes then they'll go through this exact same process, and there will be plenty of opportunity to ditch my iPhone if they go too far.  So maybe tomorrow they'll do a complete two-face, but maybe tomorrow the world will end from an asteroid impact.  I don't waste my time hypothesising about unlikely tomorrows.
    No problem. 

    That’s what other people such as the EFF are doing: looking ahead to protect people’s rights and lives in the future. Because it’ll be too late in the future if they’re ignored now. 
    elijahgbaconstangmuthuk_vanalingam
  • Reply 91 of 98
    crowleycrowley Posts: 10,453member
    Rayz2016 said:
    crowley said:
    mike54 said:

    Also, as has been mentioned several times already, everyone can opt out.
    It seems so. Today. Tomorrow? What about tomorrow? Do you think about tomorrow? I hope so.
    What about the inevitable mission creep, other government requests, etc.... tomorrow? It might pay to think about tomorrow, if not for you, for your children and their children.
    Literally anything could happen tomorrow.  Apple could send the entire contents of my phone to the NSA tomorrow.  Apple have an incredible amount of potential power over my data that they can use in any number of ways, and what they've done is a highly restricted method of checking the iOS estate for evidence of child abuse in a way that still manages to protects my privacy about as much as it can while still achieving its purpose.  

    Apple have shown a lot of good faith by being very open and publishing detailed process documentation, and being forthright and secretive about their plans to resist any government requests for expansion.  My expectation is that if there are any changes then they'll go through this exact same process, and there will be plenty of opportunity to ditch my iPhone if they go too far.  So maybe tomorrow they'll do a complete two-face, but maybe tomorrow the world will end from an asteroid impact.  I don't waste my time hypothesising about unlikely tomorrows.
    No problem. 

    That’s what other people such as the EFF are doing: looking ahead to protect people’s rights and lives in the future. Because it’ll be too late in the future if they’re ignored now. 
    Why?
    ronn
  • Reply 92 of 98
    crowley said:
    Apple have stated that they will only work with data provided by "NCMEC and other child safety groups", nothing about government (NCMEC is government funded, but independent).  And in the system being introduced Apple check the photos as the second tier review, not the government: "There is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC", so you're plain wrong about that one.

    Nothing funded by government, even indirectly, is "independent", no matter what they might say.

    And as for Apple checking the photos at the second tier instead of the government, all I have to say is "for now".  At some point, some fascist government will claim that such a sensitive task cannot be trusted in the hands of a corporation and must be done by the government to properly insure privacy.  And as soon as that law is passed, Apple will comply, as they do in all such cases.
    baconstangmuthuk_vanalingam
  • Reply 93 of 98
    maestro64maestro64 Posts: 5,043member
    Their defense is that they would refuse authoritarian attempts but we’ve seen two instances so far where Apple couldn’t refuse. With iCloud in China and with FaceTime in Saudi Arabia. Setting the past aside what’s to say that the political or financial climate for Apple won’t change and make it harder to say no than it is now? There may come a time when they want to say no but can’t. 
    The problem with hypotheticals is that they can be applied to anything and anyone. That's why lawsuits based on hypotheticals get thrown out of court, like Yahoo! suing the government over subpoenas for data. Yahoo! imagined all kinds of scenarios where handing over the data could be abused, but the court said "where's the proof the government is actually doing any of that with the data" and the case was thrown out. 
    Actually the Supreme Court just ruled on a so called Hypotheticals. Calif had a law which required non-profits to turn over all the contact information of all their donors. The stated reason for the law was the government could look for laws being broken such as money laundering. Well the argument against this was it could be abused in any number of ways and Calif argued if you are not doing any wrong you should not have an issue. Issue like this are not clear cut and everyone should be worried about their privacy and their freedoms.
    edited August 2021 elijahg
  • Reply 94 of 98
    maestro64maestro64 Posts: 5,043member
    I think everyone agrees CSAM is very bad any anyone involved in this are bad people and should be locked up in a deep dark place. I think everyone also agrees invading everyone privacy in the pursuit of bad people is also a very slipper slop open to abuse. Some people think reading a hash of a file is not an invasion especially if someone chooses to store those files on a third parties service. 

    Simple analogy, think about safe deposit boxes in banks and elsewhere. Bank have clear rules about the use of those boxes but people all the time put things in those boxes the government would love to know about. No one is allow to peek in those boxes without probable cause and a warrant. Now, can banks begin to scan those boxes with some technologies which does not require them to be open but tell the government what's in them. Extend this to your home what if they can scan your home and know what you have inside. This is not hypothetical since most people 20 yrs ago did not believe you could write a program which reads a electronic photo and recognizes for what it is.

    As much as some would like to believe what Apple is doing is very much altruistic and they are doing it for the betterment of society (i.e lots of very bad SCAM individuals locked up in a dark places). I could be wrong but hash could be manipulated. So NCMEC has a dataset of know hashes of images and files that are currently known to be circulating. I could be over simplifying this, but all you need to do is filter the image in question and the Hash is different and will not be caught with the algorithm. In the mean time everyone's privacy is being compromised in hopes they catch someone who may never get caught since they found a new way to avoid detection. 

    Privacy is a far more complicated issue than can be solved with technology. And there are people who are willing to give up everyone privacy in pursuit of their Altruistic view of the world.
    edited August 2021 elijahgbaconstangmuthuk_vanalingam
  • Reply 95 of 98
    baconstangbaconstang Posts: 1,103member
    Rayz2016 said:
    So they're going to install this spyware on your iPhone and Mac to stop child porn.  It won't be used for anything else, they say.
    I don't buy the lack of mission creep at all, but I have another question.

    The jerks making new porn don't have to worry about matching hashes of extant material, because they just created it.   So how does this protect "the children"?   Or is there something else they're not talking about?

    This part of the plan reminds me a little of the South Park gnomes that stole underwear.    
    1)   Steal underwear.
    2)   ???
    3)   Make money
    I imagine the thinking is that when law enforcement receive new images then they put them in the hash database. 

    This new database (or subset) is then downloaded to every iPhone, iPad and Mac, which will then do a check against against images going up to iCloud. 

    What they haven’t mentioned yet is how often the local database is going to be downloaded to your phone. I guess they’d want to do it as quickly as possible after a new set of images is uncovered. 

    I imagine that Apple will carry on scanning images on their servers so they don’t miss ones that are distributed before the hash database is updated. 
    Yeah, but the images have to be discovered and tagged as CSAM.  Unless a CSAM creator is stupid enough to keep uploading the files and/or keep them on the device, only schmucks that keep passing the images around get caught.  The ones doing this shit to kids are on to the next new photos.
    beowulfschmidt
  • Reply 96 of 98
    Rayz2016 said:
    gatorguy said:
    "Apple will refuse any such demands," says the FAQ document. "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."
    Until the government in question passes a law that requires Apple to do so, because as they've said many times, they'll comply with any local laws, even to the detriment of their principles concerning privacy.

    "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."

    And having "steadfastly refused" those demands in the past, you've now done what they want voluntarily.  And as soon as a government passes a law requiring the addition of something else, you'll comply, just as you have all along.


    I would not expect Apple to necessarily reveal any expansion of it if some country, and in this case I'm thinking of China, would order them to. They've long soft-peddled the "iCloud operated by GCBD" handover. Heck, it's not even an Apple-run program there. Apple is simply contractually required to cooperate with the government-controlled cloud provider in whatever way needed for handlng the demands on services and access. It is no longer Apple's to run, and they aren't making the rules.
    You seem to have a finger deep inside Google; do they have something like this, or do they just do the server side scan. I haven't been able to find any reference to a similar setup at any other tech behemoth.
    Serious? Did you check? Yes, Google and Microsoft both do something similar. PhotoDNA is Microsoft's tool. They, like Dropbox and Twitter and Tumblr, all scan for CSAM images using the hash checks, and notify the police. Just like this.

    https://protectingchildren.google/intl/en/

    https://www.microsoft.com/en-us/PhotoDNA/CloudService



    ...so what's different here? Here Apple does the hash-compare on-device, prior to uploading the image to its commercial cloud server. This allows them to 1) not host the CSAM. 2) Actually offer more privacy by not being aware if you have any CSAM on-device until a certain threshold number of matches has been met. 

    How come you guys weren't having fits about Google, Microsoft, and Dropbox scanning images? How come re-tooling it by authoritarians wasn't a concern then?
    I don’t throw a fit because I don’t rely on/use those services and/or software.
    gatorguymuthuk_vanalingam
  • Reply 97 of 98
    Rayz2016 said:
    entropys said:
    A concern from privacy and security experts has been that this scanning of images on device could easily be extended to the benefit of authoritarian governments that demand Apple expand what it searches for. 

    "Apple will refuse any such demands," says the FAQ document. "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."


    Riiiight.

    Translation: Here at Apple, we might have created a back door, but we promise to only ever use it for good. Pinky swear!

    The skeptic in me agrees with this.  I’ve long said there’s a huge difference between not being capable to do something (intentional or not) and promising not to do something when the capability is there.  While at this point in time they may very well not acquiesce to government requests, but what about several years down the road? What about China where they have already seeming bent over backwards to maintain a presence there? Being a software engineer myself, I’m sure this went through rigorous review and testing, but any new code added may potentially introduce another attack vector to be exploited.
    I'm a software engineer too, and one thing I've always been a little wary of is Apple's testing strategy. They have allowed some seriously weird bugs out into the wild. 
    Lol no kidding.  I can see how it can and does happen (nobody is perfect), but when the same or similar things happen over and over, users lose their trust which is hard to earn back.
Sign In or Register to comment.