New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM

124

Comments

  • Reply 61 of 98
    hexclockhexclock Posts: 1,291member
    How about we scan all the phones in Congress first, and see how far we get. 
    Seems fair to me. 
    antoniofonsecawatto_cobra
  • Reply 62 of 98
    Rayz2016Rayz2016 Posts: 6,957member
    cpsro said:
    Of course Apple would never expand the tools. Apple merely provides the backdoor. Governments will be the ones to walk through it.
    Apple will be opening the tools to third parties, apparently. 



    darkvader
  • Reply 63 of 98
    Rayz2016Rayz2016 Posts: 6,957member
    Rayz2016 said:
    gatorguy said:
    Rayz2016 said:
    gatorguy said:
    Rayz2016 said:
    crowley said:
    Apple can resist to government requests but if a government makes that scheme into law Apple cannot resist.
    This was true last week too, nothing has changed with regards to Apple's obligation to follow the law in places where they do business.

    My guess is that they've been offered a deal: implement the backdoor and the anti-trust/monopoly stuff goes away.
    Huh.

    You know another big tech, Google, is in the antitrust crosshairs. It also coincides with a decision by Google to no longer give themselves a key to user cloud data so that they can't turn over certain private information even if compelled by court order. They simply can't decrypt it, period. There's been two other recent Google policy changes that will restrict authorities' access to data and communications too, both here and abroad. Is there any connection between privacy and antitrust action? I'm not so sure there isn't.

    I actually meant Apple had been offered a deal, but now I'm intrigued.

    Google, throwing away the keys? 

    Where's the link for this? 
    https://www.androidcentral.com/apple-may-have-ditched-encrypted-backups-google-hasnt


    Hmmm. 

    That is very interesting. There’s a theory floating around that Apple is running the back door in the client so they can implement encrypted backups on iCloud. This seems to blow that idea out of the water. 
    Gruber states that but he’s cautiously optimistic or more cautious than optimistic on that.
    But if Google can encrypt backups without building back doors in the client, then why can’t Apple?

    I suppose running it on their servers for millions of files is quite expensive. Makes more sense to shift the resource hit for a handful of files to each individual customer. 
    darkvader
  • Reply 64 of 98
    Rayz2016Rayz2016 Posts: 6,957member
    Rayz2016 said:
    gatorguy said:
    "Apple will refuse any such demands," says the FAQ document. "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."
    Until the government in question passes a law that requires Apple to do so, because as they've said many times, they'll comply with any local laws, even to the detriment of their principles concerning privacy.

    "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."

    And having "steadfastly refused" those demands in the past, you've now done what they want voluntarily.  And as soon as a government passes a law requiring the addition of something else, you'll comply, just as you have all along.


    I would not expect Apple to necessarily reveal any expansion of it if some country, and in this case I'm thinking of China, would order them to. They've long soft-peddled the "iCloud operated by GCBD" handover. Heck, it's not even an Apple-run program there. Apple is simply contractually required to cooperate with the government-controlled cloud provider in whatever way needed for handlng the demands on services and access. It is no longer Apple's to run, and they aren't making the rules.
    You seem to have a finger deep inside Google; do they have something like this, or do they just do the server side scan. I haven't been able to find any reference to a similar setup at any other tech behemoth.
    Serious? Did you check? Yes, Google and Microsoft both do something similar. PhotoDNA is Microsoft's tool. They, like Dropbox and Twitter and Tumblr, all scan for CSAM images using the hash checks, and notify the police. Just like this.

    https://protectingchildren.google/intl/en/

    https://www.microsoft.com/en-us/PhotoDNA/CloudService



    ...so what's different here? Here Apple does the hash-compare on-device, prior to uploading the image to its commercial cloud server. This allows them to 1) not host the CSAM. 2) Actually offer more privacy by not being aware if you have any CSAM on-device until a certain threshold number of matches has been met. 

    How come you guys weren't having fits about Google, Microsoft, and Dropbox scanning images? How come re-tooling it by authoritarians wasn't a concern then?
    Because as has been said time and time again, the scanning isn’t the problem; the problem is where the scan is being done.  Doing it on the server means that there is no weak point on the phone. If an errant but of data crashes the scanner, the worst that can happen is the server crashes. Moving it client side means the worst thing that can happen is that you crash a few million phones. Apple has a poor track record with handling errant data. 

    I think CSAM scanning is a great idea, but when you put it in the client then you have to comply when foreign powers tell you to use their version of the database with hash values that could contain anything. 

    I know one thing for sure. If Google announced Android phones now had a piece of software installed that would check and report on files before the user could encrypt it, Apple fans would be screaming “spyware!”    all over the interwebs. 

    Folk  are tying themselves in knots to describe a client side scan that checks and reports on files on your device as anything other than spyware is because it’s Apple. 
    edited August 2021 elijahgmuthuk_vanalingam
  • Reply 65 of 98
    Rayz2016Rayz2016 Posts: 6,957member
    elijahg said:
    Rayz2016 said:
    elijahg said:
    Bollocks. So when the Chinese government tells Apple to add a heap of CPP provided hashes, they’re going to refuse? Of course they won’t. If any government said provided data were hashes of CSAM material, who’s Apple to say it’s not?
    That's the great thing about the CSAM material; it's just hashes. In some countries it could kiddie porn; in other countries it could be photos taken by the police at protest march. And in those countries, Apple won't be the only ones checking the pictures.
    CSAM is not just hashes. Where did you get that idea? The hashes that Apple will compare against come from NCMEC, where the actual images are stored. The hashes are created from the images. Are we supposed to believe that NCMEC will now just accept a hash from any government that feels like sending it over without a corresponding image to go along with it?

    Let’s not forget that it US law requires tech companies to report incidences of CSAM. Also, using iCloud Photo Library is opt in, so people who are worried about their photos being matched to a hash don’t need to opt in.

    Gruber posits that doing the check client-side, rather than server-side, will allow them to fully encrypt iCloud backups.
    So you think China will be happy with Apple using hashes of NCMEC? Where the US government could insert hashes that are of someone they want in China, and then under the guise of CSAM find out all the photos they want of this person? 

    There is literally no point in encrypting backups if Apple has defied the trust of their customers by inserting this spyware. What's the point in end to end encryption if the spyware is already on the device pre-encryption? How long until it scans all files on your phone before syncing to iCloud? How long before it scans all files all the time? 
    Quite. 

    It’s still end to end, but we just scan and log it before the end begins … so to speak. 
    darkvaderelijahgmuthuk_vanalingam
  • Reply 66 of 98
    Rayz2016 said:
    Rayz2016 said:
    elijahg said:
    Bollocks. So when the Chinese government tells Apple to add a heap of CPP provided hashes, they’re going to refuse? Of course they won’t. If any government said provided data were hashes of CSAM material, who’s Apple to say it’s not?
    That's the great thing about the CSAM material; it's just hashes. In some countries it could kiddie porn; in other countries it could be photos taken by the police at protest march. And in those countries, Apple won't be the only ones checking the pictures.
    CSAM is not just hashes. Where did you get that idea? The hashes that Apple will compare against come from NCMEC, where the actual images are stored. The hashes are created from the images. Are we supposed to believe that NCMEC will now just accept a hash from any government that feels like sending it over without a corresponding image to go along with it?

    Let’s not forget that it US law requires tech companies to report incidences of CSAM. Also, using iCloud Photo Library is opt in, so people who are worried about their photos being matched to a hash don’t need to opt in.

    Gruber posits that doing the check client-side, rather than server-side, will allow them to fully encrypt iCloud backups.
    You should read GoogleGuy’s response (and if I’m not mistaken, that’s probably the first time I’ve said that on this forum). Google has been encrypting Android backups for quite some time. 

    And yes, the database that Apple will be receiving from NCMEC will be just hashes. And we’re not just talking about them, we’re also talking about the equivalents in foreign countries. 
    1.Google does a good job of protecting outside intrusions seeing unencrypted data/getting to raw data. 
    2.Speaking of Google encrypting data is good. It helps lead the industry to competitively battle for greater protection of data and date comms.
    3.Speaking of Google in terms of data privacy(not that you did) is laughably surreal. 
    watto_cobra
  • Reply 67 of 98
    Rayz2016Rayz2016 Posts: 6,957member
    elijahg said:
    Rayz2016 said:
    entropys said:
    A concern from privacy and security experts has been that this scanning of images on device could easily be extended to the benefit of authoritarian governments that demand Apple expand what it searches for. 

    "Apple will refuse any such demands," says the FAQ document. "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."


    Riiiight.

    Translation: Here at Apple, we might have created a back door, but we promise to only ever use it for good. Pinky swear!

    The skeptic in me agrees with this.  I’ve long said there’s a huge difference between not being capable to do something (intentional or not) and promising not to do something when the capability is there.  While at this point in time they may very well not acquiesce to government requests, but what about several years down the road? What about China where they have already seeming bent over backwards to maintain a presence there? Being a software engineer myself, I’m sure this went through rigorous review and testing, but any new code added may potentially introduce another attack vector to be exploited.
    I'm a software engineer too, and one thing I've always been a little wary of is Apple's testing strategy. They have allowed some seriously weird bugs out into the wild. 
    Apple's software quality has been going downhill for a long time. There are a lot of bugs that should never ever have passed testing, they obviously don't fuzz the user inputs on any of their software. And considering how many user inputs have been found to have critical bugs, I don't hold out much hope that the non-user facing inputs (streams, executables, document parsing etc) has been tested too well either.

    Not sure I agree that the whole stack is going downhill, but there are definitely some recurring problems in a few areas. 

    I’ve mentioned before that one SMS message string that locks up the OS is bad, but when it happens a second time?

    It’s almost as if someone just dropped in a bit of code to trap that occurrence, without thinking that perhaps the whole block needs reviewing and rewriting because it’s unsafe. 

    Hope it’s not the same chap writing the spyware. 
    edited August 2021 muthuk_vanalingambaconstang
  • Reply 68 of 98
    macplusplusmacplusplus Posts: 2,114member
    Rayz2016 said:
    Rayz2016 said:
    gatorguy said:
    Rayz2016 said:
    gatorguy said:
    Rayz2016 said:
    crowley said:
    Apple can resist to government requests but if a government makes that scheme into law Apple cannot resist.
    This was true last week too, nothing has changed with regards to Apple's obligation to follow the law in places where they do business.

    My guess is that they've been offered a deal: implement the backdoor and the anti-trust/monopoly stuff goes away.
    Huh.

    You know another big tech, Google, is in the antitrust crosshairs. It also coincides with a decision by Google to no longer give themselves a key to user cloud data so that they can't turn over certain private information even if compelled by court order. They simply can't decrypt it, period. There's been two other recent Google policy changes that will restrict authorities' access to data and communications too, both here and abroad. Is there any connection between privacy and antitrust action? I'm not so sure there isn't.

    I actually meant Apple had been offered a deal, but now I'm intrigued.

    Google, throwing away the keys? 

    Where's the link for this? 
    https://www.androidcentral.com/apple-may-have-ditched-encrypted-backups-google-hasnt


    Hmmm. 

    That is very interesting. There’s a theory floating around that Apple is running the back door in the client so they can implement encrypted backups on iCloud. This seems to blow that idea out of the water. 
    Gruber states that but he’s cautiously optimistic or more cautious than optimistic on that.
    But if Google can encrypt backups without building back doors in the client, then why can’t Apple?

    I suppose running it on their servers for millions of files is quite expensive. Makes more sense to shift the resource hit for a handful of files to each individual customer. 
    Their response to that is so vague and evasive that one can't stop thinking how such people can work at Apple:
    "Existing techniques as implemented by other companies scan all user photos stored in the cloud. This creates privacy risk for all users. CSAM detection in iCloud Photos provides significant privacy benefits over those techniques by preventing Apple from learning about photos unless they both match to known CSAM images and are included in an iCloud Photos account that includes a collection of known CSAM." The same FAQ document quoted above.

    What is the association between "all user photos stored in the cloud" and "privacy risk for all users"? What is the point of emphasizing "all"? How your creative method achieves "less" over the totality of iCloud? An incoherent narration that creates more questions than it answers... They already confirmed to the media that they (Apple) perform the CSAM scan on iCloud Servers, why there is no mention of that in that FAQ?
    https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/
    edited August 2021
  • Reply 69 of 98
    davidwdavidw Posts: 2,085member
    elijahg said:
    Rayz2016 said:
    elijahg said:
    Bollocks. So when the Chinese government tells Apple to add a heap of CPP provided hashes, they’re going to refuse? Of course they won’t. If any government said provided data were hashes of CSAM material, who’s Apple to say it’s not?
    That's the great thing about the CSAM material; it's just hashes. In some countries it could kiddie porn; in other countries it could be photos taken by the police at protest march. And in those countries, Apple won't be the only ones checking the pictures.
    CSAM is not just hashes. Where did you get that idea? The hashes that Apple will compare against come from NCMEC, where the actual images are stored. The hashes are created from the images. Are we supposed to believe that NCMEC will now just accept a hash from any government that feels like sending it over without a corresponding image to go along with it?

    Let’s not forget that it US law requires tech companies to report incidences of CSAM. Also, using iCloud Photo Library is opt in, so people who are worried about their photos being matched to a hash don’t need to opt in.

    Gruber posits that doing the check client-side, rather than server-side, will allow them to fully encrypt iCloud backups.
    So you think China will be happy with Apple using hashes of NCMEC? Where the US government could insert hashes that are of someone they want in China, and then under the guise of CSAM find out all the photos they want of this person? 

    There is literally no point in encrypting backups if Apple has defied the trust of their customers by inserting this spyware. What's the point in end to end encryption if the spyware is already on the device pre-encryption? How long until it scans all files on your phone before syncing to iCloud? How long before it scans all files all the time? 
    That isn’t how hashes work. Hashes find the exact same photograph, not a photograph that is similar. So, your imagined scenario where the US government uploads a hash of a photo of someone they are looking for and in return get all photos of that person is not how it works. The uploaded hash would only help to find positive matches of that exact same photo.

    Also, as has been mentioned several times already, everyone can opt out.
    I don't think that's how it works. I someone cropped a photo that is in the NCMEC data base, it should have a different hash when scanned by Apple, but I bet it would still come up with a match. Same if some one were to place a "Smiley" sticker on one corner of the photo or to make a mirror image of it or change its color or contrast, to come up with a different hash. The detecting software is set to come up with a match if the photo is visually similar to one in the database. But it doesn't have to be any where near an "exact" same photo, to come up with a match.

    PhotoDNA developed by Microsoft around 2009 does this. 

    https://en.wikipedia.org/wiki/PhotoDNA

    And interestingly, cited under "History" in the link ....... 

    >In 2016, Hany Farid proposed that the technology could be used to stem the spread of terror-related imagery, but little interest was initially shown by social media companies.[21] In December 2016, Facebook, Twitter, Google and Microsoft announced plans to use PhotoDNA to tackle extremist content such as terrorist recruitment videos or violent terrorist imagery,[22] which was done e.g. to automatically remove al-Qaeda videos.[23]<


    Also interesting is that the software can detect a hash of a photo spliced into a video. Which means that a video you might receive, might look harmless but can get you in trouble if there's just one frame of that video, which you can't see when playing the video, is a hash of a photo in the NCMEC database. 
     
    edited August 2021 darkvaderronnelijahgmuthuk_vanalingambaconstang
  • Reply 70 of 98
    iadlibiadlib Posts: 101member
    Negative ghost rider the pattern is full. 
  • Reply 71 of 98
    I just find it incredibly hard to believe the timing on this is not related to their anti-trust case with Epic.   It’s all Dept of Justice.   “Build this in and you wont get smoked in that case”
    ArchStantonmuthuk_vanalingambaconstang
  • Reply 72 of 98
    darkvaderdarkvader Posts: 1,146member
    Their defense is that they would refuse authoritarian attempts but we’ve seen two instances so far where Apple couldn’t refuse. With iCloud in China and with FaceTime in Saudi Arabia. Setting the past aside what’s to say that the political or financial climate for Apple won’t change and make it harder to say no than it is now? There may come a time when they want to say no but can’t. 

    They can't even say no in the US.  And with "National Security Letters" they can't even tell us that they couldn't say no.

    Any protestations to the contrary from Apple are simply lies.  If they have the ability to spy, they can be forced to spy.

    The ONLY acceptable option is for them to remove their own ability to spy.  Otherwise your device is spyware that can and will be used against you, for whatever purpose a government wants.
    mejsricmuthuk_vanalingambaconstangbeowulfschmidt
  • Reply 73 of 98
    nizzard said:
    I just find it incredibly hard to believe the timing on this is not related to their anti-trust case with Epic.   It’s all Dept of Justice.   “Build this in and you wont get smoked in that case”
    I think this is an excellent guess. Apple has been a bit of thorn in the side of certain gov agencies. Apple, rightly or wrongly, was pretty adamant about privacy of the iPhone. So there's no doubt Apple, to some extent, is bowing to pressure from the government on this. Whether it is a quasi quid pro quo is ???. I think it's a solid guess however. For better or worse this is how deal making works in the seedy back room.
    muthuk_vanalingambaconstang
  • Reply 74 of 98
    nizzard said:
    I just find it incredibly hard to believe the timing on this is not related to their anti-trust case with Epic.   It’s all Dept of Justice.   “Build this in and you wont get smoked in that case”
    I think this is an excellent guess. Apple has been a bit of thorn in the side of certain gov agencies. Apple, rightly or wrongly, was pretty adamant about privacy of the iPhone. So there's no doubt Apple, to some extent, is bowing to pressure from the government on this. Whether it is a quasi quid pro quo is ???. I think it's a solid guess however. For better or worse this is how deal making works in the seedy back room.
    I want to so badly just attribute this to the conspiracy theorist in me.  But the timing of this… and dare I say, even the right to repair bill, seems SO suspicious.   And of course, no one believes an intel agency wont be able to exploit this, or pressure apple to allow them to do so.   It was one thing when there was no mechanism for it and Apple pushed back, but the next time they get the All Writs Act shoved up their ass, I fear their “policy” of not bending to gov pressure will go out the window.
    muthuk_vanalingambaconstang
  • Reply 75 of 98
    So say the guys who are already negotiating user privacy with the Chinese government. Pandora's box has been opened, Cupertino is only concerned about the public relations fire they themselves unleashed.
    baconstang
  • Reply 76 of 98
    elijahgelijahg Posts: 2,790member
    elijahg said:
    Rayz2016 said:
    elijahg said:
    Bollocks. So when the Chinese government tells Apple to add a heap of CPP provided hashes, they’re going to refuse? Of course they won’t. If any government said provided data were hashes of CSAM material, who’s Apple to say it’s not?
    That's the great thing about the CSAM material; it's just hashes. In some countries it could kiddie porn; in other countries it could be photos taken by the police at protest march. And in those countries, Apple won't be the only ones checking the pictures.
    CSAM is not just hashes. Where did you get that idea? The hashes that Apple will compare against come from NCMEC, where the actual images are stored. The hashes are created from the images. Are we supposed to believe that NCMEC will now just accept a hash from any government that feels like sending it over without a corresponding image to go along with it?

    Let’s not forget that it US law requires tech companies to report incidences of CSAM. Also, using iCloud Photo Library is opt in, so people who are worried about their photos being matched to a hash don’t need to opt in.

    Gruber posits that doing the check client-side, rather than server-side, will allow them to fully encrypt iCloud backups.
    So you think China will be happy with Apple using hashes of NCMEC? Where the US government could insert hashes that are of someone they want in China, and then under the guise of CSAM find out all the photos they want of this person? 

    There is literally no point in encrypting backups if Apple has defied the trust of their customers by inserting this spyware. What's the point in end to end encryption if the spyware is already on the device pre-encryption? How long until it scans all files on your phone before syncing to iCloud? How long before it scans all files all the time? 
    That isn’t how hashes work. Hashes find the exact same photograph, not a photograph that is similar. So, your imagined scenario where the US government uploads a hash of a photo of someone they are looking for and in return get all photos of that person is not how it works. The uploaded hash would only help to find positive matches of that exact same photo.

    Also, as has been mentioned several times already, everyone can opt out.
    Try reading up on what you're trying to defend because otherwise you make yourself look pretty stupid. The matching is fuzzy, it looks for similar photos. Ones that have been cropped, mirrored, had a couple of pixels changed, blurred etc. 
    muthuk_vanalingamRayz2016
  • Reply 77 of 98
    elijahg said:
    elijahg said:
    Rayz2016 said:
    elijahg said:
    Bollocks. So when the Chinese government tells Apple to add a heap of CPP provided hashes, they’re going to refuse? Of course they won’t. If any government said provided data were hashes of CSAM material, who’s Apple to say it’s not?
    That's the great thing about the CSAM material; it's just hashes. In some countries it could kiddie porn; in other countries it could be photos taken by the police at protest march. And in those countries, Apple won't be the only ones checking the pictures.
    CSAM is not just hashes. Where did you get that idea? The hashes that Apple will compare against come from NCMEC, where the actual images are stored. The hashes are created from the images. Are we supposed to believe that NCMEC will now just accept a hash from any government that feels like sending it over without a corresponding image to go along with it?

    Let’s not forget that it US law requires tech companies to report incidences of CSAM. Also, using iCloud Photo Library is opt in, so people who are worried about their photos being matched to a hash don’t need to opt in.

    Gruber posits that doing the check client-side, rather than server-side, will allow them to fully encrypt iCloud backups.
    So you think China will be happy with Apple using hashes of NCMEC? Where the US government could insert hashes that are of someone they want in China, and then under the guise of CSAM find out all the photos they want of this person? 

    There is literally no point in encrypting backups if Apple has defied the trust of their customers by inserting this spyware. What's the point in end to end encryption if the spyware is already on the device pre-encryption? How long until it scans all files on your phone before syncing to iCloud? How long before it scans all files all the time? 
    That isn’t how hashes work. Hashes find the exact same photograph, not a photograph that is similar. So, your imagined scenario where the US government uploads a hash of a photo of someone they are looking for and in return get all photos of that person is not how it works. The uploaded hash would only help to find positive matches of that exact same photo.

    Also, as has been mentioned several times already, everyone can opt out.
    Try reading up on what you're trying to defend because otherwise you make yourself look pretty stupid. The matching is fuzzy, it looks for similar photos. Ones that have been cropped, mirrored, had a couple of pixels changed, blurred etc. 
    That is an incorrect characterization. The fuzziness is still only going to match the same photo, perhaps modified, but not a different photo of the same or similar subject. Try reading up. 

    Your imaginary scenario of the government uploading a hash to find different photos of the same subject is still imaginary. 
    edited August 2021 ronnwatto_cobra
  • Reply 78 of 98
    Another reason to believe this has nothing to do with csam and everything to do with building in a back door:   No fûcking pedo is going to save their photos to iCloud (and certainly not now). And no pedo is going to opt in to any monitoring.  So how much benefit will this non-backdoor backdoor provide? What’s it going to stop? MAYBE kids sending each other inappropriate pictures? Perhaps some small time preditors sending or requesting pics to/from kids (I’m sure this happens..but at the scale worthy of  backdooring a secure platform?)?  Is that worth subverting global secure communications for CERTAIN exploitation in the future!?   

    No.

    And they’ll still get up on stage in September and boast their “end to end” encryption and all their bullshit privacy advocacy.  It’s all bullshit now.  I dont trust them anymore than Google or Facebook.
    muthuk_vanalingambaconstang
  • Reply 79 of 98
    nizzard said:
    Another reason to believe this has nothing to do with csam and everything to do with building in a back door:   No fûcking pedo is going to save their photos to iCloud (and certainly not now). And no pedo is going to opt in to any monitoring.  So how much benefit will this non-backdoor backdoor provide? What’s it going to stop? MAYBE kids sending each other inappropriate pictures? Perhaps some small time preditors sending or requesting pics to/from kids (I’m sure this happens..but at the scale worthy of  backdooring a secure platform?)?  Is that worth subverting global secure communications for CERTAIN exploitation in the future!?   

    No.

    And they’ll still get up on stage in September and boast their “end to end” encryption and all their bullshit privacy advocacy.  It’s all bullshit now.  I dont trust them anymore than Google or Facebook.
    What back door? Tell me how my privacy is compromised when I have 0 CSAM. 

    Your assertion that nobody will upload CSAM to iCloud isn’t backed up by how things have worked to date. Apple has already reported incidences of it. It was a relatively low number, in the mid-200s I think. Meanwhile, Facebook reported over 20 million instances in the same year. So, those numbers aren’t proving that the people sharing or storing CSAM are particularly clever. 

    By the way, you say they won’t be saving their photos to iCloud, which backs up that this is an opt in implementation. If you’re worried, turn it off. 
    ronnwatto_cobra
  • Reply 80 of 98
    nizzard said:
    Another reason to believe this has nothing to do with csam and everything to do with building in a back door:   No fûcking pedo is going to save their photos to iCloud (and certainly not now). And no pedo is going to opt in to any monitoring.  So how much benefit will this non-backdoor backdoor provide? What’s it going to stop? MAYBE kids sending each other inappropriate pictures? Perhaps some small time preditors sending or requesting pics to/from kids (I’m sure this happens..but at the scale worthy of  backdooring a secure platform?)?  Is that worth subverting global secure communications for CERTAIN exploitation in the future!?   

    No.

    And they’ll still get up on stage in September and boast their “end to end” encryption and all their bullshit privacy advocacy.  It’s all bullshit now.  I dont trust them anymore than Google or Facebook.
    What back door? Tell me how my privacy is compromised when I have 0 CSAM. 

    Your assertion that nobody will upload CSAM to iCloud isn’t backed up by how things have worked to date. Apple has already reported incidences of it. It was a relatively low number, in the mid-200s I think. Meanwhile, Facebook reported over 20 million instances in the same year. So, those numbers aren’t proving that the people sharing or storing CSAM are particularly clever. 

    By the way, you say they won’t be saving their photos to iCloud, which backs up that this is an opt in implementation. If you’re worried, turn it off. 

    They are building in the capability to scan the contents of messages prior to encryption and transmission.  THAT’S what people are MOST upset about.  Now that the capability to read/analyze messages prior to secure transmission exists, the fear is ANY gov org can/will pressure apple to allow them access to the capability to surveil people.  THAT’s the concern here.
    muthuk_vanalingambaconstanggatorguyelijahg
Sign In or Register to comment.