What you need to know: Apple's iCloud Photos and Messages child safety initiatives

1246789

Comments

  • Reply 61 of 162
    crowleycrowley Posts: 9,130member
    amar99 said:
    Last apple product I purchase, last software update I install. Now that they're in bed with the government, their politically-charged "excuse" for invading every user's privacy cannot outweigh the long-term consequences. Governments want in, and Apple has just given them an avenue. The future is dark for Apple if they go down this path off complicity.
    Let us know what phone you buy that has a better privacy policy.
    StrangeDayskillroywatto_cobradysamoriamwhite
  • Reply 62 of 162
    crowleycrowley Posts: 9,130member
    Rayz2016 said:

     this can be done without telling anyone because the code isn't open source. 
    Incidentally, while the code isn't open source, it's probably amongst the most exposed closed source code Apple has ever committed, since they've published a significant description of the PSI protocol, with all of its constraints and logical operators: https://www.apple.com/child-safety/pdf/Apple_PSI_System_Security_Protocol_and_Analysis.pdf.

    Apple are being uncharacteristically open about what they're doing.
    edited August 7 watto_cobradysamoria
  • Reply 63 of 162
    bulk001bulk001 Posts: 665member
    elijahg said:
    elijahg said:
    Remember that 1 in 1 trillion isn't 1 false positive per 1 trillion iCloud accounts - it's 1 per 1 trillion photos. I have 20,000 photos, that brings the chances I have a falsely flagged photo to 1 in 50 million. Not quite such spectacular odds then.
    One in a trillion over 20,000 photos is not 1 in 50 million. It's one in a trillion, 20,000 times. The odds do not decrease per photo, as your photo library increases in size. There is not a 1:1 guarantee of a falsely flagged photo in a trillion-strong photo library.

    And even if it was, one in 50 million is still pretty spectacularly against.
    Unfortunately it is - 1 in 1 trillion becomes 2 in 1 trillion with two photos. Or 1 in 500 billion. That then halves again with 4 photos, 1 in 250 billion and so on. It's little more than simplified fractions. Punch 1,000,000,000,000/20,000 into a scientific calculator and it'll be simplified to 50,000,000/1. The odds do decrease because there is a more likelihood you have a matching photo with 2 photos than 1 photo. And yes, statistically speaking 1 in 1 trillion means that in a trillion-strong library there will be one false match.

    Also, it's massively more likely someone will get their password phished than a hash collision occurring - probably 15-20% of people I know have been "hacked" through phishing. All it takes is a couple of photos to be planted, with a date a few years ago so they aren't at the forefront of someone's library and someone's in very hot water. You claim someone could defend against this in court, but I fail to understand how? "I don't know how they got there" isn't going to wash with too many people. And unfortunately, "good security practices" are practised only by the likes of us anyway, most people use the same password with their date of birth or something equally insecure for everything. 
    1 in 50 million is not the same statistically as one in a trillion tried 20,000 times, no matter how much you want it to be so, I'm afraid. Regardless, your 1 in 50 million is still a very large number.

    One in a trillion tried a trillion times does not guarantee a match, although it is likely. as you're saying. There may even be two or three. You're welcome to believe what you want, and you can research it with statisticians if you are so inclined. This is the last I will address this point here.

    And, in regards to the false positive, somebody will look at the image, and say something like: Oh, this is a palm tree. It just coincidentally collides with the hash. All good. Story over.

    In regards to your latter point, this is addressed in the article.
    Correct! If the lottery odds are 1:150 million, buying two tickets does not increase your odds of winning to 1:75 million. You now have two tickets, each with the odds of 1:150 million of winning. Or to put it another way, if you have a whole collection of photos of child pornography on iCloud, you are going to jail. 
    StrangeDaysdewmewatto_cobramwhite
  • Reply 64 of 162
    gatorguygatorguy Posts: 23,304member
    crowley said:
    Rayz2016 said:

    Now, Google, Microsoft and Twitter have been scanning images for years, but I can't find any references (which may be my poor BingFu)  that they install software on devices that scans your data and then uploads it to another server. From what I can tell, they scan images once they hit the servers then report them. 
    Which Apple can't do if photos are user-encrypted in iCloud, which I imagine is the plan for the future.  They're doing the exact same thing as the others, just on the client end.  Absolutely no difference of any significance.
    FWIW Apple may have believed this would fly under the radar or at least quickly be forgotten since they've been scanning through other private data of yours for illegal content for years, deleting and/or reporting it to law enforcement as they deed fit, yet without the groundswell of indignation that this one is causing.  Off-device compared to on-device is the difference-maker.

    That "leaked internal memo" was no leak IMHO but just an initial PR response framing it as an amazing piece of hard work from assorted Apple teams on your family's behalf, intended to get out ahead of any complaints.
    edited August 7 killroy
  • Reply 65 of 162
    GeorgeBMacGeorgeBMac Posts: 10,731member
    The thing is:  This is less about Apple than it is an ongoing debate about government and what it can and should be doing regarding its ability to gather information with the intent of protecting the country and its citizens.

    It's not a new debate.   I gained front page headline status 20 years when it was revealed the U.S. was gathering information by torturing prisoners.  The argument against it was obvious:  it's against international law and American ethics.   The argument for the torture was equally obvious:  the victim could reveal information that would save the nation from another disaster killing and maiming thousands of Americans.

    Also stemming from that era was the Patriot Act that legalized government snooping -- again to protect America and Americans.

    Today though, the same groups who advocated for the collection of data say that government cannot be trusted and privacy is a key.

    With regard to Apple the argument is:  Where is the line?  Where does this stop?
    Should Apple be scanning for terrorist plots?   Or, what kind of sex should Apple be scanning for? 

    Apple didn't create these debates -- debates that have no single right answer.   But they did step into the middle of them.

    But, maybe Apple stepped into the middle of it long ago when they told people that the government would never be able to collect their information.   That they could store child pornography or plans to make an atomic bomb safely and securely on Apple's servers.
    dewme
  • Reply 66 of 162
    darkvaderdarkvader Posts: 670member

    What you REALLY need to know: Apple's iCloud Photos and Messages child safety initiatives

    1.  Spying on your customers is EVIL, even when it's done for ostensibly noble purposes.

    2.  If this technology is implemented, it WILL be used for purposes other than what it originally was intended to do.

    3.  This can easily be used to target people, both by state and non-state actors, simply by sending your target texts or emails with matchable images.

    4.  This WILL be used by authoritarian governments for things other than its original design purpose.

    5.  It must be stopped.

    What I'd propose is that if Apple continues down this dark path, the defeat is going to be overwhelming the system.  The hash database exists, therefore it's going to be obtainable.  What needs to happen is the creation of a large number of image files that, while being innocuous cat memes or even just line noise, will match the hashes.  Then those harmless images should be so widely distributed that systems like this are utterly overwhelmed with false positives, making them useless.


  • Reply 67 of 162
    bulk001bulk001 Posts: 665member
    darkvader said:

    What you REALLY need to know: Apple's iCloud Photos and Messages child safety initiatives

    1.  Spying on your customers is EVIL, even when it's done for ostensibly noble purposes.

    2.  If this technology is implemented, it WILL be used for purposes other than what it originally was intended to do.

    3.  This can easily be used to target people, both by state and non-state actors, simply by sending your target texts or emails with matchable images.

    4.  This WILL be used by authoritarian governments for things other than its original design purpose.

    5.  It must be stopped.

    What I'd propose is that if Apple continues down this dark path, the defeat is going to be overwhelming the system.  The hash database exists, therefore it's going to be obtainable.  What needs to happen is the creation of a large number of image files that, while being innocuous cat memes or even just line noise, will match the hashes.  Then those harmless images should be so widely distributed that systems like this are utterly overwhelmed with false positives, making them useless.


    Considering that authoritarian governments can already access you iPhone (Pegasus is one example
    and there are many more no doubt out there) this is nonsense. I am sure that the Democratic governments can too. In China and Saudi Arabia they are not going to politely ask you to give them permission to access your phone or iCloud data and if you decline, let you go with an apology! Child pornographers would be thrilled to have the database corrupted and your suggestion only helps them and no one else. 
    GeorgeBMackillroyradarthekatdysamoria
  • Reply 68 of 162
    lkrupplkrupp Posts: 9,639member
    amar99 said:
    Last apple product I purchase, last software update I install. Now that they're in bed with the government, their politically-charged "excuse" for invading every user's privacy cannot outweigh the long-term consequences. Governments want in, and Apple has just given them an avenue. The future is dark for Apple if they go down this path off complicity.
    Histrionic much? Who do you plan to do business with in the future? You know, of course, that other platforms are already “in bed” with the government and have been for years, right? You know that Google has been using the same CSAM databases to detect child porn, right? Your abject ignorance on the subject is amazing. Try reading before inserting foot in mouth. 

    I’d really like to read your response but I know you just came in here to drop your turd. If you bought your last Apple product then you will off the grid shortly because you have nowhere to go. You know that, right?

    And you know that Apple will only be scanning photos uploaded to iCloud, right? So if you turn iCloud off and keep your data on your iPhone only it won't be scanned. You know this, right?
    bulk001killroyradarthekatwatto_cobradysamoriamwhite
  • Reply 69 of 162
    crowleycrowley Posts: 9,130member
    darkvader said:
    1.  Spying on your customers is EVIL, even when it's done for ostensibly noble purposes.
    This isn't spying.  It';s verifying against a known dataset of child abuse images.  No match, no problem.
    darkvader said:
    2.  If this technology is implemented, it WILL be used for purposes other than what it originally was intended to do.
    Baseless conjecture, and not even necessarily a negative one.
    darkvader said:
    3.  This can easily be used to target people, both by state and non-state actors, simply by sending your target texts or emails with matchable images.
    Except that wouldn't work, because the check only happens when the image is being sent to iCloud Photos.
    darkvader said:

    4.  This WILL be used by authoritarian governments for things other than its original design purpose.
    Authoritarian governments are not in control, Apple are. 
    darkvader said:

    5.  It must be stopped.
    I disagree, and counter that rampant child abuse must be stopped.
    darkvader said:

    What I'd propose is that if Apple continues down this dark path, the defeat is going to be overwhelming the system.  The hash database exists, therefore it's going to be obtainable.  What needs to happen is the creation of a large number of image files that, while being innocuous cat memes or even just line noise, will match the hashes.  Then those harmless images should be so widely distributed that systems like this are utterly overwhelmed with false positives, making them useless.
    Great, so you're helping out those who share child abuse images.  Good to know whose side you're on.
    StrangeDaysbulk001killroyradarthekatwatto_cobradysamoria
  • Reply 70 of 162
    dewmedewme Posts: 3,950member
    thrang said:
    My take is this.

    Apple saw the writing on the wall regarding government mandated back-doors. Doors that would all in all likelihood much more open with unfettered access to much if not all of your information. Perhaps begin he sense, the pressure were growing immense.

    Perhaps, they decided to develop a content and technology around very narrow and focused one-way beacons (to avoid a mandated backdoor), initially to identify illegal and abhorrent possession and behavior. Perhaps evidence of murders, rapes, extortion, terrorism, may be other beacons that are sent out in the future.

    I know, there will be discussions over who makes the decisions, how is it vetted, errors, misuse, hacking, etc. But essentially, to me, Apple seeking to control the process of what they see are inevitable laws that they would need to comply with and with much worse outcomes for users. One way beacons that need to be further vetted to ensure no false positives is an effort to ameliorate law enforcement concerns while still protecting legitimate private information is perhaps a very good approach.

    While it feels icky upon initially hearing about it, the more you think about it this way (and what government enforced alternatives might be), it may begin to make sense.

    In terms of what defines when these future beacons will be sent out, Apple will like say ask for the pertinent laws to govern such beaconing, leaving up to elected leaders to clarify/legislate/vote what kind of content is considered severely harmful, how it is to be legally obtained, and utlimately leave it up to voters to support or oust those politicians in future elections. So in this case, there is a well defined hash database for this particular category. Apple then implements an on-device methodology that is designed to keep the rest of your data protected and unavailable for unrelated sniffing about while beaconing when there is a match.

    As to other categories of hash matching, governments will need to figure that out which would be subjects to immense scrutiny  and public debate I'm sure...

    There are caveats of course, but in principal this is how I see what has happened.
    This largely echoes my sentiment expressed in another AI thread on the same topic. Those who wish to have unfettered access to the private information of ordinary citizens have long used the argument “what about the abused children?” to justify their requests that Apple open a back door for intrusive government spying on its own citizens. Apple is trying to take that card off the table, probably in hopes that it will quell the onslaught of requests. 

    I think it’ll buy Apple some time, but not much. As more and more countries including the United States continue to radicalize against not only remote outsiders, but their fellow citizens who they now consider outsiders because they don’t embrace reality bending authoritarian despots, the requests for back doors will transform into demands for back doors that cannot be denied. 

    I’m very much in favor of what Apple is proposing, but I’m equally concerned that what they are proposing will not be enough to keep the bigger issue of massive government intrusion through mandated back doors at bay. At some point we’ll all have to assume that privacy as we used to know it no longer exists. Nothing Apple is doing will change the eventual outcome if the embrace of authoritarianism and demonization of fellow citizens is allowed to grow. 
    edited August 7 radarthekatwatto_cobra
  • Reply 71 of 162
    bulk001bulk001 Posts: 665member
    dewme said:
    thrang said:
    My take is this.

    Apple saw the writing on the wall regarding government mandated back-doors. Doors that would all in all likelihood much more open with unfettered access to much if not all of your information. Perhaps begin he sense, the pressure were growing immense.

    Perhaps, they decided to develop a content and technology around very narrow and focused one-way beacons (to avoid a mandated backdoor), initially to identify illegal and abhorrent possession and behavior. Perhaps evidence of murders, rapes, extortion, terrorism, may be other beacons that are sent out in the future.

    I know, there will be discussions over who makes the decisions, how is it vetted, errors, misuse, hacking, etc. But essentially, to me, Apple seeking to control the process of what they see are inevitable laws that they would need to comply with and with much worse outcomes for users. One way beacons that need to be further vetted to ensure no false positives is an effort to ameliorate law enforcement concerns while still protecting legitimate private information is perhaps a very good approach.

    While it feels icky upon initially hearing about it, the more you think about it this way (and what government enforced alternatives might be), it may begin to make sense.

    In terms of what defines when these future beacons will be sent out, Apple will like say ask for the pertinent laws to govern such beaconing, leaving up to elected leaders to clarify/legislate/vote what kind of content is considered severely harmful, how it is to be legally obtained, and utlimately leave it up to voters to support or oust those politicians in future elections. So in this case, there is a well defined hash database for this particular category. Apple then implements an on-device methodology that is designed to keep the rest of your data protected and unavailable for unrelated sniffing about while beaconing when there is a match.

    As to other categories of hash matching, governments will need to figure that out which would be subjects to immense scrutiny  and public debate I'm sure...

    There are caveats of course, but in principal this is how I see what has happened.
    This largely echoes my sentiment expressed in another AI thread on the same topic. Those who wish to have unfettered access to the private information of ordinary citizens have long used the argument “what about the abused children?” to justify their requests that Apple open a back door for intrusive government spying on its own citizens. Apple is trying to take that card off the table, probably in hopes that it will quell the onslaught of requests. 

    I think it’ll buy Apple some time, but not much. As more and more countries including the United States continue to radicalize against not only remote outsiders, but their fellow citizens who they now consider outsiders because they don’t embrace reality bending authoritarian despots, the requests for back doors will transform into demands for back doors that cannot be denied. 

    I’m very much in favor of what Apple is proposing, but I’m equally concerned that what they are proposing will not be enough to keep the bigger issue of massive government intrusion through mandated back doors at bay. At some point we’ll all have to assume that privacy as we used to know it no longer exists. Nothing Apple is doing will change the eventual outcome if the embrace of authoritarianism and demonization of fellow citizens is allowed to grow. 
    Child abuse and child pornography is the very definition of “what about the children”! And after you buy Apple some time and they don’t agree that their servers should host this content, then what? You going to sign up for Jitterbug service and use an old flip phone? I remember a Walmart or Walgreens reporting a mother who took her film in to be developed and there was a picture of her child partially naked and got her arrested and possibly flagged as a sex offender. This is not what is going on here. Unless your pictures matches those in the database no one is going to see it. While false positives are rare they will happen and if there is a criticism, it would be that Apple should explain better how the image will be reviewed and what action taken. To turn your “what about the children” thesis around though, what I don’t understand is the support for the very worse of society on this board in the name of their privacy. 
    edited August 7 radarthekatdysamoria
  • Reply 72 of 162
    StrangeDaysStrangeDays Posts: 11,764member
    darkvader said:

    What you REALLY need to know: Apple's iCloud Photos and Messages child safety initiatives

    1.  Spying on your customers is EVIL, even when it's done for ostensibly noble purposes.

    2.  If this technology is implemented, it WILL be used for purposes other than what it originally was intended to do.

    3.  This can easily be used to target people, both by state and non-state actors, simply by sending your target texts or emails with matchable images.

    4.  This WILL be used by authoritarian governments for things other than its original design purpose.

    5.  It must be stopped.

    What I'd propose is that if Apple continues down this dark path, the defeat is going to be overwhelming the system.  The hash database exists, therefore it's going to be obtainable.  What needs to happen is the creation of a large number of image files that, while being innocuous cat memes or even just line noise, will match the hashes.  Then those harmless images should be so widely distributed that systems like this are utterly overwhelmed with false positives, making them useless.
    Then I assume you don’t use Dropbox, Gmail, Twitter, Tumblr, etc etc… They all use the CSAM database for the same purpose. Nobody is looking at your pictures, the unique hash signature is matched against the database of known images. 

    The main take-away - commercial cloud hosting uses their servers. Should they not take measures to address child pornography on them? Not using their commercial service, there’s no issue. Is that not reasonable? One needn’t use commercial hosting services, especially if using it for illegal purposes.
    edited August 7 bulk001watto_cobradysamoria
  • Reply 73 of 162
    avon b7avon b7 Posts: 5,959member
    Thank you Mike for you effort in your article, even though you are probably wasting your time on this particular forum -  all of us (including myself) are usually entrenched in our opinions despite any evidence to the contrary.
    Worth pointing that out.

    It is a valid article to start getting to grips with Apple's plan, the general state of affairs, some of the underlying issues etc. Explained in a clear, easy-to-digest manner and relatively concise given the 'prickly pear' nature of the issue. 






    bulk001radarthekat
  • Reply 74 of 162
    macplusplusmacplusplus Posts: 2,091member
    Then I assume you don’t use Dropbox, Gmail, Twitter, Tumblr, etc etc… They all use the CSAM database for the same purpose. 

    The main take-away - commercial cloud hosting uses their servers. Should they not take measures to address child pornography on them? Not using their commercial service, there’s no issue. Is that not reasonable? One needn’t use commercial hosting services, especially if using it for illegal purposes.
    And this is exactly what criminals actually do: they are not stupid enough to use iCloud, they have the dark web, they have browsers and file transfer tools tailored to the special protocols developed for the dark web. Apple has long explained very well that iCloud backups are not encrypted. Law enforcement has (or should have) no issue with iCloud, because they can get any person’s unencrypted iCloud data anytime by presenting a court order. And I assure you, this is almost always much faster than Apple’s surveillance, based on the accumulation of some nasty tokens and the following human review.

    So, that child protection pretext stinks. Since law enforcement can access iCloud data anytime, Apple’s attempt to adopt a self-declared law enforcement role to “prevent crimes before they occur” is Orwellian !
    edited August 7 muthuk_vanalingam
  • Reply 75 of 162
    bulk001bulk001 Posts: 665member
    Then I assume you don’t use Dropbox, Gmail, Twitter, Tumblr, etc etc… They all use the CSAM database for the same purpose. 

    The main take-away - commercial cloud hosting uses their servers. Should they not take measures to address child pornography on them? Not using their commercial service, there’s no issue. Is that not reasonable? One needn’t use commercial hosting services, especially if using it for illegal purposes.
    And this is exactly what criminals actually do: they are not stupid enough to use iCloud, they have the dark web, they have browsers and file transfer tools tailored to the special protocols developed for the dark web. Apple has long explained very well that iCloud backups are not encrypted. Law enforcement has (or should have) no issue with iCloud, because they can get any person’s unencrypted iCloud data anytime by presenting a court order. And I assure you, this is almost always much faster than Apple’s surveillance, based on the accumulation of some nasty tokens and the following human review.

    So, that child protection pretext stinks. Since law enforcement can access iCloud data anytime, Apple’s  attempt to adopt self-declared law enforcement role to “prevent crimes before they occur” is Orwellian !
    Ever watched the TV show dumbest criminals? Beside, a pretext to what? What do you have that is so private that the government should not see it? Your affair? Your own dick pic? Nobody cares and if the NSA looked at them so what? If you are a terrorist, 1/6 insurrectiionist, child pornographer, teen slashing tires in a neighborhood etc. I want you to be caught. If anything, the pretext to me seems to be that privacy is being used as an excuse to exploit children. 
    edited August 7 radarthekat
  • Reply 76 of 162
    crowleycrowley Posts: 9,130member
    Then I assume you don’t use Dropbox, Gmail, Twitter, Tumblr, etc etc… They all use the CSAM database for the same purpose. 

    The main take-away - commercial cloud hosting uses their servers. Should they not take measures to address child pornography on them? Not using their commercial service, there’s no issue. Is that not reasonable? One needn’t use commercial hosting services, especially if using it for illegal purposes.
    And this is exactly what criminals actually do: they are not stupid enough to use iCloud, they have the dark web, they have browsers and file transfer tools tailored to the special protocols developed for the dark web. Apple has long explained very well that iCloud backups are not encrypted. Law enforcement has (or should have) no issue with iCloud, because they can get any person’s unencrypted iCloud data anytime by presenting a court order. And I assure you, this is almost always much faster than Apple’s surveillance, based on the accumulation of some nasty tokens and the following human review.

    So, that child protection pretext stinks. Since law enforcement can access iCloud data anytime, Apple’s  attempt to adopt self-declared law enforcement role to “prevent crimes before they occur” is Orwellian !
    I'mma just leave this here:
    U.S. law requires tech companies to flag cases of child sexual abuse to the authorities. Apple has historically flagged fewer cases than other companies. Last year, for instance, Apple reported 265 cases to the National Center for Missing & Exploited Children, while Facebook reported 20.3 million, according to the center’s statistics. That enormous gap is due in part to Apple’s decision not to scan for such material, citing the privacy of its users.
    From: https://www.nytimes.com/2021/08/05/technology/apple-iphones-privacy.html
    bulk001StrangeDaysradarthekatwatto_cobradysamoria
  • Reply 77 of 162
    macplusplusmacplusplus Posts: 2,091member
    bulk001 said:
    Then I assume you don’t use Dropbox, Gmail, Twitter, Tumblr, etc etc… They all use the CSAM database for the same purpose. 

    The main take-away - commercial cloud hosting uses their servers. Should they not take measures to address child pornography on them? Not using their commercial service, there’s no issue. Is that not reasonable? One needn’t use commercial hosting services, especially if using it for illegal purposes.
    And this is exactly what criminals actually do: they are not stupid enough to use iCloud, they have the dark web, they have browsers and file transfer tools tailored to the special protocols developed for the dark web. Apple has long explained very well that iCloud backups are not encrypted. Law enforcement has (or should have) no issue with iCloud, because they can get any person’s unencrypted iCloud data anytime by presenting a court order. And I assure you, this is almost always much faster than Apple’s surveillance, based on the accumulation of some nasty tokens and the following human review.

    So, that child protection pretext stinks. Since law enforcement can access iCloud data anytime, Apple’s  attempt to adopt self-declared law enforcement role to “prevent crimes before they occur” is Orwellian !
    Ever watched the TV show dumbest criminals? Beside, a pretext to what? What do you have that is so private that the government should not see it? Your affair? Your own dick pic? Nobody cares and if the NSA looked at them so what? If you are a terrorist, 1/6 insurrectiionist, child pornographer, teen slashing tires in a neighborhood etc. I want you to be caught. If anything, the pretext to me seems to be that privacy is being used as an excuse to exploit children. 
    You have law enforcement for that. I don't want Apple to become a cop, that's it.

    The pretext is a long developed deep issue involved with Tim Apple's political engagements that apparently leaves him now in a very delicate situation. Won't discuss that further here...
    edited August 7 muthuk_vanalingam
  • Reply 78 of 162
    macplusplusmacplusplus Posts: 2,091member
    crowley said:
    Then I assume you don’t use Dropbox, Gmail, Twitter, Tumblr, etc etc… They all use the CSAM database for the same purpose. 

    The main take-away - commercial cloud hosting uses their servers. Should they not take measures to address child pornography on them? Not using their commercial service, there’s no issue. Is that not reasonable? One needn’t use commercial hosting services, especially if using it for illegal purposes.
    And this is exactly what criminals actually do: they are not stupid enough to use iCloud, they have the dark web, they have browsers and file transfer tools tailored to the special protocols developed for the dark web. Apple has long explained very well that iCloud backups are not encrypted. Law enforcement has (or should have) no issue with iCloud, because they can get any person’s unencrypted iCloud data anytime by presenting a court order. And I assure you, this is almost always much faster than Apple’s surveillance, based on the accumulation of some nasty tokens and the following human review.

    So, that child protection pretext stinks. Since law enforcement can access iCloud data anytime, Apple’s  attempt to adopt self-declared law enforcement role to “prevent crimes before they occur” is Orwellian !
    I'mma just leave this here:
    U.S. law requires tech companies to flag cases of child sexual abuse to the authorities. Apple has historically flagged fewer cases than other companies. Last year, for instance, Apple reported 265 cases to the National Center for Missing & Exploited Children, while Facebook reported 20.3 million, according to the center’s statistics. That enormous gap is due in part to Apple’s decision not to scan for such material, citing the privacy of its users.
    From: https://www.nytimes.com/2021/08/05/technology/apple-iphones-privacy.html
    Flagging such cases doesn't mean preventive Orwellian surveillance. Such a law cannot pass. Even if it did, it cannot be interpreted in such an Orwellian sense. Citizens will fight, courts will interpret.
    bulk001
  • Reply 79 of 162
    dewmedewme Posts: 3,950member
    bulk001 said:
    dewme said:
    thrang said:
    My take is this.

    Apple saw the writing on the wall regarding government mandated back-doors. Doors that would all in all likelihood much more open with unfettered access to much if not all of your information. Perhaps begin he sense, the pressure were growing immense.

    Perhaps, they decided to develop a content and technology around very narrow and focused one-way beacons (to avoid a mandated backdoor), initially to identify illegal and abhorrent possession and behavior. Perhaps evidence of murders, rapes, extortion, terrorism, may be other beacons that are sent out in the future.

    I know, there will be discussions over who makes the decisions, how is it vetted, errors, misuse, hacking, etc. But essentially, to me, Apple seeking to control the process of what they see are inevitable laws that they would need to comply with and with much worse outcomes for users. One way beacons that need to be further vetted to ensure no false positives is an effort to ameliorate law enforcement concerns while still protecting legitimate private information is perhaps a very good approach.

    While it feels icky upon initially hearing about it, the more you think about it this way (and what government enforced alternatives might be), it may begin to make sense.

    In terms of what defines when these future beacons will be sent out, Apple will like say ask for the pertinent laws to govern such beaconing, leaving up to elected leaders to clarify/legislate/vote what kind of content is considered severely harmful, how it is to be legally obtained, and utlimately leave it up to voters to support or oust those politicians in future elections. So in this case, there is a well defined hash database for this particular category. Apple then implements an on-device methodology that is designed to keep the rest of your data protected and unavailable for unrelated sniffing about while beaconing when there is a match.

    As to other categories of hash matching, governments will need to figure that out which would be subjects to immense scrutiny  and public debate I'm sure...

    There are caveats of course, but in principal this is how I see what has happened.
    This largely echoes my sentiment expressed in another AI thread on the same topic. Those who wish to have unfettered access to the private information of ordinary citizens have long used the argument “what about the abused children?” to justify their requests that Apple open a back door for intrusive government spying on its own citizens. Apple is trying to take that card off the table, probably in hopes that it will quell the onslaught of requests. 

    I think it’ll buy Apple some time, but not much. As more and more countries including the United States continue to radicalize against not only remote outsiders, but their fellow citizens who they now consider outsiders because they don’t embrace reality bending authoritarian despots, the requests for back doors will transform into demands for back doors that cannot be denied. 

    I’m very much in favor of what Apple is proposing, but I’m equally concerned that what they are proposing will not be enough to keep the bigger issue of massive government intrusion through mandated back doors at bay. At some point we’ll all have to assume that privacy as we used to know it no longer exists. Nothing Apple is doing will change the eventual outcome if the embrace of authoritarianism and demonization of fellow citizens is allowed to grow. 
    Child abuse and child pornography is the very definition of “what about the children”! And after you buy Apple some time and they don’t agree that their servers should host this content, then what? You going to sign up for Jitterbug service and use an old flip phone? I remember a Walmart or Walgreens reporting a mother who took her film in to be developed and there was a picture of her child partially naked and got her arrested and possibly flagged as a sex offender. This is not what is going on here. Unless your pictures matches those in the database no one is going to see it. While false positives are rare they will happen and if there is a criticism, it would be that Apple should explain better how the image will be reviewed and what action taken. To turn your “what about the children” thesis around though, what I don’t understand is the support for the very worse of society on this board in the name of their privacy. 
    I’m 100% in favor, as I said in my post, about what Apple is doing. They are taking an argument that the government has been trying to use to justify forcing Apple to open back doors off the table - by helping to solve the child abuse problem. This is not a deflection, this is the way that people who truly care about solving problems go about solving them. They aren’t just saying, “I don’t buy into your argument” and ending up at a stalemate. They are saying, “we will negate your argument by solving that problem … so now let’s talk about why you still think you need that back door.”  

    I’m totally behind Apple on this because they are doing the right thing wrt child abuse and they are removing an impediment that’s causing a stalemate on the larger issue of forced government surveillance. The inextricable linkage between the two is very evident in the posts . Doing nothing, as most posters are suggesting and the standard mode of operation, is not a valid option in my opinion. Whether you agree with what Apple is doing or not, these issues need to be discussed openly and not enter into an interminably entrenched ideological stalemate with no progress made on anything. Pragmatism has its time and place, and Apple is saying that time is now.


    StrangeDaysradarthekatwatto_cobradysamoria
  • Reply 80 of 162
    crowleycrowley Posts: 9,130member
    crowley said:
    Then I assume you don’t use Dropbox, Gmail, Twitter, Tumblr, etc etc… They all use the CSAM database for the same purpose. 

    The main take-away - commercial cloud hosting uses their servers. Should they not take measures to address child pornography on them? Not using their commercial service, there’s no issue. Is that not reasonable? One needn’t use commercial hosting services, especially if using it for illegal purposes.
    And this is exactly what criminals actually do: they are not stupid enough to use iCloud, they have the dark web, they have browsers and file transfer tools tailored to the special protocols developed for the dark web. Apple has long explained very well that iCloud backups are not encrypted. Law enforcement has (or should have) no issue with iCloud, because they can get any person’s unencrypted iCloud data anytime by presenting a court order. And I assure you, this is almost always much faster than Apple’s surveillance, based on the accumulation of some nasty tokens and the following human review.

    So, that child protection pretext stinks. Since law enforcement can access iCloud data anytime, Apple’s  attempt to adopt self-declared law enforcement role to “prevent crimes before they occur” is Orwellian !
    I'mma just leave this here:
    U.S. law requires tech companies to flag cases of child sexual abuse to the authorities. Apple has historically flagged fewer cases than other companies. Last year, for instance, Apple reported 265 cases to the National Center for Missing & Exploited Children, while Facebook reported 20.3 million, according to the center’s statistics. That enormous gap is due in part to Apple’s decision not to scan for such material, citing the privacy of its users.
    From: https://www.nytimes.com/2021/08/05/technology/apple-iphones-privacy.html
    Flagging such cases doesn't mean preventive Orwellian surveillance. Such a law cannot pass. Even if it did, it cannot be interpreted in such an Orwellian sense. Citizens will fight, courts will interpret.
    No idea what you're even talking about.  

    You said criminals are "not stupid enough to use iCloud", which is obviously untrue, since they're stupid enough to use Facebook.

    You said Apple are attempting to "prevent crimes before they occur", which doesn't seem to be true or even relevant.  Images of child abuse are definitely crimes that have already occurred.

    Stop using Orwellian like a trump word.  It isn't.
    StrangeDayskillroyradarthekatwatto_cobradysamoria
Sign In or Register to comment.