What you need to know: Apple's iCloud Photos and Messages child safety initiatives

1234579

Comments

  • Reply 121 of 162
    fastasleepfastasleep Posts: 6,408member
    aguyinatx said:

    Apple isn't magic, they have to open the image in order to analyze it.  Full stop.  No binary or user on a Unix system can hash a file without opening.  

    Next up... This computer is one that I paid for and I own.  Only parties I consent to should have the right to open files to analyze them.  From the example above,   No one is complaining about stopping CSAM, but these aren't computers that Apple owns, and they aren't asking users if they want to submit to surveillance, and no scanning a photo to see if a dog is in it is not surveillance.  Additionally Apple is clearly adopting a vigilante role that is extra-judicial.  Law enforcement agencies require a warrant to compel someone to surrender access to a computer, and yet Apple presumes powers that the FBI doesn't have.
    Nobody is "opening" the image. Apple is not accessing your computer. Scanning a photo to look for a dog with ML is far more invasive than what is happening here. The "party" that is accessing your files is the computer itself. You're going to freak out when you find out what Spotlight is doing right now on your device if this is your understanding of this technology.

    Came to the comments for the idiotic takes and am not disappointed. 
    radarthekatdysamorialkrupp
  • Reply 122 of 162
    radarthekatradarthekat Posts: 3,842moderator
    aguyinatx said:
    mjtomlin said:
    Just thought I’d chime in after reading so many misguided complaints about this subject. Are all of you context deaf? Did you read what is actually happening or are you just incapable of comprehending it?

    They are not scanning photos for specific images, they’re simply counting the bits and creating a hash… all they see are numbers… this in no way scans photos for offensive images, nor does it in any way violate your privacy.

    It amazes me that so many are complaining about trying to restrict/thwart child pornography?!

    it’s even more ridiculous when you consider every photo you save to the Photos app is automatically scanned through an image recognition engine to identify the contents of the photo.

    Creating a hash requires scanning the image and absolutely requires the file to be opened.  Do this on a Mac and it's pretty easy to demonstrate.  Also DO NOT RUN COMMANDS YOU DO NOT UNDERSTAND FROM FORUM POSTS.  Ask someone or research the commands.

    Now.... In a terminal....

    sudo -I and enter your account password. This elevates your privileges to root.  That's a lower case I btw.
    echo "foo" > /opt/root-owned-file.txt  This creates a file in /opt called root-owned-file.txt with the word "foo" as the only content.
    chmod 0600 /opt/root/owned-file.txt This ensures that only the root user can read and write to the file
    exit  and hit return

    Now you're running as the user you logged in with. 
    sha256sum /opt/root-owned-file.txt   should give you a hash (those number you were talking about) but it doesn't.  You get a permission denied because you can't hash a file that you can't open.  Apple isn't magic, they have to open the image in order to analyze it.  Full stop.  No binary or user on a Unix system can hash a file without opening.  

    Okay, clean up the file sudo rm /opt/root-owned-file.txt

    Next up... This computer is one that I paid for and I own.  Only parties I consent to should have the right to open files to analyze them.  From the example above,   No one is complaining about stopping CSAM, but these aren't computers that Apple owns, and they aren't asking users if they want to submit to surveillance, and no scanning a photo to see if a dog is in it is not surveillance.  Additionally Apple is clearly adopting a vigilante role that is extra-judicial.  Law enforcement agencies require a warrant to compel someone to surrender access to a computer, and yet Apple presumes powers that the FBI doesn't have.

    The article is primarily an ad hominem fallacy without many facts.  "Hey they other guys are doing it too!" is a baseless argument.  I do not have a Facebook account so I don't care what they do.  I'm not given a choice with Apple suddenly, and I am perfectly justified in getting my ire up when they insist that they have the uninvited right to open files that I create.  
    Oh course you have a choice.  This is implemented only on photos you upload to iCloud.  You have no more obligation to upload your photos to iCloud than you have obligation to open a Facebook account.  
    edited August 2021 GeorgeBMacdysamoria
  • Reply 123 of 162
    JFC_PAJFC_PA Posts: 932member
     Additionally, the iCloud Photos "scanner" isn't actually scanning or analyzing the images on a user's iPhone. Instead, it's comparing the mathematical hashes of known CSAM to images stored in iCloud. If a collection of known CSAM images is stored in iCloud, then the account is flagged and a report is sent to the National Center for Missing & Exploited Children (NCMEC).”

    So it is, in effect. Opt-in. Do not want to be scanned, then simply do not use iCloud for photo storage. On. Apple. Servers. 

    A local external hard drive, or a pair that are swapped each week with one kept offsite to dodge catastrophic loss if your home setup burns down, flood etc. and you’re as covered as a remote Apple server at a bit more effort. 
  • Reply 124 of 162
    GeorgeBMacGeorgeBMac Posts: 11,421member
    aguyinatx said:
    mjtomlin said:
    Just thought I’d chime in after reading so many misguided complaints about this subject. Are all of you context deaf? Did you read what is actually happening or are you just incapable of comprehending it?

    They are not scanning photos for specific images, they’re simply counting the bits and creating a hash… all they see are numbers… this in no way scans photos for offensive images, nor does it in any way violate your privacy.

    It amazes me that so many are complaining about trying to restrict/thwart child pornography?!

    it’s even more ridiculous when you consider every photo you save to the Photos app is automatically scanned through an image recognition engine to identify the contents of the photo.

    Creating a hash requires scanning the image and absolutely requires the file to be opened.  Do this on a Mac and it's pretty easy to demonstrate.  Also DO NOT RUN COMMANDS YOU DO NOT UNDERSTAND FROM FORUM POSTS.  Ask someone or research the commands.

    Now.... In a terminal....

    sudo -I and enter your account password. This elevates your privileges to root.  That's a lower case I btw.
    echo "foo" > /opt/root-owned-file.txt  This creates a file in /opt called root-owned-file.txt with the word "foo" as the only content.
    chmod 0600 /opt/root/owned-file.txt This ensures that only the root user can read and write to the file
    exit  and hit return

    Now you're running as the user you logged in with. 
    sha256sum /opt/root-owned-file.txt   should give you a hash (those number you were talking about) but it doesn't.  You get a permission denied because you can't hash a file that you can't open.  Apple isn't magic, they have to open the image in order to analyze it.  Full stop.  No binary or user on a Unix system can hash a file without opening.  

    Okay, clean up the file sudo rm /opt/root-owned-file.txt

    Next up... This computer is one that I paid for and I own.  Only parties I consent to should have the right to open files to analyze them.  From the example above,   No one is complaining about stopping CSAM, but these aren't computers that Apple owns, and they aren't asking users if they want to submit to surveillance, and no scanning a photo to see if a dog is in it is not surveillance.  Additionally Apple is clearly adopting a vigilante role that is extra-judicial.  Law enforcement agencies require a warrant to compel someone to surrender access to a computer, and yet Apple presumes powers that the FBI doesn't have.

    The article is primarily an ad hominem fallacy without many facts.  "Hey they other guys are doing it too!" is a baseless argument.  I do not have a Facebook account so I don't care what they do.  I'm not given a choice with Apple suddenly, and I am perfectly justified in getting my ire up when they insist that they have the uninvited right to open files that I create.  
    Oh course you have a choice.  This is implemented only on photos you upload to iCloud.  You have no more obligation to upload your photos to iCloud than you have obligation to open a Facebook account.  
    See there?   There you go again!  (Trying to use logic!)

    Both sides of this debate (those "protecting innocent children" and those "protecting our privacy rights" are radicalized and can only see evil.

  • Reply 125 of 162
    dysamoriadysamoria Posts: 3,430member
    elijahg said:
    elijahg said:
    Remember that 1 in 1 trillion isn't 1 false positive per 1 trillion iCloud accounts - it's 1 per 1 trillion photos. I have 20,000 photos, that brings the chances I have a falsely flagged photo to 1 in 50 million. Not quite such spectacular odds then.
    No, that's still pretty spectacular. This is addressed in the piece.
    It is still extremely low odds - but as you point out in your article (thankyou by the way!) the odds of winning the lottery are 1 in 150m - and some people do actually win. In other words there is a not infinitesimally small chance some photos will be falsely flagged - and 1 in 50m is nowhere near the headline 1 in 1 trillion, and better odds than winning the lottery.
    I've made some clarifying edits to my response while you were posting yours. the 1 in 50m isn't accurate. What happens with a flag is also addressed in the article. It isn't automatically handed off to law enforcement.

    And the "humans aren't good at risk assessment" is clearly on display here. 
    The question is, are Apple good at determining the chances of false-positives? What I see with every form of “smart” software around image & language processing is that software SUCKS at doing what humans do very easily. While you’re promoting this “1 in 1 trillion” statement, how have Apple validated this claim themselves?

    I am not critiquing this system, per se; I am critiquing the discussion of it, because there are two types of rhetoric here:

    1. “This is dystopian nightmare shit, and everyone should be freaking out.”
    2. “Apple knows best; don’t be so paranoid and reactionary, you silly goat.”

    This article seems to want to be neutral and fact-based, but it comes off more like number 2, especially when so much depends on the validity of info presented as facts, when they lack qualification as to how they’ve been determined to BE factual in the first place (ie, the “1 in 1 trillion chance of false positive” claim presented by Apple).

    Also, what does this sentence mean: “...some experts worry that having this type of mechanism in place could make it easier for authoritarian governments to clamp down on abuse.”
  • Reply 126 of 162
    dysamoriadysamoria Posts: 3,430member
    crowley said:
    That scheme, can be easily made into a legislation, approved, then applied to citizens' daily lives: a database for drug related photo hashes, a database for terrorism related photo hashes, another one for immigration related photo hashes...
    You think there are photos of drugs being circulated amongst drug users and suppliers?  Or photos of terrorisms being circulated amongst terror groups?  For the most part that's not really how those groups work.  Child abuse networks however have been shown many times over to be disseminators of vast amounts of media of child abuse.
    So what? Forget drug or terrorism or immigration related photos and replace it with CSAM or <insert any database here>.

    The point is the statutory nature such a scheme may easily acquire after Apple elaborately implements it and the Government more elaborately makes it into law, for whatever hash database at their discretion. The citizens won’t have any control on these statutory databases and then will have to fight at the courts for civil liberties. 
    “So what” is an... interesting rebuttal. It refuses to actually address the content of the argument. You’re presenting hypotheticals and then refusing to engage when someone calls out the requisite details inherent in discussing the hypothetical scenario. You’re not arguing an issue; you’re arguing, indirectly, for an ideology, but it ends up appearing more like FUD.
  • Reply 127 of 162
    dysamoriadysamoria Posts: 3,430member
    lkrupp said:
    Okay, here we go with the misinformation train. I’m watching the 5 o’clock local news in St. Louis. Next story up, after the commercials, “Apple will start scanning your photos, how you could wind up in jail”. Can’t wait to listen to the spin they put on this.
    I assume that’s Fox News, with a title like that. Maybe... don’t watch Fox News. Remember, they have stated in court that “no reasonable person would believe their entertainment commentary as being statement of fact”, or something like that...
    baconstang
  • Reply 128 of 162
    dysamoriadysamoria Posts: 3,430member
    crowley said:
    crowley said:
    Then I assume you don’t use Dropbox, Gmail, Twitter, Tumblr, etc etc… They all use the CSAM database for the same purpose. 

    The main take-away - commercial cloud hosting uses their servers. Should they not take measures to address child pornography on them? Not using their commercial service, there’s no issue. Is that not reasonable? One needn’t use commercial hosting services, especially if using it for illegal purposes.
    And this is exactly what criminals actually do: they are not stupid enough to use iCloud, they have the dark web, they have browsers and file transfer tools tailored to the special protocols developed for the dark web. Apple has long explained very well that iCloud backups are not encrypted. Law enforcement has (or should have) no issue with iCloud, because they can get any person’s unencrypted iCloud data anytime by presenting a court order. And I assure you, this is almost always much faster than Apple’s surveillance, based on the accumulation of some nasty tokens and the following human review.

    So, that child protection pretext stinks. Since law enforcement can access iCloud data anytime, Apple’s  attempt to adopt self-declared law enforcement role to “prevent crimes before they occur” is Orwellian !
    I'mma just leave this here:
    U.S. law requires tech companies to flag cases of child sexual abuse to the authorities. Apple has historically flagged fewer cases than other companies. Last year, for instance, Apple reported 265 cases to the National Center for Missing & Exploited Children, while Facebook reported 20.3 million, according to the center’s statistics. That enormous gap is due in part to Apple’s decision not to scan for such material, citing the privacy of its users.
    From: https://www.nytimes.com/2021/08/05/technology/apple-iphones-privacy.html
    Flagging such cases doesn't mean preventive Orwellian surveillance. Such a law cannot pass. Even if it did, it cannot be interpreted in such an Orwellian sense. Citizens will fight, courts will interpret.
    No idea what you're even talking about.  

    You said criminals are "not stupid enough to use iCloud", which is obviously untrue, since they're stupid enough to use Facebook.

    You said Apple are attempting to "prevent crimes before they occur", which doesn't seem to be true or even relevant.  Images of child abuse are definitely crimes that have already occurred.

    Stop using Orwellian like a trump word.  It isn't.
    He also keeps saying “Tim Apple”...
    mwhite
  • Reply 129 of 162
    dysamoriadysamoria Posts: 3,430member
    crowley said:
    crowley said:
    Then I assume you don’t use Dropbox, Gmail, Twitter, Tumblr, etc etc… They all use the CSAM database for the same purpose. 

    The main take-away - commercial cloud hosting uses their servers. Should they not take measures to address child pornography on them? Not using their commercial service, there’s no issue. Is that not reasonable? One needn’t use commercial hosting services, especially if using it for illegal purposes.
    And this is exactly what criminals actually do: they are not stupid enough to use iCloud, they have the dark web, they have browsers and file transfer tools tailored to the special protocols developed for the dark web. Apple has long explained very well that iCloud backups are not encrypted. Law enforcement has (or should have) no issue with iCloud, because they can get any person’s unencrypted iCloud data anytime by presenting a court order. And I assure you, this is almost always much faster than Apple’s surveillance, based on the accumulation of some nasty tokens and the following human review.

    So, that child protection pretext stinks. Since law enforcement can access iCloud data anytime, Apple’s  attempt to adopt self-declared law enforcement role to “prevent crimes before they occur” is Orwellian !
    I'mma just leave this here:
    U.S. law requires tech companies to flag cases of child sexual abuse to the authorities. Apple has historically flagged fewer cases than other companies. Last year, for instance, Apple reported 265 cases to the National Center for Missing & Exploited Children, while Facebook reported 20.3 million, according to the center’s statistics. That enormous gap is due in part to Apple’s decision not to scan for such material, citing the privacy of its users.
    From: https://www.nytimes.com/2021/08/05/technology/apple-iphones-privacy.html
    Flagging such cases doesn't mean preventive Orwellian surveillance. Such a law cannot pass. Even if it did, it cannot be interpreted in such an Orwellian sense. Citizens will fight, courts will interpret.
    No idea what you're even talking about.  

    You said criminals are "not stupid enough to use iCloud", which is obviously untrue, since they're stupid enough to use Facebook.

    You said Apple are attempting to "prevent crimes before they occur", which doesn't seem to be true or even relevant.  Images of child abuse are definitely crimes that have already occurred.

    Stop using Orwellian like a trump word.  It isn't.
    This is why preventive Orwellian surveillance is not a solution. How will you distinguish a mother's baby shower photo from a child abuse photo? Not AI, I mean human interpretation. You need a context to qualify it as child abuse. The scheme as described will not provide that context. "Images of child abuse are definitely crimes that have already occurred", agreed, but if and only if they are explicit enough to provide an abuse context. What about innocent looking non-explicit photos collected as a result of long abusive practices? So, the number of cases Apple can flag will be extremely llimited, since such explicit context will mostly reside elsewhere, dark web or some other media.
    Have you even bothered to read these articles? Like even bothered? They do NOT evaluate the subject of your photos. They are specific hash matches to *known* child pornography, cataloged in the CSAM database. 

    Seriously fucking educate yourself before clutching your pearls. If you can’t read the article you’re commenting on, try this one:

    https://daringfireball.net/2021/08/apple_child_safety_initiatives_slippery_slope
    Apparently you fucking educated yourself enough to still not understand that an innocent looking photo may still point to child abuse but Apple’s scheme will miss it thus it is ineffective. Crime is a very complex setup, it cannot be reduced to a couple of hashes.
    Wait, was that an anti-education slam? 🙄
  • Reply 130 of 162
    dysamoriadysamoria Posts: 3,430member
    Domestic terrorists are a greater threat than some guy saving a picture he found on the internet.  Shouldn't Apple be scanning for them?

    Likewise, an angry, disaffected person with an AR15 is a greater threat to kids in school than some guy saving a picture he found on the internet.  Shouldn't Apple be scanning for them?
    Are you proposing a method, or suggesting that one particular method of fighting one particular type of crime is not worth pursuing because it does not deal with 100% of all crime?
  • Reply 131 of 162
    dysamoriadysamoria Posts: 3,430member
    aguyinatx said:
    mjtomlin said:
    Just thought I’d chime in after reading so many misguided complaints about this subject. Are all of you context deaf? Did you read what is actually happening or are you just incapable of comprehending it?

    They are not scanning photos for specific images, they’re simply counting the bits and creating a hash… all they see are numbers… this in no way scans photos for offensive images, nor does it in any way violate your privacy.

    It amazes me that so many are complaining about trying to restrict/thwart child pornography?!

    it’s even more ridiculous when you consider every photo you save to the Photos app is automatically scanned through an image recognition engine to identify the contents of the photo.

    Creating a hash requires scanning the image and absolutely requires the file to be opened.  Do this on a Mac and it's pretty easy to demonstrate.  Also DO NOT RUN COMMANDS YOU DO NOT UNDERSTAND FROM FORUM POSTS.  Ask someone or research the commands.

    Now.... In a terminal....

    sudo -I and enter your account password. This elevates your privileges to root.  That's a lower case I btw.
    echo "foo" > /opt/root-owned-file.txt  This creates a file in /opt called root-owned-file.txt with the word "foo" as the only content.
    chmod 0600 /opt/root/owned-file.txt This ensures that only the root user can read and write to the file
    exit  and hit return

    Now you're running as the user you logged in with. 
    sha256sum /opt/root-owned-file.txt   should give you a hash (those number you were talking about) but it doesn't.  You get a permission denied because you can't hash a file that you can't open.  Apple isn't magic, they have to open the image in order to analyze it.  Full stop.  No binary or user on a Unix system can hash a file without opening.  

    Okay, clean up the file sudo rm /opt/root-owned-file.txt

    Next up... This computer is one that I paid for and I own.  Only parties I consent to should have the right to open files to analyze them.  From the example above,   No one is complaining about stopping CSAM, but these aren't computers that Apple owns, and they aren't asking users if they want to submit to surveillance, and no scanning a photo to see if a dog is in it is not surveillance.  Additionally Apple is clearly adopting a vigilante role that is extra-judicial.  Law enforcement agencies require a warrant to compel someone to surrender access to a computer, and yet Apple presumes powers that the FBI doesn't have.

    The article is primarily an ad hominem fallacy without many facts.  "Hey they other guys are doing it too!" is a baseless argument.  I do not have a Facebook account so I don't care what they do.  I'm not given a choice with Apple suddenly, and I am perfectly justified in getting my ire up when they insist that they have the uninvited right to open files that I create.  
    Oh course you have a choice.  This is implemented only on photos you upload to iCloud.  You have no more obligation to upload your photos to iCloud than you have obligation to open a Facebook account.  
    See there?   There you go again!  (Trying to use logic!)

    Both sides of this debate (those "protecting innocent children" and those "protecting our privacy rights" are radicalized and can only see evil.

    Please don’t equivocate. There’re more than just two “sides” (extremes). I am interested in good tech doing useful things, as well as not being abused or let down by badly implemented tech that fails to work as intended or which can be abused to do something not intended. That makes this topic of interest because I want to understand it.

    I think the logic at Apple of “let’s come up with something to address a legit problem AND make the argument for backdoors an unnecessary one” is smart on their part. The laissez-faire types keep saying industry self-regulates... well here it effing is!! But oh no, there’s big bad gub’ment getting notifications of when criminal content is found, and suddenly there’re screaming “ORWELLIAN!”

    Then we have smarmy pro-Apple editorials-as-explainers, without detailing why we should accept the Apple-provided content as verified fact (citing sources and explanations for how claims were devised is important; coming to this conversation hoping to get what’s NOT in an article makes an article problematic).

    BAH. I don’t want “DOOM DOOM DOOM” commentary, and I don’t want “WE LOVES THE PRECIOUS!!!” commentary either.
  • Reply 132 of 162
    What you need to know: Yes. Apple will check stuff in YOUR device. Opening precedents for totalitarian governments. Doesn't matter how much you guys or Apple keeps putting it, this is the absolute truth.
    muthuk_vanalingamxyzzy-xxx
  • Reply 133 of 162
    Rayz2016Rayz2016 Posts: 6,957member
    dysamoria said:
    aguyinatx said:
    mjtomlin said:
    Just thought I’d chime in after reading so many misguided complaints about this subject. Are all of you context deaf? Did you read what is actually happening or are you just incapable of comprehending it?

    They are not scanning photos for specific images, they’re simply counting the bits and creating a hash… all they see are numbers… this in no way scans photos for offensive images, nor does it in any way violate your privacy.

    It amazes me that so many are complaining about trying to restrict/thwart child pornography?!

    it’s even more ridiculous when you consider every photo you save to the Photos app is automatically scanned through an image recognition engine to identify the contents of the photo.

    Creating a hash requires scanning the image and absolutely requires the file to be opened.  Do this on a Mac and it's pretty easy to demonstrate.  Also DO NOT RUN COMMANDS YOU DO NOT UNDERSTAND FROM FORUM POSTS.  Ask someone or research the commands.

    Now.... In a terminal....

    sudo -I and enter your account password. This elevates your privileges to root.  That's a lower case I btw.
    echo "foo" > /opt/root-owned-file.txt  This creates a file in /opt called root-owned-file.txt with the word "foo" as the only content.
    chmod 0600 /opt/root/owned-file.txt This ensures that only the root user can read and write to the file
    exit  and hit return

    Now you're running as the user you logged in with. 
    sha256sum /opt/root-owned-file.txt   should give you a hash (those number you were talking about) but it doesn't.  You get a permission denied because you can't hash a file that you can't open.  Apple isn't magic, they have to open the image in order to analyze it.  Full stop.  No binary or user on a Unix system can hash a file without opening.  

    Okay, clean up the file sudo rm /opt/root-owned-file.txt

    Next up... This computer is one that I paid for and I own.  Only parties I consent to should have the right to open files to analyze them.  From the example above,   No one is complaining about stopping CSAM, but these aren't computers that Apple owns, and they aren't asking users if they want to submit to surveillance, and no scanning a photo to see if a dog is in it is not surveillance.  Additionally Apple is clearly adopting a vigilante role that is extra-judicial.  Law enforcement agencies require a warrant to compel someone to surrender access to a computer, and yet Apple presumes powers that the FBI doesn't have.

    The article is primarily an ad hominem fallacy without many facts.  "Hey they other guys are doing it too!" is a baseless argument.  I do not have a Facebook account so I don't care what they do.  I'm not given a choice with Apple suddenly, and I am perfectly justified in getting my ire up when they insist that they have the uninvited right to open files that I create.  
    Oh course you have a choice.  This is implemented only on photos you upload to iCloud.  You have no more obligation to upload your photos to iCloud than you have obligation to open a Facebook account.  
    See there?   There you go again!  (Trying to use logic!)

    Both sides of this debate (those "protecting innocent children" and those "protecting our privacy rights" are radicalized and can only see evil.

    Please don’t equivocate. There’re more than just two “sides” (extremes). I am interested in good tech doing useful things, as well as not being abused or let down by badly implemented tech that fails to work as intended or which can be abused to do something not intended. That makes this topic of interest because I want to understand it.

    I think the logic at Apple of “let’s come up with something to address a legit problem AND make the argument for backdoors an unnecessary one” is smart on their part. The laissez-faire types keep saying industry self-regulates... well here it effing is!! But oh no, there’s big bad gub’ment getting notifications of when criminal content is found, and suddenly there’re screaming “ORWELLIAN!”

    Then we have smarmy pro-Apple editorials-as-explainers, without detailing why we should accept the Apple-provided content as verified fact (citing sources and explanations for how claims were devised is important; coming to this conversation hoping to get what’s NOT in an article makes an article problematic).

    BAH. I don’t want “DOOM DOOM DOOM” commentary, and I don’t want “WE LOVES THE PRECIOUS!!!” commentary either.
    Well … that certainly was a lot of words. 
    elijahgfastasleep
  • Reply 134 of 162
    lkrupplkrupp Posts: 10,557member
    aguyinatx said:
    mjtomlin said:
    Just thought I’d chime in after reading so many misguided complaints about this subject. Are all of you context deaf? Did you read what is actually happening or are you just incapable of comprehending it?

    They are not scanning photos for specific images, they’re simply counting the bits and creating a hash… all they see are numbers… this in no way scans photos for offensive images, nor does it in any way violate your privacy.

    It amazes me that so many are complaining about trying to restrict/thwart child pornography?!

    it’s even more ridiculous when you consider every photo you save to the Photos app is automatically scanned through an image recognition engine to identify the contents of the photo.

    Creating a hash requires scanning the image and absolutely requires the file to be opened.  Do this on a Mac and it's pretty easy to demonstrate.  Also DO NOT RUN COMMANDS YOU DO NOT UNDERSTAND FROM FORUM POSTS.  Ask someone or research the commands.

    Now.... In a terminal....

    sudo -I and enter your account password. This elevates your privileges to root.  That's a lower case I btw.
    echo "foo" > /opt/root-owned-file.txt  This creates a file in /opt called root-owned-file.txt with the word "foo" as the only content.
    chmod 0600 /opt/root/owned-file.txt This ensures that only the root user can read and write to the file
    exit  and hit return

    Now you're running as the user you logged in with. 
    sha256sum /opt/root-owned-file.txt   should give you a hash (those number you were talking about) but it doesn't.  You get a permission denied because you can't hash a file that you can't open.  Apple isn't magic, they have to open the image in order to analyze it.  Full stop.  No binary or user on a Unix system can hash a file without opening.  

    Okay, clean up the file sudo rm /opt/root-owned-file.txt

    Next up... This computer is one that I paid for and I own.  Only parties I consent to should have the right to open files to analyze them.  From the example above,   No one is complaining about stopping CSAM, but these aren't computers that Apple owns, and they aren't asking users if they want to submit to surveillance, and no scanning a photo to see if a dog is in it is not surveillance.  Additionally Apple is clearly adopting a vigilante role that is extra-judicial.  Law enforcement agencies require a warrant to compel someone to surrender access to a computer, and yet Apple presumes powers that the FBI doesn't have.

    The article is primarily an ad hominem fallacy without many facts.  "Hey they other guys are doing it too!" is a baseless argument.  I do not have a Facebook account so I don't care what they do.  I'm not given a choice with Apple suddenly, and I am perfectly justified in getting my ire up when they insist that they have the uninvited right to open files that I create.  
    Oh course you have a choice.  This is implemented only on photos you upload to iCloud.  You have no more obligation to upload your photos to iCloud than you have obligation to open a Facebook account.  
    See there?   There you go again!  (Trying to use logic!)

    Both sides of this debate (those "protecting innocent children" and those "protecting our privacy rights" are radicalized and can only see evil.

    Just like the country’s politics these days. Snowflakes and Deplorable each seeing the other as evil.
    edited August 2021
  • Reply 135 of 162
    BeatsBeats Posts: 3,073member
    elijahg said:
    elijahg said:
    Remember that 1 in 1 trillion isn't 1 false positive per 1 trillion iCloud accounts - it's 1 per 1 trillion photos. I have 20,000 photos, that brings the chances I have a falsely flagged photo to 1 in 50 million. Not quite such spectacular odds then.
    One in a trillion over 20,000 photos is not 1 in 50 million. It's one in a trillion, 20,000 times. The odds do not decrease per photo, as your photo library increases in size. There is not a 1:1 guarantee of a falsely flagged photo in a trillion-strong photo library.

    And even if it was, one in 50 million is still pretty spectacularly against.
    Unfortunately it is - 1 in 1 trillion becomes 2 in 1 trillion with two photos. Or 1 in 500 billion. That then halves again with 4 photos, 1 in 250 billion and so on. It's little more than simplified fractions. Punch 1,000,000,000,000/20,000 into a scientific calculator and it'll be simplified to 50,000,000/1. The odds do decrease because there is a more likelihood you have a matching photo with 2 photos than 1 photo. And yes, statistically speaking 1 in 1 trillion means that in a trillion-strong library there will be one false match.

    Also, it's massively more likely someone will get their password phished than a hash collision occurring - probably 15-20% of people I know have been "hacked" through phishing. All it takes is a couple of photos to be planted, with a date a few years ago so they aren't at the forefront of someone's library and someone's in very hot water. You claim someone could defend against this in court, but I fail to understand how? "I don't know how they got there" isn't going to wash with too many people. And unfortunately, "good security practices" are practised only by the likes of us anyway, most people use the same password with their date of birth or something equally insecure for everything. 

    I know cops and a detective who plant evidence on people. Some of it digital. They’re gonna love this!

    If anyone says “if you know these cops, then why not report them to the police?”
    i say 🤣.
    muthuk_vanalingam
  • Reply 136 of 162
    lkrupplkrupp Posts: 10,557member
    Hey, when this is implemented let’s start a betting pool of how long it will take for some innocent person to be caught up in the wheels. Some of you think it will happen instantaneously. I say it will NEVER happen in our lifetime, that this will be forgotten and no longer news, becoming standard practice.

    And the ignorance shown in this thread about statistics and chance is astounding, much like the ignorance of how tech works.
    edited August 2021
  • Reply 137 of 162
    crowleycrowley Posts: 10,453member
    lkrupp said:
    Hey, when this is implemented let’s start a betting pool of how long it will take for some innocent person to be caught up in the wheels. Some of you think it will happen instantaneously. I say it will NEVER happen in our lifetime, that this will be forgotten and no longer news, becoming standard practice.

    And the ignorance shown in this thread about statistics and chance is astounding, much like the ignorance of how tech works.
    Tbh my greater worry is more that Apple is being so overly cautious and respectful of people’s privacy that this new system won’t ever flag anyone, and it’ll all be a colossal waste of time and energy.  I’ve seen it mentioned that a user needs 30 flags against their name before Apple will review any photos. 30!
  • Reply 138 of 162
    German press union tries to stop Apple saying it's illegal in the EU (even if the rollout is not planned for the EU yet, the existence of backdoor alone seems to be illegal):

  • Reply 139 of 162
    chadbagchadbag Posts: 1,999member
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    muthuk_vanalingam
  • Reply 140 of 162
    Mike WuertheleMike Wuerthele Posts: 6,858administrator
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.
    StrangeDaysradarthekat
Sign In or Register to comment.