Internal Apple memo addresses public concern over new child protection features

2

Comments

  • Reply 21 of 60
    dysamoriadysamoria Posts: 3,430member
    lorca2770 said:

    And who defend children from the mental and cultural castration product of the gods' lunacy?

    What.
    baconstang
  • Reply 22 of 60
    chadbagchadbag Posts: 2,000member
    Beats said:
    mknelson said:
    Beats said:
    Some of this sounds like PR BS. I don’t see how this helps children like Tim claims.  And Apple collaborating with the government is embarrassing.
    Collaborating with the government how? (you are the government in a democracy btw).
    Well in that case we can stretch it to collaborating with me (government) because I’m allowing them to snoop through my photos.

    From the article:
    ” In true Apple fashion, pursuing this goal has required deep cross-functional commitment, spanning Engineering, GA, HI, Legal…”
    -Tim Cook

    Maybe I’m reading it wrong but “legal” is a government entity. I would bet some looneys from Congress or the FBI twisted Apple’s arm to provide this.
    NO NO NO NO NO

    "legal" is shorthand for the the company's legal dept.  In this case, Apple's legal department.  The lawyers that work for Apple on staff (usually) that vet everything Apple does to  make sure it is legal (in their opinion) and what legal consequences of stuff could be.  In addition to any other day to day legal type work the company needs.

    dysamorialolliverwatto_cobra
  • Reply 23 of 60
    tylersdadtylersdad Posts: 310member
    tylersdad said:
    tylersdad said:
    It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
    Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.

    Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
    Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely? 

    Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology. 
    Sorry but you’re ignorant. A hash is representation of the image but is not the image. Much like TouchID stores a numeric string that represents your fingerprint, but is *not* your fingerprint. FaceID does this too. 
    I'm not ignorant. This is what I do for a living and have for the last 29 years. Tell me, how do you generate a hash without having some input as the source? 
    chemengin1baconstangpatchythepiratejdwdarkvader
  • Reply 24 of 60
    dysamoriadysamoria Posts: 3,430member
    chadbag said:
    Beats said:
    mknelson said:
    Beats said:
    Some of this sounds like PR BS. I don’t see how this helps children like Tim claims.  And Apple collaborating with the government is embarrassing.
    Collaborating with the government how? (you are the government in a democracy btw).
    Well in that case we can stretch it to collaborating with me (government) because I’m allowing them to snoop through my photos.

    From the article:
    ” In true Apple fashion, pursuing this goal has required deep cross-functional commitment, spanning Engineering, GA, HI, Legal…”
    -Tim Cook

    Maybe I’m reading it wrong but “legal” is a government entity. I would bet some looneys from Congress or the FBI twisted Apple’s arm to provide this.
    NO NO NO NO NO

    "legal" is shorthand for the the company's legal dept.  In this case, Apple's legal department.  The lawyers that work for Apple on staff (usually) that vet everything Apple does to  make sure it is legal (in their opinion) and what legal consequences of stuff could be.  In addition to any other day to day legal type work the company needs.
    The conspiracy theorist thinking is maddening, isn’t it? I just about blew up reading that one’s comment. I’m glad two other people responded to it because I have no patience any more.
    watto_cobra
  • Reply 25 of 60
    lkrupp said:
    tylersdad said:
    This is monumentally bad for privacy. It's making me reconsider my investments in Apple products. My entire family belongs to the Apple ecosystem. We all have some version of the iPhone 12, iPads and Apple Watches. It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end? 


    If it makes the perverts switch to Android, so be it and good riddance. And I doubt you will leave the platform when you realize you’re still better off with iOS than Android. Google will be doing exactly the same thing shortly as it usually follows Apple in these matters. Who will you go to? Your statement is just a fart in a wind storm. Good luck with it making any difference.
    Google's already doing it, as is Microsoft.   See Charles Arthur's comments here:

    https://theoverspill.blog/2021/08/06/apple-scanning-child-abuse-imagery-start-up-1609/
    dysamoriapatchythepiratewatto_cobrajony0
  • Reply 26 of 60
    dysamoriadysamoria Posts: 3,430member
    If you’re going to use a possibly uncommon acronym or abbreviation in your article, you should ALSO define it at least once in said article. I had to look up CSAM elsewhere. It isn’t universal or exclusive to this topic. The lack of defining it in the article felt... conspicuous.
  • Reply 27 of 60
    I don’t see any concerns being addressed in the memo, only mentioned.
    baconstangdarkvader
  • Reply 28 of 60
    Mike WuertheleMike Wuerthele Posts: 6,861administrator
    tylersdad said:
    tylersdad said:
    It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
    Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.

    Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
    Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely? 

    Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology. 
    Wes is not the one that misunderstands. There is no pixel to pixel comparison.

    The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
    edited August 2021 aderutterhcrefugeepatchythepiratelolliverwatto_cobrajony0
  • Reply 29 of 60
    KuyangkohKuyangkoh Posts: 838member
    I dont care….Apple
    can look, examined all my photos in icloud or in my phone, got nothing to hide…. Haha 
    watto_cobra
  • Reply 30 of 60
    tylersdadtylersdad Posts: 310member
    tylersdad said:
    tylersdad said:
    It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
    Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.

    Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
    Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely? 

    Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology. 
    Wes is not the one that misunderstands. There is no pixel to pixel comparison.

    The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
    How exactly do you mathematically hash an image--or any file for that matter--without looking at the 1's and 0's in the file?
    edited August 2021 baconstangdarkvader
  • Reply 31 of 60
    Mike WuertheleMike Wuerthele Posts: 6,861administrator
    tylersdad said:
    tylersdad said:
    tylersdad said:
    It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
    Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.

    Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
    Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely? 

    Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology. 
    Wes is not the one that misunderstands. There is no pixel to pixel comparison.

    The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
    How exactly do you mathematically hash an image--or any file for that matter--without looking at the 1's and 0's in the file?
    Because you're doing it one 0 and one 1 at a time. A single pixel is many 0s and many 1s, depending on the bit depth, and compression algo. There is no contextual evaluation or assessment of the file, or even a single pixel, beyond the generation of the hash.

    There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash.
    edited August 2021 lolliverMacProwatto_cobrajony0
  • Reply 32 of 60
    tylersdadtylersdad Posts: 310member
    tylersdad said:
    tylersdad said:
    tylersdad said:
    It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
    Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.

    Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
    Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely? 

    Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology. 
    Wes is not the one that misunderstands. There is no pixel to pixel comparison.

    The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
    How exactly do you mathematically hash an image--or any file for that matter--without looking at the 1's and 0's in the file?
    Because you're doing it one 0 and one 1 at a time. A single pixel is many 0s and many 1s, depending on the bit depth, and compression algo. There is no contextual evaluation or assessment of the file, or even a single pixel, beyond the generation of the hash.

    There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash.
    Which means they have to open the file, read the contents, generate the hash, transmit the hash. 

    They are opening the file. They have to. Sure, it's happening on your phone, but it's still happening. It may not be a human looking at your file, but it's being opened by Apple code that generates the hash. 

    You cannot generate a hash without some sort of input. In this case, the input is the 1's and 0's from the file...which they MUST read to generate the hash. 
    darkvader
  • Reply 33 of 60
    Mike WuertheleMike Wuerthele Posts: 6,861administrator
    tylersdad said:
    tylersdad said:
    tylersdad said:
    tylersdad said:
    It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
    Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.

    Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
    Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely? 

    Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology. 
    Wes is not the one that misunderstands. There is no pixel to pixel comparison.

    The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
    How exactly do you mathematically hash an image--or any file for that matter--without looking at the 1's and 0's in the file?
    Because you're doing it one 0 and one 1 at a time. A single pixel is many 0s and many 1s, depending on the bit depth, and compression algo. There is no contextual evaluation or assessment of the file, or even a single pixel, beyond the generation of the hash.

    There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash.
    Which means they have to open the file, read the contents, generate the hash, transmit the hash. 

    They are opening the file. They have to. Sure, it's happening on your phone, but it's still happening. It may not be a human looking at your file, but it's being opened by Apple code that generates the hash. 

    You cannot generate a hash without some sort of input. In this case, the input is the 1's and 0's from the file...which they MUST read to generate the hash. 
    Read the whole comment that I wrote. Reading the stream of 0s and 1s gives no clue to anyone as to what's in the file.

    I'm not entirely sure what your argument is. Seeing the 0 and 1 in a file without knowing how to decode it isn't the same as knowing what the file is.
    edited August 2021 lolliverMacProwatto_cobrajony0
  • Reply 34 of 60
    tylersdadtylersdad Posts: 310member
    tylersdad said:
    tylersdad said:
    tylersdad said:
    tylersdad said:
    It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
    Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.

    Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
    Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely? 

    Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology. 
    Wes is not the one that misunderstands. There is no pixel to pixel comparison.

    The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
    How exactly do you mathematically hash an image--or any file for that matter--without looking at the 1's and 0's in the file?
    Because you're doing it one 0 and one 1 at a time. A single pixel is many 0s and many 1s, depending on the bit depth, and compression algo. There is no contextual evaluation or assessment of the file, or even a single pixel, beyond the generation of the hash.

    There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash.
    Which means they have to open the file, read the contents, generate the hash, transmit the hash. 

    They are opening the file. They have to. Sure, it's happening on your phone, but it's still happening. It may not be a human looking at your file, but it's being opened by Apple code that generates the hash. 

    You cannot generate a hash without some sort of input. In this case, the input is the 1's and 0's from the file...which they MUST read to generate the hash. 
    Read the whole comment that I wrote. Reading the stream of 0s and 1s gives no clue to anyone as to what's in the file.

    I'm not entirely sure what your argument is. Seeing the 0 and 1 in a file without knowing how to decode it isn't the same as knowing what the file is.
    Read my whole quote. My concern is not that someone at Apple is looking at the visual representation of a file on my phone. My concern is that they are opening a file without my permission and examining the contents: even if it's just the 1's and 0's. 

    Today they're only looking at 1's and 0's. Tomorrow? Who knows...

    The bottom line is that the code MUST open the file. The code MUST read the contents of the file. 
    baconstangdarkvader
  • Reply 35 of 60
    tylersdad said:
    tylersdad said:
    tylersdad said:
    It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
    Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.

    Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
    Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely? 

    Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology. 
    Wes is not the one that misunderstands. There is no pixel to pixel comparison.

    The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
    How exactly do you mathematically hash an image--or any file for that matter--without looking at the 1's and 0's in the file?
    Because you're doing it one 0 and one 1 at a time. A single pixel is many 0s and many 1s, depending on the bit depth, and compression algo. There is no contextual evaluation or assessment of the file, or even a single pixel, beyond the generation of the hash.

    There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash.
    Ah!  But can I feed it corned beef?  If so, would it still generate a hash?
    baconstang
  • Reply 36 of 60
    tylersdadtylersdad Posts: 310member
    tylersdad said:
    tylersdad said:
    tylersdad said:
    It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
    Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.

    Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
    Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely? 

    Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology. 
    Wes is not the one that misunderstands. There is no pixel to pixel comparison.

    The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
    How exactly do you mathematically hash an image--or any file for that matter--without looking at the 1's and 0's in the file?
    Because you're doing it one 0 and one 1 at a time. A single pixel is many 0s and many 1s, depending on the bit depth, and compression algo. There is no contextual evaluation or assessment of the file, or even a single pixel, beyond the generation of the hash.

    There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash.
    Ah!  But can I feed it corned beef?  If so, would it still generate a hash?
    No. Not without the potatoes. 
    baconstangpatchythepirateFileMakerFellerdarkvaderwatto_cobra
  • Reply 37 of 60
    Mike WuertheleMike Wuerthele Posts: 6,861administrator
    tylersdad said:
    tylersdad said:
    tylersdad said:
    tylersdad said:
    tylersdad said:
    It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
    Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.

    Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
    Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely? 

    Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology. 
    Wes is not the one that misunderstands. There is no pixel to pixel comparison.

    The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
    How exactly do you mathematically hash an image--or any file for that matter--without looking at the 1's and 0's in the file?
    Because you're doing it one 0 and one 1 at a time. A single pixel is many 0s and many 1s, depending on the bit depth, and compression algo. There is no contextual evaluation or assessment of the file, or even a single pixel, beyond the generation of the hash.

    There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash.
    Which means they have to open the file, read the contents, generate the hash, transmit the hash. 

    They are opening the file. They have to. Sure, it's happening on your phone, but it's still happening. It may not be a human looking at your file, but it's being opened by Apple code that generates the hash. 

    You cannot generate a hash without some sort of input. In this case, the input is the 1's and 0's from the file...which they MUST read to generate the hash. 
    Read the whole comment that I wrote. Reading the stream of 0s and 1s gives no clue to anyone as to what's in the file.

    I'm not entirely sure what your argument is. Seeing the 0 and 1 in a file without knowing how to decode it isn't the same as knowing what the file is.
    Read my whole quote. My concern is not that someone at Apple is looking at the visual representation of a file on my phone. My concern is that they are opening a file without my permission and examining the contents: even if it's just the 1's and 0's. 

    Today they're only looking at 1's and 0's. Tomorrow? Who knows...

    The bottom line is that the code MUST open the file. The code MUST read the contents of the file. 
    I guess all I can say here is Apple knowing that a file has 0s and 1s in it is not the same as knowing those 0s and 1s are a picture of a palm tree, and then a picture of a dog, and then a living room shoot.
    lolliverMacProwatto_cobrajony0
  • Reply 38 of 60
    tylersdadtylersdad Posts: 310member
    tylersdad said:
    tylersdad said:
    tylersdad said:
    tylersdad said:
    tylersdad said:
    It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
    Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.

    Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
    Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely? 

    Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology. 
    Wes is not the one that misunderstands. There is no pixel to pixel comparison.

    The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
    How exactly do you mathematically hash an image--or any file for that matter--without looking at the 1's and 0's in the file?
    Because you're doing it one 0 and one 1 at a time. A single pixel is many 0s and many 1s, depending on the bit depth, and compression algo. There is no contextual evaluation or assessment of the file, or even a single pixel, beyond the generation of the hash.

    There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash.
    Which means they have to open the file, read the contents, generate the hash, transmit the hash. 

    They are opening the file. They have to. Sure, it's happening on your phone, but it's still happening. It may not be a human looking at your file, but it's being opened by Apple code that generates the hash. 

    You cannot generate a hash without some sort of input. In this case, the input is the 1's and 0's from the file...which they MUST read to generate the hash. 
    Read the whole comment that I wrote. Reading the stream of 0s and 1s gives no clue to anyone as to what's in the file.

    I'm not entirely sure what your argument is. Seeing the 0 and 1 in a file without knowing how to decode it isn't the same as knowing what the file is.
    Read my whole quote. My concern is not that someone at Apple is looking at the visual representation of a file on my phone. My concern is that they are opening a file without my permission and examining the contents: even if it's just the 1's and 0's. 

    Today they're only looking at 1's and 0's. Tomorrow? Who knows...

    The bottom line is that the code MUST open the file. The code MUST read the contents of the file. 
    I guess all I can say here is Apple knowing that a file has 0s and 1s in it is not the same as knowing those 0s and 1s are a picture of a palm tree, and then a picture of a dog, and then a living room shoot.
    Let me try again. Apple is looking at your phone and the files on your phone to determine if you broke the law. They are doing this without your consent. 

    Would you consent to law enforcement searching your phone without a warrant? Some random cop stops you on the street and wants to look at your pictures to see if you've broken the law. Would you consent? Probably not. I know I wouldn't. 

    What Apple is doing isn't much different except that they aren't looking at the physical representation of the file.

    A rose by any other name is still a rose. This is a massive violation of privacy that has the potential to go very wrong in the wrong hands. 


    muthuk_vanalingamchemengin1baconstangmacpluspluspatchythepiratejdwchadbagdarkvader
  • Reply 39 of 60
    aderutteraderutter Posts: 605member
    Maybe https://inhope.org/EN/articles/what-is-image-hashing will help the non-techies?

    Note that image hashing is not reversible so one cannot use an image hash to create another image that matches, or be used to modify an existing image so that it matches the original image.

    Also, nobody is forced into using iCloud. I for one assumed this kind of system had already been in place for years! 
    lolliverwatto_cobrajony0
  • Reply 40 of 60
    tylersdadtylersdad Posts: 310member
    aderutter said:
    Maybe https://inhope.org/EN/articles/what-is-image-hashing will help the non-techies?

    Note that image hashing is not reversible so one cannot use an image hash to create another image that matches, or be used to modify an existing image so that it matches the original image.

    Also, nobody is forced into using iCloud. I for one assumed this kind of system had already been in place for years! 
    It seems like the non-techies (as you put it) are the ones who have the least amount of problems with this. Probably because they don't understand how hash's are generated. 

    Yes, nobody is forcing people to use iCloud, but this means I have to choose between having my privacy violated or not use the features of iCloud that I'm paying for. 
    baconstangmacplusplusjdwdarkvader
Sign In or Register to comment.