tylersdad

About

Username
tylersdad
Joined
Visits
58
Last Active
Roles
member
Points
2,020
Badges
2
Posts
310
  • Internal Apple memo addresses public concern over new child protection features

    tylersdad said:
    tylersdad said:
    tylersdad said:
    tylersdad said:
    It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
    Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.

    Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
    Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely? 

    Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology. 
    Wes is not the one that misunderstands. There is no pixel to pixel comparison.

    The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
    How exactly do you mathematically hash an image--or any file for that matter--without looking at the 1's and 0's in the file?
    Because you're doing it one 0 and one 1 at a time. A single pixel is many 0s and many 1s, depending on the bit depth, and compression algo. There is no contextual evaluation or assessment of the file, or even a single pixel, beyond the generation of the hash.

    There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash.
    Which means they have to open the file, read the contents, generate the hash, transmit the hash. 

    They are opening the file. They have to. Sure, it's happening on your phone, but it's still happening. It may not be a human looking at your file, but it's being opened by Apple code that generates the hash. 

    You cannot generate a hash without some sort of input. In this case, the input is the 1's and 0's from the file...which they MUST read to generate the hash. 
    Read the whole comment that I wrote. Reading the stream of 0s and 1s gives no clue to anyone as to what's in the file.

    I'm not entirely sure what your argument is. Seeing the 0 and 1 in a file without knowing how to decode it isn't the same as knowing what the file is.
    Read my whole quote. My concern is not that someone at Apple is looking at the visual representation of a file on my phone. My concern is that they are opening a file without my permission and examining the contents: even if it's just the 1's and 0's. 

    Today they're only looking at 1's and 0's. Tomorrow? Who knows...

    The bottom line is that the code MUST open the file. The code MUST read the contents of the file. 
    baconstangdarkvader
  • Internal Apple memo addresses public concern over new child protection features

    tylersdad said:
    tylersdad said:
    tylersdad said:
    It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
    Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.

    Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
    Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely? 

    Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology. 
    Wes is not the one that misunderstands. There is no pixel to pixel comparison.

    The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
    How exactly do you mathematically hash an image--or any file for that matter--without looking at the 1's and 0's in the file?
    Because you're doing it one 0 and one 1 at a time. A single pixel is many 0s and many 1s, depending on the bit depth, and compression algo. There is no contextual evaluation or assessment of the file, or even a single pixel, beyond the generation of the hash.

    There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash.
    Which means they have to open the file, read the contents, generate the hash, transmit the hash. 

    They are opening the file. They have to. Sure, it's happening on your phone, but it's still happening. It may not be a human looking at your file, but it's being opened by Apple code that generates the hash. 

    You cannot generate a hash without some sort of input. In this case, the input is the 1's and 0's from the file...which they MUST read to generate the hash. 
    darkvader
  • Internal Apple memo addresses public concern over new child protection features

    tylersdad said:
    tylersdad said:
    It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
    Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.

    Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
    Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely? 

    Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology. 
    Wes is not the one that misunderstands. There is no pixel to pixel comparison.

    The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
    How exactly do you mathematically hash an image--or any file for that matter--without looking at the 1's and 0's in the file?
    baconstangdarkvader
  • Internal Apple memo addresses public concern over new child protection features

    tylersdad said:
    tylersdad said:
    It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
    Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.

    Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
    Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely? 

    Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology. 
    Sorry but you’re ignorant. A hash is representation of the image but is not the image. Much like TouchID stores a numeric string that represents your fingerprint, but is *not* your fingerprint. FaceID does this too. 
    I'm not ignorant. This is what I do for a living and have for the last 29 years. Tell me, how do you generate a hash without having some input as the source? 
    chemengin1baconstangpatchythepiratejdwdarkvader
  • Internal Apple memo addresses public concern over new child protection features

    Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures.

    To explain this quickly, this “scanning applies only (for now) to photos before they are uploaded to iCloud Photos. If you don’t use iCloud Photos, then this will never happen (for now).

    Your device scans your photos and calculates a photo “fingerprint” ON YOUR DEVICE. It downloads fingerprints of bad materials and checks ON YOUR DEVICE. 

    If by some chance you had a match, it would put a kind of safety token with the media while uploading it to iCloud Photos. If you reach some number of safety tokens someone will be tasked with checking up on your media (presumably in a very secure way which logs everything they do during the check).

    The big question is who creates/validates the media fingerprints that get compared. Example “what if” concerns include a government like China giving Apple fingerprints for anti-government images.
    Just because it's happening ON MY DEVICE doesn't mean the image isn't being examined. You compare the image to another reference image without opening the image. You need to look at the 1's and 0's and to do that, you need to open the file. 
    baconstangpatchythepirateJapheydarkvader