Mike Wuerthele

About

Username
Mike Wuerthele
Joined
Visits
178
Last Active
Roles
administrator
Points
23,992
Badges
3
Posts
7,272
  • Internal Apple memo addresses public concern over new child protection features

    tylersdad said:
    tylersdad said:
    tylersdad said:
    tylersdad said:
    tylersdad said:
    It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
    Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.

    Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
    Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely? 

    Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology. 
    Wes is not the one that misunderstands. There is no pixel to pixel comparison.

    The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
    How exactly do you mathematically hash an image--or any file for that matter--without looking at the 1's and 0's in the file?
    Because you're doing it one 0 and one 1 at a time. A single pixel is many 0s and many 1s, depending on the bit depth, and compression algo. There is no contextual evaluation or assessment of the file, or even a single pixel, beyond the generation of the hash.

    There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash.
    Which means they have to open the file, read the contents, generate the hash, transmit the hash. 

    They are opening the file. They have to. Sure, it's happening on your phone, but it's still happening. It may not be a human looking at your file, but it's being opened by Apple code that generates the hash. 

    You cannot generate a hash without some sort of input. In this case, the input is the 1's and 0's from the file...which they MUST read to generate the hash. 
    Read the whole comment that I wrote. Reading the stream of 0s and 1s gives no clue to anyone as to what's in the file.

    I'm not entirely sure what your argument is. Seeing the 0 and 1 in a file without knowing how to decode it isn't the same as knowing what the file is.
    Read my whole quote. My concern is not that someone at Apple is looking at the visual representation of a file on my phone. My concern is that they are opening a file without my permission and examining the contents: even if it's just the 1's and 0's. 

    Today they're only looking at 1's and 0's. Tomorrow? Who knows...

    The bottom line is that the code MUST open the file. The code MUST read the contents of the file. 
    I guess all I can say here is Apple knowing that a file has 0s and 1s in it is not the same as knowing those 0s and 1s are a picture of a palm tree, and then a picture of a dog, and then a living room shoot.
    lolliverMacProwatto_cobrajony0
  • Internal Apple memo addresses public concern over new child protection features

    tylersdad said:
    tylersdad said:
    tylersdad said:
    It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
    Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.

    Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
    Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely? 

    Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology. 
    Wes is not the one that misunderstands. There is no pixel to pixel comparison.

    The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
    How exactly do you mathematically hash an image--or any file for that matter--without looking at the 1's and 0's in the file?
    Because you're doing it one 0 and one 1 at a time. A single pixel is many 0s and many 1s, depending on the bit depth, and compression algo. There is no contextual evaluation or assessment of the file, or even a single pixel, beyond the generation of the hash.

    There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash.
    lolliverMacProwatto_cobrajony0
  • Internal Apple memo addresses public concern over new child protection features

    tylersdad said:
    tylersdad said:
    It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
    Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.

    Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
    Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely? 

    Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology. 
    Wes is not the one that misunderstands. There is no pixel to pixel comparison.

    The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
    aderutterhcrefugeepatchythepiratelolliverwatto_cobrajony0
  • What you need to know: Apple's iCloud Photos and Messages child safety initiatives

    elijahg said:
    Remember that 1 in 1 trillion isn't 1 false positive per 1 trillion iCloud accounts - it's 1 per 1 trillion photos. I have 20,000 photos, that brings the chances I have a falsely flagged photo to 1 in 50 million. Not quite such spectacular odds then.
    One in a trillion over 20,000 photos is not 1 in 50 million. It's one in a trillion, 20,000 times. The odds do not decrease per photo, as your photo library increases in size. There is not a 1:1 guarantee of a falsely flagged photo in a trillion-strong photo library.

    And even if it was, one in 50 million is still pretty spectacularly against.
    killroybulk001dewmeStrangeDaysradarthekatbyronlwatto_cobrafastasleepdysamorianapoleon_phoneapart
  • Apple expanding child safety features across iMessage, Siri, iCloud Photos

    This is tremendously dangerous. Pattern matching can create false hits triggering ... what exactly? Transferring your data to a third party for their inspection without your permission? Keeping their private data private is exactly, precisely the reason users trust and buy Apple products. If Apple starts backing out of privacy guarantees, then what?
    Let me say clearly: I have zero faith that Apple has the technical skill to implement a feature like this. Apple screws up really simple things routinely. This is a really hard computer science problem. Apple is going to screw it up. Guaranteed. When they do screw it up, they will certainly have people looking at your private photos and quite likely get innocent people in trouble with law enforcement. The way the USA treats kids who share pictures of themselves with other kids as life long criminals is a good reason to stop and think carefully about a plan like this.
    Given that Google and Facebook have been doing this for about a decade, and both companies are at least as incompetent as you claim Apple is, plus there's no rash of false positives amongst those larger user base, I believe your fears are unfounded.

    Wait and see. If there are colossal fuck-ups, they'll be found out.
    byronlStrangeDaysGeorgeBMacdewmewatto_cobra