tylersdad

About

Username
tylersdad
Joined
Visits
58
Last Active
Roles
member
Points
2,020
Badges
2
Posts
310
  • Internal Apple memo addresses public concern over new child protection features

    tylersdad said:
    tylersdad said:
    It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
    Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.

    Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
    Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely? 

    Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology. 
    Sorry but you’re ignorant. A hash is representation of the image but is not the image. Much like TouchID stores a numeric string that represents your fingerprint, but is *not* your fingerprint. FaceID does this too. 
    I'm not ignorant. This is what I do for a living and have for the last 29 years. Tell me, how do you generate a hash without having some input as the source? 
    chemengin1baconstangpatchythepiratejdwdarkvader
  • Internal Apple memo addresses public concern over new child protection features

    Beats said:
    mknelson said:
    Beats said:
    Some of this sounds like PR BS. I don’t see how this helps children like Tim claims.  And Apple collaborating with the government is embarrassing.
    Collaborating with the government how? (you are the government in a democracy btw).
    Well in that case we can stretch it to collaborating with me (government) because I’m allowing them to snoop through my photos.

    From the article:
    ” In true Apple fashion, pursuing this goal has required deep cross-functional commitment, spanning Engineering, GA, HI, Legal…”
    -Tim Cook

    Maybe I’m reading it wrong but “legal” is a government entity. I would bet some looneys from Congress or the FBI twisted Apple’s arm to provide this.
    "legal" likely means Apple legal council, not some government entity. Most companies (especially large ones) have their own legal department. 
    dysamorialolliversteveauwatto_cobra
  • Internal Apple memo addresses public concern over new child protection features

    Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures.

    To explain this quickly, this “scanning applies only (for now) to photos before they are uploaded to iCloud Photos. If you don’t use iCloud Photos, then this will never happen (for now).

    Your device scans your photos and calculates a photo “fingerprint” ON YOUR DEVICE. It downloads fingerprints of bad materials and checks ON YOUR DEVICE. 

    If by some chance you had a match, it would put a kind of safety token with the media while uploading it to iCloud Photos. If you reach some number of safety tokens someone will be tasked with checking up on your media (presumably in a very secure way which logs everything they do during the check).

    The big question is who creates/validates the media fingerprints that get compared. Example “what if” concerns include a government like China giving Apple fingerprints for anti-government images.
    Just because it's happening ON MY DEVICE doesn't mean the image isn't being examined. You compare the image to another reference image without opening the image. You need to look at the 1's and 0's and to do that, you need to open the file. 
    baconstangpatchythepirateJapheydarkvader
  • Internal Apple memo addresses public concern over new child protection features

    tylersdad said:
    It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
    Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.

    Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
    Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely? 

    Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology. 
    Beatssgs46hcrefugeechemengin1baconstangJapheydarkvader
  • Internal Apple memo addresses public concern over new child protection features

    mknelson said:
    Beats said:
    Some of this sounds like PR BS. I don’t see how this helps children like Tim claims.  And Apple collaborating with the government is embarrassing.
    Collaborating with the government how? (you are the government in a democracy btw).
    How? By looking at private information on a privately owned phone without the consent of the owner or a warrant to search such information. 

    This is essentially a warrantless search of a person's personal property. 

    We've already seen that the government is colluding with social media to censor posts which they deem unfavorable to them. We've already seen the government asking people to rat out their family members and friends for what they perceive as "extremist" behavior. Is it really that much of a stretch to think the government would ask tech companies to monitor emails and messages for extremist content?
    Beatsmuthuk_vanalingambaconstangJapheyTheObannonFiledarkvader