nizzard

About

Username
nizzard
Joined
Visits
14
Last Active
Roles
member
Points
235
Badges
1
Posts
58
  • Apple privacy head explains privacy protections of CSAM detection system

    mknelson said:
    First - they can't put in a generic hash of an AR-15, Snowden, Trump supporters, or gay people. It would have to be a hash of your exact picture or one that is mathematically so close as to be indistinguishable. The hashes are being provided by an independent organization.

    "SHOW ME YOUR WARRANT". Why would they need a warrant? If you put your "stash" in my house, I can search it all I want. It's IN MY HOUSE! And Apple isn't even searching it in iCloud - the hashes are run on your device.

    And yes, Apple totally mishandled this. This is a bad solution.
    No, it needs to be an exact picture *today*.   Tomorrow the technology can EASILY be expanded, of their own volition, or from government pressure.   When the mechanism already exists, it can EASILY be expanded tomorrow.   This was apple's moral high ground a few years ago when they REFUSED to write "the software equivalent of cancer" if you remember.  They just wrote it.

    tylersdaddarkvader
  • Apple privacy head explains privacy protections of CSAM detection system

    I am SO fùcking happy to see this many people opposed to this; not because theyre opposed to preventing harm to children, but because they understand this is the only way the government will force apple to begin the process of backdooring their platform.   If Apple JUST said all we do is scan iCloud for CSAM -- I think most people would say "sucks..but fine".  But the fact they're building in the capability to analyze messages prior to transmission is terribly frightening.   All it's going to take is a court order and a gag to get them to inject hashes of words to search for.  And big fucking deal if changes require an iOS update and big fucking deal if they cant (allegedly) target individual users.  "Even better", says the government.  "Give us every American user that used the following phrase..."


    "oh..we have no idea how the hash of an AR-15 got into the CSAM database...we'll certainly investigate any abuse of this system..."
    tylersdadelijahgbyronlandrewj5790muthuk_vanalingamchemengin1darkvader
  • New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM

    Another reason to believe this has nothing to do with csam and everything to do with building in a back door:   No fûcking pedo is going to save their photos to iCloud (and certainly not now). And no pedo is going to opt in to any monitoring.  So how much benefit will this non-backdoor backdoor provide? What’s it going to stop? MAYBE kids sending each other inappropriate pictures? Perhaps some small time preditors sending or requesting pics to/from kids (I’m sure this happens..but at the scale worthy of  backdooring a secure platform?)?  Is that worth subverting global secure communications for CERTAIN exploitation in the future!?   

    No.

    And they’ll still get up on stage in September and boast their “end to end” encryption and all their bullshit privacy advocacy.  It’s all bullshit now.  I dont trust them anymore than Google or Facebook.
    muthuk_vanalingambaconstang