nizzard

About

Username
nizzard
Joined
Visits
13
Last Active
Roles
member
Points
231
Badges
1
Posts
58
  • Apple privacy head explains privacy protections of CSAM detection system

    mknelson said:
    First - they can't put in a generic hash of an AR-15, Snowden, Trump supporters, or gay people. It would have to be a hash of your exact picture or one that is mathematically so close as to be indistinguishable. The hashes are being provided by an independent organization.

    "SHOW ME YOUR WARRANT". Why would they need a warrant? If you put your "stash" in my house, I can search it all I want. It's IN MY HOUSE! And Apple isn't even searching it in iCloud - the hashes are run on your device.

    And yes, Apple totally mishandled this. This is a bad solution.
    No, it needs to be an exact picture *today*.   Tomorrow the technology can EASILY be expanded, of their own volition, or from government pressure.   When the mechanism already exists, it can EASILY be expanded tomorrow.   This was apple's moral high ground a few years ago when they REFUSED to write "the software equivalent of cancer" if you remember.  They just wrote it.

    tylersdaddarkvader
  • Apple privacy head explains privacy protections of CSAM detection system

    I am SO fùcking happy to see this many people opposed to this; not because theyre opposed to preventing harm to children, but because they understand this is the only way the government will force apple to begin the process of backdooring their platform.   If Apple JUST said all we do is scan iCloud for CSAM -- I think most people would say "sucks..but fine".  But the fact they're building in the capability to analyze messages prior to transmission is terribly frightening.   All it's going to take is a court order and a gag to get them to inject hashes of words to search for.  And big fucking deal if changes require an iOS update and big fucking deal if they cant (allegedly) target individual users.  "Even better", says the government.  "Give us every American user that used the following phrase..."


    "oh..we have no idea how the hash of an AR-15 got into the CSAM database...we'll certainly investigate any abuse of this system..."
    tylersdadelijahgbyronlandrewj5790muthuk_vanalingamchemengin1darkvader
  • New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM

    nizzard said:
    Another reason to believe this has nothing to do with csam and everything to do with building in a back door:   No fûcking pedo is going to save their photos to iCloud (and certainly not now). And no pedo is going to opt in to any monitoring.  So how much benefit will this non-backdoor backdoor provide? What’s it going to stop? MAYBE kids sending each other inappropriate pictures? Perhaps some small time preditors sending or requesting pics to/from kids (I’m sure this happens..but at the scale worthy of  backdooring a secure platform?)?  Is that worth subverting global secure communications for CERTAIN exploitation in the future!?   

    No.

    And they’ll still get up on stage in September and boast their “end to end” encryption and all their bullshit privacy advocacy.  It’s all bullshit now.  I dont trust them anymore than Google or Facebook.
    What back door? Tell me how my privacy is compromised when I have 0 CSAM. 

    Your assertion that nobody will upload CSAM to iCloud isn’t backed up by how things have worked to date. Apple has already reported incidences of it. It was a relatively low number, in the mid-200s I think. Meanwhile, Facebook reported over 20 million instances in the same year. So, those numbers aren’t proving that the people sharing or storing CSAM are particularly clever. 

    By the way, you say they won’t be saving their photos to iCloud, which backs up that this is an opt in implementation. If you’re worried, turn it off. 

    They are building in the capability to scan the contents of messages prior to encryption and transmission.  THAT’S what people are MOST upset about.  Now that the capability to read/analyze messages prior to secure transmission exists, the fear is ANY gov org can/will pressure apple to allow them access to the capability to surveil people.  THAT’s the concern here.
    muthuk_vanalingambaconstanggatorguyelijahg
  • New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM

    Another reason to believe this has nothing to do with csam and everything to do with building in a back door:   No fûcking pedo is going to save their photos to iCloud (and certainly not now). And no pedo is going to opt in to any monitoring.  So how much benefit will this non-backdoor backdoor provide? What’s it going to stop? MAYBE kids sending each other inappropriate pictures? Perhaps some small time preditors sending or requesting pics to/from kids (I’m sure this happens..but at the scale worthy of  backdooring a secure platform?)?  Is that worth subverting global secure communications for CERTAIN exploitation in the future!?   

    No.

    And they’ll still get up on stage in September and boast their “end to end” encryption and all their bullshit privacy advocacy.  It’s all bullshit now.  I dont trust them anymore than Google or Facebook.
    muthuk_vanalingambaconstang
  • New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM

    nizzard said:
    I just find it incredibly hard to believe the timing on this is not related to their anti-trust case with Epic.   It’s all Dept of Justice.   “Build this in and you wont get smoked in that case”
    I think this is an excellent guess. Apple has been a bit of thorn in the side of certain gov agencies. Apple, rightly or wrongly, was pretty adamant about privacy of the iPhone. So there's no doubt Apple, to some extent, is bowing to pressure from the government on this. Whether it is a quasi quid pro quo is ???. I think it's a solid guess however. For better or worse this is how deal making works in the seedy back room.
    I want to so badly just attribute this to the conspiracy theorist in me.  But the timing of this… and dare I say, even the right to repair bill, seems SO suspicious.   And of course, no one believes an intel agency wont be able to exploit this, or pressure apple to allow them to do so.   It was one thing when there was no mechanism for it and Apple pushed back, but the next time they get the All Writs Act shoved up their ass, I fear their “policy” of not bending to gov pressure will go out the window.
    muthuk_vanalingambaconstang