nizzard
About
- Username
- nizzard
- Joined
- Visits
- 14
- Last Active
- Roles
- member
- Points
- 235
- Badges
- 1
- Posts
- 58
Reactions
-
Apple privacy head explains privacy protections of CSAM detection system
mknelson said:First - they can't put in a generic hash of an AR-15, Snowden, Trump supporters, or gay people. It would have to be a hash of your exact picture or one that is mathematically so close as to be indistinguishable. The hashes are being provided by an independent organization.
"SHOW ME YOUR WARRANT". Why would they need a warrant? If you put your "stash" in my house, I can search it all I want. It's IN MY HOUSE! And Apple isn't even searching it in iCloud - the hashes are run on your device.
And yes, Apple totally mishandled this. This is a bad solution.
-
Apple privacy head explains privacy protections of CSAM detection system
I am SO fùcking happy to see this many people opposed to this; not because theyre opposed to preventing harm to children, but because they understand this is the only way the government will force apple to begin the process of backdooring their platform. If Apple JUST said all we do is scan iCloud for CSAM -- I think most people would say "sucks..but fine". But the fact they're building in the capability to analyze messages prior to transmission is terribly frightening. All it's going to take is a court order and a gag to get them to inject hashes of words to search for. And big fucking deal if changes require an iOS update and big fucking deal if they cant (allegedly) target individual users. "Even better", says the government. "Give us every American user that used the following phrase..."
"oh..we have no idea how the hash of an AR-15 got into the CSAM database...we'll certainly investigate any abuse of this system..." -
New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM
ihatescreennames said:nizzard said:Another reason to believe this has nothing to do with csam and everything to do with building in a back door: No fûcking pedo is going to save their photos to iCloud (and certainly not now). And no pedo is going to opt in to any monitoring. So how much benefit will this non-backdoor backdoor provide? What’s it going to stop? MAYBE kids sending each other inappropriate pictures? Perhaps some small time preditors sending or requesting pics to/from kids (I’m sure this happens..but at the scale worthy of backdooring a secure platform?)? Is that worth subverting global secure communications for CERTAIN exploitation in the future!?
No.
And they’ll still get up on stage in September and boast their “end to end” encryption and all their bullshit privacy advocacy. It’s all bullshit now. I dont trust them anymore than Google or Facebook.Your assertion that nobody will upload CSAM to iCloud isn’t backed up by how things have worked to date. Apple has already reported incidences of it. It was a relatively low number, in the mid-200s I think. Meanwhile, Facebook reported over 20 million instances in the same year. So, those numbers aren’t proving that the people sharing or storing CSAM are particularly clever.By the way, you say they won’t be saving their photos to iCloud, which backs up that this is an opt in implementation. If you’re worried, turn it off.
They are building in the capability to scan the contents of messages prior to encryption and transmission. THAT’S what people are MOST upset about. Now that the capability to read/analyze messages prior to secure transmission exists, the fear is ANY gov org can/will pressure apple to allow them access to the capability to surveil people. THAT’s the concern here. -
New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM
Another reason to believe this has nothing to do with csam and everything to do with building in a back door: No fûcking pedo is going to save their photos to iCloud (and certainly not now). And no pedo is going to opt in to any monitoring. So how much benefit will this non-backdoor backdoor provide? What’s it going to stop? MAYBE kids sending each other inappropriate pictures? Perhaps some small time preditors sending or requesting pics to/from kids (I’m sure this happens..but at the scale worthy of backdooring a secure platform?)? Is that worth subverting global secure communications for CERTAIN exploitation in the future!?
No.
And they’ll still get up on stage in September and boast their “end to end” encryption and all their bullshit privacy advocacy. It’s all bullshit now. I dont trust them anymore than Google or Facebook. -
New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM
ArchStanton said:nizzard said:I just find it incredibly hard to believe the timing on this is not related to their anti-trust case with Epic. It’s all Dept of Justice. “Build this in and you wont get smoked in that case”