rcfa
About
- Banned
- Username
- rcfa
- Joined
- Visits
- 120
- Last Active
- Roles
- member
- Points
- 1,678
- Badges
- 1
- Posts
- 1,124
Reactions
-
Tech industry needs to rebuild user trust after privacy losses, says Tim Cook
-
Apple exec said iCloud was the 'greatest platform' for CSAM distribution
No matter how much “deescalating and explaining” Apple does: it cannot change the fundamental fact, that once the infrastructure is in place, there’s no TECHNICAL limit for what it can be used, but there may be LEGAL limits in Apple’s ability to tell people when the use or the reporting or the databases change based on a variety of governments’ “lawful requests”.
It isn’t Apple’s job to know or prevent criminal behavior.
Sony doesn’t have cameras that try to detect what pictures you take, paper vendors aren’t concerned with the fact that you jot down notes of an assassination plot on the paper you bought, the post office doesn’t care if you mail a USB stick with snuff videos on it.
It’s not Apple’s job to know, care, prevent, or report crime, unless as a service at the request of the owner (“find my stolen phone!”)
So Apple’s very thinking is flawed, as they hold themselves responsible for something that’s fundamentally none of their business, literally and figuratively. -
Civil rights groups worldwide ask Apple to drop CSAM plans
foregoneconclusion said:Apple_Bar said:foregoneconclusion said:FYI to EFF: totalitarian governments already have their populations under surveillance. You're not thwarting totalitarianism by having Apple remove the CSAM hash scanning capability. Citizens of China, Russia, Turkey etc. use smartphones and are still under totalitarian control regardless.
The only time hash scanning could possibly work, is if someone opens up an album for cloud sharing to friends, at which time it must be unencrypted or encrypted with a key to which Apple has access. But such scanning would then be stupid, because only the dumbest of the dumb would share questionable contents directly on an open photo album, rather than in an encrypted zip file, via file sharing.
In other words: it’s rather clear: unless there’s on-device scanning, people’s data can be kept safe by e2e encryption, something that if Apple won’t do it GrapheneOS will do; so if Apple wants to remain competitive, they should do it, too.
On device scanning is opening the door to all sorts of devils, it’s utterly irrelevant how old the basics of hash scanning are. Anyone who pretends otherwise either demonstrates a lack of understanding of security, or is a government/law enforcement troll who wants to spread FUD about the whole thing, in an effort to save and later expand on Apple’s efforts. -
Civil rights groups worldwide ask Apple to drop CSAM plans
The organizations that know how dangerous this is, put pressure on Apple. Bravo!
Apple can get out of this: yank out the feature, and implement E2E encryption for all data, including iCloud backups.
Apple can say, that they were emotionally swayed in their attempt of solving this problem, but were convinced that it could and likely would be abused, and that they through the resulting discussion were convinced to double down on user privacy.
They can. The question is: will they? -
Civil rights groups worldwide ask Apple to drop CSAM plans
mike_galloway said:cjlacz said:Little late for that now. Apple already opened that can of worms (if it is actually a problem) just by announcing it. I'd still kind of like to see this implemented. I think Apple has addressed this as best it can, and governments can mandate that companies scan for this stuff with laws regardless of any prior implementation. Tech has kind of created this problem of sharing these CSAM photos easily. I'd like to see them as part of the solution too.Agreed - our ability to say, post or store online what we want without consequence has gone too far. This has been caused by the tech companies so it’s really up to them to fix it.
The idea, that a temporary technological weakness (analog transmissions) leads to a permanent right to unfettered access to people’s data, is ridiculous.
Does the constitution make any exceptions about search and seizure? No, it doesn’t. Police can’t just randomly enter your house and say: “Oh, we’re just looking for child porn.”
Does the constitution allow torture or “truth serums”? No, it doesn’t. Modern computing devices, especially phones, are in essence brain prosthetics. Access to these is like tapping someone’s brain.
Our ability to say, post, or store online what we want without consequences by far doesn’t go far enough, as cancel culture, political correctness, executed atheists, incarcerated regime critics, or even the cases of Snowden and Assange amply prove.