rcfa

About

Banned
Username
rcfa
Joined
Visits
120
Last Active
Roles
member
Points
1,678
Badges
1
Posts
1,124
  • Tech industry needs to rebuild user trust after privacy losses, says Tim Cook

    "And so it's in the hands of the inventor and the user as to whether it's used for good, or not used for good."

    NO, NO, and NO! WRONG!

    It’s none of the inventor’s business if it’s used for good or not! It’s SOLELY the USERs responsibility.
    georgie01elijahg
  • Apple exec said iCloud was the 'greatest platform' for CSAM distribution

    No matter how much “deescalating and explaining” Apple does: it cannot change the fundamental fact, that once the infrastructure is in place, there’s no TECHNICAL limit for what it can be used, but there may be LEGAL limits in Apple’s ability to tell people when the use or the reporting or the databases change based on a variety of governments’ “lawful requests”.

    It isn’t Apple’s job to know or prevent criminal behavior.

    Sony doesn’t have cameras that try to detect what pictures you take, paper vendors aren’t concerned with the fact that you jot down notes of an assassination plot on the paper you bought, the post office doesn’t care if you mail a USB stick with snuff videos on it.

    It’s not Apple’s job to know, care, prevent, or report crime, unless as a service at the request of the owner (“find my stolen phone!”)

    So Apple’s very thinking is flawed, as they hold themselves responsible for something that’s fundamentally none of their business, literally and figuratively.
    macpluspluschemengin1
  • Civil rights groups worldwide ask Apple to drop CSAM plans

    Apple_Bar said:
    FYI to EFF: totalitarian governments already have their populations under surveillance. You're not thwarting totalitarianism by having Apple remove the CSAM hash scanning capability. Citizens of China, Russia, Turkey etc. use smartphones and are still under totalitarian control regardless. 
    So what you are saying is that since China, Russia and Turkey have their population under surveillance. Then countries under a Democracy shouldn’t express their VALID privacy concerns about the implementation of this scanning mechanism. 
    No, I'm responding to EFF's claim that Apple is somehow harming global freedom and privacy rights with the CSAM hash scanning. Hash scanning isn't something Apple invented. Any government from any country could hire programmers to create hash scanning programs for phones or computers. Whether or not Apple includes the CSAM hash scanning functionality doesn't change anything in that regard. 
    Yes, it changes EVERYTHING if Apple includes hash scanning functionality or not. Because for hash scanning to work, things have to be unencrypted. And if Apple’s on-device data is encrypted by keys to which only the user has access to, and is e2e encrypted, including at rest, as it’s stored on the cloud, there’s no hash scanning, either.

    The only time hash scanning could possibly work, is if someone opens up an album for cloud sharing to friends, at which time it must be unencrypted or encrypted with a key to which Apple has access. But such scanning would then be stupid, because only the dumbest of the dumb would share questionable contents directly on an open photo album, rather than in an encrypted zip file, via file sharing.

    In other words: it’s rather clear: unless there’s on-device scanning, people’s data can be kept safe by e2e encryption, something that if Apple won’t do it GrapheneOS will do; so if Apple wants to remain competitive, they should do it, too.

    On device scanning is opening the door to all sorts of devils, it’s utterly irrelevant how old the basics of hash scanning are. Anyone who pretends otherwise either demonstrates a lack of understanding of security, or is a government/law enforcement troll who wants to spread FUD about the whole thing, in an effort to save and later expand on Apple’s efforts.
    xyzzy-xxxbaconstangiHy
  • Civil rights groups worldwide ask Apple to drop CSAM plans

    The organizations that know how dangerous this is, put pressure on Apple. Bravo!

    Apple can get out of this: yank out the feature, and implement E2E encryption for all data, including iCloud backups.

    Apple can say, that they were emotionally swayed in their attempt of solving this problem, but were convinced that it could and likely would be abused, and that they through the resulting discussion were convinced to double down on user privacy.

    They can. The question is: will they?
    xyzzy-xxxbaconstangentropys
  • Civil rights groups worldwide ask Apple to drop CSAM plans

    cjlacz said:
    Little late for that now. Apple already opened that can of worms (if it is actually a problem) just by announcing it. I'd still kind of like to see this implemented. I think Apple has addressed this as best it can, and governments can mandate that companies scan for this stuff with laws regardless of any prior implementation. Tech has kind of created this problem of sharing these CSAM photos easily. I'd like to see them as part of the solution too.

    Agreed - our ability to say, post or store online what we want without consequence has gone too far. This has been caused by the tech companies so it’s really up to them to fix it.

    Nothing has gone too far! It’s not going far enough. Rights are not defined by how they can be abused. It’s also not the tech companies issue to “fix” something they didn’t even break. Law enforcement hat a lucky break with analog telecommunications, which were trivial to intercept. Before that, people talked in person, or sent trusted messengers or messages that would then be burnt. There was never a right to intercept messages, which is why the laws against intercepting mail are very strict.

    The idea, that a temporary technological weakness (analog transmissions) leads to a permanent right to unfettered access to people’s data, is ridiculous.

    Does the constitution make any exceptions about search and seizure? No, it doesn’t. Police can’t just randomly enter your house and say: “Oh, we’re just looking for child porn.”
    Does the constitution allow torture or “truth serums”? No, it doesn’t. Modern computing devices, especially phones, are in essence brain prosthetics. Access to these is like tapping someone’s brain.

    Our ability to say, post, or store online what we want without consequences by far doesn’t go far enough, as cancel culture, political correctness, executed atheists, incarcerated regime critics, or even the cases of Snowden and Assange amply prove.
    xyzzy-xxxmrstepbaconstang