Rayz2016

About

Banned
Username
Rayz2016
Joined
Visits
457
Last Active
Roles
member
Points
18,421
Badges
2
Posts
6,957
  • Users lobby 1Password to abandon new Electron version

    The biggest complaint, however, is that they're dropping the standalone version completely: no more local vaults. Everything has to be stored on 1password.com.


    MplsPpscooter63tudelostkiwidoozydozen
  • Apple details user privacy, security features built into its CSAM scanning system

    As usual, Rene Ritchie nails the problem and comes up with a solution:

    https://youtu.be/4poeneCscxI

    Scan the photos on an intermediate server before they’re sent to iCloud. That would remove the on-device scanner that everyone is concerned about. 

    Why they won’t do this:

    That would require more servers. It’s much cheaper for Apple to use the resources on your device to do the scanning for them. 

    If they want to add other goodies, such as activity tracking, later on, then they can only really do this on-device. 
    mike54xyzzy-xxxOctoMonkeyaderutterelijahgbaconstangharrywinterbyronl
  • Apple details user privacy, security features built into its CSAM scanning system

    chadbag said:
    The problem is the hubris of Apple and Tim Cook and his EVP and other staff.   They are so convinced that all their "social" wokeness and initiatives are 100% correct and a mission from god (not God).  They are not listening.  They don't care.  They think they are right and just need to convince you of that.
    Yes, the hubris of Apple...and every other company that was already doing this. Google, Microsoft, Facebook, Dropbox, Snapchat, ... Oh look: "To date, over 1,400 companies are registered to make reports to NCMEC’s CyberTipline and, in addition to making reports, these companies also receive notices from NCMEC about suspected CSAM on their servers."

    https://www.missingkids.org/theissues/csam#bythenumbers
    Google, Facebook, Snapchat and Microsoft aren’t running spyware  on your phone. This is the difference that detractors of this move bring up every time, and supporters of this move are desperate to ignore. 

    Oh, and Apple already scans their servers for CSAM images, so why move it the phone? 

    And this is what this is all about, in my opinion: the spyware. Apple needs you to accept the argument that with spyware running on your phone, your privacy is safe. Wrapping it in the noble cause of child protection was a good move; they hoped that if anyone criticised them then their supporters would use cries of “think of the children!” to silence them. 

    So why are they doing it?

    So when they introduce a client-side logger to record the apps you run and the sites you visit, they can tell you your privacy is safe even though this representation of your activity is sold on to advertisers. And you will, of course, support them and agree: “No, look; it’s not an invasion of privacy! They’re only sending this bit of data. The photo of you is blurred and they’ve used AI to scrub out your face!” You will agree because you were fooled the first time round, but rather than admit it, you’ll carry on desperately ignoring the obvious fact that this is spyware Apple is running on your phone. You’ll ignore the fact that apple is trying to redefine what privacy is so they can sell yours. 

    Over the years, we’ve been throwing around the phrase, “With Google, you’re the product.”  Google monetises your data. 

    With Apple, it’s different: access to you is the product. They’ve spent years and billions of dollars cultivating a user base of affluent people who actually buy stuff … and if you want access to that user base, Apple thinks you should pay them. 

    mike54muthuk_vanalingammacpluspluscat52aderutterelijahgbaconstangharrywinterdarkvaderbyronl
  • Apple details user privacy, security features built into its CSAM scanning system

    User data should only used to improve that user's experience, not to share with other orgs. No matter what the scenario is.

    No user data is being shared with other orgs — until a very concerning threshold has been met, and Apple has full rights to audit and monitor their own servers for abuse. They would be irresponsible to allow anybody to user their servers for any purpose without some protections in place.

    This is not an invasion of privacy, no matter how people want to spin it.

    They’re running spyware on your phone that reports on you without informing you, and along the way they’ve failed to provide an answer to questions of overreach. When a government requests that this “feature” is extended by law, Apple’s best response is that they’ll resist, which hasn’t worked at all in China or Saudi. 

    Replacing “scanning” with “analysing” doesn’t help. And having strangers look at your pictures when the system fails? … The most egregious example of privacy-breaking I’ve seen in years. 

    bloggerblogmobirdmike54muthuk_vanalingamcat52elijahgbaconstangdarkvaderbyronl
  • Apple's Federighi says child protection message was 'jumbled,' 'misunderstood'

    But things must be getting a bit sticky if they're rolling out Hair Force One.

    It's odd, because with this much of a backlash, Google and Microsoft would've thrown in the towel and sat round the campfire for a rethink,

    Apple keeps on insisting that the problem is the dissenters: we just don't understand. We understand just fine, Craig; we just disagree with you.

    Apple is determined to drive this through, no matter what; and you have to wonder why. I mean they already scan images on their servers, so why are they so determined to get spyware running on your phone?

    I think the reason is that, after a couple of false starts, Cupertino is ready to go all in on its next big product: advertising. But how do you do this while keeping up the 'privacy' mantra? How do you get into user tracking when you've spent the past three or four years crucifying all the other companies who make money doing it?

    Well, to start with, you release a client-side tracker, give it a noble purpose, and then try to convince people that their privacy is protected because it is not moving around a real image; just a hashed representation of the image.

    If you can get people to accept that, then it's a lot easier to get them to accept step 2; a client-side tracker that logs what you're doing on the phone, which developers and marketers can hook into and extract information. But here's the clever part: the info they extract is a machine-learned representation of you that gets a unique number so it can be tracked across applications. But it doesn't contain any real details; not your name, address, health records, nothing; because as long as they know that 884398443894398 exercises three times a week, goes to a lot of cookery classes and has a subscription to PornHub, that's all they really care about. Why do they need to know your real name? They can serve relevant ads to that person without knowing who they are. Only Apple knows that, and they will not allow that information out.  The APIs to access this pseudo-you might even incur a subscription charge.

    But to make this work, they would need the user base to be accept loggers running on their phones. And that's where we are now: Step 1. That's why the client-side tool cannot be dropped. Without it, the whole plan is screwed.

    Of course, this would work for apps, but once you get out onto the web then there's no API, so for that to work, Apple would need some kind of private relay that could substitute your details with your avatar when you make web requests.


    The message Apple is trying to get across is that your privacy is not compromised, because we're just dealing with a representation of your data, not the data itself. 


    DAalsethmuthuk_vanalingamgatorguyBeatselijahganantksundarambulk001OctoMonkeycat52argonaut