macplusplus
About
- Username
- macplusplus
- Joined
- Visits
- 293
- Last Active
- Roles
- member
- Points
- 3,141
- Badges
- 1
- Posts
- 2,119
Reactions
-
Open letter asks Apple not to implement Child Safety measures
killroy said:macplusplus said:killroy said:macplusplus said:omasou said:macplusplus said:Apple should shut down iCloud instead of developing a mass surveillance method for the government.
If we see or are aware of CSAM we should report it. Apple can SEE and be AWARE of CSAM w/o violating anyone's privacy and SHOULD report it.
The fact that they are bringing the monitoring right into my device shows that they might be following a totally different agenda than preventing child abuse. They may be trying to permanently implement something onto user devices which scope may extend to God knows where...
Because once it's on Apple servers they can't see it because it's encrypted. You have to see it before it's encrypted or it won't work.
Your phone yes the OS not so much. If you read your phone carriers terms of service you might find that they can upload anything they want to. What Apple is proposing is to do is add a firewall on the OS to keep the nefarious stuff off their servers. -
Epic Games CEO slams Apple 'government spyware'
anonymouse said:macplusplus said:crowley said:macplusplus said:crowley said:macplusplus said:crowley said:macplusplus said:crowley said:macplusplus said:9secondkox2 said:crowley said:newisneverenough said:Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative.Not anymore.Goodbye iCloud storage.Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement.
https://appleinsider.com/articles/21/08/06/what-you-need-to-know-apples-icloud-photos-and-messages-child-safety-initiatives
"The difference going forward is that Apple will be matching fingerprints against NCMEC’s database client-side, not server-side"
He then continues:
"This slippery-slope argument is a legitimate concern." and
"Will Apple actually flatly refuse any and all such demands? If they do, it’s all good. If they don’t, and these features creep into surveillance for things like political dissent, copyright infringement, LGBT imagery, or adult pornography — anything at all beyond irrefutable CSAM — it’ll prove disastrous to Apple’s reputation for privacy protection. The EFF seems to see such slipping down the slope as inevitable.We shall see. The stakes are incredibly high, and Apple knows it. Whatever you think of Apple’s decision to implement these features, they’re not doing so lightly."
That doesn't support your argument much. -
Epic Games CEO slams Apple 'government spyware'
crowley said:macplusplus said:crowley said:macplusplus said:crowley said:macplusplus said:crowley said:macplusplus said:9secondkox2 said:crowley said:newisneverenough said:Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative.Not anymore.Goodbye iCloud storage.Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement.
https://appleinsider.com/articles/21/08/06/what-you-need-to-know-apples-icloud-photos-and-messages-child-safety-initiatives
-
Open letter asks Apple not to implement Child Safety measures
jungmark said:mangakatten said:What happens If someone send child porn images to someone else iPhone, will it be taken down? This could be a way to silence critics.
Many people use their phones by trial and error and don't have a coherent view of the operating system, are unaware of what resides where.
And Apple's scheme doesn't warn you in such cases. Even if it detects a hash match, it continues to sneakily upload it to iCloud by updating your "perversion" score. If that score reaches some threshold it reports you, to catch you hands on job right where you are, damn pervert ! -
Open letter asks Apple not to implement Child Safety measures
killroy said:mangakatten said:What happens If someone send child porn images to someone else iPhone, will it be taken down? This could be a way to silence critics.
There's a difference between sent and downloaded. They can send you photos now and then call the cops making a swatting call.