macplusplus
About
- Username
- macplusplus
- Joined
- Visits
- 293
- Last Active
- Roles
- member
- Points
- 3,141
- Badges
- 1
- Posts
- 2,119
Reactions
-
Open letter asks Apple not to implement Child Safety measures
killroy said:macplusplus said:omasou said:macplusplus said:Apple should shut down iCloud instead of developing a mass surveillance method for the government.
If we see or are aware of CSAM we should report it. Apple can SEE and be AWARE of CSAM w/o violating anyone's privacy and SHOULD report it.
The fact that they are bringing the monitoring right into my device shows that they might be following a totally different agenda than preventing child abuse. They may be trying to permanently implement something onto user devices which scope may extend to God knows where...
Because once it's on Apple servers they can't see it because it's encrypted. You have to see it before it's encrypted or it won't work. -
Epic Games CEO slams Apple 'government spyware'
crowley said:macplusplus said:9secondkox2 said:crowley said:newisneverenough said:Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative.Not anymore.Goodbye iCloud storage.Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement.
Just for the sake of fun let's imagine a user sporadically bombarded with child porn over WeChat, Telegram, WhatsApp and alike. Our decent guy duly deletes every image he finds in those chat apps, he thinks he keeps his iPhone clean of such things. But what he doesn't know is that everytime such an image arrives, it is automatically saved to his iCloud photo library too. Safety vouchers will accumulate and our poor guy will end up in jail even without figuring out what has happened to him !.. -
Epic Games CEO slams Apple 'government spyware'
9secondkox2 said:crowley said:newisneverenough said:Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative.Not anymore.Goodbye iCloud storage.Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement on your own property. -
Open letter asks Apple not to implement Child Safety measures
omasou said:macplusplus said:Apple should shut down iCloud instead of developing a mass surveillance method for the government.
If we see or are aware of CSAM we should report it. Apple can SEE and be AWARE of CSAM w/o violating anyone's privacy and SHOULD report it.
The fact that they are bringing the monitoring right into my device shows that they might be following a totally different agenda than preventing child abuse. They may be trying to permanently implement something onto user devices which scope may extend to God knows where... -
What you need to know: Apple's iCloud Photos and Messages child safety initiatives
StrangeDays said:macplusplus said:dewme said:bulk001 said:dewme said:thrang said:My take is this.
Apple saw the writing on the wall regarding government mandated back-doors. Doors that would all in all likelihood much more open with unfettered access to much if not all of your information. Perhaps begin he sense, the pressure were growing immense.
Perhaps, they decided to develop a content and technology around very narrow and focused one-way beacons (to avoid a mandated backdoor), initially to identify illegal and abhorrent possession and behavior. Perhaps evidence of murders, rapes, extortion, terrorism, may be other beacons that are sent out in the future.
I know, there will be discussions over who makes the decisions, how is it vetted, errors, misuse, hacking, etc. But essentially, to me, Apple seeking to control the process of what they see are inevitable laws that they would need to comply with and with much worse outcomes for users. One way beacons that need to be further vetted to ensure no false positives is an effort to ameliorate law enforcement concerns while still protecting legitimate private information is perhaps a very good approach.
While it feels icky upon initially hearing about it, the more you think about it this way (and what government enforced alternatives might be), it may begin to make sense.
In terms of what defines when these future beacons will be sent out, Apple will like say ask for the pertinent laws to govern such beaconing, leaving up to elected leaders to clarify/legislate/vote what kind of content is considered severely harmful, how it is to be legally obtained, and utlimately leave it up to voters to support or oust those politicians in future elections. So in this case, there is a well defined hash database for this particular category. Apple then implements an on-device methodology that is designed to keep the rest of your data protected and unavailable for unrelated sniffing about while beaconing when there is a match.
As to other categories of hash matching, governments will need to figure that out which would be subjects to immense scrutiny and public debate I'm sure...
There are caveats of course, but in principal this is how I see what has happened.I think it’ll buy Apple some time, but not much. As more and more countries including the United States continue to radicalize against not only remote outsiders, but their fellow citizens who they now consider outsiders because they don’t embrace reality bending authoritarian despots, the requests for back doors will transform into demands for back doors that cannot be denied.I’m very much in favor of what Apple is proposing, but I’m equally concerned that what they are proposing will not be enough to keep the bigger issue of massive government intrusion through mandated back doors at bay. At some point we’ll all have to assume that privacy as we used to know it no longer exists. Nothing Apple is doing will change the eventual outcome if the embrace of authoritarianism and demonization of fellow citizens is allowed to grow.
I’m totally behind Apple on this because they are doing the right thing wrt child abuse and they are removing an impediment that’s causing a stalemate on the larger issue of forced government surveillance. The inextricable linkage between the two is very evident in the posts . Doing nothing, as most posters are suggesting and the standard mode of operation, is not a valid option in my opinion. Whether you agree with what Apple is doing or not, these issues need to be discussed openly and not enter into an interminably entrenched ideological stalemate with no progress made on anything. Pragmatism has its time and place, and Apple is saying that time is now.