AlwaysWinter
About
- Username
- AlwaysWinter
- Joined
- Visits
- 5
- Last Active
- Roles
- member
- Points
- 153
- Badges
- 1
- Posts
- 6
Reactions
-
Apple details user privacy, security features built into its CSAM scanning system
I don’t think this is an issue of Apple’s “messaging” or understanding by users. I still have three serious concerns that don’t seem to be addressed.
#1 Apple has acknowledged the privacy impact of this technology if misapplied by totalitarian governments. The response has been, “we’ll say no”. In the past the answer hasn’t been no with China and Saudi Arabia. This occurred when Apple was already powerful and wealthy. If a government compelled Apple, or if Apple one day is not in a dominant position they may not be able to say no even if they want to.
#2 We’ve recently observed zero-day exploits being used by multiple private companies to bypass the existing protections that exist in Apple’s platforms. Interfaces like this increase the attack surface that malicious actors can exploit.
#3 Up until this point the expectation from users has been that the data on your device was private and that on-device processing was used to prevent data from being uploaded to cloud services. The new system turns that expectation around and now on-device processing is being used as a means to upload to the cloud. This system, though narrowly tailored to illegal content at this time, changes the operating system’s role from the user’s perspective and places the device itself in a policing and policy enforcement role. This breaks the level of trust that computer users have had since the beginning of computing, that the device is “yours” in the same way that your car or home is “your” property.
Ultimately I think solving a human nature problem with technology isn’t a true solution. I think Apple is burning reputation with this move that was hard won. In my opinion law enforcement and judicial process should be used to rectify crime rather than technology providers like Apple.
-
New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM
Their defense is that they would refuse authoritarian attempts but we’ve seen two instances so far where Apple couldn’t refuse. With iCloud in China and with FaceTime in Saudi Arabia. Setting the past aside what’s to say that the political or financial climate for Apple won’t change and make it harder to say no than it is now? There may come a time when they want to say no but can’t. -
Apple's CSAM detection system may not be perfect, but it is inevitable
I question why any of this is Apple’s responsibility at all. In my opinion Apple’s job isn’t to solve deep rooted issues like this. Their responsibility is to make computers. We already have a democratic system to make laws, law enforcement systems to detect violations of those laws, a justice system to render decisions regarding violations of laws, and a penal system to contain and punish those that have been convicted of breaking the law. Creating these systems took thousands of years of civilization and thought to refine. Why are we now as a society putting the entire burden for this on technology companies? It reminds me of how during the heat of the pandemic we wanted tech to solve COVID-19 and the best they could do was a tracing system that didn’t really make a massive dent in the disease. Tech can’t solve all problems, no matter how much we all wish it could. -
Apple's Federighi says child protection message was 'jumbled,' 'misunderstood'
I don’t think this is an issue of Apple’s “messaging” or understanding by users. I still have three serious concerns that don’t seem to be addressed.
#1 Apple has acknowledged the privacy impact of this technology if misapplied by totalitarian governments. The response has been, “we’ll say no”. In the past the answer hasn’t been no with China and Saudi Arabia. This occurred when Apple was already powerful and wealthy. If a government compelled Apple, or if Apple one day is not in a dominant position they may not be able to say no even if they want to.
#2 We’ve recently observed zero-day exploits being used by multiple private companies to bypass the existing protections that exist in Apple’s platforms. Interfaces like this increase the attack surface that malicious actors can exploit.
#3 Up until this point the expectation from users has been that the data on your device was private and that on-device processing was used to prevent data from being uploaded to cloud services. The new system turns that expectation around and now on-device processing is being used as a means to upload to the cloud. This system, though narrowly tailored to illegal content at this time, changes the operating system’s role from the user’s perspective and places the device itself in a policing and policy enforcement role. This breaks the level of trust that computer users have had since the beginning of computing, that the device is “yours” in the same way that your car or home is “your” property.
Ultimately I think solving a human nature problem with technology isn’t the true solution. I think Apple is burning reputation with this move that was hard won. In my opinion law enforcement and judicial process should be used to rectify crime rather than technology providers like Apple.