Edward Snowden calls Apple CSAM plans 'disaster-in-the-making'
Apple's CSAM detection feature is the topic of Edward Snowden's latest editorial on Substack, with the former intelligence contractor and whistleblower turned journalist calling the strategy a "tragedy."

In a Wednesday installment of his newsletter, Snowden dispenses with technical refutations of Apple's CSAM system and cuts to the chase, saying the solution will "permanently redefine what belongs to you, and what belongs to them."
The feature, which is slated to roll out with iOS 15 this fall, will hash and match user photos marked for upload to iCloud against a hashed database of known CSAM pulled from at least two different entities. Importantly, and unlike existing systems, Apple's variation conducts all processing on-device. This will, according to Snowden, "erase the boundary dividing which devices work for you, and which devices work for them."
"Once the precedent has been set that it is fit and proper for even a 'pro-privacy' company like Apple to make products that betray their users and owners, Apple itself will lose all control over how that precedent is applied," Snowden writes.
He further argues that Apple's rollout of CSAM detection features has more to do with brand image than the protection of children or conforming to regulations, noting the feature can be avoided simply by disabling iCloud Photos uploads. The idea that Apple will introduce the measure in preparation of end-to-end encryption across iCloud is also pooh-poohed. Implementation of such a system would not matter, Snowden says, because iPhone will already have surveillance capability built in.
Both notions have been lobbed by experts and critics as part of a wider online debate.
Like others, Snowden fears governments will abuse the system by compelling Apple to expand the on-device CSAM feature or mandate that it be active by all users at all times. Arguments concerning mission creep have been central to criticism of Apple's plans since the effort was announced earlier this month.
"There is no fundamental technological limit to how far the precedent Apple is establishing can be pushed, meaning the only restraint is Apple's all-too-flexible company policy, something governments understand all too well," Snowden writes.
Apple for its part says it will not bend to government demands to expand the system beyond its original directive.
A number of high-profile figures in the information security sector, as well as some governments and civil rights groups, have urged Apple to reconsider its strategy, but the company appears undeterred.
The hubbub stems in part from Apple's very public commitment to user privacy. Over the past few years the company has positioned itself as a champion of privacy and security, investing in advanced hardware and software features to forward those goals. Critics argue the CSAM features, particularly photo hash scanning, will not only tarnish that reputation, but pave the way for a new era of digital surveillance.
Read on AppleInsider

In a Wednesday installment of his newsletter, Snowden dispenses with technical refutations of Apple's CSAM system and cuts to the chase, saying the solution will "permanently redefine what belongs to you, and what belongs to them."
The feature, which is slated to roll out with iOS 15 this fall, will hash and match user photos marked for upload to iCloud against a hashed database of known CSAM pulled from at least two different entities. Importantly, and unlike existing systems, Apple's variation conducts all processing on-device. This will, according to Snowden, "erase the boundary dividing which devices work for you, and which devices work for them."
"Once the precedent has been set that it is fit and proper for even a 'pro-privacy' company like Apple to make products that betray their users and owners, Apple itself will lose all control over how that precedent is applied," Snowden writes.
He further argues that Apple's rollout of CSAM detection features has more to do with brand image than the protection of children or conforming to regulations, noting the feature can be avoided simply by disabling iCloud Photos uploads. The idea that Apple will introduce the measure in preparation of end-to-end encryption across iCloud is also pooh-poohed. Implementation of such a system would not matter, Snowden says, because iPhone will already have surveillance capability built in.
Both notions have been lobbed by experts and critics as part of a wider online debate.
Like others, Snowden fears governments will abuse the system by compelling Apple to expand the on-device CSAM feature or mandate that it be active by all users at all times. Arguments concerning mission creep have been central to criticism of Apple's plans since the effort was announced earlier this month.
"There is no fundamental technological limit to how far the precedent Apple is establishing can be pushed, meaning the only restraint is Apple's all-too-flexible company policy, something governments understand all too well," Snowden writes.
Apple for its part says it will not bend to government demands to expand the system beyond its original directive.
A number of high-profile figures in the information security sector, as well as some governments and civil rights groups, have urged Apple to reconsider its strategy, but the company appears undeterred.
The hubbub stems in part from Apple's very public commitment to user privacy. Over the past few years the company has positioned itself as a champion of privacy and security, investing in advanced hardware and software features to forward those goals. Critics argue the CSAM features, particularly photo hash scanning, will not only tarnish that reputation, but pave the way for a new era of digital surveillance.
Read on AppleInsider
Comments
Snowden would know.
Clearly, none of you people have a clue on how it works.
Snowden is a traitor and betrayed our Americans for exposing our privacy for the internet to see and you're worried about Apple who is making sure no child porn is shared across the Internet while preserving privacy?
Apple is among the last to implement it because it had to figure out a way to maintain their existing privacy policies. Those other companies may or may not have bothered with user privacy. ISTR that FB reported 20 million instances of CSAM matching last year? Anyway, point is: if you’re just learning about this, it’s because those other sites just didn’t tell you. If you’re still willfully arguing against it, you appear to be pro-CSAM.
I hear absolutely no meaningful discussion about how social media providers and other photo services: each who all currently perform a more-privacy invasive version of CSAM scanning, are a threat to democracy and the like. Social media is a richer government target as it's the platform used to spread the information. The other issue is other service providers, such as Google's services - which has a *significantly* larger user base seemingly get no attention on the CSAM topic. While Apple uses hashes from the intersection of two known CSAM databases, Google additionally uses AI to guess if a photo is CSAM - so Google's human reviewers are fine, but Apple's are not? The double standard by commentators is clear.
In short: I don't buy the counter-arguments: they're filled with flaws big enough to drive a truck through. The real scandal is that Apple *didn't* have this feature for so long.
PS wonder why there is no FaceTime in the middle east ... pure legislation ...
Hopefully Apple changes mind and put this in iCloud.
This would make me updating to iOS 15 and buying an iPhone 13 Pro as I originally planned.
The fact that the whistleblower who revealed the US government's illegal surveillance state operations is in exile in Russia - while the people who ordered and implemented it walk free - should provide a pretty solid roadmap as to how this will end up being abused.