Apple 'poisoned the well' for client-side CSAM scanning, says former Facebook security chi...

Posted:
in General Discussion edited August 2021
Alex Stamos, former Facebook security chief, says Apple's approach to CSAM scanning and iMessage exploitation may have caused more harm than good for the cybersecurity community.

Apple's CSAM scanning tool has become a controversial topic
Apple's CSAM scanning tool has become a controversial topic


Once iOS 15 and the other fall operating systems release, Apple will introduce a set of features intended to prevent child exploitation on its platforms. These implementations have created a fiery online debate surrounding user privacy and the future of Apple's reliance on encryption.

Alex Stamos is currently a professor at Stanford but previously acted as security chief at Facebook. He encountered countless damaged families as a result of abuse and sexual exploitation during his tenure at Facebook.

He wants to stress the importance of technologies, such as Apple's, to combat these problems. "A lot of security/privacy people are verbally rolling their eyes at the invocation of child safety as a reason for these changes," Stamos said in a Tweet. "Don't do that."

The Tweet thread covering his views surrounding Apple's decisions is extensive but offers some insight into the matters brought up by Apple and experts alike.

In my opinion, there are no easy answers here. I find myself constantly torn between wanting everybody to have access to cryptographic privacy and the reality of the scale and depth of harm that has been enabled by modern comms technologies.

Nuanced opinions are ok on this.

-- Alex Stamos (@alexstamos)


The nuance of the discussion has been lost on many experts and concerned internet citizens alike. Stamos says the EFF and NCMEC both reacted with little room for conversation, having used Apple's announcements as a stepping stone to advocate for their equities "to the extreme."

Information from Apple's side hasn't helped with the conversation either, says Stamos. For example, the leaked memo from NCMEC calling concerned experts "screeching voices of the minority" is seen as harmful and unfair.

Stanford hosts a series of conferences around privacy and end-to-end encryption products. Apple has been invited but has never participated, according to Stamos.

Instead, Apple "just busted into the balancing debate" with its announcement and "pushed everybody into the furthest corners" with no public consultation, said Stamos. The introduction of non-consensual scanning of local photos combined with client-side ML might have "poisoned the well against any use of client side classifiers."

The implementation of the technology itself has left Stamos puzzled. He cites that the on-device CSAM scanning isn't necessary unless it is in preparation for end-to-end encryption of iCloud backups. Otherwise, Apple could easily perform the scanning server side.

The iMessage system doesn't offer any user-related reporting mechanisms either. So rather than alert Apple to users abusing iMessage for sextortion or sending sexual content to minors, the child is left with a decision -- one Stamos says they are not equipped to make.

As a result, their options for preventing abuse are limited.

What I would rather see:
1) Apple creates robust reporting in iMessage
2) Slowly roll out client ML to prompt the user to report something abusive
3) Staff a child safety team to investigate the worst reports

-- Alex Stamos (@alexstamos)


At the end of the Twitter thread, Stamos mentioned that Apple could be implementing these changes due to the regulatory environment. For example, the UK Online Safety Bill and EU Digital Services Act could both have influenced Apple's decisions here.

Alex Stamos isn't happy with the conversation surrounding Apple's announcement and hopes the company will be more open to attending workshops in the future.

The technology itself will be introduced in the United States first, then rolled out on a per-country basis. Apple says it will not allow governments or other entities to coerce it into changing the technology to scan for other items such as terrorism.

Read on AppleInsider
«1

Comments

  • Reply 1 of 25
    The implementation of the technology itself has left Stamos puzzled. He cites that the on-device CSAM scanning isn't necessary unless it is in preparation for end-to-end encryption of iCloud backups. ”

    Bingo. 
    tenthousandthings
  • Reply 2 of 25
    rcfarcfa Posts: 1,124member
    “Apple says it will not allow governments or other entities to coerce it into changing the technology to scan for other items such as terrorism.”

    Nice intentions…
    …not worth the bits they are written with.

    I just want to see Cook explain to shareholders the crash in Apple’s stock price as he announces leaving the Chinese market for refusing to scan for pictures of the Dalai Lama, Pooh, HK Protests, etc.

    You know, just like Apple left the Chinese market when China asked that VPN apps be removed from the AppStore, or when China and Russia mandated that all iCloud servers for their country’s users be within their jurisdiction, or when the US government wanted iCloud backups to remain unencrypted.

    Apple is always quick to point out that they comply with all the laws of the countries they operate in, so they will punt and point the finger to the authoritarian regimes’ laws as they obediently comply. And US authorities will use this as an excuse to not have national security disadvantages over other countries, etc.

    In the good old days before electronic communications, law enforcement couldn’t tap into anything, and they still managed to prosecute crime, they just had to put in more actual shoe leather for investigations, while these days some guy thinks the only time he should get up from his office chair is to present evidence in court (unless that’s a video conference, too). The demands of law enforcement are simply an expression of laziness.
    edited August 2021 emig647gatorguyelijahgspock1234darkvaderlibertyforall
  • Reply 3 of 25
    rcfarcfa Posts: 1,124member
    mr lizard said:
    “The implementation of the technology itself has left Stamos puzzled. He cites that the on-device CSAM scanning isn't necessary unless it is in preparation for end-to-end encryption of iCloud backups. ”

    Bingo. 
    Not sure why he’s saying that. On-device scanning gives users a fuzzy feeling of privacy (“I have nothing to hide, but no data leaves my device”) and saves Apple server side infrastructure for scanning large volumes of data across a massive cloud infrastructure. Instead they just have to deal with a (hopefully rather small) number of alerts.

    So even without E2E this makes sense.
    radarthekatjony0
  • Reply 4 of 25
    You know, in Brazil we have an expression: "Boasting to kill the jaguar, after it's already dead!" I doesn't really translates, but it really resonates with some of the reactions we've been seeing this past week. You know who doesn't give a damn? Regular Apple customers. You know, people with real jobs and life worries, that don't abuse children, and invest in Apple hardware because they think it is better than the competition's. They are also people that don't agonize about some very technical and narrow definition of privacy, particularly when it's been polluted by personal (ahem, monetary) interests, and when their own Congress is just about incapable of even following the most straightforward aspects of technology.

    Talk about an overblown "first world" problem.
    radarthekatspock1234scstrrfjony0Detnator
  • Reply 5 of 25
    lkrupplkrupp Posts: 10,557member
    Remember, people, this is the former 'Facebook security chief’. Facebook and security are mutually exclusionary terms. It’s like trying to put a square peg into a round hole. It ain’t happening.
    williamlondonscstrrflibertyforalljony0
  • Reply 6 of 25
    Rayz2016Rayz2016 Posts: 6,957member
    I think it’s worth reading what Stamos actually said to get some idea of context: 

    I also don't understand why Apple is pushing the CSAM scanning for iCloud into the device, unless it is in preparation for real encryption of iCloud backups. A reasonable target should be scanning shared iCloud albums, which could be implemented server-side.
    In any case, coming out of the gate with non-consensual scanning of local photos, and creating client-side ML that won't provide a lot of real harm prevention, means that Apple might have just poisoned the well against any use of client-side classifiers to protect users.

    I think he’s saying that there is a case for client-side scanning, but it should be for protecting users. No one complains when a virus checker scans files. That’s probably because you put it there yourself, and because it reports to you before it reports to the mothership. 


    macplusplusrcfaaderutterspock1234muthuk_vanalingamlibertyforalljony0
  • Reply 7 of 25
    radarthekatradarthekat Posts: 3,842moderator
    rcfa said:
    “Apple says it will not allow governments or other entities to coerce it into changing the technology to scan for other items such as terrorism.”

    Nice intentions…
    …not worth the bits they are written with.

    I just want to see Cook explain to shareholders the crash in Apple’s stock price as he announces leaving the Chinese market for refusing to scan for pictures of the Dalai Lama, Pooh, HK Protests, etc.

    You know, just like Apple left the Chinese market when China asked that VPN apps be removed from the AppStore, or when China and Russia mandated that all iCloud servers for their country’s users be within their jurisdiction, or when the US government wanted iCloud backups to remain unencrypted.

    Apple is always quick to point out that they comply with all the laws of the countries they operate in, so they will punt and point the finger to the authoritarian regimes’ laws as they obediently comply. And US authorities will use this as an excuse to not have national security disadvantages over other countries, etc.

    In the good old days before electronic communications, law enforcement couldn’t tap into anything, and they still managed to prosecute crime, they just had to put in more actual shoe leather for investigations, while these days some guy thinks the only time he should get up from his office chair is to present evidence in court (unless that’s a video conference, too). The demands of law enforcement are simply an expression of laziness.
    Wait, let me understand….  You’re saying that China didn’t know about CSAM database scanning and couldn’t, without Apple introducing the hash on device, have told Apple and others that it wants them to implement on-device photo scanning for anything they want to spy on?  Of course China could have.  So there’s NOTHING inherent in Apple doing this that suddenly allows China to make such a demand.  They could have done so last week or a year ago, requiring Apple and other vendors to implement code to scan photos, or next month even if Apple undoes it’s changes.  So please tell me, what has changed?
    edited August 2021 aderutterspock1234scstrrfjony0
  • Reply 8 of 25
    anonymouseanonymouse Posts: 6,860member
    I must have missed the part of the article where it explains or supports that he actually said Apple "poisoned the well". 
    edited August 2021 spock1234scstrrf
  • Reply 9 of 25
    auxioauxio Posts: 2,727member
    lkrupp said:
    Remember, people, this is the former 'Facebook security chief’. Facebook and security are mutually exclusionary terms.
    I was just about to say the same thing.  If people are outraged about CSAM scanning, they surely know about Facebook scanning everything (and not just hashes).  A former Facebook employee weighing in on it?  That's a joke right?
    rcfawilliamlondonspock1234scstrrflibertyforalljony0
  • Reply 10 of 25
    rcfarcfa Posts: 1,124member
    You know who doesn't give a damn? Regular Apple customers. You know, people with real jobs and life worries, that don't abuse children, and invest in Apple hardware because they think it is better than the competition's. They are also people that don't agonize about some very technical and narrow definition of privacy, […]

    Talk about an overblown "first world" problem.
    Talk about utter naïveté!

    This isn’t about “some very technical and narrow definition of privacy” this is about an infrastructure that can be used for arbitrary things. Just because, initially to sell it to the public, they limit it to child abuse, doesn’t mean there’s anything technical that limits the expansion to other domains.

    In adaptation to your Brazilian saying: “It’s stupid to think just because a jaguar was slain, the jungle is no free of dangers.”

    “People with real jobs and live worries” are bred, such that they prevent the solution of problems. Poor people desperate for a job won’t demonstrate or block the deforestation of the Amazon, they didn’t prevent the rise of Hitler to power, they were easily recruited by the Stasi in exchange for a little privilege and a small pay raise. They are the henchmen of any evil system, and then act surprised, when the powers show up at the door and drag them to a KZ or gulag.

    Tell the GeStaPo you’re not a criminal, when they knock at your door for having looked at the wrong web site, tell yourself “me being deported is just an overblown first world problem.”
    elijahgmacplusplusspock1234darkvadermuthuk_vanalingamlibertyforall
  • Reply 11 of 25
    lkrupp said:
    Remember, people, this is the former 'Facebook security chief’. Facebook and security are mutually exclusionary terms. It’s like trying to put a square peg into a round hole. It ain’t happening.
    Stamos is *highly* respected in the information security community. Speaking as someone who's job title includes the words chief, information, security, and officer, when he says something I almost always find it worth considering and never dismiss it outright based on where he's worked. For what it's worth, by all account his time at Facebook wasn't a harmonious one. 

    He's guest hosted the Risky Business security podcast numerous times. If you want to get a sense of the guy, I recommend listening to those episodes. 
    elijahgcorebeliefsaderuttercrowleychemengin1williamlondonapplguymuthuk_vanalingamlibertyforall
  • Reply 12 of 25
    rcfarcfa Posts: 1,124member
    Wait, let me understand….  You’re saying that China didn’t know about CSAM database scanning and couldn’t, without Apple introducing the hash on device, have told Apple and others that it wants them to implement on-device photo scanning for anything they want to spy on?  Of course China could have.  So there’s NOTHING inherent in Apple doing this that suddenly allows China to make such a demand.  They could have done so last week or a year ago, requiring Apple and other vendors to implement code to scan photos, or next month even if Apple undoes it’s changes.  So please tell me, what has changed?
    It’s difficult to impossible to demand an infrastructure that needs to be working at the lowest levels of the OS to be built.
    Governments have and do force the inclusion/exclusion of Apps and App-level functionality.

    An infrastructure matter is hard to force, because a company like Apple can say it doesn’t fit into their OS’ architecture, etc.

    None of these apply, once the infrastructure is actually in place. China, to stick with the example, may simply demand that they are in charge of the database and that they (rather than some NPO/NGO) be notified, citing “privacy laws” and “sovereign jurisdiction over criminal matters” as well as “national security concerns” as reasons, and on the surface, they are correct. After all, who guarantees that NPO isn’t an NSA front, and the hash database doesn’t contain items of concern to the Chinese government?

    Once China is in charge of the hash database used in China, and violating notifications don’t go to Apple or an NGO but to Chinese authorities, it’s game over. After all, what’s a human rights violation and what’s a legitimate national security concern, is just a matter of perspective. We get (rightly) outraged at how Russia treats Navalny or China treats supporters of the Dalai Lama, yet many are blind to the plight of Snowden and Assange.

    It’s one thing to architect a system that has no provisions for backdoors, it’s another to try to deny a government access to a back door that actually exists.
    edited August 2021 elijahgcorebeliefsgatorguyspock1234beowulfschmidtmuthuk_vanalingamscstrrflibertyforall
  • Reply 13 of 25
    auxio said:
    lkrupp said:
    Remember, people, this is the former 'Facebook security chief’. Facebook and security are mutually exclusionary terms.
    I was just about to say the same thing.  If people are outraged about CSAM scanning, they surely know about Facebook scanning everything (and not just hashes).  A former Facebook employee weighing in on it?  That's a joke right?
    I'm not tracking the argument here. If anything, an ex-Facebook security chief would be more credible on this subject for the reasons you and others are stating, for the same reason the government uses embedded informants. They know what they're talking about. 
    williamlondonapple_badgersphericmuthuk_vanalingam
  • Reply 14 of 25
    rcfarcfa Posts: 1,124member
    auxio said:
    lkrupp said:
    Remember, people, this is the former 'Facebook security chief’. Facebook and security are mutually exclusionary terms.
    I was just about to say the same thing.  If people are outraged about CSAM scanning, they surely know about Facebook scanning everything (and not just hashes).  A former Facebook employee weighing in on it?  That's a joke right?
    Not really, in this case. If you upload things to FB you do so fully aware that it’s for sharing with a more or less limited public. It’s like sending postcards.

    What you keep on your phone/computer, which in essence is a brain prosthetic, is in its very essence private and not shared.
    Imagine Apple scanning your diary or your thoughts! We’re literally just a technological gap distanced from “Minority Report”.

    What if machine learning derives personality profiles of rapists? What if social media profiles allow the creation of very detailed psychological personality profiles? (They do!) You want Apple to scan (on device of course, for the protection of your privacy,) scan peoples social media interactions, and alert an NGO of people’s sex offender potential?

    What if someone installs a hidden folder with offensive files on someone’s device, they get flagged and they deny any knowledge? Whom do you think the courts will believe?

    The problem is not what Apple is doing now, but what it opens the door to.

    We can only hope that some organization with deep pockets sues Apple, gets a temporary restraining order, and then finally wins the case on a wide ranging constitutional foundation.
    williamlondonp-dogdarkvadermuthuk_vanalingamlibertyforall
  • Reply 15 of 25
    seankillseankill Posts: 566member
    You know, in Brazil we have an expression: "Boasting to kill the jaguar, after it's already dead!" I doesn't really translates, but it really resonates with some of the reactions we've been seeing this past week. You know who doesn't give a damn? Regular Apple customers. You know, people with real jobs and life worries, that don't abuse children, and invest in Apple hardware because they think it is better than the competition's. They are also people that don't agonize about some very technical and narrow definition of privacy, particularly when it's been polluted by personal (ahem, monetary) interests, and when their own Congress is just about incapable of even following the most straightforward aspects of technology.

    Talk about an overblown "first world" problem.

    You need a blind gif.  
    I’m an average American, strong middle class, comfortable but not rich. I work a lot and I don’t abuse children (don’t have any of my own yet). But I have nieces and I’d never stand for anyone, including myself abusing them. 

    That said, this is a different game here. It’s a dangerous slippery slope. I understand the idea for this feature but I think it’s going to slowly erode the progress and trust Apple has made.  For me, it’s the principle of the matter. 

    I’m debating cancelling my iCloud services in protest. Probably won’t affect Apple but hey. I can only vote with my dollar. If it gets out of hand, I’m just going to switch away from Apple iPhones. Just did with the laptops. So that’s about 3K they didn’t get from me. Again, it’s all I can do as an individual. The masses will decide from here. 
    spock1234darkvadermuthuk_vanalingamlibertyforall
  • Reply 16 of 25
    crowleycrowley Posts: 10,453member
    rcfa said:
    We’re literally just a technological gap distanced from “Minority Report”.
    We're "literally" just a technological gap away from running police investigations according to psychics who can predict the future?  Sure dude.
    tenthousandthingsauxiojony0
  • Reply 17 of 25
    This is only going to be in the new versions of Apple’s OS’s, right? My simple solution to this will be not to upgrade for the time being.
    darkvader
  • Reply 18 of 25
    mcdavemcdave Posts: 1,927member
    I was originally concerned about CSAM scanning until I realised it was only looking for officially validated CSAM content to report.

    Personally I’d also be happy for a general CSAM ML model to flag/alert me about potential CSAM content hitting my daughter’s account but not reporting it further.

    As for server-side scanning, I don’t see how mandatory scanning varies whether the code is in the cloud or on device. Either way it’s out of my control.
    edited August 2021
  • Reply 19 of 25
    auxioauxio Posts: 2,727member
    auxio said:
    lkrupp said:
    Remember, people, this is the former 'Facebook security chief’. Facebook and security are mutually exclusionary terms.
    I was just about to say the same thing.  If people are outraged about CSAM scanning, they surely know about Facebook scanning everything (and not just hashes).  A former Facebook employee weighing in on it?  That's a joke right?
    I'm not tracking the argument here. If anything, an ex-Facebook security chief would be more credible on this subject for the reasons you and others are stating, for the same reason the government uses embedded informants. They know what they're talking about. 
    The argument is that, if you had no problem working for a company which scans everything people upload to them to build an advertising profile, how can you now speak out against Apple doing a very limited version of that when you're uploading to them?  Simply because it's happening on-device rather than server side?  Uh yeah, that's because it's the only way to maintain end-to-end data encryption.  If they had to generate the image hashes server side, it would mean you'd have to upload your photos unencrypted (or encrypted in a way which allows Apple to decrypt them).
    scstrrf
  • Reply 20 of 25
    auxioauxio Posts: 2,727member
    rcfa said:
    auxio said:
    lkrupp said:
    Remember, people, this is the former 'Facebook security chief’. Facebook and security are mutually exclusionary terms.
    I was just about to say the same thing.  If people are outraged about CSAM scanning, they surely know about Facebook scanning everything (and not just hashes).  A former Facebook employee weighing in on it?  That's a joke right?
    Not really, in this case. If you upload things to FB you do so fully aware that it’s for sharing with a more or less limited public. It’s like sending postcards.
    Was Facebook crystal clear about their business model from the very beginning?  Even today, where awareness of privacy is much higher, I'll be willing to bet that 90% (or more) of my friends and family who use Facebook don't understand their business model.  People are not as "fully aware" as you may believe.  Albeit, many of them simply don't care.  But if that's the case, they won't care about Apple doing it either.

    What you keep on your phone/computer, which in essence is a brain prosthetic, is in its very essence private and not shared.
    Imagine Apple scanning your diary or your thoughts! We’re literally just a technological gap distanced from “Minority Report”.
    ok, now we're going off the deep end with the argument.  While I have no doubt that Google, Amazon, and Facebook have personality profiling AI which would shock most people (ala the infamous Eric Schmidt "creepy line" quote from 2010), Apple comparing image hashes is far far from that.  If you're seriously worried about that kind of future, you'd do far better to battle all the tracking and profiling being done on you by advertisers every time you use a web browser (or web-based app), open an email, etc.
    edited August 2021 scstrrf
Sign In or Register to comment.