Apple privacy head explains privacy protections of CSAM detection system

13

Comments

  • Reply 41 of 62
    Rayz2016rayz2016 Posts: 6,957member
    At least Apple is discussing the solution. Facebook (as admitted by a former head of privacy) detected 4.5 million CSAM related images and nobody knew about it or complained about it.
    Actually Facebook made a pretty big fuss about it. We just weren’t listening. 

    muthuk_vanalingamchemengin1darkvaderelijahg
     1Like 0Dislikes 3Informatives
  • Reply 42 of 62
    Rayz2016rayz2016 Posts: 6,957member
    jdw said:
    After reading all the debate, I have not yet been convinced it is a good idea.  And here's another reason to be highly cautious about it -- a reason you'd think Apple would care about...

    What about the mental state of all those Apple employees forced to look at kiddo porn all day long when it comes time for Apple to check all those flagged photos?

    Seriously.  I've read a lot about FaceBook moderators who have to look at horrible posts all day long and have that take a mental toll on them.  Imagine how much worse kiddy porn would be.  And yet, Apple has a human review process.  That means, your job will be to look at that stuff frequently.  I can only image the mental havoc it will have on good people who know the evils of that stuff, but image if someone in the review process was a child predator getting their jollies off the entire review process!

    I think these are legitimate concerns that few are talking about in this debate.  We need to consider the mental toll on Apple employees (or contractors) forced to constantly review flagged photos.
    Yes, this is a serious consideration. 


    But the other problem is that people who are exposed to this material, day in day out, can become desensitised to it. For some, it may act as a trigger.  This why companies are spending millions to take humans out of the loop. Except Apple of course, who’re inserting humans back in. 


    edited August 2021
    muthuk_vanalingamelijahgjdw
     3Likes 0Dislikes 0Informatives
  • Reply 43 of 62
    Rayz2016rayz2016 Posts: 6,957member
    Apple is burning the CSAM hashes into its operating systems (any first year computer science student will tell you that this doesn’t belong at the OS level any more than your address book does), so when the CSAM database gets revised, Apple has to issue updates for all their operating systems?

     0Likes 0Dislikes 0Informatives
  • Reply 44 of 62
    Rayz2016rayz2016 Posts: 6,957member
    elijahg said:
    lkrupp said:
    Again, where will you all go if you can’t trust Apple anymore? I’ve asked this question many times but no responses. If you no longer trust Apple or think they are in bed with the government will you leave the platform? Where will you go? Or, as I strongly suspect, will you just bitch and whine about privacy and pray for the destruction of Apple, and then meekly accept the inevitable, and go about your lives?

    All this talk about Apple destroying its credibility, PR disasters, loss of customers, is just a wet dream of the paranoid, NSA behind every bush crowd.

    Early in my career I worked in a Ma Bell public office. We always had people calling and demanding we stop their neighbor from tapping their phone. We had one woman who regularly called in absolutely positive the Baptists across her street were listening to her phone calls during Sunday morning services. We had dispatched repairmen several times to find the ‘bug’ she was sure was there. We finally put her on the ‘do not respond’ list.
    So what you're saying is no matter what any company introduces (presumably that extends to government too) everyone should simply suck it up and not let anyone know how you feel. Got it.
    Then why bother to vote then?

    Here’s another possibility I’ve been mulling over. It might be a cost issue. The other big cloud providers run their services on their own infrastructure. Apple runs theirs through Amazon and Microsoft services. It’s possible that running a server side solution would be too expensive.  

    The likes of Google, Microsoft and Facebook don’t really have the option of running client-side scanners because they don’t control the entire stack, unlike Apple. 
     0Likes 0Dislikes 0Informatives
  • Reply 45 of 62
    Rayz2016rayz2016 Posts: 6,957member
    lkrupp said:
    Again, where will you all go if you can’t trust Apple anymore? I’ve asked this question many times but no responses. If you no longer trust Apple or think they are in bed with the government will you leave the platform? Where will you go? Or, as I strongly suspect, will you just bitch and whine about privacy and pray for the destruction of Apple, and then meekly accept the inevitable, and go about your lives?

    All this talk about Apple destroying its credibility, PR disasters, loss of customers, is just a wet dream of the paranoid, NSA behind every bush crowd.
    And this is exactly the problem. There is no other place to go. 


    Part of the problem is that Apple has held itself above Google and Android by being the champion of privacy. Because of their business model, Google would never be able to tell a convincing story that they cared about your rights to privacy (read GoogleGuy’s posts – lord knows they’ve tried). 

    I imagine Google’s management must be in their office/fake rainforest, listening to rainfall piped through their Google speakers sitting on their rainbow-coloured pouffes, watching this shitshow unfold in quiet disbelief. 

    “Master,” says one, “we should capitalise, go on the attack! Show them how we handle CSAM scanning. Show them that when it comes to privacy (terms and conditions apply), we can be the true champions of—”

    The master raises his wizened green claw. “No,” he says. “Wait, we shall.”

    “But why, master!”

    “The grave.  Deep it is.” The old master sighs and smiles. “And still digging, they are.”

    muthuk_vanalingamentropyselijahgdarkvaderjdw
     5Likes 0Dislikes 0Informatives
  • Reply 46 of 62
    Rayz2016 said:
    lkrupp said:
    Again, where will you all go if you can’t trust Apple anymore? I’ve asked this question many times but no responses. If you no longer trust Apple or think they are in bed with the government will you leave the platform? Where will you go? Or, as I strongly suspect, will you just bitch and whine about privacy and pray for the destruction of Apple, and then meekly accept the inevitable, and go about your lives?

    All this talk about Apple destroying its credibility, PR disasters, loss of customers, is just a wet dream of the paranoid, NSA behind every bush crowd.
    And this is exactly the problem. There is no other place to go. 


    Part of the problem is that Apple has held itself above Google and Android by being the champion of privacy. Because of their business model, Google would never be able to tell a convincing story that they cared about your rights to privacy (read GoogleGuy’s posts – lord knows they’ve tried). 

    I imagine Google’s management must be in their office/fake rainforest, listening to rainfall piped through their Google speakers sitting on their rainbow-coloured pouffes, watching this shitshow unfold in quiet disbelief. 

    “Master,” says one, “we should capitalise, go on the attack! Show them how we handle CSAM scanning. Show them that when it comes to privacy (terms and conditions apply), we can be the true champions of—”

    The master raises his wizened green claw. “No,” he says. “Wait, we shall.”

    “But why, master!”

    “The grave.  Deep it is.” The old master sighs and smiles. “And still digging, they are.”

    Lol. A good one, a very good one, an excellent one for humor and insight at the same time. Well done.
    elijahggatorguy
     2Likes 0Dislikes 0Informatives
  • Reply 47 of 62
    JFC_PA said:
    “ Additionally, the iCloud Photos "scanner" isn't actually scanning or analyzing the images on a user's iPhone. Instead, it's comparing the mathematical hashes of known CSAM to images stored in iCloud. If a collection of known CSAM images is stored in iCloud, then the account is flagged and a report is sent to the National Center for Missing & Exploited Children (NCMEC).”

    Dislike this?

    Don’t. Enable. iCloud. Photos. 

    They’re scanning their own servers. 
    No they aren't.  The photos are being analyzed on the user's device.  Currently, that only happens when iCloud Photos is turned on.  In the future, that could be changed.

    I think hardly anyone would object if Apple were actually scanning their own servers for such content.
    kbeemuthuk_vanalingamelijahgdarkvaderbaconstang
     5Likes 0Dislikes 0Informatives
  • Reply 48 of 62
    GeorgeBMacgeorgebmac Posts: 11,421member
    Everybody seems to agree that identifying and preventing child abuse is a good thing -- and is willing to sacrifice privacy for that goal.

    But, when I was in grade school, the same was generally agreed about Communists.   "You're a dirty, rotten Commie!"

    Later, when I was in high school, the same was generally agreed about homosexuals.  "You're a faggot!"

    I wonder where this one will end up?  Slippery slopes can lead to cures that are worse than the disease.
    edited August 2021
    muthuk_vanalingamnizzardelijahgdarkvaderbaconstang
     5Likes 0Dislikes 0Informatives
  • Reply 49 of 62
    Beatsbeats Posts: 3,073member
    bvwj said:
    So if some political foe wants to destroy me, all they need to do is hack my phone, deposit a collection of CSAM, and the cops will do the rest.  How convenient.

    And the cops LOVE this sh**!! Even when they know you’ve been planted. Heck they do it themselves!
    muthuk_vanalingamGeorgeBMacelijahgdarkvader
     3Likes 0Dislikes 1Informative
  • Reply 50 of 62
    Beatsbeats Posts: 3,073member
    At least Apple is discussing the solution. Facebook (as admitted by a former head of privacy) detected 4.5 million CSAM related images and nobody knew about it or complained about it.

    Apple and Facebook aren’t the police. It’s none of their fu**ing business who’s doing what.
    muthuk_vanalingamdarkvaderelijahg
     2Likes 0Dislikes 1Informative
  • Reply 51 of 62
    GeorgeBMacgeorgebmac Posts: 11,421member
    Beats said:
    At least Apple is discussing the solution. Facebook (as admitted by a former head of privacy) detected 4.5 million CSAM related images and nobody knew about it or complained about it.

    Apple and Facebook aren’t the police. It’s none of their fu**ing business who’s doing what.

    Interesting....  Would you rather the police did random scans of your photos?   Or is it nobody's business if you store child pornography (or any other prohibited content?)

    Either somebody does it -- or nobody does it.   There's not a lot of middle ground there.
     0Likes 0Dislikes 0Informatives
  • Reply 52 of 62
    peteo said:
    Rayz2016 said:

    It would be helpful (rather than just trashing the offered fix from Apple) to offered an alternative solution - unless we wish to state that there is not a problem to solve.


    Criticism is easy but solutions are difficult - buts lets try.

    I’ll have a think. 
    we already know what the solution is. Only run this in the cloud on iCloud photos. Do not run it on the users device. Of course, I believe they can not do this since iCloud photos are encrypted when in the cloud?
    iCloud Photos are encryped on the cloud with Apple's keys, not devıce (user's) keys.
    ...aaannnddd now we have NO hope of that ever changing.
     0Likes 0Dislikes 0Informatives
  • Reply 52 of 62
    tylersdad said:
    jdw said:
    After reading all the debate, I have not yet been convinced it is a good idea.  And here's another reason to be highly cautious about it -- a reason you'd think Apple would care about...

    What about the mental state of all those Apple employees forced to look at kiddo porn all day long when it comes time for Apple to check all those flagged photos?

    Seriously.  I've read a lot about FaceBook moderators who have to look at horrible posts all day long and have that take a mental toll on them.  Imagine how much worse kiddy porn would be.  And yet, Apple has a human review process.  That means, your job will be to look at that stuff frequently.  I can only image the mental havoc it will have on good people who know the evils of that stuff, but image if someone in the review process was a child predator getting their jollies off the entire review process!

    I think these are legitimate concerns that few are talking about in this debate.  We need to consider the mental toll on Apple employees (or contractors) forced to constantly review flagged photos.
    Good lord. What a great point! I didn’t even consider this. 
    No one is forced to do this.  Who the fuck chooses this as a career?
     0Likes 0Dislikes 0Informatives
  • Reply 54 of 62
    elijahgelijahg Posts: 2,882member
    Beats said:
    At least Apple is discussing the solution. Facebook (as admitted by a former head of privacy) detected 4.5 million CSAM related images and nobody knew about it or complained about it.

    Apple and Facebook aren’t the police. It’s none of their fu**ing business who’s doing what.

    Interesting....  Would you rather the police did random scans of your photos?   Or is it nobody's business if you store child pornography (or any other prohibited content?)

    Either somebody does it -- or nobody does it.   There's not a lot of middle ground there.
    I'm sure if the police came into your home and rifled though your belongings without you knowing you'd be pretty pissed. And why shouldn't they? You might have something incriminating in your home.
    muthuk_vanalingam
     1Like 0Dislikes 0Informatives
  • Reply 55 of 62
    elijahgelijahg Posts: 2,882member
    nizzard said:
    tylersdad said:
    jdw said:
    After reading all the debate, I have not yet been convinced it is a good idea.  And here's another reason to be highly cautious about it -- a reason you'd think Apple would care about...

    What about the mental state of all those Apple employees forced to look at kiddo porn all day long when it comes time for Apple to check all those flagged photos?

    Seriously.  I've read a lot about FaceBook moderators who have to look at horrible posts all day long and have that take a mental toll on them.  Imagine how much worse kiddy porn would be.  And yet, Apple has a human review process.  That means, your job will be to look at that stuff frequently.  I can only image the mental havoc it will have on good people who know the evils of that stuff, but image if someone in the review process was a child predator getting their jollies off the entire review process!

    I think these are legitimate concerns that few are talking about in this debate.  We need to consider the mental toll on Apple employees (or contractors) forced to constantly review flagged photos.
    Good lord. What a great point! I didn’t even consider this. 
    No one is forced to do this.  Who the fuck chooses this as a career?
    Pedophiles? Perfect job in their eyes.
    nizzard
     1Like 0Dislikes 0Informatives
  • Reply 56 of 62
    GeorgeBMacgeorgebmac Posts: 11,421member
    elijahg said:
    Beats said:
    At least Apple is discussing the solution. Facebook (as admitted by a former head of privacy) detected 4.5 million CSAM related images and nobody knew about it or complained about it.

    Apple and Facebook aren’t the police. It’s none of their fu**ing business who’s doing what.

    Interesting....  Would you rather the police did random scans of your photos?   Or is it nobody's business if you store child pornography (or any other prohibited content?)

    Either somebody does it -- or nobody does it.   There's not a lot of middle ground there.
    I'm sure if the police came into your home and rifled though your belongings without you knowing you'd be pretty pissed. And why shouldn't they? You might have something incriminating in your home.

    I wasn't suggesting that they do.   I responding to another post that said it was none of Apple's business and implied that it was a job for   government of police  (without quite saying it explicitly).

    So.i was asking if it is none of Apple's business then whose is it?   Or is it nobody's and we just do nothing (which is, of course an option -- the question is whether that's a good option.)
    edited August 2021
     0Likes 0Dislikes 0Informatives
  • Reply 57 of 62
    nizzard said:
    tylersdad said:
    jdw said:
    After reading all the debate, I have not yet been convinced it is a good idea.  And here's another reason to be highly cautious about it -- a reason you'd think Apple would care about...

    What about the mental state of all those Apple employees forced to look at kiddo porn all day long when it comes time for Apple to check all those flagged photos?

    Seriously.  I've read a lot about FaceBook moderators who have to look at horrible posts all day long and have that take a mental toll on them.  Imagine how much worse kiddy porn would be.  And yet, Apple has a human review process.  That means, your job will be to look at that stuff frequently.  I can only image the mental havoc it will have on good people who know the evils of that stuff, but image if someone in the review process was a child predator getting their jollies off the entire review process!

    I think these are legitimate concerns that few are talking about in this debate.  We need to consider the mental toll on Apple employees (or contractors) forced to constantly review flagged photos.
    Good lord. What a great point! I didn’t even consider this. 
    No one is forced to do this.  Who the fuck chooses this as a career?
    Two types of people:
    • Those that truly care and want to put sexual abusers away where they can't hurt anyone except each other
    • Pedophiles who are gleeful about being able to legally feed their perversion
    Hopefully, Apple can weed out the second variety.
    tylersdad
     1Like 0Dislikes 0Informatives
  • Reply 58 of 62
    crowleycrowley Posts: 10,453member
    nizzard said:
    tylersdad said:
    jdw said:
    After reading all the debate, I have not yet been convinced it is a good idea.  And here's another reason to be highly cautious about it -- a reason you'd think Apple would care about...

    What about the mental state of all those Apple employees forced to look at kiddo porn all day long when it comes time for Apple to check all those flagged photos?

    Seriously.  I've read a lot about FaceBook moderators who have to look at horrible posts all day long and have that take a mental toll on them.  Imagine how much worse kiddy porn would be.  And yet, Apple has a human review process.  That means, your job will be to look at that stuff frequently.  I can only image the mental havoc it will have on good people who know the evils of that stuff, but image if someone in the review process was a child predator getting their jollies off the entire review process!

    I think these are legitimate concerns that few are talking about in this debate.  We need to consider the mental toll on Apple employees (or contractors) forced to constantly review flagged photos.
    Good lord. What a great point! I didn’t even consider this. 
    No one is forced to do this.  Who the fuck chooses this as a career?
    Two types of people:
    • Those that truly care and want to put sexual abusers away where they can't hurt anyone except each other
    • Pedophiles who are gleeful about being able to legally feed their perversion
    Hopefully, Apple can weed out the second variety.
    Why?  If it satisfies them and the job gets done then everyone wins.
     0Likes 0Dislikes 0Informatives
  • Reply 59 of 62
    JFC_PAjfc_pa Posts: 957member
    JFC_PA said:
    “ Additionally, the iCloud Photos "scanner" isn't actually scanning or analyzing the images on a user's iPhone. Instead, it's comparing the mathematical hashes of known CSAM to images stored in iCloud. If a collection of known CSAM images is stored in iCloud, then the account is flagged and a report is sent to the National Center for Missing & Exploited Children (NCMEC).”

    Dislike this?

    Don’t. Enable. iCloud. Photos. 

    They’re scanning their own servers. 
    No they aren't.  The photos are being analyzed on the user's device.  Currently, that only happens when iCloud Photos is turned on.  In the future, that could be changed.

    I think hardly anyone would object if Apple were actually scanning their own servers for such content.
    Or sure people would. Have you MET the internet? And I’ve yet to see a clear indication it’s not their servers, reporting is sloppy enough on this l’m still seeing it as there servers. 

    That said I’d say my point remains for the current issue: people who’d prefer not to participate can just not opt in to iCloud Photos. Using external hard drives is a great backup in any case fir far more than just photos. Including the superior iPhone backup when encryption is chosen imho, since I hate having to resign in to all my sites, necessary for an iCloud backup. I’ve suffered through some new iPhone setups… while a local encrypted backup is a breeze. And for me an annual tote of passage. 
     
     0Likes 0Dislikes 0Informatives
  • Reply 60 of 62
    Rayz2016 said:

    It would be helpful (rather than just trashing the offered fix from Apple) to offered an alternative solution - unless we wish to state that there is not a problem to solve.


    Criticism is easy but solutions are difficult - buts lets try.

    I’ll have a think. 
    They must enable End-to-End Encryption on all iCloud user data at the same time they release that client-side CSAM matching. 

    What about the final human review then? Well, the safety voucher includes a preview image, as I understand. The original image will still be sent encrypted, but its preview will remain in the voucher. When the threshold is reached, the vouchers will unlock and the team will perform its review on the previews.

    Even with full E2E encryption enabled, that feature should not be implemented in a silent way, even if it is stated in the user agreement. The actual announcement includes nothing about user consent, prior notification and alike. Apple's stance should be preventive, to protect users from knowingly or unknowingly committing a crime. A though alert must be presented when the user tries to activate iCloud Photos, such as: "We can only accept photos that pass a CSAM scan to iCloud. In order to download a CSAM database into your device and initiate the CSAM scan on your photos, click Continue. Continue | Let Me Think | More Info..."

    And the result of the scan should be clearly communicated to the user: "13864 photos scanned, 104 photos will be sent with voucher. Show | Discard | Stop Upload"

    If it is implemented as spyware, that may cause Apple big headaches at the courts, especially in different jurisdictions.

    The user must be in control of that scan and of its results.
    This is well thought. I couldn't agree more. If it's anything, Apple should move all CSAM results into an album not to be uploaded to iCloud Photos. 

    I think that the photo scanning won't solve anything, what if someone packs CSAM material into a pdf file, and uploads it to iCloud as a document file? Will Apple consider that's ok?
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.