Apple privacy head explains privacy protections of CSAM detection system

24

Comments

  • Reply 21 of 62
    bvwj said:
    So if some political foe wants to destroy me, all they need to do is hack my phone, deposit a collection of CSAM, and the cops will do the rest.  How convenient.
    Don't worry. Someone at Apple review all of your images before calling the cops. Does that make you feel any better? 
    elijahgtuddy215
     2Likes 0Dislikes 0Informatives
  • Reply 22 of 62
    baconstangbaconstang Posts: 1,181member
    bvwj said:
    So if some political foe wants to destroy me, all they need to do is hack my phone, deposit a collection of CSAM, and the cops will do the rest.  How convenient.
    Actually, anyone with physical access to your locked iPhone for a minute can add images to your Photos upload queue if it's enabled.
    darkvader
     0Likes 0Dislikes 1Informative
  • Reply 23 of 62
    OK Apple, you can scan all my photos with one stipulation:
    SHOW ME YOUR WARRANT!
    byronlcochomuthuk_vanalingam
     3Likes 0Dislikes 0Informatives
  • Reply 24 of 62
    entropysentropys Posts: 4,431member
    This hash technology is only linked to kiddie fiddlers to make it more palatable to the gullible. The important thing is it enables the population to be categorised in the good and the ungood, with the definition of ungood being up to those that control whatever gets listed in the hash.
    who can object to kiddie fiddlers being exposed? No one. Next month it is Trumpity supporters. A few people get uncomfortable, but hey, Trump! The month after that, Winnie the Pooh. In other Countries, it’s gays and political dissidents.

    And Apple is doubling down hard on the “won’t someone think of the children!” distraction squirrel. I suspect they must be under some kind of pressure from those who would be Big Brother.


    tylersdadmuthuk_vanalingamdarkvader
     2Likes 0Dislikes 1Informative
  • Reply 25 of 62
    blastdoorblastdoor Posts: 3,760member
    Folks -- if you don't like this your complaint is with governments, not Apple. In fact, you should thank Apple for letting us know that they're doing this. 

    The first inescapable truth is that Apple (and every other company) MUST either (1) follow the laws of countries in which they operate or (2) pull out of those countries. 
    The second inescapable truth is that there is no way Apple is pulling out of the US, and almost no way Apple is pulling out of China. 

    So, if you don't want Apple to do something a government is asking them to do, your best bet is to demand the government not ask them to do it in the first place. 

    baconstang
     1Like 0Dislikes 0Informatives
  • Reply 26 of 62
    blastdoor said:
    Folks -- if you don't like this your complaint is with governments, not Apple. In fact, you should thank Apple for letting us know that they're doing this. 

    The first inescapable truth is that Apple (and every other company) MUST either (1) follow the laws of countries in which they operate or (2) pull out of those countries. 
    The second inescapable truth is that there is no way Apple is pulling out of the US, and almost no way Apple is pulling out of China. 

    So, if you don't want Apple to do something a government is asking them to do, your best bet is to demand the government not ask them to do it in the first place. 

    I've seen no evidence to suggest the government is forcing them to implement this feature. Can you post a link for the source of this information? 
    muthuk_vanalingam
     1Like 0Dislikes 0Informatives
  • Reply 27 of 62
    ikirikir Posts: 130member
    People are getting paranoid, it is an hash not an image scan.

    explained -> https://youtu.be/Dq_hvJI76IY
    JFC_PA
     1Like 0Dislikes 0Informatives
  • Reply 28 of 62
    mknelsonmknelson Posts: 1,163member
    First - they can't put in a generic hash of an AR-15, Snowden, Trump supporters, or gay people. It would have to be a hash of your exact picture or one that is mathematically so close as to be indistinguishable. The hashes are being provided by an independent organization.

    "SHOW ME YOUR WARRANT". Why would they need a warrant? If you put your "stash" in my house, I can search it all I want. It's IN MY HOUSE! And Apple isn't even searching it in iCloud - the hashes are run on your device.

    And yes, Apple totally mishandled this. This is a bad solution.
     0Likes 0Dislikes 0Informatives
  • Reply 29 of 62
    mknelson said:
    First - they can't put in a generic hash of an AR-15, Snowden, Trump supporters, or gay people. It would have to be a hash of your exact picture or one that is mathematically so close as to be indistinguishable. The hashes are being provided by an independent organization.

    "SHOW ME YOUR WARRANT". Why would they need a warrant? If you put your "stash" in my house, I can search it all I want. It's IN MY HOUSE! And Apple isn't even searching it in iCloud - the hashes are run on your device.

    And yes, Apple totally mishandled this. This is a bad solution.
    I’m not sure I’m clear in this warrant thing. Are you suggesting Apple has the right to scan the content on the device that I paid for?
    baconstang
     1Like 0Dislikes 0Informatives
  • Reply 30 of 62
    mknelson said:
    First - they can't put in a generic hash of an AR-15, Snowden, Trump supporters, or gay people. It would have to be a hash of your exact picture or one that is mathematically so close as to be indistinguishable. The hashes are being provided by an independent organization.

    "SHOW ME YOUR WARRANT". Why would they need a warrant? If you put your "stash" in my house, I can search it all I want. It's IN MY HOUSE! And Apple isn't even searching it in iCloud - the hashes are run on your device.

    And yes, Apple totally mishandled this. This is a bad solution.
    No, it needs to be an exact picture *today*.   Tomorrow the technology can EASILY be expanded, of their own volition, or from government pressure.   When the mechanism already exists, it can EASILY be expanded tomorrow.   This was apple's moral high ground a few years ago when they REFUSED to write "the software equivalent of cancer" if you remember.  They just wrote it.

    edited August 2021
    tylersdaddarkvader
     1Like 0Dislikes 1Informative
  • Reply 31 of 62
    baconstangbaconstang Posts: 1,181member
    tylersdad said:
    mknelson said:
    First - they can't put in a generic hash of an AR-15, Snowden, Trump supporters, or gay people. It would have to be a hash of your exact picture or one that is mathematically so close as to be indistinguishable. The hashes are being provided by an independent organization.

    "SHOW ME YOUR WARRANT". Why would they need a warrant? If you put your "stash" in my house, I can search it all I want. It's IN MY HOUSE! And Apple isn't even searching it in iCloud - the hashes are run on your device.

    And yes, Apple totally mishandled this. This is a bad solution.
    I’m not sure I’m clear in this warrant thing. Are you suggesting Apple has the right to scan the content on the device that I paid for?
    You may have waived that privacy in the iCloud TOS.  It's a long read....  
     0Likes 0Dislikes 0Informatives
  • Reply 32 of 62
    Also important to note here, Apple doesnt claim this cant be expanded or abused by gov.  They say they will resist.  They wont have a f*cking choice once it’s built.
    tylersdadblastdoormuthuk_vanalingamchemengin1darkvader
     4Likes 0Dislikes 1Informative
  • Reply 33 of 62
    jdwjdw Posts: 1,472member
    After reading all the debate, I have not yet been convinced it is a good idea.  And here's another reason to be highly cautious about it -- a reason you'd think Apple would care about...

    What about the mental state of all those Apple employees forced to look at kiddo porn all day long when it comes time for Apple to check all those flagged photos?

    Seriously.  I've read a lot about FaceBook moderators who have to look at horrible posts all day long and have that take a mental toll on them.  Imagine how much worse kiddy porn would be.  And yet, Apple has a human review process.  That means, your job will be to look at that stuff frequently.  I can only image the mental havoc it will have on good people who know the evils of that stuff, but image if someone in the review process was a child predator getting their jollies off the entire review process!

    I think these are legitimate concerns that few are talking about in this debate.  We need to consider the mental toll on Apple employees (or contractors) forced to constantly review flagged photos.
    tylersdadmuthuk_vanalingamdarkvader
     2Likes 0Dislikes 1Informative
  • Reply 34 of 62
    nizzard said:
    Also important to note here, Apple doesnt claim this cant be expanded or abused by gov.  They say they will resist.  They wont have a f*cking choice once it’s built.
    Exactly right. If the Chinese government gives Apple the choice of implementing this or getting kicked out of the Chinese market, does anybody really think they’ll give up the Chinese market? Hell no!
    muthuk_vanalingambaconstang
     2Likes 0Dislikes 0Informatives
  • Reply 35 of 62
    bluefire1bluefire1 Posts: 1,316member
    If the door is even slightly ajar, others will find a way to enter with or without Apple’s consent.
    muthuk_vanalingamdarkvaderbaconstang
     2Likes 0Dislikes 1Informative
  • Reply 36 of 62
    jdw said:
    After reading all the debate, I have not yet been convinced it is a good idea.  And here's another reason to be highly cautious about it -- a reason you'd think Apple would care about...

    What about the mental state of all those Apple employees forced to look at kiddo porn all day long when it comes time for Apple to check all those flagged photos?

    Seriously.  I've read a lot about FaceBook moderators who have to look at horrible posts all day long and have that take a mental toll on them.  Imagine how much worse kiddy porn would be.  And yet, Apple has a human review process.  That means, your job will be to look at that stuff frequently.  I can only image the mental havoc it will have on good people who know the evils of that stuff, but image if someone in the review process was a child predator getting their jollies off the entire review process!

    I think these are legitimate concerns that few are talking about in this debate.  We need to consider the mental toll on Apple employees (or contractors) forced to constantly review flagged photos.
    Good lord. What a great point! I didn’t even consider this. 
    darkvaderjdw
     2Likes 0Dislikes 0Informatives
  • Reply 37 of 62
    JFC_PAjfc_pa Posts: 964member
     Additionally, the iCloud Photos "scanner" isn't actually scanning or analyzing the images on a user's iPhone. Instead, it's comparing the mathematical hashes of known CSAM to images stored in iCloud. If a collection of known CSAM images is stored in iCloud, then the account is flagged and a report is sent to the National Center for Missing & Exploited Children (NCMEC).”

    Dislike this?

    Don’t. Enable. iCloud. Photos. 

    They’re scanning their own servers. 
     0Likes 0Dislikes 0Informatives
  • Reply 38 of 62
    auxioauxio Posts: 2,785member
    bvwj said:
    So if some political foe wants to destroy me, all they need to do is hack my phone, deposit a collection of CSAM, and the cops will do the rest.  How convenient.
    If they can hack your phone, then they could have uploaded a bunch of CSAM photos to Facebook (or other cloud services which scan for CSAM) and got the same result before.  Someone with access to your phone can do a lot of harm to you with or without this technology.
    baconstang
     1Like 0Dislikes 0Informatives
  • Reply 39 of 62
    auxioauxio Posts: 2,785member
    lkrupp said:
    Again, where will you all go if you can’t trust Apple anymore? I’ve asked this question many times but no responses. If you no longer trust Apple or think they are in bed with the government will you leave the platform? Where will you go? Or, as I strongly suspect, will you just bitch and whine about privacy and pray for the destruction of Apple, and then meekly accept the inevitable, and go about your lives?

    All this talk about Apple destroying its credibility, PR disasters, loss of customers, is just a wet dream of the paranoid, NSA behind every bush crowd.
    And this is exactly the problem. There is no other place to go. This is why I and many other people are angry. We have no other options. 
    There are options (like this), but it'll mean that you need to do a fair bit of research into how you want to share things with others.  You're not going to get the simplicity you do with Apple.
     0Likes 0Dislikes 0Informatives
  • Reply 40 of 62
    Rayz2016rayz2016 Posts: 6,957member
    Old privacy chief:

    "Apple has confirmed that it’s automatically scanning images backed up to iCloud to ferret out child abuse images.

    As the Telegraph reports, Apple chief privacy officer Jane Horvath, speaking at the Consumer Electronics Show in Las Vegas this week, said that this is the way that it’s helping to fight child exploitation, as opposed to breaking encryption."

    https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/

    New privacy chief:

    "The voucher generation is actually exactly what enables us not to have to begin processing all users’ content on our servers which we’ve never done for iCloud Photos" The TechCruch interview quoted in the article.

    What do you think?

    I think she’s a little out of the loop. 
    darkvader
     0Likes 0Dislikes 1Informative
Sign In or Register to comment.