Apple employees express concern over new child safety tools

24

Comments

  • Reply 21 of 66
    entropysentropys Posts: 4,152member
    Why does it matter who owns the hash list?
    Point is the process has been created. The algorithm can simply pointed to any other hash list of photos (or data really) that Apple is made to do. By a despot, a State actor, the courts or whatever.  It won’t be able to say no if it creates the ability. 

    Do not create the ability in the first place and Apple won’t be forced into trying to say no, and fail.
    baconstangOctoMonkeyJaiOh81darkvaderelijahgcat52zeus423
  • Reply 22 of 66
    elijahgelijahg Posts: 2,753member
    entropys said:
    Why does it matter who owns the hash list?
    Point is the process has been created. The algorithm can simply pointed to any other hash list of photos (or data really) that Apple is made to do. By a despot, a State actor, the courts or whatever.  It won’t be able to say no if it creates the ability. 

    Do not create the ability in the first place and Apple won’t be forced into trying to say no, and fail.
    Exactly. And now the tech exists, is anyone really fooled by this stupid "we will not accede to any government's request to expand it"? The hash list is baked into iOS 15, apparently. And according to that article, the same image is distributed globally. So the tech is in all iOS 15-touting phones, just switched off. China no doubt knows this. Is Apple really going to refuse if the CCP demands it be activated in China?

    Cook has opened a massive can of worms with this, and it's going to be very difficult to put the lid back on.
    edited August 2021 baconstangmuthuk_vanalingamRayz2016darkvadercat52zeus423
  • Reply 23 of 66
    NaiyasNaiyas Posts: 107member
    I’ve refrained from commenting on this “reveal” for some time because I wanted to have a proper think about it. My view still isn’t fully developed but it runs down the following line…

    There are many bad things in the world that occur when you embrace freedom of the individual and we accept those risks… freedom of speech implies that someone (somewhere) will find what you have to say offensive, but to limit it actually makes the world a worse place to live because you push the perceived hate underground and it will therefore manifest itself in other ways.

    Privacy is another one. By executing this algorithm on your phone (ignoring the legal stuff around T&C etc) you are effectively having an illegal search performed on your private information or property. The reason may be valid in this instance, CSAM, but it is never the less an illegal search of your personal property taking the view that you are guilty (the act of conducting the matching search) until proven innocent (the lack of matches found).

    Whilst I can get behind the reason, I just can’t get behind the method of execution. It’s very much like “curation creep”.  We see on the App Store things like porn apps being banned. Do I want them? No. Is that a good enough reason to ban them entirely? Not sure, but it’s Apple’s store and they are free to make that choice as store owner. But forcing a privacy invading search of your photos onto your phone is a step too far in my opinion. Search my iCloud Photo library on your servers by all means to ensure compliance with the service’s T&C, but to operate it on “my” phone… not convinced.

    If this is how they want to go then perhaps they should treat all iPhone users like employees of Apple and provide iPhones to us for a small fee with the premise that we never ever own it ourselves. That way at least they have the clear legal right to search the phone just like your employer does when they give you a work device.
    baconstangRayz2016muthuk_vanalingamelijahgOctoMonkeyemig647mobirddarkvaderaderutterNumbuhOne
  • Reply 24 of 66
    Ofer said:
    …However, if they continue in this trajectory, I may have to start researching other options.
    Presumably that being, turning off iCloud Photo storage. Solved. 
    dewmekillroyn2itivguy
  • Reply 25 of 66
    MisterKit said:
    So Apple scans our photo library for a hit on known child porn. Somebody at some point along the chain had to watch it and establish the library.
    Nobody at Apple establishes (or maintains) the library. It is done by the National Center for Missing and Exploited Children (NCMEC) - not somebody at Apple. 
    dewmekillroy
  • Reply 26 of 66
    im so sure the government would never use this to mass-control the population.. although i still dont understand why they want to extradite Snowden, or why McAfee committed suicide when he specifically tattooed himself saying he'd never.. and right when he was about to testify in court! makes no sense..

    .. and I also dont understand what that threshold bit means.. maybe that unless we have Epstein-levels of material evidence, the system should be ok. Or is it the other way around? With so many vaccine tycoons and former presidents on literally dozens of flight logs and pictures in Epsteins Lolita Express, it is hard to tell.
  • Reply 27 of 66
    Leifur said:
    So I understand the hashes come from a database run by a government?
    Unfortunately your understanding is incorrect. The database is not run by the government, rather it is run by the National Center for Missing and Exploited Children (NCMEC). 
    dewmekillroy
  • Reply 28 of 66
    gatorguygatorguy Posts: 24,176member
    chadbag said:
    tommikele said:
    I can't wait to read about some Grandmother who gets arrested when they find a picture of naked infant grandchild in the bath in her iCloud Photo Library. I guarantee you that or something equally insane will result from this.
    It doesn’t work like that.  Unless that infant grandchild pic was put into the child abuse pic database and had a hash made and put into th DB. 

    I am against this because of the way it is being done.  Apple putting plumbing in that can later be abused when circumstances change, despite their best intentions today. 

    However, relying on misinformation does not help the discussion against this in any way. 
    How does the blocking/reporting of potentially inappropriate photos sent or received by minors work? That's an additional piece of this. I don't understand how that would be based on a strict hash, perhaps something very fuzzy like recognition of skin uncovered?
    Ofer
  • Reply 29 of 66
    dewmedewme Posts: 5,328member
    This is exciting. We now have 800 smart people (must be smart because they work at Apple) who’ve decided to actively ENGAGE in helping solve the global problem of missing and exploited children. By weighing in on this they have stepped forward and declared that they are now part of the SOLUTION. That’s just what adults do.

    Only infants and toddlers, to a certain age, can simply declare that they don’t like something and smoosh their face up in a contorted frown, knowing that mommy or daddy won’t force the issue. Once you’re grown up, simply complaining about something without offering an alternative or compromise doesn’t matter. That’s regressive and transforms you back into being a helpless little baby, in a full sized or maybe even super sized adult body. The world doesn’t need any more adult sized babies, we have way too many already as evidenced by current events.

    I look forward to hearing about what all these new volunteers come up with in terms of alternative proposals that will help us solve this heinous problem. And no, doing nothing or sacrificing the victims to buff up our ideological musings and utopian dreams does not count as a solution. These are real humans in life threatening situations that need help - now. Having 800+ smart Apple dudes and dudettes in the fray for doing good can only help.


  • Reply 30 of 66
    GG1GG1 Posts: 483member
    entropys said:
    Why does it matter who owns the hash list?
    Point is the process has been created. The algorithm can simply pointed to any other hash list of photos (or data really) that Apple is made to do. By a despot, a State actor, the courts or whatever.  It won’t be able to say no if it creates the ability. 

    Do not create the ability in the first place and Apple won’t be forced into trying to say no, and fail.
    This post and an earlier post by you demonstrate the slippery slope, such as:

    Say I reside in Australia and have pics of Winnie the Pooh on my phone to entertain my kids. I visit China on business, where the CCP provides the hash database. When I leave China, I'm detained for having dissident information on my phone. Plausible scenario or "the sky is falling?"
    darkvaderelijahgOferzeus423
  • Reply 31 of 66
    Happy to see that also folks at Apple think that this kind of mass surveillance is completely unacceptable.

    To me it's against the right of presumption of innocence.

    Personally (I bought a new iPhone every year since 2007) I won't update to iOS 15 and will delay purchasing a new iPhone until this is sorted out!
    edited August 2021 darkvaderelijahgOferaderutterzeus423
  • Reply 32 of 66
    crowleycrowley Posts: 10,453member
    GG1 said:
    entropys said:
    Why does it matter who owns the hash list?
    Point is the process has been created. The algorithm can simply pointed to any other hash list of photos (or data really) that Apple is made to do. By a despot, a State actor, the courts or whatever.  It won’t be able to say no if it creates the ability. 

    Do not create the ability in the first place and Apple won’t be forced into trying to say no, and fail.
    This post and an earlier post by you demonstrate the slippery slope, such as:

    Say I reside in Australia and have pics of Winnie the Pooh on my phone to entertain my kids. I visit China on business, where the CCP provides the hash database. When I leave China, I'm detained for having dissident information on my phone. Plausible scenario or "the sky is falling?"
    The latter.

    Your Winnie the Pooh photos won't have uploaded to iCloud while you were in China, therefore won't have triggered the check.
    killroy
  • Reply 33 of 66
    elijahgelijahg Posts: 2,753member
    crowley said:
    GG1 said:
    entropys said:
    Why does it matter who owns the hash list?
    Point is the process has been created. The algorithm can simply pointed to any other hash list of photos (or data really) that Apple is made to do. By a despot, a State actor, the courts or whatever.  It won’t be able to say no if it creates the ability. 

    Do not create the ability in the first place and Apple won’t be forced into trying to say no, and fail.
    This post and an earlier post by you demonstrate the slippery slope, such as:

    Say I reside in Australia and have pics of Winnie the Pooh on my phone to entertain my kids. I visit China on business, where the CCP provides the hash database. When I leave China, I'm detained for having dissident information on my phone. Plausible scenario or "the sky is falling?"
    The latter.

    Your Winnie the Pooh photos won't have uploaded to iCloud while you were in China, therefore won't have triggered the check.
    What if he’s been travelling and had no wifi until he arrives in China, then it uploads there?
  • Reply 34 of 66
    This is a slippery slope that should be avoided.  I'm all for catching bad guys (and gals), but we should have rights against unreasonable search and seizure.   The stuff on my phone should be protected as much as stuff on my physical desk.   Get a warrant and do things the right way.   Americans are innocent until proven guilty.
    macplusplusmuthuk_vanalingamOferaderutterzeus423
  • Reply 35 of 66
    techconctechconc Posts: 275member
    Leifur said:
    So I understand the hashes come from a database run by a government? So then that government can easily slip any other hash into the same database depending on what they are looking for.

    This is building in that back door that was requested some years ago.
    As others have mentioned, the NCMEC is not a government agency.  Further, there is a manual review performed before any report is sent out.

    Here's my take...  At a high level, I understand the concern and I don't think this is a necessary step for Apple to take.   Having said that, it seems to implemented in a secure and private manner... exactly what we'd expect from Apple.  This sort of scan is already being done by the likes of Facebook, Google, etc.  This really isn't anything new.  Moreover, the majority of critics amount to nothing more than conspiracy theories which involve fears of what "might happen" in some future date, etc.   I think the optics are bad for putting in this kind of "feature" in the first place.  I get it.  However, the fears and corresponding over reactions seem unfounded in my opinion. 
    n2itivguyDetnator
  • Reply 36 of 66
    techconctechconc Posts: 275member
    Naiyas said:
    ...
    Whilst I can get behind the reason, I just can’t get behind the method of execution. It’s very much like “curation creep”.  We see on the App Store things like porn apps being banned. Do I want them? No. Is that a good enough reason to ban them entirely? Not sure, but it’s Apple’s store and they are free to make that choice as store owner. But forcing a privacy invading search of your photos onto your phone is a step too far in my opinion. Search my iCloud Photo library on your servers by all means to ensure compliance with the service’s T&C, but to operate it on “my” phone… not convinced.
    ...
    Overall, I liked your thoughtful response.  However, I can't help but wonder if you understand how this is going to work.  You don't seem to have a problem with your iCloud Photo library being scanned.  Yet, you apparently have a problem with your photos being scanned on your phone... just before (and only then) they are uploaded to iCloud?  Really, what's the difference.  Photos that are never intended to go to iCloud on your devices will not be scanned.  I fail to understand the distinction you are making here.

    To your point, I see this as more of a terms and conditions of service for using iCloud Photos.  It's not different in concept from what Google, Facebook, etc. are already doing today.
    Ofern2itivguy
  • Reply 37 of 66
    If you are pro Apple, keep in mind that the thousands of Apple employees signing this petition ARE APPLE. This warrantless intrusion into private data goes against everything Apple claimed to be as a company. The executives who thought this was a great idea should be fired. Don't worry they have plenty of money and Apple desperately needs new blood. The executives could certainly stand to be a bit more diverse too.
    muthuk_vanalingamOferaderutterelijahgzeus423
  • Reply 38 of 66
    maestro64maestro64 Posts: 5,043member
    As I said before on this topic, comparing a Hash from a known image to an Hash of another image only works if both images are exactly the same. If the image that is being compared is modified in the least bit the Hash is no longer the same. Does anyone believe the people who are involved in this stuff are going to be that stupid to store these items on their phone exactly as they in the database. Especially now that this has been highlighted all over the place. If apple wanted to do this and catch these guys they should have never disclosed this information publicly.

    The police are only catching the stupid idiots who down load this stuff to their home computer. The major trafficers of this stuff are not being caught, if they were it would have been plastered all over the news. 

    The problem we have is we have bunch of people who think if they are doing something for very altruistic reason the means justify the ends. They do not care if your rights and privacy are violated in the end. You hear these people say all the time even if they get one of these bad people is justification for what they did. If people do not being stepping up on all of this invasion of privacy, you will have non in the near future. You have to stop saying you have nothing to hide or only those who are doing something wrong should worry. It does not stop there, you are already seeing people moving the goal post on what is acceptable and what is not.
    muthuk_vanalingamOferelijahg
  • Reply 39 of 66
    anonymouseanonymouse Posts: 6,857member
    gatorguy said:
    chadbag said:
    tommikele said:
    I can't wait to read about some Grandmother who gets arrested when they find a picture of naked infant grandchild in the bath in her iCloud Photo Library. I guarantee you that or something equally insane will result from this.
    It doesn’t work like that.  Unless that infant grandchild pic was put into the child abuse pic database and had a hash made and put into th DB. 

    I am against this because of the way it is being done.  Apple putting plumbing in that can later be abused when circumstances change, despite their best intentions today. 

    However, relying on misinformation does not help the discussion against this in any way. 
    How does the blocking/reporting of potentially inappropriate photos sent or received by minors work? That's an additional piece of this. I don't understand how that would be based on a strict hash, perhaps something very fuzzy like recognition of skin uncovered?
    Seriously gatorguy? Have you bothered to read anything authoritative (such as from Apple) on this at all? Or are you just feigning an embarrassing level of ignorance?
    edited August 2021 techconcn2itivguy
  • Reply 40 of 66
    bbhbbh Posts: 134member
    I think the Apple tool is a great idea.

    The iPhone has top notch photo and video capabilities and is very portable.
    As such, it can easily be a portable tool to abuse children.

    Apple wants to stop that.  Go Apple go go go!
    How about if I'm the arbiter of whether or not that "cute" photo of your child taking a bath is "kiddy porn" ? Suppose I say it is ? Then what ? Your opinion doesn't matter. This "well intentioned" screening seems obvious to me that we are put in the position of proving our innocence vis a vis every photo we upload. Privacy ? Not much !! 

    APPLE is trashing years and years of reputation with this. BAD IDEA.
    edited August 2021 muthuk_vanalingamOfer
Sign In or Register to comment.