Apple's Federighi says child protection message was 'jumbled,' 'misunderstood'

135

Comments

  • Reply 41 of 90
    fastasleepfastasleep Posts: 6,417member
    mejsric said:
    NYC362 said:
    Come on already.  Google and Facebook have been doing this for years.   Suddenly when Apple wants to do the same thing, everyone gets twisted.

    It is just like the 1000 gas powered cars that will catch on fire get not a work of national attention, but one Tesla goes up in flames and there's a world wide news bulletin like it's the end of the world. 
    i don't heard any news about Google and Facebook scanning photos or files on my computer. 
    If you upload them to their cloud services they sure do, and the only difference is when exactly it happens. 


    argonaut
  • Reply 42 of 90
    opinionopinion Posts: 103member
    I believe Apple have good intentions but they made a communication mistake simply. And by that even a bigger mistake brand vise - since they focused so much on being the no 1 choice in privacy just the slightest notion of being near any kind of surveillance is going against that conception.
    n2itivguy
  • Reply 43 of 90
    Pee Wee Herman, Gary Glitter and R. Kelly are all talking about switching to Android now. We need these celebrities on Apple!
    crowleyargonaut
  • Reply 44 of 90
    opinionopinion Posts: 103member
    And also, today media is aiming to be first not right about news and that does not help in this situation either.
    edited August 2021 n2itivguy
  • Reply 45 of 90
    jdwjdw Posts: 1,336member
    I really like Federighi and highly respect the man, but if indeed that is all he said, I'm afraid it's little more the Doublespeak. It doesn't address the core issue at all.

    I also find it fascinating that Apple has long said it doesn't enter a particular market until "we can do it right."  They want to make sure they can do things in an insanely great way before they embark on it.  And yet, this case seems to be a huge exception to that rule.  They clearly didn't think this out well enough.  As such, the best thing for Apple to do would be to scuttle the idea until such a day that they can truly "figure it out."  You cannot say you've figured it out with this much public animosity toward the plan, regardless of whether you feel the public has "misunderstood" the current plan.

    The road to hell is paved with good intentions.
    muthuk_vanalingamcat52elijahg
  • Reply 46 of 90
    This feature should be nixed. It is subject to that pandemic called "function creep" folks. Been around eons. Won't go way once started. Despite the noble intent it isn't worth it. Kill it at birth. 
    jdwcat52elijahg
  • Reply 47 of 90
    I see in the comments that again most of the people do not understand that this is implemented with way greater privacy in mind. If you use MS, GOOG, FB, ADBE cloud services, all you pictures are scanned and fingerprinted, so all these companies know a lot about you, and if ever there is a data breach a lot can be derived from these hashes, even if you have ZERO CSAM pictures. Apple is pretty simple, if you use iCloud, the picture's hash is calculated on device, no CSAM hit means no hash sent to iCloud, we can assume that most of us have no CSAM pictures, so no one will have hashes in iCloud ... What do you prefer? I know I prefer Apple's method by far. To to be clear, you taking a bath with your 5 year old will NOT result in a hit, and no Apple wil not even know you're sitting in your bath with your 5 year old, it's not image recognition they are doing it's a mathematical hash that is being checked against a known CSAM library. I do see the mis understanding because the iMessage parental control is and is something that works within the family (up to 13 years available and needs to be approved by the child). If you claim Apple is running spyware, I guess you have other motives or just to st ....   
    n2itivguyradarthekatargonaut
  • Reply 48 of 90
    jdwjdw Posts: 1,336member
    temperor said:
    If you use MS, GOOG, FB, ADBE cloud services...
    I do not use those cloud services, but even if I did, they are largely irrelevant insofar as this is AppleInsider and most of us are Apple fans focused exclusively on Apple. We like Apple BECAUSE it isn't Google or FaceBook or Adobe. Please consider that.

     You also did not address the most glaring problem -- the human review aspect. That's right, Apple will pay people to view kiddy porn and verify if the flagging was false. We know the mental toll on FaceBook contractors who do nothing all day but review evil posts. How much more so will the mental toll be on people hired by Apple to examine atrocities against innocents? Not a single mention of this is made by Apple or anyone defending Apple. Everyone defending Apple on this glories in the "greater good" aspect, and treats flippantly the sacrifices that will have to be made for that "good" to occur.
    edited August 2021 OctoMonkeymuthuk_vanalingamcat52elijahg
  • Reply 49 of 90
    mattinozmattinoz Posts: 2,316member
    I'm amazed Apple are generally so good at messaging yet every 5 years or so they still manage to fluff a message this badly. 
    I can't even remember what the last one was I'm sure people do they were never going to buy Apple anything again.

    Rinse Repeat
  • Reply 50 of 90
    Frederghi and Cook are beginning to look a bit foolish at this point. That would be fine with me, if it wasn't starting to make Apple itself look a bit foolish.

    Fix the problem instead of this mea culpa farce, people.
    Apple should abandon this feature altogether. 
    OctoMonkeycat52elijahg
  • Reply 51 of 90
    PezaPeza Posts: 198member
    mfryd said:
    NYC362 said:
    Come on already.  Google and Facebook have been doing this for years.   Suddenly when Apple wants to do the same thing, everyone gets twisted.

    It is just like the 1000 gas powered cars that will catch on fire get not a work of national attention, but one Tesla goes up in flames and there's a world wide news bulletin like it's the end of the world. 
    While Google and Facebook have been scanning user data for years, they are honest and open about it.  They make their money by selling information they glean about their customers.  Google doesn't charge for Gmail because they learn more about you when you use their services, and hence make more profit selling that higher grade information.

    On the other hand, Apple prides itself on protecting the privacy of their customers.  A key sales point in buying into the Apple ecosystem is that Apple does everything they possibly can in order to protect your data.  They even fight court orders requiring them to add back doors to iPhone local encryption.

    Under Apple's new policy, every image you upload to your iCloud library will be scanned, and compared against a list of blacklisted images.  If you have too many blacklisted images, you will be reported to the authorities.  

    Initially, the blacklist will only contain child porn images.  I can easily imagine a narcissistic leader ordering Apple to add to that list images which are critical to the government.   Imagine a photo of a President that makes him look foolish, shows him in a compromising position, or reveals a public statement to be a lie.  Such a President would have a strong incentive to add these photos to the list.  Remember, Apple doesn't know what the blacklisted photos look like, Apple only has digital fingerprints of these images (it would be illegal for Apple to posses Child Porn, even if it was for a good cause).

    Apple fights to keep out back doors because they know that once those tools exist, at some point they will be used inappropriately by the government.  While I applaud Apple's goal of fighting child abuse, I know that once this tool exists, at some point it will be used inappropriately by the government.

    One of the key beliefs in the US Constitution is a healthy distrust of government.  We have three branches of government, each to keep watch on the other two.  We have a free press and free speech so that the public can stay informed as to what government is doing.  We have free and fair elections so the people can act as the ultimate watchdog by voting out representatives that are not doing what they should be doing.

    I strongly urge Apple to protect the privacy of our images by killing this feature.   At best, it will get the criminals use other services for sharing their personal photos.  At worst, it is a tool that can be used by the government to prevent free speech.
    Not quite right there, Google actually does not sell your ‘high grade’ data, it uses it to target you with advertising, it makes its money from advertisement plus a cut of app sales on its store and charging ISP’s for searches performed on its service. And Facebook literally tells you in its T&C’s it owns everything you upload to its servers. You give up any right you have to it by agreeing to those terms.
    Your entire comment has also dismissed the fact that Apples servers are only going to scan photos of 13 year olds and younger and only if you turn the feature on.

    Anyway all I see are the usual hypocrites complaining, everyone is happy for Apples suppliers of its customer services to put cameras on their employees, demand access to their children’s electronic devices and services, that’s don’t, but when it comes to Apple wanting to scan some photos of you turn the option on the world has stopped rotating and you all want to burn Apple to the ground.
    edited August 2021
  • Reply 52 of 90
    crowleycrowley Posts: 10,453member
    jdw said:
    temperor said:
    If you use MS, GOOG, FB, ADBE cloud services...
    I do not use those cloud services, but even if I did, they are largely irrelevant insofar as this is AppleInsider and most of us are Apple fans focused exclusively on Apple. We like Apple BECAUSE it isn't Google or FaceBook or Adobe. Please consider that.

     You also did not address the most glaring problem -- the human review aspect. That's right, Apple will pay people to view kiddy porn and verify if the flagging was false. We know the mental toll on FaceBook contractors who do nothing all day but review evil posts. How much more so will the mental toll be on people hired by Apple to examine atrocities against innocents? Not a single mention of this is made by Apple or anyone defending Apple. Everyone defending Apple on this glories in the "greater good" aspect, and treats flippantly the sacrifices that will have to be made for that "good" to occur.
    I don’t envy those reviewers, and I hope Apple is properly supporting and compensating them for doing a taxing job, but the idea that we shouldn’t try to identify and stop child abuse perpetuators because the investigators might be too sensitive is absolutely bankrupt. Sweep it under the carpet and pretend it doesn’t exist because it’s too ugly to look at?  Absolutely not.

    in an ideal world there wouldn’t even be a need for this job. But we don’t live in that world, and to get closer to that world some people will have to look at some horrible things in order to stop them. 
    dewmeradarthekatargonaut
  • Reply 53 of 90
    avon b7avon b7 Posts: 7,668member
    crowley said:
    jdw said:
    temperor said:
    If you use MS, GOOG, FB, ADBE cloud services...
    I do not use those cloud services, but even if I did, they are largely irrelevant insofar as this is AppleInsider and most of us are Apple fans focused exclusively on Apple. We like Apple BECAUSE it isn't Google or FaceBook or Adobe. Please consider that.

     You also did not address the most glaring problem -- the human review aspect. That's right, Apple will pay people to view kiddy porn and verify if the flagging was false. We know the mental toll on FaceBook contractors who do nothing all day but review evil posts. How much more so will the mental toll be on people hired by Apple to examine atrocities against innocents? Not a single mention of this is made by Apple or anyone defending Apple. Everyone defending Apple on this glories in the "greater good" aspect, and treats flippantly the sacrifices that will have to be made for that "good" to occur.
    I don’t envy those reviewers, and I hope Apple is properly supporting and compensating them for doing a taxing job, but the idea that we shouldn’t try to identify and stop child abuse perpetuators because the investigators might be too sensitive is absolutely bankrupt. Sweep it under the carpet and pretend it doesn’t exist because it’s too ugly to look at?  Absolutely not.

    in an ideal world there wouldn’t even be a need for this job. But we don’t live in that world, and to get closer to that world some people will have to look at some horrible things in order to stop them. 
    Definitely.

    Having worked in potentially 'distressing' jobs and also having three people supposedly commit suicide because of my actions in government I have direct experience in some of the nasty jobs.

    Some people are not cut out for certain jobs so there are PV processes to find the right candidates.

    I've received death threats and done anti mafia surveillance. It's not nice but it was voluntary and we received a ton of in-house training to prepare us for the different tasks.

    Obviously there is a type of desensitisation happening in the day-to-day side of things and you learn to switch off to the suffering of others while doing your job.

    That mostly translates to a 'healthy' worker who is able to live a normal life outside of work.

    We are human though and some (a minority) will burn out psychologically or have periods when things get too much. No one is truly immune to those moments.

    Mostly though these jobs are voluntary affairs and you can walk away from those tasks to other tasks within the organisation. 

    I once knew someone who worked in the cleanup team of suicide deaths on the London Underground. Good pay, good conditions, voluntary, etc but they obviously all went through a PV process before they were assigned. 

    I used to work in a government open plan office with a raised rail line right next to it. I was on the second floor parallel to the line and the nearest station was 200m away. 

    One day an express train came to an emergency stop because someone had jumped in front of it from the station. 

    We became witnesses to the whole operation to deal with the situation. From the arrival of the police, judicial teams, the removal of what was left of the body in a huge black bag etc.

    But then, a few hours later another team appeared and 'swept' the area inch by inch, recovering bits of flesh and putting them into transparent bags that each one was holding. 

    What struck us was that, as they moved down the line, they were chatting and clearly joking as they went about their work. Professionally of course but the 'nature' of the job clearly wasn't a problem for any of them. 

    I like to think that these big companies are vetting candidates well for any nasty tasks they may have to work on. 

    There's always the view from the outside and the view from the inside. 

    The case of the worst Thames riverboat sinking comes to mind. The forensic scientists were overloaded with work, lacking resources and under pressure to get the job of identifying victims done as quickly as possible. 

    They took the reasonable decision (IMO, and given the extreme circumstances) to cut off the hands of victims and send them somewhere else for identification. Families of the victims were appalled of course but this just goes to show how the team leads were able to separate their emotions from their work. 

    The Memorial Hospital case in the US also comes to mind. 



    anantksundaramargonaut
  • Reply 54 of 90
    DAalsethDAalseth Posts: 2,783member

    I mean they used to be able to even give out bad news and make it sound good. They made admissions that they screwed up sound like a brilliant move. They made dropping products that lay on the showroom floor like a mouldy turd sound like a technological triumph. Those days are gone.
    edited August 2021 cat52elijahg
  • Reply 55 of 90
     Federighi then stepped through the processes involved in both the iCloud Photos scanning of uploaded CSAM (Child Sexual Abuse Material) images, and what happens when a child is sent such an image over Messages.”

    Even Appleinsider is still mistaken. It just shows how confusing the release of these features has been. 

    The iMessage feature for minors has nothing to do with CSAM images at all. It’s just using ML to infer any type of nude image. It’s not hash matching against anything. 
    cat52
  • Reply 56 of 90
    9secondkox29secondkox2 Posts: 2,710member
    While the heart behind this use case is good, there is nothing to stop other use cases from being justified in the future. 

    What people store on iCloud is still their data snd they expect it to be secure snd private. Apple has made this more of an “it’s on our server so we can do what we want” issue. That’s not how people have been using it. 

    Unfortunately, Craig did not dissuade those with concerns. If anything he reinforced them. Just because it’s scanning your iCloud files doesn’t mean they are not your files. 

    I do hope this genuinely protects minors from not only the predators out there, but also from wrecking their own lives with the sending snd receiving of photos. It’s good to give parents more authority as well since smartphones have been at the center of so much trouble. 

    I also hope this kind of tech is never ever ever used against people who are not of the prevailing cultural mindset politically, religiously, medically, or lifestyle. 

    But that won’t be so. There is too much now in terms of surveillance on each of us, censorship over simple disagreements with the politically correct consensus, etc. this is how bad things in history got started. Only instead of coming to your house, they can do it from far away without you being aware. 

    I sincerely hope this is never applied to something else. I hope it’s never abused. I hope…
    edited August 2021 muthuk_vanalingamcat52elijahgargonaut
  • Reply 57 of 90
    GeorgeBMacGeorgeBMac Posts: 11,421member
    darkvader said:
    The message is not the problem.

    The spyware is the problem.

    Yes, although almost everybody agrees that this was a good use of spyware, it's still spyware.

    My take is:   OK, you did a good job here.   But where does this go next?  What will be the next hot button? 
    The horse is out of the barn.   You can't put him back.

    While Apple is defending it, I suspect that they, at this point, regret doing it.
    But, at this point, they can't go back.  If they do, they'll be accused of enabling child pornography.
    For Apple, this is a no-win situation for them -- at least until we see on the crime blotter where they stopped a child molester.
    edited August 2021 entropys
  • Reply 58 of 90
    dewmedewme Posts: 5,362member
    crowley said:
    I don’t envy those reviewers, and I hope Apple is properly supporting and compensating them for doing a taxing job, but the idea that we shouldn’t try to identify and stop child abuse perpetuators because the investigators might be too sensitive is absolutely bankrupt. Sweep it under the carpet and pretend it doesn’t exist because it’s too ugly to look at?  Absolutely not.

    in an ideal world there wouldn’t even be a need for this job. But we don’t live in that world, and to get closer to that world some people will have to look at some horrible things in order to stop them. 
    Well said, very well said. This is all part of being a compassionate adult. There are times when you have to put aside your selfish inclinations, personal comfort, and tribal boundaries to help other people who cannot help themselves. This is always the case with adults who are implicitly and morally responsible for safeguarding and protecting children. As a compassionate human being, if you’re in a position to help protect a child, you help. This is non-negotiable. 

    There are too many who have planted their flag and are willing to die on their own little hill of ideological purity. We, especially Americans and countries that have been saved by American intervention, are very fortunate that these little hills are few and far between. America could have stood firmly on their ideology of neutrality, isolationism, and nonintervention during both World War 1 and World War 2.

    Fortunately, if somewhat belatedly, America put pragmatism, compassion, and a sense of global unity out front and intervened. No doubt that some folks were appalled by the sacrifice of ideological purity and the terrible price that was paid once America decided to act. But action was needed and the cost and consequences were and still are viewed in the free world as having been an acceptable sacrifice for the benefits achieved. 

    If our children are the lifeline to the future of humanity, why are we even arguing about the need to act? We have new technological tools that can make a difference closer to the source. The “law & order” chest thumping which mostly treats the symptoms and post damage aftermath aren’t yielding sufficient results. 

    If someone has an issue with the techniques being used, please bring your alternative approaches forward - now. Doing nothing other than speculating about what-ifs and what-abouts is no longer morally acceptable. As adults we are responsible for protecting our children, all children in fact. There is no hidden agenda here, it’s simply Apple, like so many other companies, following through and acting compassionately and responsibly, with some additional prodding from the general public and our representatives. 
    argonautDetnator
  • Reply 59 of 90
    radarthekatradarthekat Posts: 3,842moderator
    mfryd said:
    NYC362 said:
    Come on already.  Google and Facebook have been doing this for years.   Suddenly when Apple wants to do the same thing, everyone gets twisted.

    It is just like the 1000 gas powered cars that will catch on fire get not a work of national attention, but one Tesla goes up in flames and there's a world wide news bulletin like it's the end of the world. 
    While Google and Facebook have been scanning user data for years, they are honest and open about it.  They make their money by selling information they glean about their customers.  Google doesn't charge for Gmail because they learn more about you when you use their services, and hence make more profit selling that higher grade information.

    On the other hand, Apple prides itself on protecting the privacy of their customers.  A key sales point in buying into the Apple ecosystem is that Apple does everything they possibly can in order to protect your data.  They even fight court orders requiring them to add back doors to iPhone local encryption.

    Under Apple's new policy, every image you upload to your iCloud library will be scanned, and compared against a list of blacklisted images.  If you have too many blacklisted images, you will be reported to the authorities.  

    Initially, the blacklist will only contain child porn images.  I can easily imagine a narcissistic leader ordering Apple to add to that list images which are critical to the government.   Imagine a photo of a President that makes him look foolish, shows him in a compromising position, or reveals a public statement to be a lie.  Such a President would have a strong incentive to add these photos to the list.  Remember, Apple doesn't know what the blacklisted photos look like, Apple only has digital fingerprints of these images (it would be illegal for Apple to posses Child Porn, even if it was for a good cause).

    Apple fights to keep out back doors because they know that once those tools exist, at some point they will be used inappropriately by the government.  While I applaud Apple's goal of fighting child abuse, I know that once this tool exists, at some point it will be used inappropriately by the government.

    One of the key beliefs in the US Constitution is a healthy distrust of government.  We have three branches of government, each to keep watch on the other two.  We have a free press and free speech so that the public can stay informed as to what government is doing.  We have free and fair elections so the people can act as the ultimate watchdog by voting out representatives that are not doing what they should be doing.

    I strongly urge Apple to protect the privacy of our images by killing this feature.   At best, it will get the criminals use other services for sharing their personal photos.  At worst, it is a tool that can be used by the government to prevent free speech.
    I fail to see how a foreign government has had to wait for this hash comparison technique to be implemented by Apple before they could demand that Apple start scanning for pictures of their leader or any other subject.  The very nature of a dictator or totalitarian implies they could demand any kind of functionality they wish in return for access to their markets.  They didn’t need to wait for Apple to create a tool that could be applied to their dastardly schemes; they could simply demand that Apple do what they want done regarding spying on their citizenry and hold access to their market hostage until they get it.  So your argument holds exactly zero water.  
    mattinozDetnator
  • Reply 60 of 90
    radarthekatradarthekat Posts: 3,842moderator
    darkvader said:
    The message is not the problem.

    The spyware is the problem.
    It’s not spyware. It’s the exact same child porn hash checking used by Microsoft, Google, Dropbox, Twitter, and Tumblr. Nobody wants to host child pornography on their own servers, and they won’t let you upload them. If you use any of those services, you’re already using CSAM hash checking. 

    You are free to not pay to use iCloud Photos and your problem is solved. 
    Watch the interview! Frederighi's word salad is distressing. He simply avoids tough questions, especially the one on backdoors. Stern does an outstanding job, but should have pushed him harder.

    It's not about CSAM.
    There really are no tough questions surrounding this issue, but there have been some foolish an ignorant ones.  
Sign In or Register to comment.