Apple expanding child safety features across iMessage, Siri, iCloud Photos

135

Comments

  • Reply 41 of 97
    Sharing your thoughts both pro and con here is fine but for those of you against what’s planned should just write Tim Cook an email yourself at [email protected] and let him know how you feel.
    edited August 2021 watto_cobra
  • Reply 42 of 97
    chasmchasm Posts: 3,274member
    Seems like nearly all commenter didn’t read the article, or even the headline, before commenting.

    To be fair, most other media outlets offer way worse headlines, and not a one I’ve seen outside of AppleInsider has even attempted to calmly explain the policies.

    As Mike W pointed out … Apple is actually the LAST tech behemoth to implement these policies. Where were you people when Google, Facebook, Microsoft, and Twitter among others adopted these same controls and rules?

    What good are parental controls if they don’t do anything about child sexualisation and exploitation?
    watto_cobra
  • Reply 43 of 97
    doggonedoggone Posts: 377member
    I understand that Apple are trying to do a good thing here but this really goes against the grain as far as privacy is concerned.
    What also worries me is the this can be used by people to bribe or blackmail. 
    There a plenty of ways that people inadvertently download files etc. What if a banned image is slipped in to your photo library without your knowledge.  The next thing you know is the police are knocking on your door.  This is way worse that having your PC held for ransom. It’s potentially your life that could be ruined simply because you made a mistake download a file you shouldn’t have. 
    Just imagine getting hit by a ransom ware that threatens to place child pornography on your computer.  There are plenty of people who will panic and do the wrong thing by paying up instead of going to the police.  This type of blackmail only has to work a small fraction of the time to be highly profitable. 

    There has to be another way to track the trafficking of this type of imagery as it is being distributed.  I don’t think looking a people’s private content is the right way to do it.  It is open to too much risk for hacking and could easily ruin many innocent people’s lives.

    Apple could face a very severe backlash on this.  
    muthuk_vanalingamelijahgmobirdbaconstang
  • Reply 44 of 97
    Rayz2016Rayz2016 Posts: 6,957member
    chasm said:
    Seems like nearly all commenter didn’t read the article, or even the headline, before commenting.

    To be fair, most other media outlets offer way worse headlines, and not a one I’ve seen outside of AppleInsider has even attempted to calmly explain the policies.

    As Mike W pointed out … Apple is actually the LAST tech behemoth to implement these policies. Where were you people when Google, Facebook, Microsoft, and Twitter among others adopted these same controls and rules?

    What good are parental controls if they don’t do anything about child sexualisation and exploitation?

     Just a few points:

    I didn't say anything when other companies implemented these changes because I didn't know. And I feel really bad about this because through my own ignorance, I lost the opportunity to rip Google a second arsehole. Which brings me to my next point.

    As far as I can tell, Google is supplying the tech that makes the tagging possible; I can't see where they are handing over matches of CSAM pictures found on Google Pics to law enforcement. They just supplied the tech that will have much wider benefit by tagging the pictures found floating around the web and tracing them back to their source. 

    Google: Let's clean up the internet. 
    Apple: Let's just worry about our servers.

    Google is doing something about sexualisation and exploitation by attacking the sources. Apple's solution will do very little because perverts are not storing their images on iCloud accounts where backups can be accessed with a legal request. 

    When I was a kid I used to get stop-searched a lot; and when I started driving, I was stopped again so the police could make sure I hadn't stolen the vehicle.  I hadn't burgled any houses and I hadn't stolen the cars. Now since this has probably never happened to you, let me tel you something you may not be aware of: being accused of something you didn't do is not a good feeling; it's stressful and it's shaming, even when you haven't done anything. That's before we end up in in inevitable situation of somebody losing their family and their livelihood over a false accusation, and Apple being told they have to search for pictures of dissidents in folks' photo library too.


    elijahgandrewj5790baconstangmuthuk_vanalingam
  • Reply 45 of 97
    Rayz2016Rayz2016 Posts: 6,957member
    Hah! My Android-loving friends were gleefully all over this. They asked me if I'll now consider a high-end Android for my next phone.

    "All that's happened," I said, "is that one of the reasons for not choosing an Android phone – better privacy on Apple phones – has been squashed. That still leaves about nine other reasons."

    "Good one, dude."

    "You're forty-five years-old. Stop calling me dude."

    "Sure, and you'll stop banging on about Apple giving a shit about your privacy."

    "Agreed. But I'm still not buying a Samsung."

    "Yuh, whatever."

    "I repeat: you're forty-five years-old."
    elijahgbaconstangwatto_cobra
  • Reply 46 of 97
    crowleycrowley Posts: 10,453member
    Kids' photos are going to start getting flagged by this system the moment it goes live.
    Only if the kids photos are in the child abuse database.  Why would they be?
    Is Apple ready for that? 
    You're suggesting that they aren't ready for their own feature?
    Will they contact the parents of the kids or the authorities?
    How would they contact the parents of the kids?
    And yes there is a way that a kids photos could be in CSAM and you should already know that.
    What?
    How do people opt out of this program?
    This program would be useless if there was an opt out.  You buy another phone from a different company.  Note that Google does this too in Android.
    Will they tell users when their photos have been flagged and viewed by humans?
    Probably not, why would they?
    Will we get to see the score our own photos got?
    Score?
    Can we see the source code to make sure it was implemented correctly?
    You've heard of Apple, right?
    Where is this database of known CSAM images? How would Apple possess such a thing legally?
    There is no database of images, it's a database of images hashes, and the data is from the National Center for Missing and Exploited Children (NCMEC).  
    So many unanswered questions. I guess we have to wait for the S to hit the F.
    The answer to pretty much all of these questions was in the article.  For those others, common sense would do you in good stead.
    asdasdCloudTalkinwatto_cobra
  • Reply 47 of 97
    Rayz2016Rayz2016 Posts: 6,957member
    doggone said:
    I understand that Apple are trying to do a good thing here but this really goes against the grain as far as privacy is concerned.
    What also worries me is the this can be used by people to bribe or blackmail. 
    There a plenty of ways that people inadvertently download files etc. What if a banned image is slipped in to your photo library without your knowledge.  The next thing you know is the police are knocking on your door.  This is way worse that having your PC held for ransom. It’s potentially your life that could be ruined simply because you made a mistake download a file you shouldn’t have. 
    Just imagine getting hit by a ransom ware that threatens to place child pornography on your computer.  There are plenty of people who will panic and do the wrong thing by paying up instead of going to the police.  This type of blackmail only has to work a small fraction of the time to be highly profitable. 

    There has to be another way to track the trafficking of this type of imagery as it is being distributed.  I don’t think looking a people’s private content is the right way to do it.  It is open to too much risk for hacking and could easily ruin many innocent people’s lives.

    Apple could face a very severe backlash on this.  

    Michael Green, the researcher who put this out there, has some examples of how similar hash values can apply to images that are not actually the same when you look at them. 

    There are two attack points here:

    Disguised images sent to your phone, which is unlikely through Airdrop, but easily done through email.
    Images placed on the CSAM database by the government or a hacker. 

    Remember that Apple that Apple will be storing hashed data on your phone related to illegal porn that they will be trying to match using a deliberately imprecise algorithm that hasn't been reviewed by any expert outside of Apple.

    What can possibly go wrong.

    muthuk_vanalingamelijahgbaconstang
  • Reply 48 of 97
    bulk001bulk001 Posts: 764member
    aguyinatx said:
    It's terrifying that Apple is allowing the government to bypass legal restrictions that would have made this type of search unlawful.  I am not defending criminals or pedos but I strongly object to the government having unlimited insight into the photos on personal devices.  A list of hashes is exactly that, and that list could be expanded to anything the government would like as I strongly assume the the government is providing the hashes to Apple initially.
    I must live a very boring life. There are no naked pictures of me out there and even if there were, nobody would want to see them. Other than banking details, which the government can access without my phone, getting into the hands of criminals there is nothing I would be concerned about if it made its way onto the front page of the NYTimes. The only people
    this is going to affect is child pornographers and pedophiles. Good riddance. Small positives are not going to be flagged so this is not going to affect innocent and legal users. For the paranoid, the safest and easy solution is don’t backup to iCloud though, save everything to local computers, backing those up locally and then saving a copy using encrypted backblaze service or something similar. 
    watto_cobra
  • Reply 49 of 97
    For a very long time I’ve trusted Apple to do the right thing. 

    They’ve been prepared to make difficult decisions to maintain and enhance that trust. 

    They’ve now stepped over that line. 

    I’ve been paying them for decades for what amounts to surveillance systems I’ve placed in my home, carry with me everywhere and perform my secure and private work, correspondence etc on.

    I’ve justified paying a premium for apple kit because I trusted them that everything was private and remained private. 

    That is now not the case. 

    Apple are now enabling algorithms to judge me.  Every photo I take will be judged for approval. Every Siri request I make will be judged, every internet search I make I will be judged. 

    It used to be that those convicted of crimes where subject to having their electronic activities scrutinised, now everyone everywhere will be scrutinised. 

    I no longer trust Apple in my home or holding my families data.

    the biggest fear is when something goes wrong. 

    People and systems make mistakes. People don’t care when mIntakes happen to other people.

    no one is interested when mistakes happen to you, no smoke without fire etc etc etc. 
    elijahgmobirdbaconstang
  • Reply 50 of 97
    asdasdasdasd Posts: 5,686member
    Rayz2016 said:
    Worth taking a look at Matthew Green’s Twitter feed. Some fascinating stuff in there. 

    He points out that Apple is using a fuzzy matching algorithm, and that it's possible to have false matches that will flag due to the the fuzziness of the match. He also points out that no one has seen Apple’s matching algorithm, a matching algorithm built by the same company that gave you iTunes Match and Siri. 

    But that’s okay, to make sure that there’s no chance of a false match, Apple will have some stranger look at your pictures to make sure. So if your photos start showing up in odd places around the internet, this may be why. 
    Are you sure about fuzzy matching? I can’t see how that would work for a hash. The whole point of hashing is that minor changes on the source produce a totally different hash. Take a picture and generate a hash. Now add a black dot to the picture. Regenerate the hash. It’s totally different. Every character. 

    How can fuzzy matching work in that case? 
    watto_cobra
  • Reply 51 of 97
    Mike WuertheleMike Wuerthele Posts: 6,858administrator
    Rayz2016 said:
    georgie01 said:

    For the first time in 28 years I’m considering leaving Apple products. Child pornography is disgusting and an outrage, but overreach of power is currently running rampant in the US and that will eventually ruin an entire country.
    I’m wondering if this is some sort of deal that Apple has struck to prevent an even more intrusive solution from US law enforcement. If that’s the case then I imagine Google will have to implement something very similar in the near future. 

    But regardless, what I realise now is that I have been somewhat naive about Apple's stance on privacy. All this clever stuff they bang on about: “We can’t see this, we won’t look at that” – it really is smoke and mirrors. What Apple is doing is walking through the girls’ locker room while promising to keep its eyes shut. Yeah, that’s nice of them, but guess what? They’re still in the locker room. 

    So the next time GoogleGuy breezes through trying to prove that Google cares just as much about privacy as Apple does, I’m afraid he won’t be wrong: they both kinda suck, but at least Google is honest about it.

    What extraordinary times we live in.
    Google's had this system since about 2013 as you've noted in a follow-up comment, server-side. Like Apple will, they scan users' Google Photos libraries. Facebook since 2010, Microsoft since 2008. Twitter, as you've said.

    And yes, they scan pro-actively, and report to the authorities. They all use the same approach as it pertains to law enforcement that Apple will start using shortly.

    So, for the folks saying "this is my last Apple product" by all means go if you're uncomfortable. Have a good time finding a vendor that doesn't do this. In short, any image that a server can see, anywhere, on anybody's servers or even passing through them, is very likely going to be looked at for this material whether you want it to or not.

    And, like I said on page one of this comment thread -- if you don't want Apple to do it and think this is some kind of massive privacy breach despite Apple not knowing or giving one single care that you've got 45,000 pictures of your dog frolicking in your back yard despite how you may think the system works, turn off iCloud Photos. That simple.

    And, the other system, the explicit Messages notification to parents which is getting conflated with the CSAM hash table identification by folks that didn't read the article: It's opt-in, and only for child accounts in a Family Sharing setup. That's all.
    edited August 2021 watto_cobra
  • Reply 52 of 97
    A link to the actual announcement would have been helpful 

    https://www.apple.com/child-safety/
  • Reply 53 of 97
    asdasdasdasd Posts: 5,686member
    Can somebody link to this Michael green Twitter thread.  I don’t buy the deliberately imprecise fuzzy matching for hashes at all. 
  • Reply 54 of 97
    Mike WuertheleMike Wuerthele Posts: 6,858administrator
    asdasd said:
    Can somebody link to this Michael green Twitter thread.  I don’t buy the deliberately imprecise fuzzy matching for hashes at all. 
    It's in the article text and it's been there since publication.
    edited August 2021
  • Reply 55 of 97
    asdasdasdasd Posts: 5,686member
    who reads the articles? 

    Just kidding. I didn’t realise the image was clickable. 

    I went to the thread and he’s deliberately scaremongering about collisions. 


    He gives an example of a simple 6 numeric hash there. Just like a 6 digit password is more likely to collide than a 256 alphanumeric password,  the same is true of a hash. From stack overflow. 

    If we have a "perfect" hash function with output size n, and we have p messages to hash (individual message length is not important), then probability of collision is about p^2/2^n+1 (this is an approximation which is valid for "small" p, i.e. substantially smaller than 2n/2). For instance, with SHA-256 (n=256) and one billion messages (p=10^9) then the probability is about 4.3*10-60.

    A mass murdering space rock happens about once every 30 million years on average. This leads to a probability of such an event occurring in the next second to about 10-15. That's 45orders of magnitude more probable than the SHA-256 collision. Briefly stated, if you find SHA-256 collisions scary then your priorities are wrong.

    edited August 2021 watto_cobra
  • Reply 56 of 97
    Mike WuertheleMike Wuerthele Posts: 6,858administrator
    asdasd said:
    who reads the articles? 

    Just kidding. I didn’t realise the image was clickable. 

    I went to the thread and he’s deliberately scaremongering about collisions. 


    He gives an example of a simple 6 numeric hash there. Just like a 6 digit password is more likely to collide than a 256 alphanumeric password,  the same is true of a hash. From stack overflow. 

    If we have a "perfect" hash function with output size n, and we have p messages to hash (individual message length is not important), then probability of collision is about p^2/2^n1 (this is an approximation which is valid for "small" p, i.e. substantially smaller than 2n/2). For instance, with SHA-256 (n=256) and one billion messages (p=10^9) then the probability is about 4.3*10-60.

    A mass murdering space rock happens about once every 30 million years on average. This leads to a probability of such an event occurring in the next second to about 10-15. That's 45orders of magnitude more probable than the SHA-256 collision. Briefly stated, if you find SHA-256 collisions scary then your priorities are wrong.

    Yeah, I know you read the articles, history demonstrates this.

    A large number of commenters in this post did not, however.
    edited August 2021 watto_cobraasdasd
  • Reply 57 of 97
    Rayz2016Rayz2016 Posts: 6,957member
    One thing I’m wondering though. 

    For Apple to throw its whole privacy schtick out the window, what were they promised in return?

    A favourable outcome in the Epic case?
    Governments backing off on allowing third party app stores?
    Exclusion from right-to-repair legislation?

    ”If we just inject this picture/file/document in this database, and you tell us which users have it on their phone …”

    Could be the best deal they ever made. 
    elijahgyuck9
  • Reply 58 of 97
    Rayz2016Rayz2016 Posts: 6,957member
    I went to the thread and he’s deliberately scaremongering about collisions. 

    Yup, s’funny, I said he was scaremongering when he said Apple would be checking your pictures and sending their interpretation to law enforcement. 

    The match has to be fuzzy because it’s rare that a CSAM picture will be sent around the internet without being enlarged or compressed.  

    The other thing is that no one has seen Apple’s matching algorithm, but as long it wasn’t written by the same people who wrote iTunes Match …

    Thirdly, Apple’s record of review processes is pretty shit, has to be said. 


  • Reply 59 of 97
    Rayz2016Rayz2016 Posts: 6,957member
    chris-net said:
    For a very long time I’ve trusted Apple to do the right thing. 

    They’ve been prepared to make difficult decisions to maintain and enhance that trust. 

    They’ve now stepped over that line. 

    I’ve been paying them for decades for what amounts to surveillance systems I’ve placed in my home, carry with me everywhere and perform my secure and private work, correspondence etc on.

    I’ve justified paying a premium for apple kit because I trusted them that everything was private and remained private. 

    That is now not the case. 

    Apple are now enabling algorithms to judge me.  Every photo I take will be judged for approval. Every Siri request I make will be judged, every internet search I make I will be judged. 

    It used to be that those convicted of crimes where subject to having their electronic activities scrutinised, now everyone everywhere will be scrutinised. 

    I no longer trust Apple in my home or holding my families data.

    the biggest fear is when something goes wrong. 

    People and systems make mistakes. People don’t care when mIntakes happen to other people.

    no one is interested when mistakes happen to you, no smoke without fire etc etc etc. 

    Yeah, that's the problem though isn't it? Lots of folk chucking out maths and stats and saying it can't happen … great, until you're the one it happens to. 

    But the thing to remember that this is something that all companies are going to have to agree to, so unless you get rid of everything that plugs in, it's going to be pretty hard to escape. 

    Look on the bright side: you no longer have to pay a premium for Apple kit.  You may have put up with Siri's shortcomings because you believed those shortcomings were a trade-off for privacy.  Well, this announcement means the privacy field is pretty much level, so buy what works best for you.
    elijahg
  • Reply 60 of 97
    Mike WuertheleMike Wuerthele Posts: 6,858administrator
    Rayz2016 said:

    Look on the bright side: you no longer have to pay a premium for Apple kit.  You may have put up with Siri's shortcomings because you believed those shortcomings were a trade-off for privacy.  Well, this announcement means the privacy field is pretty much level, so buy what works best for you.
    I have literally no idea how you're coming up with this conclusion. It isn't based on facts.

    Nothing that Apple announced yesterday has anything to do privacy, including but not limited to email metadata harvesting for ad serving, Alexa data collection for suggested product sales, variable Amazon pricing based on purchase history, Facebook data scraping, Android privacy issues,  photo metatagging for ads, or anything else.

    Apple still doesn't know what's in your photo library because of anything announced yesterday.
    edited August 2021 watto_cobra
Sign In or Register to comment.