Apple employees express concern over new child safety tools

13

Comments

  • Reply 41 of 66
    omasouomasou Posts: 573member
    Ofer said:
    Sadly, even with this new tool, Apple is still the best game in town as far as user privacy is concerned. I’m a big fan of the company and have been using their products since the very first computer I’ve owned (anyone remember the Apple IIgs?). However, if they continue in this trajectory, I may have to start researching other options.
    Yes, I remember the IIgs, still have one...  as well as my first Macintosh (a platinum Plus I purchased new back in '87).

    Sadly, due to this CSAM scanning, I will almost certainly be moving away from Apple.  I have already disabled the auto-upgrade on our dozen, or so, iOS based devices.  When iOS 15 comes out, that will be the end of updates for us.  Since we don't use iCloud and rarely text, this "feature" would have little impact on us.  However, the very notion that a company (any company) believes it has the right to place what amounts to spyware on devices I own is completely unacceptable!  If this means eventually going back to a "dumb" phone, then so be it.  Even if Apple were to state it will not include this "feature", I would probably not trust them enough to believe them...  perhaps they would just do it anyway without telling anybody.

    In fairness, I grew up consuming more than my fair share of dystopian novels and movies which might be influencing my position a bit.  :-)

    May as well plan to move off the grid as these technologies will become more and more ubiquitous.
    techconcOfer
  • Reply 42 of 66
    omasouomasou Posts: 573member
    It is amazing the amount of ignorance about how this works and peoples refusal to read about how it works before posting.

    https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf
    n2itivguyfastasleep
  • Reply 43 of 66
    techconctechconc Posts: 275member
    maestro64 said:
    As I said before on this topic, comparing a Hash from a known image to an Hash of another image only works if both images are exactly the same. If the image that is being compared is modified in the least bit the Hash is no longer the same. 
    Nope.  The algorithms used are designed to recognize the same picture, regardless if it has been resized, rotated, even slightly cropped, etc.

    bbh said:
    How about if I'm the arbiter of whether or not that "cute" photo of your child taking a bath is "kiddy porn" ?
    That's not how it works.  The arbiter of "kiddy porn" is based on a collection of existing photos currently in circulation and being distributed as such.  The photo you take of your child taking a bath is not part of that collection. 
    bbh said:
    APPLE is trashing years and years of reputation with this. BAD IDEA.
    Perhaps.  Mostly because of the ignorance, misinformation and overall false assumptions associated with it.  I would agree that it's not necessary.  I would also agree that it's probably the responsible thing to do.  Again, other services are already doing this today.  This is really nothing new.
    aderuttern2itivguyfastasleep
  • Reply 44 of 66
    bluefire1bluefire1 Posts: 1,302member
    Apple prides itself on safeguarding user privacy. Once you open the door for any exception, no matter how laudable, you begin a slippery slope that will make the word privacy hollow.
    This is a vacuous decision by Apple that needs to be reconsidered.

    edited August 2021 OferOctoMonkeyzeus423
  • Reply 45 of 66
    techconctechconc Posts: 275member
    omasou said:
    May as well plan to move off the grid as these technologies will become more and more ubiquitous.
    They are already more ubiquitous than people realize.  Examples...

    https://www.theverge.com/2014/8/5/5970141/how-google-scans-your-gmail-for-child-porn
    https://blog.chron.com/techblog/2014/08/google-microsoft-others-scan-your-data-looking-for-child-porn/
    https://miami.cbslocal.com/2018/12/07/lauderdale-man-child-porn-google-photos/
    https://www.forbes.com/sites/kashmirhill/2012/07/13/yes-facebook-scans-peoples-private-conversations-looking-for-sexual-predators-and-child-porn/?sh=6a103211595a

    Everyone else is already doing this.  Apple is just now getting onboard for obvious reasons.

    omasou said:
    It is amazing the amount of ignorance about how this works and peoples refusal to read about how it works before posting.

    https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf
    Exactly!
    omasoudewmen2itivguyfastasleep
  • Reply 46 of 66
    gatorguygatorguy Posts: 24,213member
    gatorguy said:
    chadbag said:
    tommikele said:
    I can't wait to read about some Grandmother who gets arrested when they find a picture of naked infant grandchild in the bath in her iCloud Photo Library. I guarantee you that or something equally insane will result from this.
    It doesn’t work like that.  Unless that infant grandchild pic was put into the child abuse pic database and had a hash made and put into th DB. 

    I am against this because of the way it is being done.  Apple putting plumbing in that can later be abused when circumstances change, despite their best intentions today. 

    However, relying on misinformation does not help the discussion against this in any way. 
    How does the blocking/reporting of potentially inappropriate photos sent or received by minors work? That's an additional piece of this. I don't understand how that would be based on a strict hash, perhaps something very fuzzy like recognition of skin uncovered?
    Seriously gatorguy? Have you bothered to read anything authoritative (such as from Apple) on this at all? Or are you just feigning an embarrassing level of ignorance?
    I missed you when you were gone, you charmer. :/

    Anyway, since asking the question I've read an explanation in an Apple article reported earlier today. Craig Federighi admits it was all poorly explained and all in all confusing. After reading his explanation (it's more complete on another site) my guess was correct: Apparently if certain skin areas are exposed, using a fuzzy algorithm to identify it, and determining the recipient or sender is identified by Apple (?) as a minor.

    You should jump over to that thread and insult those posters too since you thought I was the only one not getting it. 
    edited August 2021
  • Reply 47 of 66
    anonymouseanonymouse Posts: 6,860member
    gatorguy said:
    gatorguy said:
    chadbag said:
    tommikele said:
    I can't wait to read about some Grandmother who gets arrested when they find a picture of naked infant grandchild in the bath in her iCloud Photo Library. I guarantee you that or something equally insane will result from this.
    It doesn’t work like that.  Unless that infant grandchild pic was put into the child abuse pic database and had a hash made and put into th DB. 

    I am against this because of the way it is being done.  Apple putting plumbing in that can later be abused when circumstances change, despite their best intentions today. 

    However, relying on misinformation does not help the discussion against this in any way. 
    How does the blocking/reporting of potentially inappropriate photos sent or received by minors work? That's an additional piece of this. I don't understand how that would be based on a strict hash, perhaps something very fuzzy like recognition of skin uncovered?
    Seriously gatorguy? Have you bothered to read anything authoritative (such as from Apple) on this at all? Or are you just feigning an embarrassing level of ignorance?

    You should jump over to that thread and insult those posters too since you thought I was the only one not getting it. 
    You're nearly always wrong about everything, but I do expect you to not be totally clueless. The rest of them, I can't really generalize.
  • Reply 48 of 66
    jungmarkjungmark Posts: 6,926member
    The people who are complaining are the ones that don’t know how it’ll work. They read something on the internet so it must be true. 
    n2itivguy
  • Reply 49 of 66
    maestro64maestro64 Posts: 5,043member
    techconc said:
    maestro64 said:
    As I said before on this topic, comparing a Hash from a known image to an Hash of another image only works if both images are exactly the same. If the image that is being compared is modified in the least bit the Hash is no longer the same. 
    Nope.  The algorithms used are designed to recognize the same picture, regardless if it has been resized, rotated, even slightly cropped, etc.
    Everything I read said they are comparing the Hash, the Hash is specific to the all the bits which make up the image. Running the picture through a photoshop filter and stripping the meta data will alter the picture enough so the Hash it not close enough to compare. They said it compared image even if it encrypted and does not required direct access to the image since they are looking at the Hash. Without opening the image there is no way to know the image is the same as the one in the database. The only way to do what you are claiming is they would have to open the image and do a broad image recognition which Apple said they are not doing. Even Facebook uses people in India to look at suspect images that show nudity their algorithm flagged they can not solely rely on the algorithms.

    Again, Apple is going to spy on everyone in hopes they catch someone who is stupid enough to store these images on their phones.
    muthuk_vanalingam
  • Reply 50 of 66
    fastasleepfastasleep Posts: 6,418member
    maestro64 said:
    As I said before on this topic, comparing a Hash from a known image to an Hash of another image only works if both images are exactly the same. If the image that is being compared is modified in the least bit the Hash is no longer the same. Does anyone believe the people who are involved in this stuff are going to be that stupid to store these items on their phone exactly as they in the database. Especially now that this has been highlighted all over the place. If apple wanted to do this and catch these guys they should have never disclosed this information publicly.
    Yes, they are that stupid. But if not, it's able to detect modified/cropped/etc images. You don't understand how this works. 
    The police are only catching the stupid idiots who down load this stuff to their home computer. The major trafficers of this stuff are not being caught, if they were it would have been plastered all over the news. 
    Wrong. Tons of shit is getting caught online. Facebook reported over 20 million alone last year. Google, 546K. Microsoft, 97K. Snapchat, 144K. Dropbox, 20K. etc, etc. Have a look, the data is right here, look at the list of online service providers on this page and take a guess as to whether they are using more invasive than Apple is to find these 
    https://www.missingkids.org/gethelpnow/cybertipline#bythenumbers

    And, there are plenty of reports of major traffickers/servers being taken down. You can easily find articles about these things if you search for them. 
    https://www.businessinsider.com/us-government-shuts-down-largest-child-porn-marketplace-dark-web-2019-10
    They do not care if your rights and privacy are violated in the end.
    Again, you apparently don't understand how this works.
    edited August 2021 omasou
  • Reply 51 of 66
    fastasleepfastasleep Posts: 6,418member
    gatorguy said:
    chadbag said:
    tommikele said:
    I can't wait to read about some Grandmother who gets arrested when they find a picture of naked infant grandchild in the bath in her iCloud Photo Library. I guarantee you that or something equally insane will result from this.
    It doesn’t work like that.  Unless that infant grandchild pic was put into the child abuse pic database and had a hash made and put into th DB. 

    I am against this because of the way it is being done.  Apple putting plumbing in that can later be abused when circumstances change, despite their best intentions today. 

    However, relying on misinformation does not help the discussion against this in any way. 
    How does the blocking/reporting of potentially inappropriate photos sent or received by minors work? That's an additional piece of this. I don't understand how that would be based on a strict hash, perhaps something very fuzzy like recognition of skin uncovered?
    Do you also say, gee, how does my camera know how to identify a face or how Photos knows when it sees a picture of my cat?
  • Reply 52 of 66
    fastasleepfastasleep Posts: 6,418member
    bbh said:
    I think the Apple tool is a great idea.

    The iPhone has top notch photo and video capabilities and is very portable.
    As such, it can easily be a portable tool to abuse children.

    Apple wants to stop that.  Go Apple go go go!
    How about if I'm the arbiter of whether or not that "cute" photo of your child taking a bath is "kiddy porn" ? Suppose I say it is ? Then what ? Your opinion doesn't matter. This "well intentioned" screening seems obvious to me that we are put in the position of proving our innocence vis a vis every photo we upload. Privacy ? Not much !! 

    APPLE is trashing years and years of reputation with this. BAD IDEA.
    You don't understand how any of this works. 
    omasou
  • Reply 53 of 66
    fastasleepfastasleep Posts: 6,418member
    Sadly, due to this CSAM scanning, I will almost certainly be moving away from Apple.  
    LOL to what, Google? They already use the hash matching AND machine learning models to scan all their platforms, the latter being far more aggressive than Apple:
    https://blog.google/technology/safety-security/our-efforts-fight-child-sexual-abuse-online

    To Microsoft? They helped develop this hash-matching technology:
    https://www.microsoft.com/en-us/corporate-responsibility/digital-safety-content-report?activetab=pivot_1:primaryr3
    as well as this:
    https://blogs.microsoft.com/on-the-issues/2020/01/09/artemis-online-grooming-detection/

    Fuck there are a lot of stupid comments here.
    edited August 2021 omasou
  • Reply 54 of 66
    fastasleepfastasleep Posts: 6,418member
    gatorguy said:

    and determining the recipient or sender is identified by Apple (?) as a minor.
    Uh, identified as a minor by the parent who enables this control in their Family settings and only applies to ages 12 and under as clearly explained by Apple in the initial documentation, which you clearly didn't read.
  • Reply 55 of 66
    fastasleepfastasleep Posts: 6,418member
    mike54 said:
    It's not about CSAM, its not even about iCloud images. CSAM is the excuse to install surveillance software on every iPhone, iPad, macOS, Apple Watch. This framework, now part of the OS, will certainly be used to search for other content on your Apple devices in the future. This is why the software is on your device in the first place , not CSAM. CSAM is the gateway. The door is opened, the mechanism is in place. And Federighi  says "multiple levels of auditability", which simply  means 'believe us'.  

    How long has Apple been working on implementing this spy software? Maybe 1.5 to 2 years since it was first discussed. Tim Cook would of had to sign off on it. Therefore, he had already signed off on this while he was going on and on about privacy, with his multi million dollar  advertising campaign.

    Tim Cook has deceived all Apple users and has lost all integrity. He is a pro snake oil salesman. He has used 'privacy' as a sales tool and people believed him but it turned out to be a huge fat lie. He had made Apple untrustworthy on their most significant marketing differentiation from Google. The board should immediately terminate Tim Cook in disgrace and rollback this appalling decision.

    Hey, do you know what Spotlight does? What's prevented Apple from uploading your Spotlight database from your devices? Or your Photos Library object/person detection database? How do you think those models are trained? Or any other repositories of easily exfiltrated meta data you store on your device?

    Explain how THIS feature is somehow the thing that will be abused by Apple, but none of those things were.
    edited August 2021
  • Reply 56 of 66
    crowleycrowley Posts: 10,453member
    elijahg said:
    crowley said:
    GG1 said:
    entropys said:
    Why does it matter who owns the hash list?
    Point is the process has been created. The algorithm can simply pointed to any other hash list of photos (or data really) that Apple is made to do. By a despot, a State actor, the courts or whatever.  It won’t be able to say no if it creates the ability. 

    Do not create the ability in the first place and Apple won’t be forced into trying to say no, and fail.
    This post and an earlier post by you demonstrate the slippery slope, such as:

    Say I reside in Australia and have pics of Winnie the Pooh on my phone to entertain my kids. I visit China on business, where the CCP provides the hash database. When I leave China, I'm detained for having dissident information on my phone. Plausible scenario or "the sky is falling?"
    The latter.

    Your Winnie the Pooh photos won't have uploaded to iCloud while you were in China, therefore won't have triggered the check.
    What if he’s been travelling and had no wifi until he arrives in China, then it uploads there?
    In this highly unlikely scenario, still nothing would happen because Apple aren’t going to report Winnie the Pooh photos.

    Come on.
  • Reply 57 of 66
    crowleycrowley Posts: 10,453member
    If you are pro Apple, keep in mind that the thousands of Apple employees signing this petition ARE APPLE. This warrantless intrusion into private data goes against everything Apple claimed to be as a company. The executives who thought this was a great idea should be fired. Don't worry they have plenty of money and Apple desperately needs new blood. The executives could certainly stand to be a bit more diverse too.
    Did I miss something?  This thread is about a Slack thread with 800 posts. That’s a big difference from a petition with thousands of signatures.
    fastasleep
  • Reply 58 of 66
    anonymouseanonymouse Posts: 6,860member
    crowley said:
    If you are pro Apple, keep in mind that the thousands of Apple employees signing this petition ARE APPLE. This warrantless intrusion into private data goes against everything Apple claimed to be as a company. The executives who thought this was a great idea should be fired. Don't worry they have plenty of money and Apple desperately needs new blood. The executives could certainly stand to be a bit more diverse too.
    Did I miss something?  This thread is about a Slack thread with 800 posts. That’s a big difference from a petition with thousands of signatures.
    Plus, if AppleInsider threads are anything to go by, 800 posts could represent 20-25 people, tops, half of them pro, half con. 
    edited August 2021
  • Reply 59 of 66
    elijahg said:
    "Apple Employees"? Multiple tens of thousands voiced concern or 20 did? I don't like these misleading headlines. It allows a few people to control the public message. 

    Apple, as another poster said, is still by far the best game in town. 

    CSAM is probably not the hill to fight the privacy battle. Be sure that government will come to Apple and push for more. That's a better hill to fight the battle. 
    800 posts so unless there are a couple of people spending a lot of time posting on Slack, it's a fair number of people.

    Problem is at the top of that hill is not only CSAM, but behind a chain-link fence is all legal data that every iOS user owns. Whilst the government may not have the means to climb the hill, they do hold a pair of wire cutters.
    Number of Apple employees is, I believe, around 150,000. 
    800 posts -- yea I'm sure there wouldn't be posters repeatedly posting, that never happens. But let's say 2 posts per employee. 400 employees out of 150,000. About .4 tenths of a single percent (changes are very likely posters have more than 2 on average making it notably lower).  Yea, that is very worthy of big headlines of "Apple Employees Complain" and not misleading at all. So you're right, it is very substantial and a few are not controlling the message (sarcasm intended). 

    Top of the hill is CSAM? Seriously? It's not even a government thing, you can disable it by disabling iCloud photos. Maybe they'll start gathering who you know and where you go and what you commonly type and what web pages you visit and what you buy...(among others) -- sometimes sharing that with government. Wouldn't that be the top of the hill if they did that?? (let me guess, those items are A-Ok, right?)
  • Reply 60 of 66
    mubailimubaili Posts: 453member
    Apple made a good compromise in implementing this feature. Apple has figured out a very creative way to protect users privacy yet be able to spot a potential crime at the same time. This is a good demonstration of how end to end encryption can be used but potential crime can still be spotted. Now what I do wonder is whether governments would latch onto this innovative solution and ask Apple to save a digest of each conversation/iMessage etc, and also come up likewise solutions to have Apple save evidences. The good thing is so Apple is transparent about is approach and we can all decide whether it is a good solution or not.
    fastasleep
Sign In or Register to comment.