Internal Apple memo addresses public concern over new child protection features

Posted:
in iPhone edited August 2021
An internal memo from Apple reportedly addresses concerns around CSAM and Photo scanning features, aims to uphold its commitment to user privacy while also protecting children.




On Thursday, Apple announced that it would expand child safety features in iOS 15, iPadOS 15, macOS Monterey, and watchOS 8. The new tools include a system that leverages cryptographic techniques to detect collections of CSAM stored in iCloud Photos to provide information to law enforcement.

The announcement was met with a fair amount of pushback from customers and security experts alike. The most prevalent worry is that the implementation could at some point in the future lead to surveillance of data traffic sent and received from the phone.

An internal memo, obtained by 9to5Mac, addresses these concerns to Apple staff. The memo was reportedly penned by Sebastien Marineau-Mes, a software VP at Apple, and reads as follows:

Today marks the official public unveiling of Expanded Protections for Children, and I wanted to take a moment to thank each and every one of you for all of your hard work over the last few years. We would not have reached this milestone without your tireless dedication and resiliency.

Keeping children safe is such an important mission. In true Apple fashion, pursuing this goal has required deep cross-functional commitment, spanning Engineering, GA, HI, Legal, Product Marketing and PR. What we announced today is the product of this incredible collaboration, one that delivers tools to protect children, but also maintain Apple's deep commitment to user privacy.

We've seen many positive responses today. We know some people have misunderstandings, and more than a few are worried about the implications, but we will continue to explain and detail the features so people understand what we've built. And while a lot of hard work lays ahead to deliver the features in the next few months, I wanted to share this note that we received today from NCMEC. I found it incredibly motivating, and hope that you will as well.

The memo also included a message from Marita Rodrigues, the executive director of strategic partnerships at the National Center for Missing and Exploited Children.

Team Apple,

I wanted to share a note of encouragement to say that everyone at NCMEC is SO PROUD of each of you and the incredible decisions you have made in the name of prioritizing child protection.

It's been invigorating for our entire team to see (and play a small role in) what you unveiled today.

I know it's been a long day and that many of you probably haven't slept in 24 hours. We know that the days to come will be filled with the screeching voices of the minority.

Our voices will be louder.

Our commitment to lift up kids who have lived through the most unimaginable abuse and victimizations will be stronger.

During these long days and sleepless nights, I hope you take solace in knowing that because of you many thousands of sexually exploited victimized children will be rescued, and will get a chance at healing and the childhood they deserve.

Thank you for finding a path forward for child protection while preserving privacy.

Apple has yet to make a public-facing statement addressing the backlash.

Apple has long been viewed as a champion in user privacy. At WWDC 2021, Apple unveiled plans to radically expand privacy features across the Apple Ecosystem.

Read on AppleInsider
«13

Comments

  • Reply 1 of 60
    BeatsBeats Posts: 3,073member
    Some of this sounds like PR BS. I don’t see how this helps children like Tim claims.  And Apple collaborating with the government is embarrassing.
    lkrupprcfabluefire1macplusplusjdwdarkvader
  • Reply 2 of 60
    tylersdadtylersdad Posts: 310member
    This is monumentally bad for privacy. It's making me reconsider my investments in Apple products. My entire family belongs to the Apple ecosystem. We all have some version of the iPhone 12, iPads and Apple Watches. It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end? 


    lkruppBeatsRayz2016elijahgbluefire1sgs46hcrefugeechemengin1baconstangjdw
  • Reply 3 of 60
    mknelsonmknelson Posts: 1,125member
    Beats said:
    Some of this sounds like PR BS. I don’t see how this helps children like Tim claims.  And Apple collaborating with the government is embarrassing.
    Collaborating with the government how? (you are the government in a democracy btw).
    dysamoriadoozydozenlolliverwatto_cobrajony0
  • Reply 4 of 60
    * Surveillance By Apple * is wrong. I hope this is somehow stopped. Who would have thought privacy focused Apple could be so exceptionally stupid. 
    Beatselijahgsgs46baconstangjdwdarkvader
  • Reply 5 of 60
    Wesley HilliardWesley Hilliard Posts: 190member, administrator, moderator, editor
    tylersdad said:
    It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
    Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.

    Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
    lkruppStrangeDaysdysamoriaaderutterpatchythepiratedoozydozenlolliverFileMakerFellersteveauwatto_cobra
  • Reply 6 of 60
    tylersdadtylersdad Posts: 310member
    mknelson said:
    Beats said:
    Some of this sounds like PR BS. I don’t see how this helps children like Tim claims.  And Apple collaborating with the government is embarrassing.
    Collaborating with the government how? (you are the government in a democracy btw).
    How? By looking at private information on a privately owned phone without the consent of the owner or a warrant to search such information. 

    This is essentially a warrantless search of a person's personal property. 

    We've already seen that the government is colluding with social media to censor posts which they deem unfavorable to them. We've already seen the government asking people to rat out their family members and friends for what they perceive as "extremist" behavior. Is it really that much of a stretch to think the government would ask tech companies to monitor emails and messages for extremist content?
    Beatsmuthuk_vanalingambaconstangJapheyTheObannonFiledarkvader
  • Reply 7 of 60
    tylersdadtylersdad Posts: 310member
    tylersdad said:
    It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
    Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.

    Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
    Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely? 

    Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology. 
    Beatssgs46hcrefugeechemengin1baconstangJapheydarkvader
  • Reply 8 of 60
    larryjwlarryjw Posts: 1,031member
    No computer system will ever have the capability of interpreting photos. We see that in police reliance on facial recognition. We see hospitals are using software decide level of care which results in white people getting extraordinary care while POC get lesser care. We see courts using software predict recidivism.

    Sure there's money in this kind of research and commercial software. So developers and companies will be more than happy to let someone else fix the major screwups new technology causes. It's not like the people in the country any clue about how to make good decisions, to begin with, so let the computer do -- it ain't my problem. 
  • Reply 9 of 60
    Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures.

    To explain this quickly, this “scanning applies only (for now) to photos before they are uploaded to iCloud Photos. If you don’t use iCloud Photos, then this will never happen (for now).

    Your device scans your photos and calculates a photo “fingerprint” ON YOUR DEVICE. It downloads fingerprints of bad materials and checks ON YOUR DEVICE. 

    If by some chance you had a match, it would put a kind of safety token with the media while uploading it to iCloud Photos. If you reach some number of safety tokens someone will be tasked with checking up on your media (presumably in a very secure way which logs everything they do during the check).

    The big question is who creates/validates the media fingerprints that get compared. Example “what if” concerns include a government like China giving Apple fingerprints for anti-government images.
    applguydysamoriaaderutterhcrefugeebaconstangpatchythepiratedoozydozenlolliverwatto_cobrajony0
  • Reply 10 of 60

    And who defend children from the mental and cultural castration product of the gods' lunacy?

  • Reply 11 of 60
    M68000M68000 Posts: 725member
    Does anybody think that an internal memo being posted here and maybe other places in the internet is normal or appropriate?  Especially knowing how secretive Apple is?  I don’t think most companies,  Apple or otherwise want their internal communications out in the wild ?!
  • Reply 12 of 60
    lkrupplkrupp Posts: 10,557member
    tylersdad said:
    This is monumentally bad for privacy. It's making me reconsider my investments in Apple products. My entire family belongs to the Apple ecosystem. We all have some version of the iPhone 12, iPads and Apple Watches. It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end? 


    If it makes the perverts switch to Android, so be it and good riddance. And I doubt you will leave the platform when you realize you’re still better off with iOS than Android. Google will be doing exactly the same thing shortly as it usually follows Apple in these matters. Who will you go to? Your statement is just a fart in a wind storm. Good luck with it making any difference.
    edited August 2021 doozydozenFileMakerFellersteveauwatto_cobra
  • Reply 13 of 60
    BeatsBeats Posts: 3,073member
    mknelson said:
    Beats said:
    Some of this sounds like PR BS. I don’t see how this helps children like Tim claims.  And Apple collaborating with the government is embarrassing.
    Collaborating with the government how? (you are the government in a democracy btw).
    Well in that case we can stretch it to collaborating with me (government) because I’m allowing them to snoop through my photos.

    From the article:
     In true Apple fashion, pursuing this goal has required deep cross-functional commitment, spanning Engineering, GA, HI, Legal…”
    -Tim Cook

    Maybe I’m reading it wrong but “legal” is a government entity. I would bet some looneys from Congress or the FBI twisted Apple’s arm to provide this.
    elijahgbaconstangdarkvader
  • Reply 14 of 60
    Apple has finally fallen off the same Privacy slippery-slope that Google did years ago. Well done, Tim. You finally caught up to them.
    edited August 2021 baconstangpatchythepiratedarkvader
  • Reply 15 of 60
    Rayz2016Rayz2016 Posts: 6,957member

    To explain this quickly, this “scanning applies only (for now) to photos before they are uploaded to iCloud Photos. If you don’t use iCloud Photos, then this will never happen (for now).

    Your device scans your photos and calculates a photo “fingerprint” ON YOUR DEVICE. It downloads fingerprints of bad materials and checks ON YOUR DEVICE. 

    If by some chance you had a match, it would put a kind of safety token with the media while uploading it to iCloud Photos. If you reach some number of safety tokens someone will be tasked with checking up on your media (presumably in a very secure way which logs everything they do during the check).

    The big question is who creates/validates the media fingerprints that get compared. Example “what if” concerns include a government like China giving Apple fingerprints for anti-government images.
    Well that’s just it. You’re assuming that your media will be checked in a very secure way, but you don’t know who is doing the checking and what happens to the image even if it’s a false alarm. 

    But the lack of transparency is the biggest concern. In the future, Apple will be given other databases to match against, and new laws will be drafted to
    prevent Apple from telling citizens of the country that they’re now searching for pictures if dissidents on their phones. 

    So instead of telling us how it works, because we do get it, perhaps someone could tell us what safeguards are built in to protect protestors in foreign countries in the future?

    Anyone?

    Thought not. 
    elijahgOctoMonkeysgs46hcrefugeebaconstangjdwJapheydarkvader
  • Reply 16 of 60
    tylersdadtylersdad Posts: 310member
    Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures.

    To explain this quickly, this “scanning applies only (for now) to photos before they are uploaded to iCloud Photos. If you don’t use iCloud Photos, then this will never happen (for now).

    Your device scans your photos and calculates a photo “fingerprint” ON YOUR DEVICE. It downloads fingerprints of bad materials and checks ON YOUR DEVICE. 

    If by some chance you had a match, it would put a kind of safety token with the media while uploading it to iCloud Photos. If you reach some number of safety tokens someone will be tasked with checking up on your media (presumably in a very secure way which logs everything they do during the check).

    The big question is who creates/validates the media fingerprints that get compared. Example “what if” concerns include a government like China giving Apple fingerprints for anti-government images.
    Just because it's happening ON MY DEVICE doesn't mean the image isn't being examined. You compare the image to another reference image without opening the image. You need to look at the 1's and 0's and to do that, you need to open the file. 
    baconstangpatchythepirateJapheydarkvader
  • Reply 17 of 60
    tylersdadtylersdad Posts: 310member
    Beats said:
    mknelson said:
    Beats said:
    Some of this sounds like PR BS. I don’t see how this helps children like Tim claims.  And Apple collaborating with the government is embarrassing.
    Collaborating with the government how? (you are the government in a democracy btw).
    Well in that case we can stretch it to collaborating with me (government) because I’m allowing them to snoop through my photos.

    From the article:
    ” In true Apple fashion, pursuing this goal has required deep cross-functional commitment, spanning Engineering, GA, HI, Legal…”
    -Tim Cook

    Maybe I’m reading it wrong but “legal” is a government entity. I would bet some looneys from Congress or the FBI twisted Apple’s arm to provide this.
    "legal" likely means Apple legal council, not some government entity. Most companies (especially large ones) have their own legal department. 
    dysamorialolliversteveauwatto_cobra
  • Reply 18 of 60
    Rayz2016Rayz2016 Posts: 6,957member
    M68000 said:
    Does anybody think that an internal memo being posted here and maybe other places in the internet is normal or appropriate?  Especially knowing how secretive Apple is?  I don’t think most companies,  Apple or otherwise want their internal communications out in the wild ?!
    It happens quite a lot actually. 

    Especially when they want it leaked. 
    elijahgtylersdaddarkvader
  • Reply 19 of 60
    dewmedewme Posts: 5,362member
    These services are likely a proactive move by Apple to finally quell the barrage of requests from government agencies for Apple to open a backdoor for authorities to go around Apple's privacy and security protection features. It's not at all unusual for those who seek the backdoor solution to bring up child protection as a reason why backdoors are needed. If Apple takes the child protection argument off the table it gives them further justification for not adding the backdoor that authorities so desperately covet. In reality, it just buys Apple (and us) more time because those who seek the backdoor approach are never going to be satisfied with anything less than a master key that allows them unfettered access to whatever they want, whether or not they truly have a legitimate need or even a legal right to it. The standard operating principle in nearly all of these cases of authoritarian overreach is to ask for forgiveness, not to ask for permission.

    I haven't delved into all of the details of Apple's safeguards but from what I've read so far it sounds like classic signature matching, much like the technology behind Shazam. Everyone everywhere should always assume that everything sent over a communication link unencrypted (or easily decrypted) is being scanned and analyzed to extract information of interest (for whatever reasons, bizarre or legitimate) from the raw data. Everyone everywhere should also assume that all data and information that is collected from every source of acquisition in multiple formats is being fused together for additional processing.

    I'm not advocating that anyone live in a constant state of paranoia. I'm simply saying that we no longer live in a world where individuals can rationally sustain a universal expectation of privacy. Once you step outside of your own personal space, physically or virtually, you are sharing some bit of data about yourself with someone or something.  If you're walking or driving in an area with other humans, there are public and private cameras that see you even if you're not carrying a connected device. Last time I went into a popular gas station I counted no fewer than 14 cameras inside the store while I was waiting for my sandwich order to be completed. If you're carrying a connected device you're divulging a heck of a lot more data to be fused with captured images. Traversing the internet is no less private, no matter what you do to limit your exposure. VPNs and companies like Apple that value privacy and security in their products/services are helpful, but I still see their "protection" as temporary and quite fragile, as we've seen with recurring privacy and security breaches.

    All I'm saying is assume you're always being watched, and if there's something that you're planning on doing physically or electronically that you wouldn't anyone else observing, maybe think twice about doing it. This is simply where we are at in today's society, whether we like it or not.


    llamadysamoriamartinp13hcrefugeelolliverFileMakerFellersteveau
  • Reply 20 of 60
    StrangeDaysStrangeDays Posts: 12,877member
    tylersdad said:
    tylersdad said:
    It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
    Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.

    Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
    Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely? 

    Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology. 
    Sorry but you’re ignorant. A hash is representation of the image but is not the image. Much like TouchID stores a numeric string that represents your fingerprint, but is *not* your fingerprint. FaceID does this too. 
    dysamorialolliverwatto_cobrajony0
Sign In or Register to comment.