WhatsApp latest to pile on Apple over Child Safety tools

Posted:
in General Discussion edited August 2021
WhatsApp chief Will Cathcart on Friday dinged Apple's plan to roll out new Child Safety features that scan user photos to find offending images, the latest in a string of criticism from tech experts who suggest the system flouts user privacy.

Billboard


Cathcart outlined his position in a series of tweets, saying Apple's plan to combat child sexual abuse material (CSAM) is a step in the wrong direction and represents a "setback for people's privacy all over the world."

"Apple has long needed to do more to fight CSAM, but the approach they are taking introduces something very concerning into the world," Cathcart said. "Instead of focusing on making it easy for people to report content that's shared with them, Apple has built software that can scan all the private photos on your phone -- even photos you haven't shared with anyone. That's not privacy."

Announced on Thursday, Apple's tools will help identify and report CSAM by comparing hashes of images uploaded to iCloud against a database of known database provided by the National Center for Missing & Exploited Children (NCMEC) and other child safety organizations.

Before a photo is uploaded to iCloud Photos, a hash is generated and matched with the database, which Apple will transform into an unreadable set of hashes for storage on a user's device. The process results in a cryptographic safety voucher that is subsequently sent to iCloud along with the photo. Using a technique called threshold sharing, the system ensures voucher contents cannot be viewed by Apple unless an iCloud account surpasses a predefined threshold. Further, Apple can only interpret vouchers related to matching CSAM images.

Apple says the technique enables CSAM detection while affording end users a high level of privacy.

Cathcart also takes issue with another new feature that automatically blurs sexually explicit images in Messages if a user is under 17 years old. Additionally, parents can opt to be notified when a child under 13 years old sends or receives a message containing such material. Images are analyzed on-device with the help of machine learning.

"I read the information Apple put out yesterday and I'm concerned. I think this is the wrong approach and a setback for people's privacy all over the world," Cathcart said. "People have asked if we'll adopt this system for WhatsApp. The answer is no."

Cathcart said WhatsApp has worked hard to fight CSAM and last year reported more than 400,000 cases to NCMEC without breaking its encryption protocols.

It is perhaps unsurprising that WhatsApp, an arm of Facebook, is quick to bemoan Apple's initiative. Facebook is under threat from privacy-minded changes Apple delivered with iOS 14.5. Called App Tracking Transparency, the feature requires developers to ask users for permission before using Identification for Advertisers (IDFA) tags to track their activity across apps and the web. Facebook believes a majority of users will opt out of ad tracking, severely disrupting its main source revenue.

Along with Cathcart, other tech industry insiders have spoken out against Apple's child protection changes. Edward Snowden, Epic CEO Tim Sweeney, the Electronic Frontier Foundation and others have posited that the system, while well intentioned, could lead to abuse by governments or nefarious parties.

The uproar in large part stems from Apple's very public commitment to user privacy. Over the past few years the company has positioned itself as a champion of privacy and security, investing in advanced hardware and software features to forward those goals. Critics argue Apple's new Child Safety tools will tarnish that reputation.

Read on AppleInsider
«1

Comments

  • Reply 1 of 25
    elijahgelijahg Posts: 2,759member
    When even the likes of WhatsApp are piling on with regards to privacy, you know you’ve made a mistake.
    Rayz2016darkvadermike54rinosaurBeatschemengin1
  • Reply 2 of 25
    gerardgerard Posts: 83member
    I think Apple should take time and make sure they implement something like this after all the pros and cons are sorted out. As much as I admire their philosophy; it doesn’t mean they always have the correct solution. We all want to protect innocent children from unnecessary harm but if parents have to deal with claims that end up being deemed harmless children would still be affected negatively. Children could possibly be removed from homes by social services while these things play out. 
    mobirdelijahgbaconstangRayz2016beowulfschmidtchemengin1watto_cobra
  • Reply 3 of 25
    There are some concerns to work out, but I applaud Apple for doing something to combat child sexual abuse materials. It’s only going to roll out in the US at first, use hashes from reputed sources, and requires multiple hash matches before being reported. The people and groups attacking Apple but haven’t done anything themselves to protect children or offer alternative measures. If you aren’t part of the solution, you are part of the problem, WhatsApp, EFF, and Epic.
    watto_cobra
  • Reply 4 of 25
    Poster I Got A Bowl Cut sarcasm for the win.
    Facebook(Whats App) are one of the worst if not the worst purveyors of private data exploitation. That's not up for debate though there will be those who will argue it (they hate the truth on it). But how ironic that Apple is the 'your private data is private' company yet they ceded that high ground to probably the biggest purveyors of private data exploitation. That's a mistake that one would think had to be comedy of errors.
    Apple claims they, supposedly, have a data privacy certain way to check for disturbing kiddie porn images without changing the underlying privacy standards? Great, gee why not lead with that information and an easy to understand explanation as to why it is? Versus letting it all out at once that 'scanning your pics'(then fill in the creepy kiddie porn thing). Apple, use your friggin heads and explain the data privacy first! tomorrow or next week roll out "We are trying to help exploited children" . Come on this isn't rocket science, you are a privacy first company, so explain the data privacy angle first. Instead they sound like just another surveillance capitalism company -- where Facebook(Whats App) is the high ground for data privacy? Surreal.... 
    elijahgmike54argonautwatto_cobra
  • Reply 5 of 25
    badmonkbadmonk Posts: 1,293member
    I have to say as an iOS user (and Apple booster), I have a lot of conflicted feeling about this.  I think the premise of NCMEC is flawed…scanning iCloud for a collection of known culd abuse photos (I am guessing that are 5-30 years old) will do little to prevent on-going child abuse.  I realize that there may still be utility to removing these images from circulation, helping survivors cope etc) but I doubt that active child abuse will be thwarted by this program since by design it is backward looking.

    And we have to deal with governments and bad actors planting these known photos on device to implicate the innocent, the slippery slope etc.

    It seems to be that basic community policing will do more to prevent child abuse.  In my community most of the notorious cases of child abuse were discovered by neighbors and the community.  The cases were being perpetuated by the marginalized (strange religious beliefs that their children were devil-possessed etc) and I suspect this will do more to protect the children.
    elijahgmike54Beats
  • Reply 6 of 25
    baconstangbaconstang Posts: 1,105member
    Looks like Apple just decimated their phone market amongst the clergy...
    lordjohnwhorfinwilliamlondondarkvaderargonautStrangeDaysBeatsbeowulfschmidtwatto_cobra
  • Reply 7 of 25
    Jeffrey Epstein would have moved to Android already.
    watto_cobra
  • Reply 8 of 25
    Rayz2016Rayz2016 Posts: 6,957member
    Facebook schooling Apple on privacy. 

    It’s like one of those What if … movies. 
    darkvadermike54watto_cobra
  • Reply 9 of 25
    crowleycrowley Posts: 10,453member

    "Apple has built software that can scan all the private photos on your phone -- even photos you haven't shared with anyone. That's not privacy."
    That's a misrepresentation.  While the software probably could scan all the private photos on your phone, that's not how it is being deployed.  It only "scans" (in reality a cryptographic hash that compares with a database of cryptographic hashes) the photos as they are prepped to be uploaded to iCloud.
    StrangeDayswatto_cobrajony0
  • Reply 10 of 25
    crowleycrowley Posts: 10,453member

    Cathcart said WhatsApp has worked hard to fight CSAM and last year reported more than 400,000 cases to NCMEC without breaking its encryption protocols.
    Would be interesting to know how they accomplished this.  If it's just users reporting images they receive then that's not WhatsApp working hard at all.  
    watto_cobra
  • Reply 11 of 25
    darkvaderdarkvader Posts: 1,146member
    They're right.  This utterly destroys any trust anyone should have in Apple.

    This WILL be used for nefarious purposes. 
    mike54cochobaconstangBeatschemengin1
  • Reply 12 of 25
    bulk001bulk001 Posts: 764member
    What nefarious purposes? The government is tracking you to Target? You have a subscription to a legal porn site? You are having an affair? Nobody cares! They have access to all your financial information already. And to your phone meta data. The ones who will use this against you can access it by just arresting you and forcing you to give over the data or hack into it like Saudi Arabia is doing to murder dissidents. 
    edited August 2021 watto_cobra
  • Reply 13 of 25
    cpsrocpsro Posts: 3,198member
    nytesky said:
    There are some concerns to work out, but I applaud Apple for doing something to combat child sexual abuse materials. It’s only going to roll out in the US at first, use hashes from reputed sources, and requires multiple hash matches before being reported. The people and groups attacking Apple but haven’t done anything themselves to protect children or offer alternative measures. If you aren’t part of the solution, you are part of the problem, WhatsApp, EFF, and Epic.
    The real reason Apple is doing this is to provide the back door some governments demand in exchange for doing business there. E.g., you can bet Tank Man will be in the China version of the database, the thresholds for reporting will be lower, and it won’t be Apple employees verifying the hash hits.
    edited August 2021 baconstangBeatsbeowulfschmidt
  • Reply 14 of 25
    cochococho Posts: 8member
    Apple always boasted about user privacy - just marketing BS!
    williamlondon
  • Reply 15 of 25
    macplusplusmacplusplus Posts: 2,112member
    I think he wants to point to another more serious issue between lines, regarding Apple's scheme:

    What about WhatsApp Backups that by implementation reside on the iCloud? Will Apple's scheme monitor WhatsApp backups too? Apparently not, since Apple mentions only iCloud Photos.

    How do we suppose that child abusers won't use WhatsApp or other apps and will commit their crimes always via iCloud Photos? 
  • Reply 16 of 25
    crowleycrowley Posts: 10,453member
    I think he wants to point to another more serious issue between lines, regarding Apple's scheme:

    What about WhatsApp Backups that by implementation reside on the iCloud? Will Apple's scheme monitor WhatsApp backups too? Apparently not, since Apple mentions only iCloud Photos.

    How do we suppose that child abusers won't use WhatsApp or other apps and will commit their crimes always via iCloud Photos? 
    Why would anyone suppose that?
    watto_cobra
  • Reply 17 of 25
    Apple’s commitment to privacy is as reliable as Ming’s promise not to blast Dale Arden into space.
    williamlondonmacgui
  • Reply 18 of 25
    macplusplusmacplusplus Posts: 2,112member
    crowley said:
    I think he wants to point to another more serious issue between lines, regarding Apple's scheme:

    What about WhatsApp Backups that by implementation reside on the iCloud? Will Apple's scheme monitor WhatsApp backups too? Apparently not, since Apple mentions only iCloud Photos.

    How do we suppose that child abusers won't use WhatsApp or other apps and will commit their crimes always via iCloud Photos? 
    Why would anyone suppose that?
    Apparently Apple does. Since their scheme is not system-wide, it operates only on iCloud Photos. For a WhatsApp photo to become iCloud photo it has to be automatically saved to Photos library, but that can be turned off in WhatsApp settings. If turned off, the child abuse photo will never make to iCloud, it will reside in WhatsApp history or will reach iCloud only if WhatsApp backup is active, about which Apple is totally mute.

    Or maybe Apple already knows that WhatsApp has an efficient scheme and they don’t want to break it?

    If Apple is under such a legal liability as to report child abuse cases, that liability should encompass the totality of iCloud, including other apps’ storage. Apple cannot say “this is not my realm, go after WhatsApp”.
    edited August 2021
  • Reply 19 of 25
    crowleycrowley Posts: 10,453member
    crowley said:
    I think he wants to point to another more serious issue between lines, regarding Apple's scheme:

    What about WhatsApp Backups that by implementation reside on the iCloud? Will Apple's scheme monitor WhatsApp backups too? Apparently not, since Apple mentions only iCloud Photos.

    How do we suppose that child abusers won't use WhatsApp or other apps and will commit their crimes always via iCloud Photos? 
    Why would anyone suppose that?
    Apparently Apple does. Since their scheme is not system-wide, it operates only on iCloud Photos. For a WhatsApp photo to become iCloud photo it has to be automatically saved to Photos library, but that can be turned off in WhatsApp settings. If turned off, the child abuse photo will never make to iCloud, it will reside in WhatsApp history or will reach iCloud only if WhatsApp backup is active, about which Apple is totally mute.

    Or maybe Apple already knows that WhatsApp has an efficient scheme and they don’t want to break it?

    If Apple is under such a legal liability as to report child abuse cases, that liability should encompass the totality of iCloud, including other apps’ storage. Apple cannot say “this is not my realm, go after WhatsApp”.
    Sure they can.
  • Reply 20 of 25
    zimmiezimmie Posts: 651member
    crowley said:

    "Apple has built software that can scan all the private photos on your phone -- even photos you haven't shared with anyone. That's not privacy."
    That's a misrepresentation.  While the software probably could scan all the private photos on your phone, that's not how it is being deployed.  It only "scans" (in reality a cryptographic hash that compares with a database of cryptographic hashes) the photos as they are prepped to be uploaded to iCloud.
    Honestly, it wouldn't make sense to implement this anywhere but the iCloud photo sync process. If that's where it is (and we will see soon enough), then no, it couldn't scan all the photos on your phone; only the ones sent to it directly. Of course, the hazard with a closed OS is that random users can't confirm this themselves, they have to trust specialist analysis.

    I'm still digesting the technical analysis papers on the CSAM detection. This application of Threshold Secret Sharing is fascinating. I've only ever used it for "You need at least four managers to agree to get access to the certificate authority" kind of stuff.
    watto_cobra
Sign In or Register to comment.