WhatsApp latest to pile on Apple over Child Safety tools

2»

Comments

  • Reply 21 of 25
    BeatsBeats Posts: 3,073member
    “Child Safety”

    What a fu**ing joke of a name. This feature does NOTHING to keep children safe. I can’t believe people fall for this crap.
    williamlondon
  • Reply 22 of 25
    crowleycrowley Posts: 10,453member
    Beats said:
    “Child Safety”

    What a fu**ing joke of a name. This feature does NOTHING to keep children safe. I can’t believe people fall for this crap.
    There are three features that Apple are bundling under the banner of child safety.

    And if CSAM scanning puts a single child abuser behind bars then it's done a great thing.
    watto_cobrajony0
  • Reply 23 of 25
    crowley said:

    "Apple has built software that can scan all the private photos on your phone -- even photos you haven't shared with anyone. That's not privacy."
    That's a misrepresentation.  While the software probably could scan all the private photos on your phone, that's not how it is being deployed.  It only "scans" (in reality a cryptographic hash that compares with a database of cryptographic hashes) the photos as they are prepped to be uploaded to iCloud.
    That's not how it's being deployed...for now.  And only Apple employees will be reviewing flagged images...for now.  

    There will be pressure from governments to use this technology for other purposes, and at some point Apple will accede to that pressure, just as they have in the past for other functionality, because they put "complying with local laws" above their principles when it comes to operating on a word wide stage.  Fascist and other authoritarian regimes will take this as an opportunity increase their surveillance of "dangerous" people no matter how Apple tries to lock it down.  The government's current use of hacking hardware and software is testament to that.  Even allegedly "democratic" governments want to know more than they need to, or should, know these days.

    The biggest problem I see with this is that it just won't work over the long term to stop the real problem, i.e. the traffickers who specialize in this type of material and its sale and distribution all over the world.  It will almost certainly catch small players, like the guy taking pictures of his neighbors kids, but the big players have already figured out ways around this, and in fact probably already have systems in place to thwart this kind of detection.  Hell, I can think of one way that's not even particularly difficult.

    Another problem I can see happening is that this has the potential to flag innocuous material.  I realize the current tech will be comparing to a database of existing images, but as another poster alluded, if this algorithm can flag pics that appear in different formats, sizes, bit depth, etc, using "fuzzy" hashing, the potential to flag innocuous material, such as kids swimming or bathing, exists.  Many parents I know take and keep such pictures with no intent towards inappropriate use.

    And last but not least, it's remotely possible that some bright boy will get hold of the database of image hashes and fabricate images that match (time-consuming, and unlikely, depending on the size of the hash, but not impossible) one or more of them without containing any objectionable material at all.  And then upload them to iCloud repeatedly.  Hilarity, for a sufficiently vague definition of such, will no doubt ensue.
  • Reply 24 of 25
    Apple’s commitment to privacy is as reliable as Ming’s promise not to blast Dale Arden into space.
    I'm not sure you understand the reasons why he was called "The Merciless"
    watto_cobra
Sign In or Register to comment.