elijahg

About

Username
elijahg
Joined
Visits
397
Last Active
Roles
member
Points
6,576
Badges
2
Posts
2,904
  • Apple sued over 2022 dropping of CSAM detection features

    I'm late to this thread, but I don't get it.

    Say, I'm a pedophile and take a picture of a naked kid. Obviously that pic is not in any database of child porn yet. Then I try to upload it to a CSAM-enabled iCloud. The AI flags it as possible child porn so an Apple employee takes a look to be sure? How does the reviewer know the age of my subject? How does the reviewer know that I'm a pedophile rather an a horny teen taking pictures of my consenting 17-year-old boyfriend/girlfriend? Hell, maybe it's a picture of me.

    Also, regarding this lawsuit, the plaintiff isn't even a user of Apple products (or not the one's in question). Some third party shared their pic using iCloud. They could have done the same thing by spinning up their own website. Impossible for me to see how Apple bears any responsibility here.
    The AI doesn't use contextual awareness to work out what's in the photo. It just hashes and compares with a database. Presumably the database is updated every time a device with CSAM on it is found by law enforcement, so even if that photo *is* shared, until someone who they've shared with (or the creator) is caught, and has their photos uploaded to the database the creator won't get caught. And the creator won't get caught if they delete the photo either. Messages OTOH asks before a child receives (and maybe sends, not sure) a picture containing nudity, judged by on-device AI.
    ronn
  • Apple sued over 2022 dropping of CSAM detection features


    williamh said:
    chasm said:
    netrox said:
    If it's encrypted end to end, how is it possible to detect CSAM? 


    That question is way above my pay grade to know, but my GUESS is that, as Apple mentioned in the original announcement, even an encrypted photo can be algorithmically searched for a certain pattern, in this case human nudity.

    Now lots of perfectly innocent pictures contain some level of nudity, so — again, guessing here — that is combined with where the image was sent and/or uploaded by. If any of the recipients or senders is a child account, that plus the nudity might flag it — and then law enforcement could be contacted to obtain a court order to decrypt.
    The basic concept of CSAM scanning doesn't involve searching for patterns in your images, not nudity or anything else.  The way it worked was to compare hashes of your images to a database of hashes of known CSAM images.  The database came from the National Center for Missing and Exploited Children which maintains hashes of the images that were proven in criminal cases to be of minors.

    The concerns of having CSAM on our devices as part of the detection were unwarranted and based on a misunderstanding of how the system works.  A potential valid concern is the probability of hash collisions.  I recall Apple's response on that was that they weren't going to alert on single matches.
    That's not entirely true. Whilst it did compare hashes, Apple's CSAM scanner was fuzzy matching so false positives were possible, which is why they had a human reviewer.

    "The hashing technology, called NeuralHash, analyzes an image and converts it to a unique number specific to that image. Only another image that appears nearly identical can produce the same number; for example, images that differ in size or transcoded quality will still have the same NeuralHash value." https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf
    ronndewmewatto_cobra
  • New in iOS 18.2 developer beta 3: Changes to Apple Intelligence, video playback, and more

    Moving the video controls from the bottom of the screen to the middle, over the content was stupid completely stupid. Not only does it mean it's harder to reach with one hand, they're in front of the content so you have to tap somewhere else to dismiss, or wait. Hopefully this change moves them back.
    Alex1N
  • iPhone 16 Pro users face random freezes and repeated restarts

    quazze said:
    I have been experiencing issues with Apple CarPlay with my iPhone 16 Pro. If the interface is not on, I get this odd lower volume sound on whatever I’m listening to something … and sometimes it doesn’t even want to play through my car speakers at all, but after a few minutes, it’ll turn on through the speakers.
    I've had it not play through the car (or any) speakers when connected to CarPlay too. Thought it was a car bug rather than iOS, but maybe it is iOS.
    dewmewatto_cobra
  • Roadside Assistance via Satellite expands into the UK

    As a uk based, 2 year old iPhone 14 user, what happens next? 
    Just tried and it and it still seems to work, with an option like in the screenshot, with the uk partner Green Flag assistance.
    i wonder how much they charge?
    I wonder this too, I already have coverage with someone other than Green Flag, so I guess I can't use it unless I want to Pay £££ for recovery.
    appleinsideruserwatto_cobra