jSnively

About

Username
jSnively
Joined
Visits
306
Last Active
Roles
administrator
Points
1,143
Badges
2
Posts
463
  • Apple sued over 2022 dropping of CSAM detection features

    williamh said:

    In a truly E2E (iCloud photos are not unless you turn it on) system, hash scanning doesn't work as Apple never sees the original content on their servers. The only way to do it is to scan on device before anything goes anywhere. Apple's original annoucment around this effort was somewhat convoluted (but actually pretty smart) and didn't trigger on single matches (for fasle-positive reasons you mention) but it did have to do with scanning actual content not just hashes, which is why there's probably confusion.

    As I understood it, your device would have a csam hash database and your device would hash your images, so the detection would happen on your device using data on your device.  In any case, the technique involved hashes and not trying to determine the image content. 

    I'm trying to think back to the last time this came up and we looked into it for editorial purposes. IIRC it's either actually illegal (there's a lot of really unintuitive laws around this stuff) or at the least just a very bad idea to distribute that databse to the phones in the first place, as it can be extracted and used to circumvent detection. There were also concerns about 'posioining' the database with other content, espeically in countries with fewer (or no) freedom of speech protections. 

    There was a fuzzy ML implementation approach akin to Microsoft's PhotoDNA included in the initial pitch as well. It wasn't just a dumb hash based solution, as it (and Microsoft's in-use PhotoDNA) are designed to detect changes (cropping, mirroing, color shifting, etc.) in the content while still idenitfying it. I do not recall off the top of my head if they ever released a full whitepaper on the approach. Someone else can dig it up if they care (edit: elijahg did link the pdf above.)

    I beleive it took about half a week after researchers got ahold of Apple's implementation for testing to make it reliabally generate false positives, which meant the whole thing was ultimately useless. Some of these projects are still up on github like https://github.com/anishathalye/neural-hash-collider . That's why Apple really decided not to do it in the end. It was a clever approach from a privacy respecting perspective, it just didn't work well enough and had some external world-facing issues to boot.

    The lawsuit itself is frought with problems. Keep reading AppleInsider and we'll keep you updated on its progress 😁
    williamhronndewme
  • Deleting messages from AI inbox?

    You know, I don't think there is. That's kinda ... dumb. I'm not sure how your read status would get reset as i'm pretty sure that's handled server side. Did you do anything out of the ordinary lately? 

    That said, forums are finally on the roadmap. We wanted to get to them in 2024, but it just didn't happen. Things will improve next year 👍
    programmer
  • Apple sued over 2022 dropping of CSAM detection features

    williamh said:
    chasm said:
    netrox said:
    If it's encrypted end to end, how is it possible to detect CSAM? 
    That question is way above my pay grade to know, but my GUESS is that, as Apple mentioned in the original announcement, even an encrypted photo can be algorithmically searched for a certain pattern, in this case human nudity.

    Now lots of perfectly innocent pictures contain some level of nudity, so — again, guessing here — that is combined with where the image was sent and/or uploaded by. If any of the recipients or senders is a child account, that plus the nudity might flag it — and then law enforcement could be contacted to obtain a court order to decrypt.
    The basic concept of CSAM scanning doesn't involve searching for patterns in your images, not nudity or anything else.  The way it worked was to compare hashes of your images to a database of hashes of known CSAM images.  The database came from the National Center for Missing and Exploited Children which maintains hashes of the images that were proven in criminal cases to be of minors.

    The concerns of having CSAM on our devices as part of the detection were unwarranted and based on a misunderstanding of how the system works.  A potential valid concern is the probability of hash collisions.  I recall Apple's response on that was that they weren't going to alert on single matches.

    In a truly E2E (iCloud photos are not unless you turn it on) system, hash scanning doesn't work as Apple never sees the original content on their servers. The only way to do it is to scan on device before anything goes anywhere. Apple's original annoucment around this effort was somewhat convoluted (but actually pretty smart) and didn't trigger on single matches (for fasle-positive reasons you mention) but it did have to do with scanning actual content not just hashes, which is why there's probably confusion.

    ronnAlex1Nwatto_cobrachasm
  • Apple stuck the Mac mini power button on the bottom

    Penzi said:
    Just the oddball in me, but I wish they’d put the power button in the Apple logo on the top…

    They should do this, and they should also make the apple logo on the phones the flashlight. Change my mind. 
    watto_cobra
  • Twitter for Mac leaves the Mac App Store, but iPad X is still usable

    auxio said:
    blastdoor said:
    Twitter has proven amazingly resilient in the face of its malignant owner. I almost never go there anymore but clearly plenty of people do. My hope is that everyone who isn’t a musk fan will leave, but I’m afraid that hope might never be realized .
    The vast majority of Twitter accounts these days are for business, which includes influencers and other self promoters. Between those and bot accounts to pad numbers/harvest data, there isn't much genuine human interest in the platform.
    Bless your heart.  https://www.msn.com/en-us/money/technology/maye-musk-is-a-proud-mom-as-sons-x-platform-scores-a-win-over-mark-zuckerbergs-instagram-and-facebook-elon-musk-says-there-is-still-limited-understanding/ar-BB1qZYOu

    Press X to doubt.

    Looking at the data on similarweb, which he is citing here, it's obvious that there's data irregularity there. Just look at the traffic numberse for April/May/June -- https://www.similarweb.com/website/x.com -- it does seem like they did some kinda twtter/x merge in May for overall ranking, which makes sense, but I wouldn't be surprised if there's just a ton of duplicate data in there which is what lead to the drastic increase 
    watto_cobra