hijinx

About

Username
hijinx
Joined
Visits
0
Last Active
Roles
member
Points
3
Badges
0
Posts
1
  • Outdated Apple CSAM detection algorithm harvested from iOS 14.3 [u]

    >As mentioned, users have to save the photo. Not just one photo, but 30 photos. You think you’re going to convince someone to save 30 photos they received from a random person?

    LOL this is radically FALSE , you don't have to convince anyone , for example if you've whatsapp there's an option to save photos on device(I have it since day 1) , so if someone sends you 30 photos, not necessarily SEX related but simply CSAM collisions (there's code available to generate those) , they get automatically synch-ed to iCloud , scanned and the result will trigger a review by the Apple employee that plays police . If some of these photos watched lowres resembles sex related stuff  you'll have police knocking at your door and calling you a pedophile. This is not a joke

    muthuk_vanalingamelijahgxyzzy-xxx