exceptionhandler
About
- Username
- exceptionhandler
- Joined
- Visits
- 967
- Last Active
- Roles
- member
- Points
- 344
- Badges
- 0
- Posts
- 381
Reactions
-
Outdated Apple CSAM detection algorithm harvested from iOS 14.3 [u]
This is the kind of stuff I alluded to in a previous comment:
https://forums.appleinsider.com/discussion/comment/3327620/#Comment_3327620
People will try to exploit this. It would be better if it wasn’t there at all. The hashes SHOULD be unique, but due to the nature of hashes (it’s an algorithm), there can be collisions. You may sooner win the lottery than encounter one by chance… but what if it wasn’t by chance? It was only a matter of time before someone created an image or images that match hashes in CSAM and started propagating. Now Apple may have a human review step after several flags, but what happens if they get flooded with requests because a series of images gets propagated? Humans can only work so fast. But this human review implies images may be sent to people at Apple, which would necessarily circumvent the encryption that prevents anyone but the intended parties to see the images.
I also find it unfortunate that this has been in since ios14.3, but also hope apple doesn’t enable it. I have iCloud photos turned off and I will not be upgrading to 15. The “smart” criminals will also do the same thing. All this does is root out the dumb ones. The “smarter” ones will turn it off avoid it, but the smartest ones will find/use other channels now. -
Fed expansion of Apple's CSAM system barred by 4th Amendment, Corellium exec says
wwinter86 said:Shocking how many people do not want this feature and seem keen to protect the rights of pedophiles
Benjamin Franklin (1706-1790)
-
New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM
StrangeDays said:Rayz2016 said:gatorguy said:beowulfschmidt said:AppleInsider said:"Apple will refuse any such demands," says the FAQ document. "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."AppleInsider said:"We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."
And having "steadfastly refused" those demands in the past, you've now done what they want voluntarily. And as soon as a government passes a law requiring the addition of something else, you'll comply, just as you have all along.
https://protectingchildren.google/intl/en/
https://www.microsoft.com/en-us/PhotoDNA/CloudService
...so what's different here? Here Apple does the hash-compare on-device, prior to uploading the image to its commercial cloud server. This allows them to 1) not host the CSAM. 2) Actually offer more privacy by not being aware if you have any CSAM on-device until a certain threshold number of matches has been met.
How come you guys weren't having fits about Google, Microsoft, and Dropbox scanning images? How come re-tooling it by authoritarians wasn't a concern then? -
New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM
entropys said:A concern from privacy and security experts has been that this scanning of images on device could easily be extended to the benefit of authoritarian governments that demand Apple expand what it searches for.
"Apple will refuse any such demands," says the FAQ document. "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."Riiiight.
Translation: Here at Apple, we might have created a back door, but we promise to only ever use it for good. Pinky swear!
-
App Tracking Transparency having big impact on ad effectiveness
My question is this: how do they know this if we are not being tracked? How do they know how many have not opted in? Either they’re pulling numbers out of their butts or some limited degree of tracking is still being done.
I guess they could compare before and after datasets (everybody vs those opted in).
¯\_(ツ)_/¯