exceptionhandler

About

Username
exceptionhandler
Joined
Visits
967
Last Active
Roles
member
Points
344
Badges
0
Posts
381
  • Outdated Apple CSAM detection algorithm harvested from iOS 14.3 [u]

    This is the kind of stuff I alluded to in a previous comment: 

    https://forums.appleinsider.com/discussion/comment/3327620/#Comment_3327620

    People will try to exploit this.  It would be better if it wasn’t there at all. The hashes SHOULD be unique, but due to the nature of hashes (it’s an algorithm), there can be collisions.  You may sooner win the lottery than encounter one by chance… but what if it wasn’t by chance? It was only a matter of time before someone created an image or images that match hashes in CSAM and started propagating. Now Apple may have a human review step after several flags, but what happens if they get flooded with requests because a series of images gets propagated?  Humans can only work so fast. But this human review implies images may be sent to people at Apple, which would necessarily circumvent the encryption that prevents anyone but the intended parties to see the images.

    I also find it unfortunate that this has been in since ios14.3, but also hope apple doesn’t enable it.  I have iCloud photos turned off and I will not be upgrading to 15.  The “smart” criminals will also do the same thing.  All this does is root out the dumb ones.  The “smarter” ones will turn it off avoid it, but the smartest ones will find/use other channels now.
    xyzzy-xxx
  • Fed expansion of Apple's CSAM system barred by 4th Amendment, Corellium exec says

    wwinter86 said:
    Shocking how many people do not want this feature and seem keen to protect the rights of pedophiles  :#
    “Those who would give up essential liberty, to purchase a little temporary safety, deserve neither liberty nor safety.”

    Benjamin Franklin (1706-1790)

    Beatsmuthuk_vanalingamp-dogbonobobrcfa
  • New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM

    Rayz2016 said:
    gatorguy said:
    "Apple will refuse any such demands," says the FAQ document. "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."
    Until the government in question passes a law that requires Apple to do so, because as they've said many times, they'll comply with any local laws, even to the detriment of their principles concerning privacy.

    "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."

    And having "steadfastly refused" those demands in the past, you've now done what they want voluntarily.  And as soon as a government passes a law requiring the addition of something else, you'll comply, just as you have all along.


    I would not expect Apple to necessarily reveal any expansion of it if some country, and in this case I'm thinking of China, would order them to. They've long soft-peddled the "iCloud operated by GCBD" handover. Heck, it's not even an Apple-run program there. Apple is simply contractually required to cooperate with the government-controlled cloud provider in whatever way needed for handlng the demands on services and access. It is no longer Apple's to run, and they aren't making the rules.
    You seem to have a finger deep inside Google; do they have something like this, or do they just do the server side scan. I haven't been able to find any reference to a similar setup at any other tech behemoth.
    Serious? Did you check? Yes, Google and Microsoft both do something similar. PhotoDNA is Microsoft's tool. They, like Dropbox and Twitter and Tumblr, all scan for CSAM images using the hash checks, and notify the police. Just like this.

    https://protectingchildren.google/intl/en/

    https://www.microsoft.com/en-us/PhotoDNA/CloudService



    ...so what's different here? Here Apple does the hash-compare on-device, prior to uploading the image to its commercial cloud server. This allows them to 1) not host the CSAM. 2) Actually offer more privacy by not being aware if you have any CSAM on-device until a certain threshold number of matches has been met. 

    How come you guys weren't having fits about Google, Microsoft, and Dropbox scanning images? How come re-tooling it by authoritarians wasn't a concern then?
    I don’t throw a fit because I don’t rely on/use those services and/or software.
    gatorguymuthuk_vanalingam
  • New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM

    entropys said:
    A concern from privacy and security experts has been that this scanning of images on device could easily be extended to the benefit of authoritarian governments that demand Apple expand what it searches for. 

    "Apple will refuse any such demands," says the FAQ document. "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."


    Riiiight.

    Translation: Here at Apple, we might have created a back door, but we promise to only ever use it for good. Pinky swear!

    The skeptic in me agrees with this.  I’ve long said there’s a huge difference between not being capable to do something (intentional or not) and promising not to do something when the capability is there.  While at this point in time they may very well not acquiesce to government requests, but what about several years down the road? What about China where they have already seeming bent over backwards to maintain a presence there? Being a software engineer myself, I’m sure this went through rigorous review and testing, but any new code added may potentially introduce another attack vector to be exploited.
    elijahgentropysbaconstangmike54
  • App Tracking Transparency having big impact on ad effectiveness

    My question is this: how do they know this if we are not being tracked?  How do they know how many have not opted in?  Either they’re pulling numbers out of their butts or some limited degree of tracking is still being done.

    I guess they could compare before and after datasets (everybody vs those opted in).

    ¯\_(ツ)_/¯ 
    watto_cobra