zimmie

About

Username
zimmie
Joined
Visits
172
Last Active
Roles
member
Points
2,737
Badges
1
Posts
651
  • UK government lauds Apple's CSAM plans as it seeks to skirt end-to-end encryption

    mjtomlin said:
    Talk about ridiculous worry mongering going on. Let me get this straight, you are all worried that once this system is put into place, Apple will be compelled by governments and other evil entities to expand the “scanning” and reporting into other areas? Is that the gist of it?

    Can one of you please explain to me how that is not currently possible? Because I’m missing something. If desired, China could require Apple to surveillance EVERYTHING on your device. There is no absolutely need for this CSAM tool to do that.

    For years, Apple and every other online service was already compelled to search through all user data stored on their servers and report illegal child pornography. Why haven’t they expanded that search into other areas? Can one of you explain that?

    The current “laws” do not allow e2e encryption of stored data for that one and only reason; child pornography. This gives law enforcement [warranted] access to all your iCloud data, not just your illegal photos.

    All Apple’s CSAM tools allow, is that Apple can remain complaint to current laws (reporting illegal child pornography) while also offering users the ability to store the rest of their data (and photos) encrypted and out of prying eyes.
    Couple of points:
    1. It may be extremely hard for you to grasp this basic point, but many people do understand this - Apple's iCloud servers are Apple's property and anything stored in iCloud by end-users is public data for all practical purposes (even though the owner of the data is end-users). And Apple can scan data in their property for illegal content and report it to law enforcement agencies. BUT they have no business or whatsoever looking into the data stored in a device which is "owned' by end-users. "Ownership of the property" is the key operative word here. Apple owns iCloud and they can do whatever hell they want to do with it, as long as they "inform" end-users about it. End-users own the phones and Apple/Google/<anyone else> (at least the ones who "claim" to uphold the "privacy" of the end-users) has no business peeking into it.

    2. The most important one - you mentioned "All Apple’s CSAM tools allow, is that Apple can remain complaint to current laws (reporting illegal child pornography) while also offering users the ability to store the rest of their data (and photos) encrypted and out of prying eyes". This is pure SPECULATION on your part and please do NOT spread this RUMOR WITHOUT any basis again and again. Apple has NEVER mentioned that they WILL implement end-to-end encryption as soon as on-device CSAM scanning is enabled. NEVER. It is pure speculation by some of the AI forum members that Apple would do it. If you are so sure about it, can you please share Apple's official statement on this?
    You should read some of the evaluations of Apple's proposed CSAM detection tool, as apparently neither you nor the sources you get information from understand what is being proposed.

    1. The proposed tool does not "[look] into the data stored" in your device. It looks at the data you have told it to send to Apple's photo sync/sharing service. If you don't use iCloud Photo Library, the CSAM detection is never run.

    On the other hand, Spotlight looks at all the data stored on your device, and it would be trivial to extend it to tell Apple or whatever government agency if you have mentioned the words "bomb" and "president" together in any thread in Messages. How is this CSAM detection tool a threat to privacy but Spotlight isn't?

    2. The whole system intrinsically involves end-to-end encryption for all images sent to iCloud Photo Library. If implemented, all photos sent to Apple would be encrypted. That's literally the whole point of the tool. If a CSAM match is detected, the tool emits a "voucher" which contains a point in 30-dimensional space. When enough points are accumulated, they can be used to find the key the device used to encrypt the photos, which can then be used to decrypt them and verify if they actually are CSAM.

    To protect against a single such voucher being used as proof that a particular person has uploaded CSAM, the tool also emits "synthetic vouchers" which contain garbage points which don't contribute to the ability to find the image encryption key. Apple can only tell which vouchers from an account are real when they get 30 real vouchers from that account.
    tuckerjjwatto_cobra
  • Facebook launches $299 Ray-Ban Stories smart glasses

    How do I opt out of people recording me with these?
    sconosciutonapoleon_phoneapartwilliamlondonfotoformatJanNLScot1pscooter63viclauyycpatchythepiratewatto_cobra
  • Fewer Android users switching to 'iPhone 13' because of CSAM scan, no Touch ID

    mrstep said:
    > The inclusion of the option in the list may be in response to repeated misguided claims that Apple's CSAM tools erode privacy and may enable surveillance for governments down the road, likely caused by misinformed public outcry overestimating the system's capabilities. Furthermore, Google also performs the scanning — albeit not on-device.

    The claims - from essentially every privacy group - that it's likely to be abused in the future because it enables client side scanning isn't a "misguided claim", it's almost inevitable once the ability is added.  It's just a question of what it will "protect" us from next.

    "Google also performs the scanning" - because what, we look to Google as a shining light of privacy protection?  "albeit not on-device."  Right, usually when some third-party software is scanning your files, it's malware or a virus, not the company selling you the phone.  It's arguable that you shouldn't have private companies scanning your files as part of storage solutions, but with the secret letters that agencies use to access content, that ship has sailed.

    It's unclear why AI is cheerleading something that will 99.999% likely end up eroding privacy protections - and in the best case doesn't improve your privacy.  They could scan iCloud content like everyone else scans their respective cloud files and at least not end up being worse. (Hell, they could use the hash, # of hits, review method all server side.)
    It's a misguided claim because if it were inevitable, we would have seen Spotlight abused in such ways already. It's explicitly designed to search through all the stuff on your phone, after all. It could be trivially modified to alert Apple if you wrote a bomb threat in Notes or Messages.

    The CSAM scanning is less invasive than Steam, the DRM platform and game store. The Steam client reports on software you have installed, as well as whether the copy of a game you're trying to run was purchased under your account or not (i.e., whether you may have committed a crime). Don't want to be subject to such scanning? Don't buy stuff on Steam.

    Don't want to be subject to the CSAM scanning? Don't sync photos to iCloud.

    That said, other applications like OneDrive can recognize new photos and automatically upload them to the associated service if you want. I think it would probably be best from a messaging perspective if iCloud photo sync were spun out into a separate application like that. Then, if you don't want the CSAM scanning code to even be present on your phone, you just remove (or don't install) the iCloud photo sync application.
    Oferbaconstangmagman1979fastasleepjony0
  • San Francisco doctor charged with possessing child pornography in iCloud

    DAalseth said:
    This says to me that they then have no reason to add any additional measures. They can already detect these images in iCloud. 
    They can ... because images in iCloud aren't encrypted today. Apple's servers have the ability to see them.

    With this new plan to scan images on end users' devices as they are being uploaded to iCloud, the images themselves can be encrypted in a way that Apple can't break. Each uploaded instance of CSAM includes a partial direction on how to find the key to decrypt the images. This is known as threshold secret sharing. They aren't exactly parts of the key, but once Apple has enough (apparently 30 in this case), they can use the directions to generate their own copy of the key. (Edited to clarify that last sentence.)

    Today, Apple employees can poke through your photos and share ones they find interesting (presumably policy does not allow this, but they have the capability to). With the announced system in place, they would no longer be able to at a technical level.
    patchythepirateAnilu_777mpw_amherstStrangeDayssireofsethkillroymwhitedoozydozenAlex_VDougie.S
  • Civil rights groups worldwide ask Apple to drop CSAM plans

    Spotlight already indexes everything on your phone. If these totalitarian nightmares were going to happen, why wouldn't they have started a decade ago with the technology which has always done what people are afraid this new tech might be extended to do?
    dee_deemaximara