Researchers who built rudimentary CSAM system say Apple's is a danger

13»

Comments

  • Reply 41 of 46
    crowleycrowley Posts: 10,453member
    Beats said:

    I know a detective that works for the state who implants evidence. He HATES men. So this is gonna be a lovely tool for him!!
    How do you propose your criminal friend would use this lovely tool against all men?
    Beats said:

    Heck, if your wife or someone knows your iCloud password and wants revenge they can do this. The possibilities are endless!!
    If your wife or someone knows your iCloud password and wants revenge and has no qualms about doing illegal things then the gloves are off for who they can email in your name; absolutely no need for this CSAM hash checking to cause damage.

    Don't share your iCloud password, especially not with criminals.
    fastasleepjony0
  • Reply 42 of 46
    The author of this story is clearly not a security expert. He is relying on what Apple has promised to do in their press statements. To the best of my knowledge Apple has not said that they would provide the source code to their system so real experts can review it for safety. Apple has not even said if they would provide key technical details. Their statements thus far have been vague. The key issue is that Apple as a company does not have the core competency to pull off a system like this without very serious unexpected consequences. Many have been pointed out by other writers. One example is what happens when the estranged, abusive parents are contacted because of content in their child's account that is triggered by Apple's system? What safeguards does Apple have against that likely outcome? Based on Apple's past mistakes, I would guess none at all. This is just one of an unknown number of unexpected consequences from a system like this. One thing that is certain is that a nation like China will use this system to coerce Apple to use it against its own repressed populations.
    edited August 2021 muthuk_vanalingam
  • Reply 43 of 46
    mcdave said:
    & never login into your iCloud account on someone else’s device as it starts uploading their camera roll to your iCloud Photo Library.
    You are wrong.  

    Photo syncing is opt-in when you sign into iCloud.  It has never been turned on by default.t
  • Reply 44 of 46
    gatorguygatorguy Posts: 24,261member
    robaba said:
    rcfa said:
    The silly exculpatory listing of differences in the systems is useless.

    1Did Apple leave the Russian market when Russia demanded the installation of Russian government approved apps? 2Did Apple leave the Russian and Chinese markets, when Russia and China demanded that iCloud servers be located in their countries where government has physical access? 3Did Apple leave the Chinese market, when VPN apps were requested to be removed from the Chinese AppStore? 4Did Apple comply when Russia demanded that Telegram be removed from the Russian AppStore? 5Did Apple leave the UAE when VoIP apps were outlawed there?

    NO, NO, NO, NO, NO, and NO!

    And NO will be the answer if these countries require additional databases, direct notification (instead of Apple reviewing the cases), etc.

    Once this is baked into the OS, Apple has no leg to stand on, once “lawful” requests from governments are coming.
    1-Apple did not end up preloading the software that Russia demanded, only allowed for users to selectively load programs upon start up if they chose to.
    2-Apple is quickly moving to end-to-end encryption with an independent, third party go between which would completely eliminate the threat of Chinese (or Russian, or UAE) access to encrypted files on servers.
    3-New security system will be a built in VPN on steroids (end to end encryption, intermediate, independent 3rd part server shielding ID from Webhosts and sniffers, while preventing ISPs from knowing sites visited)
    4-don’t know
    5-see 3

    THIS IS WHY THEY ARE TAKING THE STEP TO SINGLE OUT CSAM NOW—SO THEY CAN STAMP IT OUT, WITHOUT PROVIDING A GATEWAY TO BAD ACTORS, STATE OR PRIVATE ENTERPRISE, WHILE ALLOWING AN UNPRECEDENTED LEVEL OF SECURITY / PRIVACY.
    @robaba ;
     Just one niggling problem....
    China will not permit Apple to fully encrypt cloud data in a way the Chinese could not have decrypted access as needed. They would be breaking recent Chinese security laws if they did so.  Apple follows the law in every country they do business in as we all understand, and Chinese revenues have been treated as too essential for Apple to say no.

    That of course even ignores the fact that Apple has relinquished their cloud service to a Chinese state-run company. It is no longer iCloud by Apple. It is now iCloud by GCBD.
    I suspect Russia is heading down that same path.

    I think Apple has screwed the pooch on E2EE at least as far as China is concerned. They were doomed when they gave in to Chinese control of Apple user data.
    edited August 2021 muthuk_vanalingam
  • Reply 45 of 46
    fastasleepfastasleep Posts: 6,425member
    mcdave said:
    & never login into your iCloud account on someone else’s device as it starts uploading their camera roll to your iCloud Photo Library.
    Why would anyone log into their iCloud account on someone else’s device? What a bizarre idea. 
  • Reply 46 of 46
    fastasleepfastasleep Posts: 6,425member
    One example is what happens when the estranged, abusive parents are contacted because of content in their child's account that is triggered by Apple's system? What safeguards does Apple have against that likely outcome? Based on Apple's past mistakes, I would guess none at all. This is just one of an unknown number of unexpected consequences from a system like this.
    It warns the child first about the detected content and allows them to opt out of continuing with the action (sending/receiving), thereby providing a safeguard against your “likely outcome”. This has been widely documented since the first announcement. Perhaps you should re-read some things. 
Sign In or Register to comment.