Apple backs down on CSAM features, postpones launch

1234568»

Comments

  • Reply 141 of 158
    davidwdavidw Posts: 2,049member
    crowley said:
    davidw said:
    crowley said:
    bb-15 said:

    A user lends their phone to a friend, that friend uploads the wrong kind of image, that is report to NDMEC then law enforcement & then the phone owner has a the police barging into their house. 
    You would need to do that something like 30 times for the police to come barging on your door.  Do you even lend your phone to a friend for anything other than a quick call?  I don't.  These gotchas that people are offering are always such a preposterous stretch.  
    No, just once would be enough if the "friend" uploads 30 images with matching hashes. Or are you saying that if a "friend" uploaded 100 images with matching hashes, that Apple will ignore it and only count that as 1 instance? So a "friend" can upload 100 images with matching hash, 29 times and you will not be flagged because 30 times is the limit? That's not right.  
    Who are your "friends" that this is something that might happen?!
    I'm not questioning your "friend" part of your comment. I more or less agree with you on that. I questioned the part where you implied that it is preposterous this can ever happen, as it would have to happen 30 times, before you get in some kind of trouble. When it's obviously that under the right condition, it only needs to happen just once. Which it now not as preposterous as you make it seem. 

    And don't forget iPads. I have been to a few friends or friends of a friend's home, where they leave an iPad lying around their home because they have kids that mainly use it to play games or watch Netflix or Youtube on. And there are no passcode on it because the iiPad rarely leaves the home. But they do have cameras, internet, Photos and iCloud storage when on their home WiFi. But they are more or less, just a "toy" for their kids and not something that has a lot of personal info on.  

    Just because you can't ever imagine it ever happening with a "friend", doesn't mean that it can't. There are over 1B iPhones out there, 120M in the US alone. Not including iPads. Just like I can't ever imagine anyone ever falling for the phishing scam that involves the IRS requesting payment (in iTunes gift cards). Yet there are people that should know better, that falls for it all the time. Even if the people that do fall for it is a very, very small fraction of the people that are targeted, it's actually preposterous to think that no one would ever fall for it.  
      
    darkvader
  • Reply 142 of 158
    Illusive said:
    Illusive said:
    Does anyone here realize THIS means iCloud Photos stay virtually unencrypted, just as they have been since at least 2020? That CSAM thingy was supposed to scan the pics on-device so that they could be uploaded securely to iCloud if they don't violate the policy. 
    Just the opposite.  I'm guessing this is a necessary prerequisite for Apple to put in place before it could ever go with end-to-end encryption and still remain compliant with authorities by not holding CSAM material on their cloud. 


    My point exactly. Without CSAM, iCloud Photos will remain unencrypted - all courtesy of noobs talking nonsense on the Internet. I wish the latter were only accessible with a valid sanity certificate. 
    Apple easily let go of end-to-end encryption that they already planned when the CIA complained, which proves further that their 'customer privacy comes first' vision can be bought for money. 



    darkvader
  • Reply 143 of 158
    mr. h said:
    gatorguy said:

    I get that you really REALLY want to paint a glowing picture of "gosh Apple is doing this for us", but is there any even circumstantial evidence Apple was ready to make everything end-to-end encrypted in a way they could not access any of your data even if they were ordered to? Not as far as I know. It's more of a hope and prayer since otherwise it's not for the betterment of us users. 
    All I can say about that is that the whole scheme would be totally pointless if they weren't going to encrypt the photos. Why go to all the effort of designing this enormously complicated system, calculating hashes on-device, doing the CSAM hash-matching in a "blind" way so even the device itself doesn't know if there's been a match, and then going to all the convoluted effort of generating doubly-encrypted "vouchers" and associated "image information", if the photo itself was uploaded to iCloud unencrypted?

    Certainly, this system would enable the photos to be uploaded to iCloud encrypted, but I concede that as far as I know, Apple hasn't said that they would do that. It's just that, as I said, the whole scheme seems totally pointless if the photos are uploaded to the server in the clear anyway.

    How about Apple just offers a toggle in iCloud photos settings? The two options would be:

    1. Photos are CSAM-scanned and encrypted before being uploaded to iCloud.
    2. Photos are not CSAM-scanned, but are uploaded to iCloud in the clear. The server then does the CSAM scan.

    Would this solution make everyone happier?
    No, because;

    • option 1. is still a violation; technology is spying on your unencrypted media files and sending the output to Apple. Also, when Apple sees CSAM triggered for device X and photo Y, they wouldn't be able to enter step 2 of their process: manual review1 because their images cannot be EtE encrypted - they can't take a peek at the photo. Also, since Apple's original approach compares hashes anyway and not the image itself, your solution does not provide any practical added benefit.
    • option 2. this requires the lack of true end-to-end encryption (which Apple still doesn't provide after the CIA complained). The server needs to be trusted by the consumer. If a government forces access to the server, you'll be exposed. If there's a leak, all user's unencrypted photos can be exposed
    darkvader
  • Reply 144 of 158
    crowleycrowley Posts: 10,453member
    davidw said:
    crowley said:
    davidw said:
    crowley said:
    bb-15 said:

    A user lends their phone to a friend, that friend uploads the wrong kind of image, that is report to NDMEC then law enforcement & then the phone owner has a the police barging into their house. 
    You would need to do that something like 30 times for the police to come barging on your door.  Do you even lend your phone to a friend for anything other than a quick call?  I don't.  These gotchas that people are offering are always such a preposterous stretch.  
    No, just once would be enough if the "friend" uploads 30 images with matching hashes. Or are you saying that if a "friend" uploaded 100 images with matching hashes, that Apple will ignore it and only count that as 1 instance? So a "friend" can upload 100 images with matching hash, 29 times and you will not be flagged because 30 times is the limit? That's not right.  
    Who are your "friends" that this is something that might happen?!
    I'm not questioning your "friend" part of your comment. I more or less agree with you on that. I questioned the part where you implied that it is preposterous this can ever happen, as it would have to happen 30 times, before you get in some kind of trouble. When it's obviously that under the right condition, it only needs to happen just once. Which it now not as preposterous as you make it seem. 

    And don't forget iPads. I have been to a few friends or friends of a friend's home, where they leave an iPad lying around their home because they have kids that mainly use it to play games or watch Netflix or Youtube on. And there are no passcode on it because the iiPad rarely leaves the home. But they do have cameras, internet, Photos and iCloud storage when on their home WiFi. But they are more or less, just a "toy" for their kids and not something that has a lot of personal info on.  

    Just because you can't ever imagine it ever happening with a "friend", doesn't mean that it can't. There are over 1B iPhones out there, 120M in the US alone. Not including iPads. Just like I can't ever imagine anyone ever falling for the phishing scam that involves the IRS requesting payment (in iTunes gift cards). Yet there are people that should know better, that falls for it all the time. Even if the people that do fall for it is a very, very small fraction of the people that are targeted, it's actually preposterous to think that no one would ever fall for it.  
    So now your friends kids are going to download child porn on a family iPad?

    I say preposterous because so many things have to align for the things you say to be a problem.

    1. Someone would have to have access to an unlocked device that isn't theirs.
    2. They would have to be a someone who knows where to get child abuse imagery.
    3. They would have to save 30 instances of child abuse imagery that is on the NCMEC list to the Photos app.
    4. The iPhone would have to have iCloud photos turned on.

    5. The device would have to be returned to the original owner, so that they don't have the lost/stolen defence.
    6. They would have to do it in such a way that the original owner doesn't know who had access to the device.
    7. They would have to do it in such a way that the original owner doesn't know that it has happened for a time, so that they can't use Find My iPhone or report the incident and current location to authorities.

    And the question remains why would this someone do this?  Accident seems beyond plausibility for such a list, so malice is surely the only reason.  That's an absurd risk for the malicious actor to take purely out of malice.  There is such a high chance of it backfiring at any point and landing them in jail for possessing child abuse imagery and probably some count of fraud or attempt to implicate another, or whatever the relevant crime would be.

    It is quite simply preposterous that someone would get away with this.  It's preposterous enough that anyone would even attempt such a high risk, low reward crime.

    And if you're genuinely worried about it then I have news; a malicious actor can cause you a whole lot of damage by having access to your unlocked device even without this CSAM hash checking software.  They can access your phonebook, email, your social media, possibly your bank and financial apps if they know your passwords.  They can mess you up.  And they can find plenty of uses for child abuse imagery that will potentially implicate you and get you in trouble without needing to go near iCloud.

    This is a totally made up problem.
    fastasleep
  • Reply 145 of 158
    davidw said:
    nicholfd said:
    Rogue01 said:
    lkrupp said:
    lkrupp said:
    Dead_Pool said:
    Subway’s Jared tweets his appreciation!
    Think of the children who will suffer abuse because a few privacy wackos don’t want Apple to scan their photos. Fuck those kids, right?
    Nope. Apple can very well scan the photos in iCloud and report it to authorities. They have the keys to decrypt the files stored in iCloud, so there is NOTHING that is preventing Apple from doing it and NO ONE is against this. The opposition is only for doing the scan ON the device, NOT in iCloud.
    And as AppleInsider and Apple have stated, images are not scanned on the device. But you choose to believe it’s a lie because...?
    Apple has openly and clearly stated that the CSAM hashes are stored on your device and every photo on your device is scanned BEFORE they are uploaded to their servers when iCloud Photos is turned on, and Apple turns it on by default.  Apple will scan every photo on your device.  Why do you choose not to believe that?
    Stop the fucking nonsense.  

    By definition, when iCloud Photos is turned on, all photos stored in the Photos app (no where else) are uploaded to iCloud, hence all photos stored in the Photos app are scanned, when they are uploaded to iCloud!

    Apple DOES NOT EVER scan every photo on your device.  Apple only scans photos stored in the Photos app, and only if iCloud Photos is turned on, do they scan them.

    iCloud Photos is NOT turned on by default.  Period.  You are new here, so I'll assume you have NEVER used an Apple device, right?

    Save space on your iPhone

    iCloud Photos can help you make the most of the storage space on your iPhone. When Optimize iPhone Storage is turned on, all your full‑resolution photos and videos are stored in iCloud in their original formats, with storage-saving versions kept on your iPhone as space is needed.

    Optimize iPhone Storage is turned on by default. To turn it off, go to Settings  > [your name] > iCloud > Photos, then tap Optimize iPhone Storage.


    https://support.apple.com/guide/iphone/use-icloud-photos-iph961b96c4d/ios



    If you are not new here and have used an Apple device, what's your excuse for not knowing?

    What the fuck does that have to do with iCloud Photos being optional, or ANYTHING else I said?  Optimize iPhone Storage is turned on by default ONLY IF YOU ENABLE iCloud Photos.  

    The article you linked tells you how to use iCloud photos.  It does not say anything about iCloud Photos being turned on by default.  If you turn on iCloud Photos (only then) is "Optimize iPhone Storage is turned on by default."

    And hell - you don't even have to use iCloud at all.  Don't sign in to iCloud and there's not even an option to use iCloud Photos

    As I said, STOP the nonsense.
    [Deleted User]jony0fastasleep
  • Reply 146 of 158
    Illusive said:
    You peeps must be real bored to keep discussing this :D
    The saddest, the most self-unaware comment of this thread. 

    Or any other. 
    I could say the same about yours. Being paranoid is harder than ever nowadays. Constant vigilance, forum wars, meticulously micromanaging your privacy settings… Tin foils won’t cut it anymore 😀
    jony0
  • Reply 147 of 158
    Illusive said:
    Illusive said:
    Does anyone here realize THIS means iCloud Photos stay virtually unencrypted, just as they have been since at least 2020? That CSAM thingy was supposed to scan the pics on-device so that they could be uploaded securely to iCloud if they don't violate the policy. 
    Just the opposite.  I'm guessing this is a necessary prerequisite for Apple to put in place before it could ever go with end-to-end encryption and still remain compliant with authorities by not holding CSAM material on their cloud. 


    My point exactly. Without CSAM, iCloud Photos will remain unencrypted - all courtesy of noobs talking nonsense on the Internet. I wish the latter were only accessible with a valid sanity certificate. 
    Apple easily let go of end-to-end encryption that they already planned when the CIA complained, which proves further that their 'customer privacy comes first' vision can be bought for money. 


    Apple is an American corporation. Corporations are legally required to comply with all manner of governmental regulations. Or do you think you’d act any different if those guys showed up at your doorstep? I ask you, talk is cheap. It’s one thing to bad-mouth them, but when they are on to you for real, it’s a whole different matter. 
    And yes, governments can make any ad campaign go down the toilet, whether multi-million or billion or even trillion. As I said already, Apple is a business. Businesses are here to make money, not make everyone happy. And to do so, they need licenses and permits and all sorts of approvals from those big boys in high places. 

    Anyway, feel free to switch to (a) different platform (s). Google, Microsoft (both American, btw), and all sorts of Linux crowds are all waiting for you eagerly :smiley: 


    edited September 2021 jony0
  • Reply 148 of 158
    davidwdavidw Posts: 2,049member
    nicholfd said:
    davidw said:
    nicholfd said:
    Rogue01 said:
    lkrupp said:
    lkrupp said:
    Dead_Pool said:
    Subway’s Jared tweets his appreciation!
    Think of the children who will suffer abuse because a few privacy wackos don’t want Apple to scan their photos. Fuck those kids, right?
    Nope. Apple can very well scan the photos in iCloud and report it to authorities. They have the keys to decrypt the files stored in iCloud, so there is NOTHING that is preventing Apple from doing it and NO ONE is against this. The opposition is only for doing the scan ON the device, NOT in iCloud.
    And as AppleInsider and Apple have stated, images are not scanned on the device. But you choose to believe it’s a lie because...?
    Apple has openly and clearly stated that the CSAM hashes are stored on your device and every photo on your device is scanned BEFORE they are uploaded to their servers when iCloud Photos is turned on, and Apple turns it on by default.  Apple will scan every photo on your device.  Why do you choose not to believe that?
    Stop the fucking nonsense.  

    By definition, when iCloud Photos is turned on, all photos stored in the Photos app (no where else) are uploaded to iCloud, hence all photos stored in the Photos app are scanned, when they are uploaded to iCloud!

    Apple DOES NOT EVER scan every photo on your device.  Apple only scans photos stored in the Photos app, and only if iCloud Photos is turned on, do they scan them.

    iCloud Photos is NOT turned on by default.  Period.  You are new here, so I'll assume you have NEVER used an Apple device, right?

    Save space on your iPhone

    iCloud Photos can help you make the most of the storage space on your iPhone. When Optimize iPhone Storage is turned on, all your full‑resolution photos and videos are stored in iCloud in their original formats, with storage-saving versions kept on your iPhone as space is needed.

    Optimize iPhone Storage is turned on by default. To turn it off, go to Settings  > [your name] > iCloud > Photos, then tap Optimize iPhone Storage.


    https://support.apple.com/guide/iphone/use-icloud-photos-iph961b96c4d/ios



    If you are not new here and have used an Apple device, what's your excuse for not knowing?

    What the fuck does that have to do with iCloud Photos being optional, or ANYTHING else I said?  Optimize iPhone Storage is turned on by default ONLY IF YOU ENABLE iCloud Photos.  

    The article you linked tells you how to use iCloud photos.  It does not say anything about iCloud Photos being turned on by default.  If you turn on iCloud Photos (only then) is "Optimize iPhone Storage is turned on by default."

    And hell - you don't even have to use iCloud at all.  Don't sign in to iCloud and there's not even an option to use iCloud Photos

    As I said, STOP the nonsense.
    My mistake, it's not "Optimize Photo" that I was thinking of, that is enabled by default, it's "My Photo Stream".

    https://www.compsmag.com/how-to/turn-off-my-photo-stream-to-free-up-1gb-of-space-on-ios/

    https://www.igeeksblog.com/disable-turn-off-photo-stream-on-iphone-and-ipad/

    https://support.apple.com/guide/iphone/use-my-photo-stream-iphbfeb468fc/14.0/ios/14.0

    Not sure if it's still enabled by default, now that iCloud Photos can handle what My Photo Stream does and with more features. So those that wants the features of My Photo Stream, can now choose which to use. But before iCloud Photos, My Photo Stream was enabled by default. And if you had it enabled on a device, using a back up from it for a new device, will also keep it enabled on the new device. I had several friends ask me how come they had photos in the iCloud, when they made it a point to never turned on iCloud Photos. Mainly because they only had the free 5GB of space and didn't want to fill up the space with photos. They backed up to a computer with iTunes anyway. What I found out was that they had My Photo Stream enabled and they swear they didn't enable it or even knew what it was, as they only had one Apple device.

    When enabled, all the photos taken will automatically end up in the iCloud and not take up any of the free 5GB. But the photos are only in the iCloud for 30 days and limited to 1000 of your most recent photos (no videos). It's not a back up. Just a convenient way to share the photos on one device, with another one that is also logged into the same iCloud account.  

    So, will Apple also scan and search these photos, on the device? They are not in an iCloud Photos Library, but will be on the iCloud servers. And what about back ups? From what I understand, if you don't have iCloud Photos enabled, a backup will save all the photos on the iPhone, iPad or Touch and that back up can be saved in iTunes or automatically in the iCloud. Will Apple eventually also scan and search the photos on the back up on the device, before it's saved in an iCloud account? Those photos will also be stored on the iCloud servers.

    It's not the rules that Apple has set for their users, when using the iCloud, that is what upsetting a lot of iDevice users. Some people want and need to use iCloud. It's that Apple is planning to do their scan and search of the photos that are going to end up in their iCloud servers, while the photos are still on their customers devices, before they are saved in the iCloud.

    The SCOTUS has ruled that a smartphone can be an extension of one's private property and is protected from unreasonable searches and no warrant can be issued without probable cause. Much like a person's home or car. Thus law enforcement must obtain a warrant for a search and only with probable cause.

    https://www.wtsp.com/article/news/local/supreme-court-rules-cell-phones-are-private-property/67-300323516

    The data residing on one's smartphone, is not considered in the hands of a third party. Not even a third party should have access to its contents, without permission. Much like landlords can not enter and search their tenants units, without permission. (Unless there's an emergency.) A landlord of an apartment building has the right to kick out a tenant that he finds out is a pedophile, because there are a lot of kids living in the other units. But that landlord can not just enter that tenant's apartment and search his computer, to find this out.

    Complaining that Apple is scanning and searching through your photos for child abuse, when they are stored in their iCloud servers, would be nonsense.

    Complaining that Apple is performing the scan and search while the photos are still on your device, is not nonsense. Your device and its contents are not considered in the hands of a third party and thus you lose any rights of privacy and protection from unreasonable and warrantless search, without probable cause.  You lose those rights once the contents of your device are in the iCloud and not before.

    The option should be whether you give Apple permission or not, to do their search while the photos are still on your device. Apple do not need your permission to do their search, when your photos are on their servers. The option should not be whether to use iCloud or not. That is nonsense. 
  • Reply 149 of 158
    davidw said:
    nicholfd said:
    davidw said:
    nicholfd said:
    Rogue01 said:
    lkrupp said:
    lkrupp said:
    Dead_Pool said:
    Subway’s Jared tweets his appreciation!
    Think of the children who will suffer abuse because a few privacy wackos don’t want Apple to scan their photos. Fuck those kids, right?
    Nope. Apple can very well scan the photos in iCloud and report it to authorities. They have the keys to decrypt the files stored in iCloud, so there is NOTHING that is preventing Apple from doing it and NO ONE is against this. The opposition is only for doing the scan ON the device, NOT in iCloud.
    And as AppleInsider and Apple have stated, images are not scanned on the device. But you choose to believe it’s a lie because...?
    Apple has openly and clearly stated that the CSAM hashes are stored on your device and every photo on your device is scanned BEFORE they are uploaded to their servers when iCloud Photos is turned on, and Apple turns it on by default.  Apple will scan every photo on your device.  Why do you choose not to believe that?
    Stop the fucking nonsense.  

    By definition, when iCloud Photos is turned on, all photos stored in the Photos app (no where else) are uploaded to iCloud, hence all photos stored in the Photos app are scanned, when they are uploaded to iCloud!

    Apple DOES NOT EVER scan every photo on your device.  Apple only scans photos stored in the Photos app, and only if iCloud Photos is turned on, do they scan them.

    iCloud Photos is NOT turned on by default.  Period.  You are new here, so I'll assume you have NEVER used an Apple device, right?

    Save space on your iPhone

    iCloud Photos can help you make the most of the storage space on your iPhone. When Optimize iPhone Storage is turned on, all your full‑resolution photos and videos are stored in iCloud in their original formats, with storage-saving versions kept on your iPhone as space is needed.

    Optimize iPhone Storage is turned on by default. To turn it off, go to Settings  > [your name] > iCloud > Photos, then tap Optimize iPhone Storage.


    https://support.apple.com/guide/iphone/use-icloud-photos-iph961b96c4d/ios



    If you are not new here and have used an Apple device, what's your excuse for not knowing?

    What the fuck does that have to do with iCloud Photos being optional, or ANYTHING else I said?  Optimize iPhone Storage is turned on by default ONLY IF YOU ENABLE iCloud Photos.  

    The article you linked tells you how to use iCloud photos.  It does not say anything about iCloud Photos being turned on by default.  If you turn on iCloud Photos (only then) is "Optimize iPhone Storage is turned on by default."

    And hell - you don't even have to use iCloud at all.  Don't sign in to iCloud and there's not even an option to use iCloud Photos

    As I said, STOP the nonsense.
    My mistake, it's not "Optimize Photo" that I was thinking of, that is enabled by default, it's "My Photo Stream".

    https://www.compsmag.com/how-to/turn-off-my-photo-stream-to-free-up-1gb-of-space-on-ios/

    https://www.igeeksblog.com/disable-turn-off-photo-stream-on-iphone-and-ipad/

    https://support.apple.com/guide/iphone/use-my-photo-stream-iphbfeb468fc/14.0/ios/14.0

    Not sure if it's still enabled by default, now that iCloud Photos can handle what My Photo Stream does and with more features. So those that wants the features of My Photo Stream, can now choose which to use. But before iCloud Photos, My Photo Stream was enabled by default. And if you had it enabled on a device, using a back up from it for a new device, will also keep it enabled on the new device. I had several friends ask me how come they had photos in the iCloud, when they made it a point to never turned on iCloud Photos. Mainly because they only had the free 5GB of space and didn't want to fill up the space with photos. They backed up to a computer with iTunes anyway. What I found out was that they had My Photo Stream enabled and they swear they didn't enable it or even knew what it was, as they only had one Apple device.

    When enabled, all the photos taken will automatically end up in the iCloud and not take up any of the free 5GB. But the photos are only in the iCloud for 30 days and limited to 1000 of your most recent photos (no videos). It's not a back up. Just a convenient way to share the photos on one device, with another one that is also logged into the same iCloud account.  

    So, will Apple also scan and search these photos, on the device? They are not in an iCloud Photos Library, but will be on the iCloud servers. And what about back ups? From what I understand, if you don't have iCloud Photos enabled, a backup will save all the photos on the iPhone, iPad or Touch and that back up can be saved in iTunes or automatically in the iCloud. Will Apple eventually also scan and search the photos on the back up on the device, before it's saved in an iCloud account? Those photos will also be stored on the iCloud servers.

    It's not the rules that Apple has set for their users, when using the iCloud, that is what upsetting a lot of iDevice users. Some people want and need to use iCloud. It's that Apple is planning to do their scan and search of the photos that are going to end up in their iCloud servers, while the photos are still on their customers devices, before they are saved in the iCloud.

    The SCOTUS has ruled that a smartphone can be an extension of one's private property and is protected from unreasonable searches and no warrant can be issued without probable cause. Much like a person's home or car. Thus law enforcement must obtain a warrant for a search and only with probable cause.

    https://www.wtsp.com/article/news/local/supreme-court-rules-cell-phones-are-private-property/67-300323516

    The data residing on one's smartphone, is not considered in the hands of a third party. Not even a third party should have access to its contents, without permission. Much like landlords can not enter and search their tenants units, without permission. (Unless there's an emergency.) A landlord of an apartment building has the right to kick out a tenant that he finds out is a pedophile, because there are a lot of kids living in the other units. But that landlord can not just enter that tenant's apartment and search his computer, to find this out.

    Complaining that Apple is scanning and searching through your photos for child abuse, when they are stored in their iCloud servers, would be nonsense.

    Complaining that Apple is performing the scan and search while the photos are still on your device, is not nonsense. Your device and its contents are not considered in the hands of a third party and thus you lose any rights of privacy and protection from unreasonable and warrantless search, without probable cause.  You lose those rights once the contents of your device are in the iCloud and not before.

    The option should be whether you give Apple permission or not, to do their search while the photos are still on your device. Apple do not need your permission to do their search, when your photos are on their servers. The option should not be whether to use iCloud or not. That is nonsense. 
    Apple made it very clear - photos that go to Apple's servers (iCloud) will be scanned.  You should assume the Photo Stream photos will be scanned.  And I do not believe the Photo Stream is on by default (I've never seen it on by default), but I'm not going to waste my time setting up one of my test devices to prove it.

    If you look at everything I wrote, I never said anything about the merits of what Apple is doing - good or bad.  

    What I'm calling out as nonsense is all the misinformation that people are spreading - "It's on by default" (WRONG!), "They're scanning everything on my iPhone!" (WRONG!), etc.

    Apple has also made it very clear that the photos going to iCloud are the only thing they are scanning, and the scanning occurs in the pipeline of the actual upload of the photo to iCloud.  One of their discussion was very clear on that (they even used the word pipeline).  So if the photos never get uploaded (you don't have WiFi on - it only uploads on WiFi), then the photos are never scanned.  It's that simple - as they're uploaded, they're scanned on your device, and you the user control if/when they are uploaded.
    edited September 2021 jony0
  • Reply 150 of 158
    crowleycrowley Posts: 10,453member
    davidw said:

    Complaining that Apple is performing the scan and search while the photos are still on your device, is not nonsense. Your device and its contents are not considered in the hands of a third party and thus you lose any rights of privacy and protection from unreasonable and warrantless search, without probable cause.  You lose those rights once the contents of your device are in the iCloud and not before.
    No, you lose those rights once you agree to waive those rights, which will be in the iCloud terms and conditions.  Agree to the terms and conditions and you agree to Apple scanning your photo on device.
    jony0
  • Reply 151 of 158
    davidwdavidw Posts: 2,049member
    crowley said:
    davidw said:

    Complaining that Apple is performing the scan and search while the photos are still on your device, is not nonsense. Your device and its contents are not considered in the hands of a third party and thus you lose any rights of privacy and protection from unreasonable and warrantless search, without probable cause.  You lose those rights once the contents of your device are in the iCloud and not before.
    No, you lose those rights once you agree to waive those rights, which will be in the iCloud terms and conditions.  Agree to the terms and conditions and you agree to Apple scanning your photo on device.
    Just because you agreed to it, it doesn't mean that what Apple is doing is legal. And if what Apple is doing is illegal, you agreeing to it doesn't make it legal on Apple part.

    The classic example is that an employee can not agree to be paid less than minimum wage. If the employer pays an employee less than minimum wage, he would be breaking the law, even if the employee agrees to it. As an employee, one can not agree away their rights to be paid the minimum wage. 

    Now, is what Apple doing with their search on their customers devices and how they are informing them about it, legal? The courts might rule that at least some of what Apple is doing, are illegal. And it wouldn't matter if you agreed to allow Apple to do the search or not.  

    https://securityboulevard.com/2021/08/is-apples-client-side-child-porn-scanning-legal/
    edited September 2021 muthuk_vanalingamdarkvader
  • Reply 152 of 158
    crowleycrowley Posts: 10,453member
    davidw said:
    crowley said:
    davidw said:

    Complaining that Apple is performing the scan and search while the photos are still on your device, is not nonsense. Your device and its contents are not considered in the hands of a third party and thus you lose any rights of privacy and protection from unreasonable and warrantless search, without probable cause.  You lose those rights once the contents of your device are in the iCloud and not before.
    No, you lose those rights once you agree to waive those rights, which will be in the iCloud terms and conditions.  Agree to the terms and conditions and you agree to Apple scanning your photo on device.
    Just because you agreed to it, it doesn't mean that what Apple is doing is legal. And if what Apple is doing is illegal, you agreeing to it doesn't make it legal on Apple part.

    The classic example is that an employee can not agree to be paid less than minimum wage. If the employer pays an employee less than minimum wage, he would be breaking the law, even if the employee agrees to it. As an employee, one can not agree away their rights to be paid the minimum wage. 

    Now, is what Apple doing with their search on their customers devices and how they are informing them about it, legal? The courts might rule that at least some of what Apple is doing, are illegal. And it wouldn't matter if you agreed to allow Apple to do the search or not.  

    https://securityboulevard.com/2021/08/is-apples-client-side-child-porn-scanning-legal/
    Yeah, the courts aren't going to do that though.  The article you posted has a butt load of words in it, but there's nothing there indicating that there's any chance of Apple being found doing anything illegal.   And there are outright mistakes, such as
    Obviously, Apple’s teams of lawyers thought all of this out before they considered allowing the company to break into the file cabinets of every customer, rummage through every document, picture or file on those customers’ devices and report the findings back to the local gendarmerie.
    Apple's solution does not do that, so I'm going to outright dismiss the article as poorly informed and badly analysed tosh.  Apple passing data through a CSAM scan before uploading it to Apple's own severs is obviously not a privacy violation, because the data is being surrendered to Apple's custody in order to store it in iCloud.  Practical sense will rule here, not the desperate thrashings of the privacy zealots trying to latch on to any possibility that they can.
    edited September 2021 fastasleep
  • Reply 153 of 158
    davidw said:
    It's not the "scan" that people are worked up about, it's the "search". Apple is searching for certain images on your device, that they don't want on their servers, not just scanning them for your benefit.
    What are you talking about?  Apple isn't going around searching all of the files on your device.  It's only when you attempt to upload a picture to iCloud photo that this particular photo will be scanned.  Also, yes, that photo is being scanned against known CSAM material.  Likewise, I find the distinction you are trying to make here to be rather bizarre. 
    This would be like if UPS brought a drug sniffing dog into your home, to sniff the parcels that you are about to ship by UPS. Obviously UPS has the right to not want to ship illegal drugs and to search any parcel in their system that might be suspected of containing illegal drugs. But they don't have the right to do the search while the parcels were still in the shipper's home and there was no reason to believe that the shipper was shipping any illegal drugs. Even if the parcels already had the UPS shipping labels on them. They would have to wait until the parcels are in their truck or warehouse, to do the search. Whether they suspect the parcels contained illegal drugs or not.  
    In your analogy, UPS isn't bringing a dog into your home randomly sniffing for drugs.  Rather, when you decide that you want to ship a package, you can put your box in a device that scans it and pre-certifies so that it can be shipped and doesn't need to be checked anywhere later in the process (unless your pre-certification found drugs of course).  If people actually think about this for a minute, most would actually prefer the check to happen in the privacy of their home by a trusted process rather than having strangers doing whatever they want to your package after it ships. 

    Plus when Spotlight or Photos "scan" your images, it for your benefit, not for Apple's or anyone else's.  

    And yet, there are those that can't or won't, see the difference between Apple "scanning" the images on your device for your benefit and Apple "searching" for images on your device for theirs. When Spotlight or Photos scan your images for "dogs", it's not searching for any sign of animal abuse and will report you over to the SPCA if it finds photos that might be evidence of animal abuse. 
    Right, Apple has been scanning your images for years.  For your benefit.  You don't have to load your photos into the cloud to have this service either.  That's the definition of privacy.  However, IF YOU CHOOSE to upload your pictures to the cloud, Apple (like EVERY OTHER CLOUD PROVIDER) has the right to make sure you're not sending illegal CSAM images ONTO THEIR CLOUD.  You can either choose to enforce that in a private way by scanning photos locally on your device as you try to upload them... or in an invasive way where the scanning of your photos happens on their cloud.  Either way, the scanning is happen.  When it happens in the cloud, you have no idea what else they are doing with it.  Apple provides multiple levels of audit-ability for their process.  Do other cloud providers provide the same?
    fastasleep
  • Reply 154 of 158
    gatorguygatorguy Posts: 24,213member
    techconc said:
    When it happens in the cloud, you have no idea what else they are doing with it.  Apple provides multiple levels of audit-ability for their process.  Do other cloud providers provide the same?
    How can the process of scanning data on either an iPhone or iCloud be independently audited and verified when the software is closed source? Serious question. I can't find reference to it.

    Since you asked Google Cloud is independently audited by 3rd parties, and the process and security can be verified. I would be surprised if Amazon's Cloud is not also independently audited.
    edited September 2021 muthuk_vanalingamdarkvader
  • Reply 155 of 158
    gatorguy said:
    How can the process of scanning data on either an iPhone or iCloud be independently audited and verified when the software is closed source? Serious question. I can't find reference to it.
    Apple has provided detail on this. 

    https://www.apple.com/child-safety/pdf/Security_Threat_Model_Review_of_Apple_Child_Safety_Features.pdf

    "This approach enables third-party technical audits: an auditor can confirm that for any given root hash of the encrypted CSAM database in the Knowledge Base article or on a device, the database was generated only from an intersection of hashes from partici- pating child safety organizations, with no additions, removals, or changes. Facilitating the audit does not require the child safety organization to provide any sensitive informa- tion like raw hashes or the source images used to generate the hashes – they must pro- vide only a non-sensitive attestation of the full database that they sent to Apple. Then, in a secure on-campus environment, Apple can provide technical proof to the auditor that the intersection and blinding were performed correctly. A participating child safety organization can decide to perform the audit as well."
  • Reply 156 of 158
    gatorguygatorguy Posts: 24,213member
    techconc said:
    gatorguy said:
    How can the process of scanning data on either an iPhone or iCloud be independently audited and verified when the software is closed source? Serious question. I can't find reference to it.
    Apple has provided detail on this. 

    https://www.apple.com/child-safety/pdf/Security_Threat_Model_Review_of_Apple_Child_Safety_Features.pdf
    Thanks!
    techconc
  • Reply 157 of 158
    techconc said:
    lkrupp said:
    And as AppleInsider and Apple have stated, images are not scanned on the device. But you choose to believe it’s a lie because...?
    No, images are scanned on device.  What's funny is that people are just getting worked up about this now.  Apple has been scanning images on our devices for a long time.  This isn't the CSAM hash type scanning, this is the machine learning scanning I'm talking about.  That's how we can search for generic things like "dog" or "beach" a get a bunch of relevant pictures from our library.  Where is all of the "slippery slope" type of discussions around that?  Seriously, the level of stupid being raised about this topic is mind numbing. 
    It’s not so much the scanning aspect (as the case you speak of uses on device ML), but rather the scanning and reporting.  Of course, images, text, etc will necessarily need to be accessible to the OS and/or specified applications (it has to read and write them after all) , but it’s the unauthorized transmission of that data off device.  I’m ok if Apple wants to scan and report CSAM in the cloud: their service, their prerogative, but we should also recognize then iCloud photos is not completely private and act accordingly because it should be known that those photos will be scanned and potentially shown to individuals you did not intend (accidentally or intentionally).

    To put this scanning and potential reporting on the phone, rather than in the service, takes that private space and makes it more public, even if it’s currently only done if you upload to iCloud.
    muthuk_vanalingam
Sign In or Register to comment.