Here's the obvious problem for all the people claiming there is some sort of major privacy issue at stake with CSAM scanning in this thread: Apple already scans files uploaded to iCloud for illegal content. That was a part of the user agreement long before 2021. You have to agree to Apple's terms to use the service...so you've already agreed to have your files scanned. The idea that your files residing in iCloud are totally private isn't true at all, just like it isn't true for all the other mainstream cloud services.
However, unlike the other major cloud services Apple has made the bizarre decision to say that CSAM scanning shouldn't be done...despite scanning for other illegal content.
The problem of any privacy violation isn't that Apple is (and most likely been doing for a while) already scanning for CSAM on their iCloud servers but that they were planning to scan for CSAM on the owner's device. It doesn't matter that the scans were going to take place right before the images are transferred from the owner device to the iCloud. The software for the scanning process is on the owner device, the scanning process uses the owner device resources and the scanning takes place before the images are on the iCloud server. That is not the same as Apple scanning for CSAM on their servers, like how the other cloud services are doing it.
On the contrary, it's EXACTLY the same thing. The user agrees to Apple's terms/conditions to use iCloud and any apps that the user has personally selected to be used with iCloud backups will have files scanned per those terms/conditions. Nothing has changed from a privacy perspective. The exact same files that would be scanned on the server side would be scanned on-device.
Sure, you can come up with imagined conjectural scenarios where Apple runs wild and starts scanning everything on the phone regardless of whether it's been selected for iCloud backup...but that could obviously occur without CSAM or iCloud too. Any company that creates an OS could theoretically create privacy invading functions within that OS.
If the illegal images have been encrypted, Apple won't be able to determine that they're illegal. That's why some folks want them to scan an individual's phone, before they're encrypted. I'll go with the 4th and say show me the search warrant. Otherwise, bug off.
How can you claim the 4th when it isn't the government that's scanning the files AND giving Apple permission to scan the files is part of the user agreement for iCloud? Nothing is scanned unless you personally choose to use iCloud backup.
Good luck with that. If Apple’s own engineers tried and failed, it’s very unlikely anyone else could, and certainly not this fringe outfit.
Apple never failed. Their CSAM system was well-designed and did protect privacy better than any existing system out there. People are just afraid of what they don't understand, or refuse to understand.
If the illegal images have been encrypted, Apple won't be able to determine that they're illegal. That's why some folks want them to scan an individual's phone, before they're encrypted. I'll go with the 4th and say show me the search warrant. Otherwise, bug off.
You've misunderstood how the CSAM system worked. It never "scanned your phone". The system was only invoked at the moment that a photo was to be uploaded to iCloud. It's not scanning all of the data across your entire phone. This was about preventing iCloud from hosting and propagating the questionable material, not about busting you for what you may have on your phone in other apps.
Here's the obvious problem for all the people claiming there is some sort of major privacy issue at stake with CSAM scanning in this thread: Apple already scans files uploaded to iCloud for illegal content. That was a part of the user agreement long before 2021. You have to agree to Apple's terms to use the service...so you've already agreed to have your files scanned. The idea that your files residing in iCloud are totally private isn't true at all, just like it isn't true for all the other mainstream cloud services.
However, unlike the other major cloud services Apple has made the bizarre decision to say that CSAM scanning shouldn't be done...despite scanning for other illegal content.
The problem of any privacy violation isn't that Apple is (and most likely been doing for a while) already scanning for CSAM on their iCloud servers but that they were planning to scan for CSAM on the owner's device. It doesn't matter that the scans were going to take place right before the images are transferred from the owner device to the iCloud. The software for the scanning process is on the owner device, the scanning process uses the owner device resources and the scanning takes place before the images are on the iCloud server. That is not the same as Apple scanning for CSAM on their servers, like how the other cloud services are doing it.
On the contrary, it's EXACTLY the same thing. The user agrees to Apple's terms/conditions to use iCloud and any apps that the user has personally selected to be used with iCloud backups will have files scanned per those terms/conditions. Nothing has changed from a privacy perspective. The exact same files that would be scanned on the server side would be scanned on-device.
Sure, you can come up with imagined conjectural scenarios where Apple runs wild and starts scanning everything on the phone regardless of whether it's been selected for iCloud backup...but that could obviously occur without CSAM or iCloud too. Any company that creates an OS could theoretically create privacy ainvading functions within that OS.
No, it is not EXACTLY the same. No where in the EULA did Apple state or even imply that they would install software in YOUR device, in order to scan the images that you are going to upload into iCloud. Yes, the result is exactly the same in regards to the privacy (of lack there of) of the images you are going to store in your iCloud account. But the rights to private ownership that you in have in regards to your iPhone, would have been violated. It doesn't matter that Apple is not the government or acting on the government behalf. Just because Apple is not the government, it doesn't mean that you lose all rights pertaining to privacy, with your iPhone. If you work for a private company, your employer can not force you to install company software on your smartphone, unless you give them permission or install it without your knowledge. Nor can they search your phone, just because it's on company's property, while sitting on your work place desk and being charge with company paid for electricity.
Apple got in big trouble in nearly all the countries where they sold iPhones because they installed software that throttled the CPU, if it detected a bad battery or one about to go bad. This to prevent the iPhone from crashing without notice, due to the bad battery. The sinister were saying that Apple did this to purposely slow down the iPhone so to force customers to buy a new, faster iPhone. But Apple was not found guilty of that. What Apple was really found guilty of was that they installed software on their customers iPhone without their permission and customers had no way to disable the software, if they didn't want it on their iPhone. There were some customers that claimed the software throttled their iPhone, even though they had a good battery. Even if that was just a bug, customers should had had the choice to disable the software or to not have it installed at all. And all Apple had to do was to give their customers the choice to install or ability to disable such software, on the iPhones that is considered their private property and this would have saved Apple 100's on millions of dollars (if not over $1B by now).
Apple also caught a lot of flak for downloading a free U2 album into all their customers iTunes account, without their consent (or even knowledge.). And back then, their iTunes account was on their devices. And it could not be deleted from the library until Apple created a special tool to do just that. (The album would just download again if deleted using the usual way to deleted an album.) In the mean time, it was there taking up HD space, of the 10's of millions of iTunes customers that didn't want the album. Even if one could delete (eventually) the album from their iTunes library, it's not exactly the same as if Apple gave them a choice to not download the album into their iTunes library to begin with. Even if in the end, either way, the album is not in their iTunes library.
Now, you might not have the right to the privacy of your images from being scanned for CSAM, when it's in the iCloud, but surely you have the privacy rights to not allow Apple to install software on your iPhone to do the scanning, before they're in the iCloud servers. Scanning for CSAM is strictly voluntary on Apple part. Apple should not be forcing their customers to host the software (to do the scanning) on their customers privately own devices. Apple should have to first ask for permission, before installing the software. Just like what they should had done with the bad battery throttling software and the free U2 album.
Obviously Apple has to be scanning files in order to remove content. Just click on the link I provided and go to Section V and you will find highly detailed information.
Actually, Section V says that “Apple does not control the Content posted via the Service, nor does it guarantee the accuracy, integrity or quality of such Content” which is pretty much the opposite of what you are asserting.
Section V also states that Apple reserves “the right” to do such scanning and removing to enforce the agreement, comply with laws, protect others, etc but does not state that they are actively doing it.
Comments
Apple never failed. Their CSAM system was well-designed and did protect privacy better than any existing system out there. People are just afraid of what they don't understand, or refuse to understand.
You've misunderstood how the CSAM system worked. It never "scanned your phone". The system was only invoked at the moment that a photo was to be uploaded to iCloud. It's not scanning all of the data across your entire phone. This was about preventing iCloud from hosting and propagating the questionable material, not about busting you for what you may have on your phone in other apps.
Apple got in big trouble in nearly all the countries where they sold iPhones because they installed software that throttled the CPU, if it detected a bad battery or one about to go bad. This to prevent the iPhone from crashing without notice, due to the bad battery. The sinister were saying that Apple did this to purposely slow down the iPhone so to force customers to buy a new, faster iPhone. But Apple was not found guilty of that. What Apple was really found guilty of was that they installed software on their customers iPhone without their permission and customers had no way to disable the software, if they didn't want it on their iPhone. There were some customers that claimed the software throttled their iPhone, even though they had a good battery. Even if that was just a bug, customers should had had the choice to disable the software or to not have it installed at all. And all Apple had to do was to give their customers the choice to install or ability to disable such software, on the iPhones that is considered their private property and this would have saved Apple 100's on millions of dollars (if not over $1B by now).
Apple also caught a lot of flak for downloading a free U2 album into all their customers iTunes account, without their consent (or even knowledge.). And back then, their iTunes account was on their devices. And it could not be deleted from the library until Apple created a special tool to do just that. (The album would just download again if deleted using the usual way to deleted an album.) In the mean time, it was there taking up HD space, of the 10's of millions of iTunes customers that didn't want the album. Even if one could delete (eventually) the album from their iTunes library, it's not exactly the same as if Apple gave them a choice to not download the album into their iTunes library to begin with. Even if in the end, either way, the album is not in their iTunes library.
Now, you might not have the right to the privacy of your images from being scanned for CSAM, when it's in the iCloud, but surely you have the privacy rights to not allow Apple to install software on your iPhone to do the scanning, before they're in the iCloud servers. Scanning for CSAM is strictly voluntary on Apple part. Apple should not be forcing their customers to host the software (to do the scanning) on their customers privately own devices. Apple should have to first ask for permission, before installing the software. Just like what they should had done with the bad battery throttling software and the free U2 album.