Some of this sounds like PR BS. I don’t see how this helps children like Tim claims. And Apple collaborating with the government is embarrassing.
Collaborating with the government how? (you are the government in a democracy btw).
Well in that case we can stretch it to collaborating with me (government) because I’m allowing them to snoop through my photos.
From the article: ” In true Apple fashion, pursuing this goal has required deep cross-functional commitment, spanning Engineering, GA, HI, Legal…” -Tim Cook
Maybe I’m reading it wrong but “legal” is a government entity. I would bet some looneys from Congress or the FBI twisted Apple’s arm to provide this.
NO NO NO NO NO
"legal" is shorthand for the the company's legal dept. In this case, Apple's legal department. The lawyers that work for Apple on staff (usually) that vet everything Apple does to make sure it is legal (in their opinion) and what legal consequences of stuff could be. In addition to any other day to day legal type work the company needs.
It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.
Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely?
Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology.
Sorry but you’re ignorant. A hash is representation of the image but is not the image. Much like TouchID stores a numeric string that represents your fingerprint, but is *not* your fingerprint. FaceID does this too.
I'm not ignorant. This is what I do for a living and have for the last 29 years. Tell me, how do you generate a hash without having some input as the source?
Some of this sounds like PR BS. I don’t see how this helps children like Tim claims. And Apple collaborating with the government is embarrassing.
Collaborating with the government how? (you are the government in a democracy btw).
Well in that case we can stretch it to collaborating with me (government) because I’m allowing them to snoop through my photos.
From the article: ” In true Apple fashion, pursuing this goal has required deep cross-functional commitment, spanning Engineering, GA, HI, Legal…” -Tim Cook
Maybe I’m reading it wrong but “legal” is a government entity. I would bet some looneys from Congress or the FBI twisted Apple’s arm to provide this.
NO NO NO NO NO
"legal" is shorthand for the the company's legal dept. In this case, Apple's legal department. The lawyers that work for Apple on staff (usually) that vet everything Apple does to make sure it is legal (in their opinion) and what legal consequences of stuff could be. In addition to any other day to day legal type work the company needs.
The conspiracy theorist thinking is maddening, isn’t it? I just about blew up reading that one’s comment. I’m glad two other people responded to it because I have no patience any more.
This is monumentally bad for privacy. It's making me reconsider my investments in Apple products. My entire family belongs to the Apple ecosystem. We all have some version of the iPhone 12, iPads and Apple Watches. It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
If it makes the perverts switch to Android, so be it and good riddance. And I doubt you will leave the platform when you realize you’re still better off with iOS than Android. Google will be doing exactly the same thing shortly as it usually follows Apple in these matters. Who will you go to? Your statement is just a fart in a wind storm. Good luck with it making any difference.
Google's already doing it, as is Microsoft. See Charles Arthur's comments here:
If you’re going to use a possibly uncommon acronym or abbreviation in your article, you should ALSO define it at least once in said article. I had to look up CSAM elsewhere. It isn’t universal or exclusive to this topic. The lack of defining it in the article felt... conspicuous.
It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.
Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely?
Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology.
Wes is not the one that misunderstands. There is no pixel to pixel comparison.
The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.
Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely?
Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology.
Wes is not the one that misunderstands. There is no pixel to pixel comparison.
The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
How exactly do you mathematically hash an image--or any file for that matter--without looking at the 1's and 0's in the file?
It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.
Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely?
Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology.
Wes is not the one that misunderstands. There is no pixel to pixel comparison.
The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
How exactly do you mathematically hash an image--or any file for that matter--without looking at the 1's and 0's in the file?
Because you're doing it one 0 and one 1 at a time. A single pixel is many 0s and many 1s, depending on the bit depth, and compression algo. There is no contextual evaluation or assessment of the file, or even a single pixel, beyond the generation of the hash.
There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash.
It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.
Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely?
Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology.
Wes is not the one that misunderstands. There is no pixel to pixel comparison.
The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
How exactly do you mathematically hash an image--or any file for that matter--without looking at the 1's and 0's in the file?
Because you're doing it one 0 and one 1 at a time. A single pixel is many 0s and many 1s, depending on the bit depth, and compression algo. There is no contextual evaluation or assessment of the file, or even a single pixel, beyond the generation of the hash.
There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash.
Which means they have to open the file, read the contents, generate the hash, transmit the hash.
They are opening the file. They have to. Sure, it's happening on your phone, but it's still happening. It may not be a human looking at your file, but it's being opened by Apple code that generates the hash.
You cannot generate a hash without some sort of input. In this case, the input is the 1's and 0's from the file...which they MUST read to generate the hash.
It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.
Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely?
Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology.
Wes is not the one that misunderstands. There is no pixel to pixel comparison.
The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
How exactly do you mathematically hash an image--or any file for that matter--without looking at the 1's and 0's in the file?
Because you're doing it one 0 and one 1 at a time. A single pixel is many 0s and many 1s, depending on the bit depth, and compression algo. There is no contextual evaluation or assessment of the file, or even a single pixel, beyond the generation of the hash.
There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash.
Which means they have to open the file, read the contents, generate the hash, transmit the hash.
They are opening the file. They have to. Sure, it's happening on your phone, but it's still happening. It may not be a human looking at your file, but it's being opened by Apple code that generates the hash.
You cannot generate a hash without some sort of input. In this case, the input is the 1's and 0's from the file...which they MUST read to generate the hash.
Read the whole comment that I wrote. Reading the stream of 0s and 1s gives no clue to anyone as to what's in the file.
I'm not entirely sure what your argument is. Seeing the 0 and 1 in a file without knowing how to decode it isn't the same as knowing what the file is.
It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.
Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely?
Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology.
Wes is not the one that misunderstands. There is no pixel to pixel comparison.
The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
How exactly do you mathematically hash an image--or any file for that matter--without looking at the 1's and 0's in the file?
Because you're doing it one 0 and one 1 at a time. A single pixel is many 0s and many 1s, depending on the bit depth, and compression algo. There is no contextual evaluation or assessment of the file, or even a single pixel, beyond the generation of the hash.
There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash.
Which means they have to open the file, read the contents, generate the hash, transmit the hash.
They are opening the file. They have to. Sure, it's happening on your phone, but it's still happening. It may not be a human looking at your file, but it's being opened by Apple code that generates the hash.
You cannot generate a hash without some sort of input. In this case, the input is the 1's and 0's from the file...which they MUST read to generate the hash.
Read the whole comment that I wrote. Reading the stream of 0s and 1s gives no clue to anyone as to what's in the file.
I'm not entirely sure what your argument is. Seeing the 0 and 1 in a file without knowing how to decode it isn't the same as knowing what the file is.
Read my whole quote. My concern is not that someone at Apple is looking at the visual representation of a file on my phone. My concern is that they are opening a file without my permission and examining the contents: even if it's just the 1's and 0's.
Today they're only looking at 1's and 0's. Tomorrow? Who knows...
The bottom line is that the code MUST open the file. The code MUST read the contents of the file.
It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.
Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely?
Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology.
Wes is not the one that misunderstands. There is no pixel to pixel comparison.
The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
How exactly do you mathematically hash an image--or any file for that matter--without looking at the 1's and 0's in the file?
Because you're doing it one 0 and one 1 at a time. A single pixel is many 0s and many 1s, depending on the bit depth, and compression algo. There is no contextual evaluation or assessment of the file, or even a single pixel, beyond the generation of the hash.
There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash.
Ah! But can I feed it corned beef? If so, would it still generate a hash?
It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.
Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely?
Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology.
Wes is not the one that misunderstands. There is no pixel to pixel comparison.
The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
How exactly do you mathematically hash an image--or any file for that matter--without looking at the 1's and 0's in the file?
Because you're doing it one 0 and one 1 at a time. A single pixel is many 0s and many 1s, depending on the bit depth, and compression algo. There is no contextual evaluation or assessment of the file, or even a single pixel, beyond the generation of the hash.
There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash.
Ah! But can I feed it corned beef? If so, would it still generate a hash?
It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.
Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely?
Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology.
Wes is not the one that misunderstands. There is no pixel to pixel comparison.
The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
How exactly do you mathematically hash an image--or any file for that matter--without looking at the 1's and 0's in the file?
Because you're doing it one 0 and one 1 at a time. A single pixel is many 0s and many 1s, depending on the bit depth, and compression algo. There is no contextual evaluation or assessment of the file, or even a single pixel, beyond the generation of the hash.
There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash.
Which means they have to open the file, read the contents, generate the hash, transmit the hash.
They are opening the file. They have to. Sure, it's happening on your phone, but it's still happening. It may not be a human looking at your file, but it's being opened by Apple code that generates the hash.
You cannot generate a hash without some sort of input. In this case, the input is the 1's and 0's from the file...which they MUST read to generate the hash.
Read the whole comment that I wrote. Reading the stream of 0s and 1s gives no clue to anyone as to what's in the file.
I'm not entirely sure what your argument is. Seeing the 0 and 1 in a file without knowing how to decode it isn't the same as knowing what the file is.
Read my whole quote. My concern is not that someone at Apple is looking at the visual representation of a file on my phone. My concern is that they are opening a file without my permission and examining the contents: even if it's just the 1's and 0's.
Today they're only looking at 1's and 0's. Tomorrow? Who knows...
The bottom line is that the code MUST open the file. The code MUST read the contents of the file.
I guess all I can say here is Apple knowing that a file has 0s and 1s in it is not the same as knowing those 0s and 1s are a picture of a palm tree, and then a picture of a dog, and then a living room shoot.
It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?
Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.
Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures. Are you trying to be obtuse purposely?
Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology.
Wes is not the one that misunderstands. There is no pixel to pixel comparison.
The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
How exactly do you mathematically hash an image--or any file for that matter--without looking at the 1's and 0's in the file?
Because you're doing it one 0 and one 1 at a time. A single pixel is many 0s and many 1s, depending on the bit depth, and compression algo. There is no contextual evaluation or assessment of the file, or even a single pixel, beyond the generation of the hash.
There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash.
Which means they have to open the file, read the contents, generate the hash, transmit the hash.
They are opening the file. They have to. Sure, it's happening on your phone, but it's still happening. It may not be a human looking at your file, but it's being opened by Apple code that generates the hash.
You cannot generate a hash without some sort of input. In this case, the input is the 1's and 0's from the file...which they MUST read to generate the hash.
Read the whole comment that I wrote. Reading the stream of 0s and 1s gives no clue to anyone as to what's in the file.
I'm not entirely sure what your argument is. Seeing the 0 and 1 in a file without knowing how to decode it isn't the same as knowing what the file is.
Read my whole quote. My concern is not that someone at Apple is looking at the visual representation of a file on my phone. My concern is that they are opening a file without my permission and examining the contents: even if it's just the 1's and 0's.
Today they're only looking at 1's and 0's. Tomorrow? Who knows...
The bottom line is that the code MUST open the file. The code MUST read the contents of the file.
I guess all I can say here is Apple knowing that a file has 0s and 1s in it is not the same as knowing those 0s and 1s are a picture of a palm tree, and then a picture of a dog, and then a living room shoot.
Let me try again. Apple is looking at your phone and the files on your phone to determine if you broke the law. They are doing this without your consent.
Would you consent to law enforcement searching your phone without a warrant? Some random cop stops you on the street and wants to look at your pictures to see if you've broken the law. Would you consent? Probably not. I know I wouldn't.
What Apple is doing isn't much different except that they aren't looking at the physical representation of the file.
A rose by any other name is still a rose. This is a massive violation of privacy that has the potential to go very wrong in the wrong hands.
Note that image hashing is not reversible so one cannot use an image hash to create another image that matches, or be used to modify an existing image so that it matches the original image.
Also, nobody is forced into using iCloud. I for one assumed this kind of system had already been in place for years!
Note that image hashing is not reversible so one cannot use an image hash to create another image that matches, or be used to modify an existing image so that it matches the original image.
Also, nobody is forced into using iCloud. I for one assumed this kind of system had already been in place for years!
It seems like the non-techies (as you put it) are the ones who have the least amount of problems with this. Probably because they don't understand how hash's are generated.
Yes, nobody is forcing people to use iCloud, but this means I have to choose between having my privacy violated or not use the features of iCloud that I'm paying for.
Comments
"legal" is shorthand for the the company's legal dept. In this case, Apple's legal department. The lawyers that work for Apple on staff (usually) that vet everything Apple does to make sure it is legal (in their opinion) and what legal consequences of stuff could be. In addition to any other day to day legal type work the company needs.
https://theoverspill.blog/2021/08/06/apple-scanning-child-abuse-imagery-start-up-1609/
can look, examined all my photos in icloud or in my phone, got nothing to hide…. Haha
There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash.
They are opening the file. They have to. Sure, it's happening on your phone, but it's still happening. It may not be a human looking at your file, but it's being opened by Apple code that generates the hash.
You cannot generate a hash without some sort of input. In this case, the input is the 1's and 0's from the file...which they MUST read to generate the hash.
I'm not entirely sure what your argument is. Seeing the 0 and 1 in a file without knowing how to decode it isn't the same as knowing what the file is.
Today they're only looking at 1's and 0's. Tomorrow? Who knows...
The bottom line is that the code MUST open the file. The code MUST read the contents of the file.
Would you consent to law enforcement searching your phone without a warrant? Some random cop stops you on the street and wants to look at your pictures to see if you've broken the law. Would you consent? Probably not. I know I wouldn't.
What Apple is doing isn't much different except that they aren't looking at the physical representation of the file.
A rose by any other name is still a rose. This is a massive violation of privacy that has the potential to go very wrong in the wrong hands.
Note that image hashing is not reversible so one cannot use an image hash to create another image that matches, or be used to modify an existing image so that it matches the original image.
Also, nobody is forced into using iCloud. I for one assumed this kind of system had already been in place for years!
Yes, nobody is forcing people to use iCloud, but this means I have to choose between having my privacy violated or not use the features of iCloud that I'm paying for.