Photos in Sierra auto tags all photos - no opt out, or in, and as I recall even had an arguably misleading statement on the photos web page suggesting the possibility, rather than the compulsory data mining... One can call it various shades of private, but I'm guessing all one needs is an executive order and fair game on...
All photo analysis happens on the iPhone itself.. there is no sending of data to the servers, so Apple has no idea what your iPhone found on the photos on your iPhone..
Absolutely true. This is why your phone and your iPad run separate syncs when tagging your photos. None of this tagging info is stored outside the device.
Photos are actually a good example of the limits of Apple's approach.
I use Google Photos because it allows me to group faces: the AI automatically detects and groups faces and you can provide labels (Google doesn't try to identify who the person is - the AI is just able to group all the faces of my son or daughter, say).
If this grouping is not synced across Mobile and Desktop it would be useless for me. I don't want to maintain two sets of face groupings (the AI has sometimes trouble differentiating between my son at age 1 and age 4 - so I manually combine those instances).
I figure since I back up my photos on Google Photos or iCloud, these companies already have all my photo information - if I trust them with that why wouldn't I trust them with calculating summary statistics on them?
It's the same with email: if you use email by Apple or Google, all the data is already on their servers. Hence, doing things like Priority Inbox etc. do not collect any further data - it just makes use of the data that they already have.
I can understand better that people are concerned about tracking cookies - because here new data is created that otherwise wouldn't exist. Similarly, if Siri records voice transcriptions in order to do AI on them, then this also collects new data.
But why would I be concerned about Google or Apple doing AI on my email or photos since they already have access to every single email and photo?
My question is this : couldn't Apple collect as much data as they need to teach their AI the desired skill then aggressively delete that data after the the machine has leant from the data ? that way Apple wouldn't be collecting and keeping large amounts of info on any one which in my opinion is what creeps people out, the sheer comprehensiveness of info obout one's life in the possession of a corporation is just scary, but if Apple just collects, learns then descard keeping only the AI skill, that would be acceptable yes ? Any thoughts ?
I would wager that differential privacy will be the default for all the upcoming machine-learning systems, no matter who the provider is. It's already been shown to be workable.
I disagree that differential privacy will be the default for all the upcoming machine learning systems due to providers' desire to collect as much data as possible. On another note, differential privacy was not championed until Apple raised the flag. Google, Microsoft and others studied differential privacy and all decided not to use it. Now that Apple appears to be making progress with keeping data private by using differential privacy others may consider it, but with Amazon, Facebook, Google and others collecting as much data as possible online and offline for profiling, I sincerely doubt they will be strong advocates of the use of differential privacy.
Also as others noted Google has been experimenting with differential privacy for awhile (See RAPPOR), well before Apple began their own limited use of it. But kudos to Apple for bringing the privacy feature to the public's attention. Google doesn't do PR very well, kinda surprising for a company who makes most of its living from promotion of other companies.
Experimenting is one thing, using it for the benefit of your customers is another.
Oh, and illegally using your customers data when they specifically asked you not to, is another thing again.
It's a good thing Google doesn't illegally use it when you tell them not to, huh? I assume you're referring to the Google/Safari dust-up from a couple years ago where Google got fined about $22M?
I'll do you a favor and make sure you are aware of this: You and many others believe that "Do Not Track" carries some legal weight and turning it on in Safari or any other browser means you can not be legally tracked. It does notmean that at all and was not the reason for the fine. It came because Google improperly advised Safari users on opt-out options. Do Not Track is probably ignored as often as honored on the web. If you don't want cookies and tags following you around the internet you have to take much more active actions to avoid it. A Safari setting is pretty useless, more a feel-good thing than effective. https://nakedsecurity.sophos.com/2014/08/26/do-not-track-the-privacy-standard-thats-melting-away/
OK so google is only unethical by circumventing user settings, not criminal. Got it. Sorry but that's still bunk -- it's an obvious sign that Google doesn't respect its users because the users are not Google's customer, they're the product. Crappy company; their interests are not aligned with mine as the user.
I can only imagine the eternal outcry if Apple had been caught cheating its users in this fashion.
Photos in Sierra auto tags all photos - no opt out, or in, and as I recall even had an arguably misleading statement on the photos web page suggesting the possibility, rather than the compulsory data mining... One can call it various shades of private, but I'm guessing all one needs is an executive order and fair game on...
All photo analysis happens on the iPhone itself.. there is no sending of data to the servers, so Apple has no idea what your iPhone found on the photos on your iPhone..
Absolutely true. This is why your phone and your iPad run separate syncs when tagging your photos. None of this tagging info is stored outside the device.
Photos are actually a good example of the limits of Apple's approach.
I use Google Photos because it allows me to group faces: the AI automatically detects and groups faces and you can provide labels (Google doesn't try to identify who the person is - the AI is just able to group all the faces of my son or daughter, say).
iOS automatically detects and groups faces and you can provide labels. It doesn't currently sync them across devices, but last I heard this is coming.
I would wager that differential privacy will be the default for all the upcoming machine-learning systems, no matter who the provider is. It's already been shown to be workable.
I disagree that differential privacy will be the default for all the upcoming machine learning systems due to providers' desire to collect as much data as possible. On another note, differential privacy was not championed until Apple raised the flag. Google, Microsoft and others studied differential privacy and all decided not to use it. Now that Apple appears to be making progress with keeping data private by using differential privacy others may consider it, but with Amazon, Facebook, Google and others collecting as much data as possible online and offline for profiling, I sincerely doubt they will be strong advocates of the use of differential privacy.
Also as others noted Google has been experimenting with differential privacy for awhile (See RAPPOR), well before Apple began their own limited use of it. But kudos to Apple for bringing the privacy feature to the public's attention. Google doesn't do PR very well, kinda surprising for a company who makes most of its living from promotion of other companies.
Experimenting is one thing, using it for the benefit of your customers is another.
Oh, and illegally using your customers data when they specifically asked you not to, is another thing again.
It's a good thing Google doesn't illegally use it when you tell them not to, huh? I assume you're referring to the Google/Safari dust-up from a couple years ago where Google got fined about $22M?
I'll do you a favor and make sure you are aware of this: You and many others believe that "Do Not Track" carries some legal weight and turning it on in Safari or any other browser means you can not be legally tracked. It does notmean that at all and was not the reason for the fine. It came because Google improperly advised Safari users on opt-out options. Do Not Track is probably ignored as often as honored on the web. If you don't want cookies and tags following you around the internet you have to take much more active actions to avoid it. A Safari setting is pretty useless, more a feel-good thing than effective. https://nakedsecurity.sophos.com/2014/08/26/do-not-track-the-privacy-standard-thats-melting-away/
OK so google is only unethical by circumventing user settings, not criminal. Got it. Sorry but that's still bunk -- it's an obvious sign that Google doesn't respect its users because the users are not Google's customer, they're the product. Crappy company; their interests are not aligned with mine as the user.
I can only imagine the eternal outcry if Apple had been caught cheating its users in this fashion.
You obviously didn't read the Sophos link, or if you did you didn't understand it. Yahoo ignores Do Not Track. Facebook ignores Do Not Track. NetFlix ignores Do Not Track. Pandora ignores Do Not Track. General Mills, Toyota, Subway, Kraft Foods, IBM, Bank of America, American Express and Intel have all put their opposition to Do Not Track in writing.
I would wager that differential privacy will be the default for all the upcoming machine-learning systems, no matter who the provider is. It's already been shown to be workable.
I disagree that differential privacy will be the default for all the upcoming machine learning systems due to providers' desire to collect as much data as possible. On another note, differential privacy was not championed until Apple raised the flag. Google, Microsoft and others studied differential privacy and all decided not to use it. Now that Apple appears to be making progress with keeping data private by using differential privacy others may consider it, but with Amazon, Facebook, Google and others collecting as much data as possible online and offline for profiling, I sincerely doubt they will be strong advocates of the use of differential privacy.
Also as others noted Google has been experimenting with differential privacy for awhile (See RAPPOR), well before Apple began their own limited use of it. But kudos to Apple for bringing the privacy feature to the public's attention. Google doesn't do PR very well, kinda surprising for a company who makes most of its living from promotion of other companies.
Experimenting is one thing, using it for the benefit of your customers is another.
Oh, and illegally using your customers data when they specifically asked you not to, is another thing again.
It's a good thing Google doesn't illegally use it when you tell them not to, huh? I assume you're referring to the Google/Safari dust-up from a couple years ago where Google got fined about $22M?
I'll do you a favor and make sure you are aware of this: You and many others believe that "Do Not Track" carries some legal weight and turning it on in Safari or any other browser means you can not be legally tracked. It does notmean that at all and was not the reason for the fine. It came because Google improperly advised Safari users on opt-out options. Do Not Track is probably ignored as often as honored on the web. If you don't want cookies and tags following you around the internet you have to take much more active actions to avoid it. A Safari setting is pretty useless, more a feel-good thing than effective. https://nakedsecurity.sophos.com/2014/08/26/do-not-track-the-privacy-standard-thats-melting-away/
And that brings up another point of contention: Services where you must actively opt-out rather than opt-in and no that's not specific to Google services. Even Apple has required users to actively opt-out for some tracking features, with the default being on. You should always have to opt-in IMHO. Even when the FCC very recently fined Verizon for the way they used super-cookies (and some articles implied they are illegal; they aren't but should be IMHO) they still allowed Verizon to go on using them and leave the default as on, meaning to avoid it customers have to actively opt-out. http://www.phonearena.com/news/Verizon-fined-1.35-million-for-illegal-use-of-mobile-traffic-supercookies_id79071
Way too few people have any idea what they've actually agreed to when using their mobile devices and computers.
To begin with, "everyone else is doing it" is a bizarrely immature line of reasoning.
Secondly, in your haste to jump to Google's defence, you missed the word "illegal" in my original post. Although it is unacceptable to ignore requests for privacy, it doesn't appear to be unlawful, and it is not why Google was fined in the Apple case.
Google was punished in the Apple case for breaking the agreement it made not to run roughshod over user privacy after it was convicted of an even worse crime the year before.
After taking over Buzz, Google copied their GMail users and details of the people they interacted with over to Buzz, without consent.
In a display of pure arrogance, and having promised not to play fast and loose with folk's right privacy again, Google was at it again.
So it wasn't the Safari hack that was the problem. The problem was that Google thought it was above the law, so they got much-deserved slap when they pulled the same anti-privacy nonsense with Safari.
Now I believe this is the part where you feverishly trawl the Internet looking for random Apple crimes to throw at the wall.
In the long run, all companies that depend on revenue from the collection and sale of personal data are doomed. We're simply in the earliest phase of the business model. The majority of people have no idea of the exposure they have. They have little or no direct experience with the consequences of that exposure. Accordingly, they are still ripe for tremendous exploitation. Consumer exploitation will continue until a threshold of direct experience with negative consequences is met, then the paradigm will shift dramatically toward security and privacy. Where/when is this threshold? I don't know, but once consumers start to feel violated and manipulated, it won't take long to eschew the companies and tech that screwed them over.
Apple is paying the short term price for building in privacy and security. Siri's learning curve. HomeKit's agonizingly slow progress. Ridicule from government and citizens that consider disallowing a backdoor to be unpatriotic or even treasonous. Watching nice niche markets get chewed up by competitors that are falling over themselves to exploit customer data.
Ultimately, the puck will be at customer security and privacy. Apple may be the only player there. Apple is doing both what is right and what will be in greatest demand.
Photos in Sierra auto tags all photos - no opt out, or in, and as I recall even had an arguably misleading statement on the photos web page suggesting the possibility, rather than the compulsory data mining... One can call it various shades of private, but I'm guessing all one needs is an executive order and fair game on...
All photo analysis happens on the iPhone itself.. there is no sending of data to the servers, so Apple has no idea what your iPhone found on the photos on your iPhone..
Absolutely true. This is why your phone and your iPad run separate syncs when tagging your photos. None of this tagging info is stored outside the device.
Photos are actually a good example of the limits of Apple's approach.
I use Google Photos because it allows me to group faces: the AI automatically detects and groups faces and you can provide labels (Google doesn't try to identify who the person is - the AI is just able to group all the faces of my son or daughter, say).
If this grouping is not synced across Mobile and Desktop it would be useless for me. I don't want to maintain two sets of face groupings (the AI has sometimes trouble differentiating between my son at age 1 and age 4 - so I manually combine those instances).
I figure since I back up my photos on Google Photos or iCloud, these companies already have all my photo information - if I trust them with that why wouldn't I trust them with calculating summary statistics on them?
It's the same with email: if you use email by Apple or Google, all the data is already on their servers. Hence, doing things like Priority Inbox etc. do not collect any further data - it just makes use of the data that they already have.
I can understand better that people are concerned about tracking cookies - because here new data is created that otherwise wouldn't exist. Similarly, if Siri records voice transcriptions in order to do AI on them, then this also collects new data.
But why would I be concerned about Google or Apple doing AI on my email or photos since they already have access to every single email and photo?
In Apple's view, a photo is just a photo. But once you start tacking names on it the it becomes an identifiable piece of information that they have no business storing. Fair enough, but I really don't understand it myself. Why is this a problem when a billion of address books are stored in iCloud?
I would wager that differential privacy will be the default for all the upcoming machine-learning systems, no matter who the provider is. It's already been shown to be workable.
I disagree that differential privacy will be the default for all the upcoming machine learning systems due to providers' desire to collect as much data as possible. On another note, differential privacy was not championed until Apple raised the flag. Google, Microsoft and others studied differential privacy and all decided not to use it. Now that Apple appears to be making progress with keeping data private by using differential privacy others may consider it, but with Amazon, Facebook, Google and others collecting as much data as possible online and offline for profiling, I sincerely doubt they will be strong advocates of the use of differential privacy.
Also as others noted Google has been experimenting with differential privacy for awhile (See RAPPOR), well before Apple began their own limited use of it. But kudos to Apple for bringing the privacy feature to the public's attention. Google doesn't do PR very well, kinda surprising for a company who makes most of its living from promotion of other companies.
Experimenting is one thing, using it for the benefit of your customers is another.
Oh, and illegally using your customers data when they specifically asked you not to, is another thing again.
It's a good thing Google doesn't illegally use it when you tell them not to, huh? I assume you're referring to the Google/Safari dust-up from a couple years ago where Google got fined about $22M?
I'll do you a favor and make sure you are aware of this: You and many others believe that "Do Not Track" carries some legal weight and turning it on in Safari or any other browser means you can not be legally tracked. It does notmean that at all and was not the reason for the fine. It came because Google improperly advised Safari users on opt-out options. Do Not Track is probably ignored as often as honored on the web. If you don't want cookies and tags following you around the internet you have to take much more active actions to avoid it. A Safari setting is pretty useless, more a feel-good thing than effective. https://nakedsecurity.sophos.com/2014/08/26/do-not-track-the-privacy-standard-thats-melting-away/
And that brings up another point of contention: Services where you must actively opt-out rather than opt-in and no that's not specific to Google services. Even Apple has required users to actively opt-out for some tracking features, with the default being on. You should always have to opt-in IMHO. Even when the FCC very recently fined Verizon for the way they used super-cookies (and some articles implied they are illegal; they aren't but should be IMHO) they still allowed Verizon to go on using them and leave the default as on, meaning to avoid it customers have to actively opt-out. http://www.phonearena.com/news/Verizon-fined-1.35-million-for-illegal-use-of-mobile-traffic-supercookies_id79071
Way too few people have any idea what they've actually agreed to when using their mobile devices and computers.
To begin with, "everyone else is doing it" is a bizarrely immature line of reasoning.
Secondly, in your haste to jump to Google's defence, you missed the word "illegal" in my original post. Although it is unacceptable to ignore requests for privacy, it doesn't appear to be unlawful, and it is not why Google was fined in the Apple case.
Google was punished in the Apple case for breaking the agreement it made not to run roughshod over user privacy after it was convicted of an even worse crime the year before.
After taking over Buzz, Google copied their GMail users and details of the people they interacted with over to Buzz, without consent.
In a display of pure arrogance, and having promised not to play fast and loose with folk's right privacy again, Google was at it again.
So it wasn't the Safari hack that was the problem. The problem was that Google thought it was above the law, so they got much-deserved slap when they pulled the same anti-privacy nonsense with Safari.
Now I believe this is the part where you feverishly trawl the Internet looking for random Apple crimes to throw at the wall.
No I did not miss your use of illegal. You apparently thought ignoring a Safari setting was illegal. That was initially the reason I replied to you and the focus of the first part of my post that you seemingly find offensive.
I would have expected you to comment on what I had to say, not expecting a thank you of course for making you aware of something you weren't understanding and that does actually affect you if wrongly assuming that a setting was protecting you, but at the least something from you regarding best privacy practices and the failure of Do Not Track. Instead it's back to but...But... GOOGLE. Well OK then, I've nothing I can add for you. Ya got it covered.
...the incremental privacy creep of MacOS has me asking why so few seem concerned... W10 seems even worse, turning on access with the latest 1607 'feature update' without notice... I now install on external key, unconnected from the web & check all privacy settings after, for what than may be worth(less) given what lies below any GUI. I'm keeping a couple of macs that run Snow, just in case.
Photos in Sierra auto tags all photos - no opt out, or in, and as I recall even had an arguably misleading statement on the photos web page suggesting the possibility, rather than the compulsory data mining... One can call it various shades of private, but I'm guessing all one needs is an executive order and fair game on...
AI may be the next big push to harvest what remains of work and IP, and concentrate the wealth to the half percent - will we be dealing with more than fake news or alternative facts very soon?
Interesting that Photos indexes and tags without an opt-out choice. Wasn't aware.
In the long run, all companies that depend on revenue from the collection and sale of personal data are doomed. We're simply in the earliest phase of the business model. The majority of people have no idea of the exposure they have. They have little or no direct experience with the consequences of that exposure. Accordingly, they are still ripe for tremendous exploitation. Consumer exploitation will continue until a threshold of direct experience with negative consequences is met, then the paradigm will shift dramatically toward security and privacy. Where/when is this threshold? I don't know, but once consumers start to feel violated and manipulated, it won't take long to eschew the companies and tech that screwed them over.
Apple is paying the short term price for building in privacy and security. Siri's learning curve. HomeKit's agonizingly slow progress. Ridicule from government and citizens that consider disallowing a backdoor to be unpatriotic or even treasonous. Watching nice niche markets get chewed up by competitors that are falling over themselves to exploit customer data.
Ultimately, the puck will be at customer security and privacy. Apple may be the only player there. Apple is doing both what is right and what will be in greatest demand.
I get nervous each time someone writes "sale of personal data". Newspapers and other old-fashioned subscription services used to sell addresses. Google doesn't sell any addresses. They sell access to "consumers who are tech enthusiasts" or "consumers who have young kids". That's a big difference.
I remember a time without Google where there was no effective search, there was no unlimited email and you had to constantly backup old messages, there was nothing like Google Maps etc. These services have created tremendous consumer surplus and they are financed by targeted ads. Google has a dashboard which shows exactly the data they collect and you can switch off any parts of that data collection that you dislike if you are so inclined.
Moreover, I find it very funny that you believe that Apple is doing "what is right" and that it cares about customer security and privacy:
Apple is fully invested in China which is its second biggest market. It engages in censorship since it removes apps that the Communist government doesn't like as well as books and publications (such as NYT app in China).
The much-maligned Google on the other hand gave up the Chinese market in 2010 when it retreated to Hong Kong - giving up a 30% search market share in that country and tremendous growth potential. They did that after the Chinese government tried to hack the accounts of political activists. They also gave up news censorship after moving out of China (which was a condition for operating Google New in that country before 2010).
Ultimately, most of the AI by Facebook, Google, Apple etc. is applied to data that is already stored on these companies servers - such as uploaded photo, emails, messages etc. Just because you don't allow Apple to tag your photos doesn't change the fact that they could facematch everything on their servers if they choose to do so.
Hence, it all comes down to trust. How much do I trust Apple, Google, Facebook etc?
So far I haven't seen Apple do anything as self-damaging as Google's pullout out of China. Why should I trust Apple more than Google?
In the long run, all companies that depend on revenue from the collection and sale of personal data are doomed. We're simply in the earliest phase of the business model. The majority of people have no idea of the exposure they have. They have little or no direct experience with the consequences of that exposure. Accordingly, they are still ripe for tremendous exploitation. Consumer exploitation will continue until a threshold of direct experience with negative consequences is met, then the paradigm will shift dramatically toward security and privacy. Where/when is this threshold? I don't know, but once consumers start to feel violated and manipulated, it won't take long to eschew the companies and tech that screwed them over.
Apple is paying the short term price for building in privacy and security. Siri's learning curve. HomeKit's agonizingly slow progress. Ridicule from government and citizens that consider disallowing a backdoor to be unpatriotic or even treasonous. Watching nice niche markets get chewed up by competitors that are falling over themselves to exploit customer data.
Ultimately, the puck will be at customer security and privacy. Apple may be the only player there. Apple is doing both what is right and what will be in greatest demand.
I get nervous each time someone writes "sale of personal data". Newspapers and other old-fashioned subscription services used to sell addresses. Google doesn't sell any addresses. They sell access to "consumers who are tech enthusiasts" or "consumers who have young kids". That's a big difference.
I remember a time without Google where there was no effective search, there was no unlimited email and you had to constantly backup old messages, there was nothing like Google Maps etc. These services have created tremendous consumer surplus and they are financed by targeted ads. Google has a dashboard which shows exactly the data they collect and you can switch off any parts of that data collection that you dislike if you are so inclined.
Moreover, I find it very funny that you believe that Apple is doing "what is right" and that it cares about customer security and privacy:
Apple is fully invested in China which is its second biggest market. It engages in censorship since it removes apps that the Communist government doesn't like as well as books and publications (such as NYT app in China).
The much-maligned Google on the other hand gave up the Chinese market in 2010 when it retreated to Hong Kong - giving up a 30% search market share in that country and tremendous growth potential. They did that after the Chinese government tried to hack the accounts of political activists. They also gave up news censorship after moving out of China (which was a condition for operating Google New in that country before 2010).
Ultimately, most of the AI by Facebook, Google, Apple etc. is applied to data that is already stored on these companies servers - such as uploaded photo, emails, messages etc. Just because you don't allow Apple to tag your photos doesn't change the fact that they could facematch everything on their servers if they choose to do so.
Hence, it all comes down to trust. How much do I trust Apple, Google, Facebook etc?
So far I haven't seen Apple do anything as self-damaging as Google's pullout out of China. Why should I trust Apple more than Google?
Yes, I agree with your point. I can't pretend to know Apple's true motives and priorities when it comes to China. Certainly, making money is a primary motivation, but I'm tempted by the following logic:
If Apple seeks to establish a beachhead in China for its stated (and practiced - in other countries) ethos regarding privacy and security, it cannot do so by retreating like Google. In the short term, this makes Apple look hypocritical. In the long run, Chinese customers will increasingly aspire to Apple's premium experience and privacy potential. For the Chinese customer, privacy and security must be a most tempting forbidden fruit.
Human history repeats stupidity ad nauseum, but there are no examples of governments being able to censor and suppress large populations indefinitely. Suppression is especially difficult when a culture's education and affluence are in a phase of ascendency.
Apple is there, selling bottles and making money... Chinese customers know there is a Genie (privacy) in their bottle. Sooner or later, the Genie will get out. When it does, Apple will enjoy both the profits and a win for human rights. That sounds like Apple to me.
In the long run, all companies that depend on revenue from the collection and sale of personal data are doomed. We're simply in the earliest phase of the business model. The majority of people have no idea of the exposure they have. They have little or no direct experience with the consequences of that exposure. Accordingly, they are still ripe for tremendous exploitation. Consumer exploitation will continue until a threshold of direct experience with negative consequences is met, then the paradigm will shift dramatically toward security and privacy. Where/when is this threshold? I don't know, but once consumers start to feel violated and manipulated, it won't take long to eschew the companies and tech that screwed them over.
Apple is paying the short term price for building in privacy and security. Siri's learning curve. HomeKit's agonizingly slow progress. Ridicule from government and citizens that consider disallowing a backdoor to be unpatriotic or even treasonous. Watching nice niche markets get chewed up by competitors that are falling over themselves to exploit customer data.
Ultimately, the puck will be at customer security and privacy. Apple may be the only player there. Apple is doing both what is right and what will be in greatest demand.
I get nervous each time someone writes "sale of personal data". Newspapers and other old-fashioned subscription services used to sell addresses. Google doesn't sell any addresses. They sell access to "consumers who are tech enthusiasts" or "consumers who have young kids". That's a big difference.
I remember a time without Google where there was no effective search, there was no unlimited email and you had to constantly backup old messages, there was nothing like Google Maps etc. These services have created tremendous consumer surplus and they are financed by targeted ads. Google has a dashboard which shows exactly the data they collect and you can switch off any parts of that data collection that you dislike if you are so inclined.
Moreover, I find it very funny that you believe that Apple is doing "what is right" and that it cares about customer security and privacy:
Apple is fully invested in China which is its second biggest market. It engages in censorship since it removes apps that the Communist government doesn't like as well as books and publications (such as NYT app in China).
The much-maligned Google on the other hand gave up the Chinese market in 2010 when it retreated to Hong Kong - giving up a 30% search market share in that country and tremendous growth potential. They did that after the Chinese government tried to hack the accounts of political activists. They also gave up news censorship after moving out of China (which was a condition for operating Google New in that country before 2010).
Ultimately, most of the AI by Facebook, Google, Apple etc. is applied to data that is already stored on these companies servers - such as uploaded photo, emails, messages etc. Just because you don't allow Apple to tag your photos doesn't change the fact that they could facematch everything on their servers if they choose to do so.
Hence, it all comes down to trust. How much do I trust Apple, Google, Facebook etc?
So far I haven't seen Apple do anything as self-damaging as Google's pullout out of China. Why should I trust Apple more than Google?
Yes, I agree with your point. I can't pretend to know Apple's true motives and priorities when it comes to China. Certainly, making money is a primary motivation, but I'm tempted by the following logic:
If Apple seeks to establish a beachhead in China for its stated (and practiced - in other countries) ethos regarding privacy and security, it cannot do so by retreating like Google. In the short term, this makes Apple look hypocritical. In the long run, Chinese customers will increasingly aspire to Apple's premium experience and privacy potential. For the Chinese customer, privacy and security must be a most tempting forbidden fruit.
Human history repeats stupidity ad nauseum, but there are no examples of governments being able to censor and suppress large populations indefinitely. Suppression is especially difficult when a culture's education and affluence are in a phase of ascendency.
Apple is there, selling bottles and making money... Chinese customers know there is a Genie (privacy) in their bottle. Sooner or later, the Genie will get out. When it does, Apple will enjoy both the profits and a win for human rights. That sounds like Apple to me.
I hope you are right. I sometimes think that Apple only talks boldly where it is inexpensive to do so.
I think you only have to look as far as the past week's news about Cook meeting with Trump and others in Washington. He was very clear in his reasoning: [i]You don't effect changes by running away, you effect changes by having a seat at the table. [/i]
Unless somehow you think he's lying about this, it makes sense, and is consistent with much of what Apple has done over many years, even if it might seem imperfect at a glance. Google did what they felt they had to do in China, and it's one of the few large decisions where I mostly respect what they did, but I also think it was a bit of a knee-jerk response. It will take many years to see how it all plays out.
I think you only have to look as far as the past week's news about Cook meeting with Trump and others in Washington. He was very clear in his reasoning: [i]You don't effect changes by running away, you effect changes by having a seat at the table. [/i]
Unless somehow you think he's lying about this, it makes sense, and is consistent with much of what Apple has done over many years, even if it might seem imperfect at a glance. Google did what they felt they had to do in China, and it's one of the few large decisions where I mostly respect what they did, but I also think it was a bit of a knee-jerk response. It will take many years to see how it all plays out.
I don't think it was as "knee-jerk" as you might remember it to be. Google for several years thought they were affecting change in China, and that initially agreeing to censor news as China was demanding was only a temporary cost and real change would come. They resisted calls from both human rights groups and some of their own shareholders to stand up to Chinese censorship, thinking that by staying they were making a difference. http://www.pcworld.com/article/145668/article.html But Chinese actions in 2010/11 required taking a stand, and if not it would be seen as passive acceptance of Chinese government moves then and going forward. Read Sergey Brin's comments immediately following Google's decision to move operations outside of China. http://www.spiegel.de/international/business/google-co-founder-on-pulling-out-of-china-it-was-a-real-step-backward-a-686269.html
Google has absolutely made a couple of mistakes, a couple of them doozies. But something they have been very consistent on is their views about free and open access to information, willing to pay both political and monetary penalties to do what they feel is right. (ie, the recent Right to Be Forgotten issue). Free movement of knowledge and even people is a strong personally held view of Google's founders. Latest example: Last evening Mr Brin walked with protesters in San Francisco to show opposition to the recent immigration issue. Respect.
Knee-jerk wasn't the best choice of words, and after resurrecting my account after a move/hiatus my post was stuck awaiting moderation, uneditable. What I meant was closer to "reactionary". And as a response to the hacking attacks it could easily be construed as petulant, but I know there was more at play than just that. Again, this was a decision that I could mostly respect, even if I don't respect the what the company does overall.
But in re-reading the Spiegel article again now though, I found this telling:
Brin: The hacking attacks were the straw that broke the camel's back. There were several aspects there: the attack directly on Google, which we believe was an attempt to gain access to Gmail accounts of Chinese human rights activists. But there is also a broader pattern we then discovered of simply the surveillance of human rights activists.
It comes across like Brin is trying to say that previously they didn't really understand the surveillance that China was performing on their citizens. That's complete bullshit. If any tech leader understands the nature of government monitoring, it's Brin. So what could he have really meant?
Let's think logically about how this relates to the differences between Apple and Google's decisions with regard to China. I think Brin realized that because of the very nature of Google's business, which is essentially surveillance and profiling, that it was just too dangerous for them to continue to operate in China without terrible repercussions for a lot of people. The nature and very soul of Google is personal profiles. That's where and why they make the bulk of their money, I think we can all agree on that much. Perhaps the writing was on the wall at that point that if they were to continue in China that concessions would be required (or perhaps already were being requested), such as the "servers must be in-country" today. That, along with the hacking, probably made them realize that there was no way they'd be able to protect that data over time, so they bailed. Not necessarily a bad decision, as I said earlier, it will take years to understand how it all plays out.
Apple, while they do have visibility into many of their customers, the same kind of deep profiling is clearly not the mainstay of their business, and it's possible to use their hardware without giving an ounce of personal data back to Apple at all. I am proof of that. Even if the decision matrix both companies used was based on the same factors, the coefficients would be very different.
Then again, for all we know Cook/Jobs/Brin/Page and the whole gang all got together over lunch and played rock, paper, scissors...
I'd like to see Siri work offline, the only time I would use it is in the car when coverage is patchy at best. The iPhone is surely powerful enough to manage and compute some basic offline commands. Conversely, I'd like the ability to sync my iCloud photo tagging - spent weeks sorting People tags to find I had to do it all over again on my ipad.
Comments
I use Google Photos because it allows me to group faces: the AI automatically detects and groups faces and you can provide labels (Google doesn't try to identify who the person is - the AI is just able to group all the faces of my son or daughter, say).
If this grouping is not synced across Mobile and Desktop it would be useless for me. I don't want to maintain two sets of face groupings (the AI has sometimes trouble differentiating between my son at age 1 and age 4 - so I manually combine those instances).
I figure since I back up my photos on Google Photos or iCloud, these companies already have all my photo information - if I trust them with that why wouldn't I trust them with calculating summary statistics on them?
It's the same with email: if you use email by Apple or Google, all the data is already on their servers. Hence, doing things like Priority Inbox etc. do not collect any further data - it just makes use of the data that they already have.
I can understand better that people are concerned about tracking cookies - because here new data is created that otherwise wouldn't exist. Similarly, if Siri records voice transcriptions in order to do AI on them, then this also collects new data.
But why would I be concerned about Google or Apple doing AI on my email or photos since they already have access to every single email and photo?
I can only imagine the eternal outcry if Apple had been caught cheating its users in this fashion.
Yahoo ignores Do Not Track. Facebook ignores Do Not Track. NetFlix ignores Do Not Track. Pandora ignores Do Not Track. General Mills, Toyota, Subway, Kraft Foods, IBM, Bank of America, American Express and Intel have all put their opposition to Do Not Track in writing.
FWIW Google Chrome includes Do Not Track settings.
https://www.cnet.com/how-to/how-to-enable-chromes-do-not-track-option/
To begin with, "everyone else is doing it" is a bizarrely immature line of reasoning.
Secondly, in your haste to jump to Google's defence, you missed the word "illegal" in my original post. Although it is unacceptable to ignore requests for privacy, it doesn't appear to be unlawful, and it is not why Google was fined in the Apple case.
Google was punished in the Apple case for breaking the agreement it made not to run roughshod over user privacy after it was convicted of an even worse crime the year before.
After taking over Buzz, Google copied their GMail users and details of the people they interacted with over to Buzz, without consent.
http://www.pcworld.com/article/223778/article.html
In a display of pure arrogance, and having promised not to play fast and loose with folk's right privacy again, Google was at it again.
So it wasn't the Safari hack that was the problem. The problem was that Google thought it was above the law, so they got much-deserved slap when they pulled the same anti-privacy nonsense with Safari.
Now I believe this is the part where you feverishly trawl the Internet looking for random Apple crimes to throw at the wall.
Apple is paying the short term price for building in privacy and security. Siri's learning curve. HomeKit's agonizingly slow progress. Ridicule from government and citizens that consider disallowing a backdoor to be unpatriotic or even treasonous. Watching nice niche markets get chewed up by competitors that are falling over themselves to exploit customer data.
Ultimately, the puck will be at customer security and privacy. Apple may be the only player there. Apple is doing both what is right and what will be in greatest demand.
I would have expected you to comment on what I had to say, not expecting a thank you of course for making you aware of something you weren't understanding and that does actually affect you if wrongly assuming that a setting was protecting you, but at the least something from you regarding best privacy practices and the failure of Do Not Track. Instead it's back to but...But... GOOGLE. Well OK then, I've nothing I can add for you. Ya got it covered.
I remember a time without Google where there was no effective search, there was no unlimited email and you had to constantly backup old messages, there was nothing like Google Maps etc. These services have created tremendous consumer surplus and they are financed by targeted ads. Google has a dashboard which shows exactly the data they collect and you can switch off any parts of that data collection that you dislike if you are so inclined.
Moreover, I find it very funny that you believe that Apple is doing "what is right" and that it cares about customer security and privacy:
Apple is fully invested in China which is its second biggest market. It engages in censorship since it removes apps that the Communist government doesn't like as well as books and publications (such as NYT app in China).
The much-maligned Google on the other hand gave up the Chinese market in 2010 when it retreated to Hong Kong - giving up a 30% search market share in that country and tremendous growth potential. They did that after the Chinese government tried to hack the accounts of political activists. They also gave up news censorship after moving out of China (which was a condition for operating Google New in that country before 2010).
Ultimately, most of the AI by Facebook, Google, Apple etc. is applied to data that is already stored on these companies servers - such as uploaded photo, emails, messages etc. Just because you don't allow Apple to tag your photos doesn't change the fact that they could facematch everything on their servers if they choose to do so.
Hence, it all comes down to trust. How much do I trust Apple, Google, Facebook etc?
So far I haven't seen Apple do anything as self-damaging as Google's pullout out of China. Why should I trust Apple more than Google?
If Apple seeks to establish a beachhead in China for its stated (and practiced - in other countries) ethos regarding privacy and security, it cannot do so by retreating like Google. In the short term, this makes Apple look hypocritical. In the long run, Chinese customers will increasingly aspire to Apple's premium experience and privacy potential. For the Chinese customer, privacy and security must be a most tempting forbidden fruit.
Human history repeats stupidity ad nauseum, but there are no examples of governments being able to censor and suppress large populations indefinitely. Suppression is especially difficult when a culture's education and affluence are in a phase of ascendency.
Apple is there, selling bottles and making money... Chinese customers know there is a Genie (privacy) in their bottle. Sooner or later, the Genie will get out. When it does, Apple will enjoy both the profits and a win for human rights. That sounds like Apple to me.
Unless somehow you think he's lying about this, it makes sense, and is consistent with much of what Apple has done over many years, even if it might seem imperfect at a glance. Google did what they felt they had to do in China, and it's one of the few large decisions where I mostly respect what they did, but I also think it was a bit of a knee-jerk response. It will take many years to see how it all plays out.
But Chinese actions in 2010/11 required taking a stand, and if not it would be seen as passive acceptance of Chinese government moves then and going forward. Read Sergey Brin's comments immediately following Google's decision to move operations outside of China.
http://www.spiegel.de/international/business/google-co-founder-on-pulling-out-of-china-it-was-a-real-step-backward-a-686269.html
Google has absolutely made a couple of mistakes, a couple of them doozies. But something they have been very consistent on is their views about free and open access to information, willing to pay both political and monetary penalties to do what they feel is right. (ie, the recent Right to Be Forgotten issue). Free movement of knowledge and even people is a strong personally held view of Google's founders. Latest example: Last evening Mr Brin walked with protesters in San Francisco to show opposition to the recent immigration issue. Respect.
But in re-reading the Spiegel article again now though, I found this telling: Brin: The hacking attacks were the straw that broke the camel's back. There were several aspects there: the attack directly on Google, which we believe was an attempt to gain access to Gmail accounts of Chinese human rights activists. But there is also a broader pattern we then discovered of simply the surveillance of human rights activists.
It comes across like Brin is trying to say that previously they didn't really understand the surveillance that China was performing on their citizens. That's complete bullshit. If any tech leader understands the nature of government monitoring, it's Brin. So what could he have really meant?
Let's think logically about how this relates to the differences between Apple and Google's decisions with regard to China. I think Brin realized that because of the very nature of Google's business, which is essentially surveillance and profiling, that it was just too dangerous for them to continue to operate in China without terrible repercussions for a lot of people. The nature and very soul of Google is personal profiles. That's where and why they make the bulk of their money, I think we can all agree on that much. Perhaps the writing was on the wall at that point that if they were to continue in China that concessions would be required (or perhaps already were being requested), such as the "servers must be in-country" today. That, along with the hacking, probably made them realize that there was no way they'd be able to protect that data over time, so they bailed. Not necessarily a bad decision, as I said earlier, it will take years to understand how it all plays out.
Apple, while they do have visibility into many of their customers, the same kind of deep profiling is clearly not the mainstay of their business, and it's possible to use their hardware without giving an ounce of personal data back to Apple at all. I am proof of that. Even if the decision matrix both companies used was based on the same factors, the coefficients would be very different.
Then again, for all we know Cook/Jobs/Brin/Page and the whole gang all got together over lunch and played rock, paper, scissors...