Apple's differential privacy analyzes the group, protects the individual
Apple's incorporation of differential privacy in its data collection efforts marks the first wide scale use of the technique, according to Prof. Aaron Roth, the University of Pennsylvania mathematician who "literally wrote the book" on it, as Craig Federighi, Apple's senior vice president of Software Engineering, said at WWDC on June 13.

Associate Professor of Computer and Information Science at the University of Pennsylvania Aaron Roth.
In an exclusive interview with AppleInsider, Roth noted that while Apple hadn't achieved a breakthrough in the technique, it has become the industry leader in incorporating differential privacy across its services.
"The thing that I think is really exciting about this is not that they have a new scientific or technological breakthrough," said Roth. "Differential privacy is something that we've been studying for more than a decade now. It's that they've decided to make this a central part of their product."
Roth, associate professor of Computer and Information Science at the University of Pennsylvania computer science department, wrote the book "The Algorithmic Foundations of Differential Privacy" with Cynthia Dwork, which was published in 2014.
The concept behind differential privacy is the idea of obscuring or introducing "noise" into big data results to mask individual inputs while still getting useful information on larger trends. Roth uses the example of trying to conduct a survey of people who might want to use recreational marijuana: While a data collector would be interested in large-scale results, he or she would not want to reveal any individual's answer. So the best means is to obscure individual answers while still getting representative samples.
Apple uses these large data patterns to improve products and develop new ones. The first academic paper on differential privacy was issued in 2006, according to Roth. Where Apple is breaking new ground is in incorporating it on a widespread basis.
"I think what's exciting to me is that Apple is making it a central focus of its data collection efforts," said Roth.
While Apple's incorporation of differential privacy -- announced at WWDC -- was touted as "groundbreaking," and "visionary," Apple is not the first company to use the process to protect users' data. Google has been using the technique with its Chrome browser for its Google Report Project.
Roth was approached to complete a peer review of Apple's research and is a consultant to the company on privacy matters.
Since WWDC he says he's been receiving more phone calls than usual and "It's definitely interesting to have my 15 minutes of fame. Hopefully, once the hype dies down, what I hope is that this encourages people to read more about the science of differential privacy."
Apple's use of the technique has also caught the attention of other privacy advocates.
"As with all technology, the devil is in the details," Jeremy Gillula, a staff researcher at the Electronic Frontier Foundation told AppleInsider. "But that's true with all technology. It is certainly a promising thing and at least the fact that Apple is trying to incorporate it is a step in the right direction. From our perspective, it would be important for Apple, as it puts these techniques in their systems, to release details or even release code so that independent third party analysts and researchers can go through it."

Associate Professor of Computer and Information Science at the University of Pennsylvania Aaron Roth.
In an exclusive interview with AppleInsider, Roth noted that while Apple hadn't achieved a breakthrough in the technique, it has become the industry leader in incorporating differential privacy across its services.
"The thing that I think is really exciting about this is not that they have a new scientific or technological breakthrough," said Roth. "Differential privacy is something that we've been studying for more than a decade now. It's that they've decided to make this a central part of their product."
Roth, associate professor of Computer and Information Science at the University of Pennsylvania computer science department, wrote the book "The Algorithmic Foundations of Differential Privacy" with Cynthia Dwork, which was published in 2014.
The concept behind differential privacy is the idea of obscuring or introducing "noise" into big data results to mask individual inputs while still getting useful information on larger trends. Roth uses the example of trying to conduct a survey of people who might want to use recreational marijuana: While a data collector would be interested in large-scale results, he or she would not want to reveal any individual's answer. So the best means is to obscure individual answers while still getting representative samples.
Apple uses these large data patterns to improve products and develop new ones. The first academic paper on differential privacy was issued in 2006, according to Roth. Where Apple is breaking new ground is in incorporating it on a widespread basis.
"I think what's exciting to me is that Apple is making it a central focus of its data collection efforts," said Roth.
While Apple's incorporation of differential privacy -- announced at WWDC -- was touted as "groundbreaking," and "visionary," Apple is not the first company to use the process to protect users' data. Google has been using the technique with its Chrome browser for its Google Report Project.
Roth was approached to complete a peer review of Apple's research and is a consultant to the company on privacy matters.
Since WWDC he says he's been receiving more phone calls than usual and "It's definitely interesting to have my 15 minutes of fame. Hopefully, once the hype dies down, what I hope is that this encourages people to read more about the science of differential privacy."
Apple's use of the technique has also caught the attention of other privacy advocates.
"As with all technology, the devil is in the details," Jeremy Gillula, a staff researcher at the Electronic Frontier Foundation told AppleInsider. "But that's true with all technology. It is certainly a promising thing and at least the fact that Apple is trying to incorporate it is a step in the right direction. From our perspective, it would be important for Apple, as it puts these techniques in their systems, to release details or even release code so that independent third party analysts and researchers can go through it."
Comments
It's really very simple.
There are legions of haters/losers/idiots who always try to break down Apple discussions using a binary technique. They then apply this simple Yes/No to whatever topic is at hand so they can try to minimize what Apple is doing or make it appear others have already done it. Here are three examples (two well known and the third in this article):
- Fingerprint Scanner. Apple introduced it in the 5S. Immediately all the idiots start proclaiming that the Motorola Atrix had a fingerprint sensor before the iPhone 5S. And they would be correct, as they are just "stating the facts" as thewhitefalcon said above. But they are being deceitful. The facts are that the Atrix had a horrid fingerprint sensor that was not only unreliable, but had an astonishingly bad failure rate (if you owned the phone for more than 6 months there was a good chance the sensor had already died). Other phones (like Samsung) added fingerprint sensors, but their first generation versions were also terrible. If you looked at these from a Yes/No perspective, then you could say that the iPhone 5S was no different than other phones which also had fingerprint sensors. If you looked at the real "facts" you'd see the implementation as done by Apple makes their version superior.
- Data Collection. Google collects your data. Apple collects your data. Therefore Google and Apple are equal. This is one the Google fans like to trot out whenever someone pokes fun at all the data Google collects about you or talks about user privacy. It's sheer idiocy to think that the data collection done by Apple is anywhere near the scale of Google. Or that Apple monetizes your data to the same scale as Google (which makes some $50+ billion a year off targeted ads based on this data). Again, reducing something down to a simple Yes/No to try and make Apple appear the same as Google when it comes to collecting your data.
- Differential Privacy. If you look at articles on all the tech blogs, idiots everywhere were coming out of the woodwork with their usual "Apple wasn't the first to use this" or "Google already uses this". I especially like the last one, as it implies that Google uses this everywhere (instead of just in RAPPOR). Let's be clear here: Google DOES NOT use differential privacy in its core business, the one responsible for almost 90% of their revenues - targeted ads. There are several papers on this topic about using differential privacy with targeted ads, but at the time it doesn't seem possible to achieve the same level of granularity when targeting ads while also using differential privacy. And since 90% of Googles revenue comes from targeted ads, you can bet they won't be using any new technology that could affect their ability to make money off those ads. Which is why I predict it will be some time (if ever) before we see Google talk about differential privacy in the context of targeted ads.
So it's important to explain in the article what Google is doing with differential privacy not as a means to slight Apple or promote Google, but to take the winds out of the sails of the idiots who will, again, try to minimize what Apple has done here. That is, to dive into the deep end and use differential privcacy across all their services, insetad of using it like a research study on some minor aspect of your business (like everyone else).
It think it is necessary to point out the reason why he has to make the argument, perhaps not with the insults, although some people are really just ignorant. They might not be really 'idiots', 'haters' or 'losers', but if you read some of their comments in tech sites, you can't help but believed they were.
Not going to edit anything. The vast majority of people I see posting these comments are clearly trolls who are fully aware of what they are saying and are being intentionally obtuse with the intention of misleading readers. This, IMO, makes them worse than the usual trolls who spout useless stuff like "Apple sucks". They are carefully wording what they say with the express intent to deceive, and I'm not going to mince words when describing them.
To the very small minority who make such comments without any intention to deceive, or because they lack the knowledge and are just interested in furthering their understanding, I apologize. To the rest I say "fuck you".
My advice is to adapt better reading skills without letting your emotions control what you post, because clearly you misunderstood my example, that contrasted how a cross-reference use of a technology would also be perfectly fine for a website that was detailing an Android feature that also referenced when Apple had added the features years earlier.
You nearly always make good points, and generally supported with facts. You really want visiting readers to see Apple fans as just a different side of the same hater-troll-idiot coin? It's easy to avoid, just don't tag everyone who might not agree with you as one of them. That's all I'll say about it.