Inside iOS 10: Apple doubles down on security with cutting edge differential privacy
After a bruising battle with the Federal Bureau of Investigation and a contentious debate over encryption in the wake of the San Bernadino terrorist shooting, Apple is doubling down on privacy protection by researching cutting-edge privacy techniques for iOS 10, allowing advanced new features while protecting user data.
"All of this great work in iOS 10 would be meaningless to us if it came at the expense of your privacy. And so in every feature that we do we carefully consider how to protect your privacy," said Craig Federighi, Apple's senior vice president of Software Engineering, at the WWDC16 event on June 13.
In pursuit of advanced protection, Apple is investing in "differential privacy," a means of maximizing the accuracy of queries from statistical databases while at the same time minimizing the chances of identifying specific individuals. Differential privacy uses a variety of techniques to do this, like hashing, subsampling and noise injection.
When Aaron Roth, a world-class privacy researcher from the University of Pennsylvania, saw the company's efforts, he called them "groundbreaking" and said that scaling it up and incorporating it broadly into the technology "is visionary and positions Apple as the clear privacy leader among technology companies today."
The technique is important since Apple, like many companies, attempts to analyze device use to spot mass trends so that software improvements can be made - for example, discovering new words to include in QuickType.
In addition, Apple is continuing its practice of providing end-to-end encryption by default in apps like Facetime, Messages and Homekit to protect communications. When it comes to advanced, deep learning, artificial intelligence analysis of user data, it is being done on the device itself, keeping personal data under user control.
Federighi also pledged that Apple builds no user profiles based on Internet searches.
"We believe you should have great features and great privacy," he said. "You demand it and we're dedicated to providing it."
"All of this great work in iOS 10 would be meaningless to us if it came at the expense of your privacy. And so in every feature that we do we carefully consider how to protect your privacy," said Craig Federighi, Apple's senior vice president of Software Engineering, at the WWDC16 event on June 13.
In pursuit of advanced protection, Apple is investing in "differential privacy," a means of maximizing the accuracy of queries from statistical databases while at the same time minimizing the chances of identifying specific individuals. Differential privacy uses a variety of techniques to do this, like hashing, subsampling and noise injection.
Apple's work in differential privacy has earned the praise of Aaron Roth, a world-class expert on the subject, who called the technology in iOS 10 "groundbreaking."
When Aaron Roth, a world-class privacy researcher from the University of Pennsylvania, saw the company's efforts, he called them "groundbreaking" and said that scaling it up and incorporating it broadly into the technology "is visionary and positions Apple as the clear privacy leader among technology companies today."
The technique is important since Apple, like many companies, attempts to analyze device use to spot mass trends so that software improvements can be made - for example, discovering new words to include in QuickType.
In addition, Apple is continuing its practice of providing end-to-end encryption by default in apps like Facetime, Messages and Homekit to protect communications. When it comes to advanced, deep learning, artificial intelligence analysis of user data, it is being done on the device itself, keeping personal data under user control.
Federighi also pledged that Apple builds no user profiles based on Internet searches.
"We believe you should have great features and great privacy," he said. "You demand it and we're dedicated to providing it."
Comments
Haha, just kidding.
I notice governments want citizens to have weaker security/privacy while citizens are not fed up with governments loosing data for not being secure enough.
Even if the data was de-noised, it's still not personally identifying individuals. That's a big difference in how the data is collected. The noise just makes it even less useful to those trying to capitalize on "people as the product".
I wonder what impact this will have an Apple's rumoured issues with being able to recruit experts in the field of AI. If they were concerned about the potential of not having access to the same data as they would at companies like Facebook and Google that have no issue spying on their users will this be enough of a consolation to offset those concerns.
I like that Apple is working on ways to get the data required while also finding new ways to ensure user privacy is maintained. It's not the easy was but it's certainly the way that will keep me as a loyal customer.
I am not a part of Facebook. I stopped using Google search over 2 years ago and all of my devices are Google free zones.
Google sends their $&#% spy cams on the road, takes photos of my house and posts them on the Internet. I have taken to planting trees all around my home to give their spy cams as little view of my home as possible.
I do make a substantial number of purchases on Amazon and I know they have a pretty good profile on me. However I trust Amazon far more than Google.
I have searched my name and profile using All of the search engines and Google
along with the pay sites. Half of the information is wrong and much of it is old. However, pictures of my home and vehicle are on the Internet courtesy of Google. It is quite unsettling, and the company gets none of my business as a result.
If the millennials don't care, that's their problem. I do care and I refuse to give my business to any company that profiles me so that they can sell it to the highest bidder. No thanks.
I'll stay with Apple on iOS using Anonymizer when browsing the web. Google is more than welcome to profile those too lazy, too uncaring, too trusting to protect their profiles.
https://www.theguardian.com/technology/2015/apr/10/facebook-admits-it-tracks-non-users-but-denies-claims-it-breaches-eu-privacy-law
...but this year they fessed up and admitted they were doing so, but that it still wasn't breaking EU law.
http://www.theverge.com/2016/5/27/11795248/facebook-ad-network-non-users-cookies-plug-ins
You only THINK you're not part of Facebook. But both they and Acxiom, along with 1000's of other data brokers buying and selling your health, financial, and family information would thank you for keeping your focus on just Google.
For most companies making apps, especially e-commerce apps:
A/B testing, Analytics frameworks? Check.
Logging all your requests? Check.
Requiring an account, even though most of your data could use a temporary ID? Check.