Privacy advocates express concern over Apple allowing developers to use iPhone X Face ID c...
A series of privacy advocates are taking issue with Apple allowing developers to use the TrueDepth camera central to the Face ID system under limited circumstances -- and with serious restrictions applied by Apple to what coders can use the data for.

A report on Thursday morning by Reuters notes that Apple's terms for developers allows app creators to utilize "certain facial data" such as attaching an augmented reality mask to a user's face. Additionally, some data can be collected by the developers -- assuming they get "clear and conspicuous consent" from the user.
Apple is disallowing access in any way to stored identification data. Under debate is any developer use of the TrueDepth system associated with Face ID.
"The privacy issues around of the use of very sophisticated facial recognition technology for unlocking the phone have been overblown," said Jay Stanley, a senior policy analyst with the American Civil Liberties Union. "The real privacy issues have to do with the access by third-party developers."
Apple's terms specifically prohibit developers from using the face data for advertising or marketing, or generating user profiles to identify anonymous users. Also disallowed is selling the data to others who may use the information.
Clare Garvie, an associate with the Center on Privacy & Technology at Georgetown University Law Center said that Apple's terms for developers are clear that the technology is a "user experience addition" and not one for advertising.
An Apple corporate employee not authorized to speak on behalf of the company told AppleInsider that what developers are allowed to use is "profoundly, seriously limited" and isn't precise enough to be utilized to build a third-party facial recognition database, even if users allow their data to be collated.
The issue isn't developers that adhere to Apple's policies. Advocates are concerned about rogue developers who may use the Face ID system unscrupulously and surreptitiously.
Apple already requires discrete permission to be granted per app to use a camera in the first place. At this time, it is unclear if there is a required dialog box for users to agree to to allow developers to collect Face ID data.
"Apple does have a pretty good historical track record of holding developers accountable who violate their agreements, but they have to catch them first - and sometimes that's the hard part," Stanley said. "It means household names probably won't exploit this, but there's still a lot of room for bottom feeders."
The Face ID system, and the True Depth camera are debuting with the iPhone X -- which will be in customer's hands on Friday. The technology is expected to migrate to Apple's entire fall 2018 line of iPhones.

A report on Thursday morning by Reuters notes that Apple's terms for developers allows app creators to utilize "certain facial data" such as attaching an augmented reality mask to a user's face. Additionally, some data can be collected by the developers -- assuming they get "clear and conspicuous consent" from the user.
Apple is disallowing access in any way to stored identification data. Under debate is any developer use of the TrueDepth system associated with Face ID.
"The privacy issues around of the use of very sophisticated facial recognition technology for unlocking the phone have been overblown," said Jay Stanley, a senior policy analyst with the American Civil Liberties Union. "The real privacy issues have to do with the access by third-party developers."
Apple's terms specifically prohibit developers from using the face data for advertising or marketing, or generating user profiles to identify anonymous users. Also disallowed is selling the data to others who may use the information.
Clare Garvie, an associate with the Center on Privacy & Technology at Georgetown University Law Center said that Apple's terms for developers are clear that the technology is a "user experience addition" and not one for advertising.
An Apple corporate employee not authorized to speak on behalf of the company told AppleInsider that what developers are allowed to use is "profoundly, seriously limited" and isn't precise enough to be utilized to build a third-party facial recognition database, even if users allow their data to be collated.
The issue isn't developers that adhere to Apple's policies. Advocates are concerned about rogue developers who may use the Face ID system unscrupulously and surreptitiously.
Apple already requires discrete permission to be granted per app to use a camera in the first place. At this time, it is unclear if there is a required dialog box for users to agree to to allow developers to collect Face ID data.
"Apple does have a pretty good historical track record of holding developers accountable who violate their agreements, but they have to catch them first - and sometimes that's the hard part," Stanley said. "It means household names probably won't exploit this, but there's still a lot of room for bottom feeders."
The Face ID system, and the True Depth camera are debuting with the iPhone X -- which will be in customer's hands on Friday. The technology is expected to migrate to Apple's entire fall 2018 line of iPhones.
Comments
Also take a moment to appreciate what Facebook and Google Photos are already doing with access to your images.
"Apple allows developers to take certain facial data off the phone as long as they agree to seek customer permission and not sell the data to third parties, among other terms in a contract seen by Reuters.
App makers who want to use the new camera on the iPhone X can capture a rough map of a user’s face and a stream of more than 50 kinds of facial expressions. This data, which can be removed from the phone and stored on a developer’s own servers, can help monitor how often users blink, smile or even raise an eyebrow."
I'd invite you to comment again.
Now Facebook is a different matter, tho I'm not quite as up-to-date with their use of uploaded images. I do know they do some psychological experimentation on their members and perhaps photos are a part of it? My wife finding herself "tagged" with her name in another person's photos was a bit unsettling to her. She did not realize Facebook could ID her. I suppose using a little caution if you're someplace you'd prefer not to be ID'd in might be wise as it seems almost everyone has a Facebook page anymore and most have dozens/hundreds of pics on their page. Even if you don't have a Facebook account yourself you're not safe from being "tagged" and identified in some picture someone posts there.
Or, would Apple release any data that would allow developers to do dirty work "on their servers inaccessible to Apple"? Of course novice bright young urban entrepreneurs will always come forward to brag about their clever "hacking" of Face ID by putting moustaches on that girl's face.
If if an app has given access to the camera and photos, then there’s a chance that a dishonest developer can take your photos, collect data about you, and use it to monetize data or whatever else they could think of (many scary possibilities).
What is like to see happens is that when app needs access to camera, it should be very specific request; for example: “App need access to camera to take photo/video to be stored in your photo album” or something to that extent. The current request is just to generic for app to use camera.
When Google decided to sneak past a bug in Safari and track users who had specifically asked not to be tracked, it demonstrated that the unscrupulous cannot be trusted to toe the requested line.
Not sure this is such a great idea.
https://images.apple.com/business/docs/FaceID_Security_Guide.pdf
Look in "Other uses for FaceID".
This must be a new use of the data that wasn't covered in the Apple White Paper.
My father is concerned all the time, but that goes wit the territory, at 98, his grasp of the complexity of the world is reduced.
That's basically the same thing, but for those "privacy advocate". The thing you don't understand frightens you.
If they're concerned now, wait till those devices are in private hand and scan anything that moves without any regard on their property and in front of their property and then this gets aggregated by some broker who will buy it for money.
Internet of things devices with those capabilities will occur in 3-5 years and will be everywhere, many of them android with almost no security, will be a much bigger concern but
https://developer.apple.com/videos/play/fall2017/601/
The sensor projects 30,000 dots so it can't be more than this and they likely average over a few adjacent dots for a smooth result. The density increases where the surface changes more. The lower count helps with animation performance.
AAA games use higher resolution meshes:
https://wccftech.com/ryse-polygon-count-comparision-aaa-titles-crysis-star-citizen/
It would have to be combined with photo data for ID. Machine learning algorithms can extract pretty accurate meshes from images alone:
http://cvl-demos.cs.nott.ac.uk/vrn/
It can even do a better job than that if it starts with a detailed generic face model and then adjusts it to fit the shape of the person as it would have accurate shapes for the eyeballs, eyelids, nostrils, mouth, teeth, ears etc.
Even with high resolution meshes and image data producing a 3rd party database, there's not much that it can be used for that would affect people. It might be useful to the medical industry, plastic surgeons, anthropologists, fashion industry to find potential models.
It's not as if Apple has exclusive depth sensors either. Stores can put a Kinect-like sensor at the checkout when you pay for something.
It would actually be quite useful to have developers able to access the high-res data from the sensor as they can make 3D scanners. Hold the device close to an object and move the camera around it to capture a model.
Apple can prevent apps uploading data, they can put restrictions on apps that have face scanning and ask the user if they want to allow the app to send or share data. If not, network access and share access can be locked down while the scanner is active and any attempt to send new data either from memory or the filesystem has to be approved by the user.