Google engineer proves any iPhone app with permission to access the camera is capable of s...
A Google engineer has demonstrated it is possible for a malicious iOS app to spy on a user, with a proof of concept app capable of photographing or recording from both iPhone cameras without the user's knowledge, all by exploiting the permissions granted by the user allowing access to the cameras.

Researcher Felix Krause, founder of Fastlane.Tools, created the watch.user concept app to show how far the camera permissions could be pushed, reports The Next Web. Once granted, Krause advises it is possible for an app to photograph and record from the cameras any time the app is in the foreground, without informing the user the images and video are being captured with flashes or other indicators.
Krause also claims it can then upload the images and video to an app's servers, including broadcasting a live feed from the iPhone itself. It is suggested that it is possible for a malicious developer to determine the user's location based on the image data, and to run facial recognition on still frames to find other photos of the user or to discover their identity.
A video demonstrating the test app's capabilities also shows it can also track the movements of the user's mouth, nose, eyes, and the entire face, and can even determine the mood of the user based on their facial expressions. Krause advises this part uses the Vision framework introduced in iOS 11, designed to allow developers to track a user's facial movements.
Notably, the issue is only a problem if the app is in the foreground, but Krause highlights that this could still cause privacy problems. For example, if a user decides to browse a social app while in the bathroom, and the app includes such code, it would be theoretically possible for it to record the user in a somewhat compromising position.
To answer criticism that people would never grant camera permissions, Krause warns many users will have already provided access to their image libraries and cameras to social networks and messaging apps, which could be updated with the malicious code.
Krause suggests worried users could protect themselves by using camera covers which block out the camera's view entirely, claimed by the researcher to be "the only real safe way." It is also suggested for users to revoke camera access for all apps, to always use the build-in camera app, and to use the image picker of each app to select the photograph to publish, or to use Copy and Paste to move the image to the application.
The researcher has disclosed the issue to Apple, at the same time as making some suggestions for how to avoid this from becoming a long-term issue. Suggestions include providing apps temporary access to the camera, and putting an icon in the status bar showing the camera is active and forcing the status bar to be visible when an app accesses the camera.
On the hardware side, Krause suggests adding an LED to the camera modules on both sides that cannot be kept off by sandboxed apps. This would be similar to the LED used by the MacBook, which lights up whenever the camera is in use, but it is doubtful Apple would make such a change to the iPhone's design, given the evolution of the top bezel to a notch in the iPhone X.
This is not the only potential security hole in iOS that Krause has discovered. Earlier this month, Krause disclosed another proof of concept app that displayed a popup similar to one used to enter an iTunes or Apple ID password, which could theoretically be used to steal a user's credentials.

Researcher Felix Krause, founder of Fastlane.Tools, created the watch.user concept app to show how far the camera permissions could be pushed, reports The Next Web. Once granted, Krause advises it is possible for an app to photograph and record from the cameras any time the app is in the foreground, without informing the user the images and video are being captured with flashes or other indicators.
Krause also claims it can then upload the images and video to an app's servers, including broadcasting a live feed from the iPhone itself. It is suggested that it is possible for a malicious developer to determine the user's location based on the image data, and to run facial recognition on still frames to find other photos of the user or to discover their identity.
A video demonstrating the test app's capabilities also shows it can also track the movements of the user's mouth, nose, eyes, and the entire face, and can even determine the mood of the user based on their facial expressions. Krause advises this part uses the Vision framework introduced in iOS 11, designed to allow developers to track a user's facial movements.
Notably, the issue is only a problem if the app is in the foreground, but Krause highlights that this could still cause privacy problems. For example, if a user decides to browse a social app while in the bathroom, and the app includes such code, it would be theoretically possible for it to record the user in a somewhat compromising position.
To answer criticism that people would never grant camera permissions, Krause warns many users will have already provided access to their image libraries and cameras to social networks and messaging apps, which could be updated with the malicious code.
Krause suggests worried users could protect themselves by using camera covers which block out the camera's view entirely, claimed by the researcher to be "the only real safe way." It is also suggested for users to revoke camera access for all apps, to always use the build-in camera app, and to use the image picker of each app to select the photograph to publish, or to use Copy and Paste to move the image to the application.
The researcher has disclosed the issue to Apple, at the same time as making some suggestions for how to avoid this from becoming a long-term issue. Suggestions include providing apps temporary access to the camera, and putting an icon in the status bar showing the camera is active and forcing the status bar to be visible when an app accesses the camera.
On the hardware side, Krause suggests adding an LED to the camera modules on both sides that cannot be kept off by sandboxed apps. This would be similar to the LED used by the MacBook, which lights up whenever the camera is in use, but it is doubtful Apple would make such a change to the iPhone's design, given the evolution of the top bezel to a notch in the iPhone X.
This is not the only potential security hole in iOS that Krause has discovered. Earlier this month, Krause disclosed another proof of concept app that displayed a popup similar to one used to enter an iTunes or Apple ID password, which could theoretically be used to steal a user's credentials.
Comments
Is this a serious news item?
So why would a social app for instance be interested in doing this? Since the ability to take photos/video without the user's knowledge or explicit consent is now granted every time the app is in the foreground "... facial recognition could be used to identify you, and even use facial expression analysis to measure your emotional response to things like ads displayed in the feed."
If you watch the posted video you will better understand the possible concern.
The camera is NOT hardwired to any display device and this allows the application to do all sorts of real time effects BEFORE displaying them or it can simply discard the data.
THIS IS THE POINT OF THE CAMERA!!! From Android to Windows Mobile to iOS. THIS IS HOW THE CAMERA WORKS!!!
Then uploading the data? This is what a smartphone DOES. Ii INTERFACES to the internet. Have you head of apps like Instagram? Snapchat? Facebook? Can an app be evil (like all the crap showing up in the Play Store shows)? Yes. Could an app be legitimately designed to record video without any visual indication? Absolutely. I am thinking of a wildlife trip camera where the visual notification or the display turning on would scare the wildlife.
That said, a very small LED on the front and back would be... interesting. Distracting and something I would personally want to shut off, but interesting.
But this is a 100% WTF is this guy smoking? He should not go into work high or think the epiphanies he gets while high on the most recent weed is really a true epiphany.
You simply gave another off=the=wall hypothetical like: I discovered a new use for baseball bats. To break windows!!!! We should design baseball bats to know when they are being used to break a window and turn to water.
"Intended use"
There is nothing new here. Access to the camera was granted by the user and dialog explicitly says "To take pictures and detect your face". If that developer abuses that access and uses it beyond what they've told the user, then it's up to Apple to catch that and shut it down. Apple has guidelines about recording ... there has to be a clear indicator that the camera is "recording".
Allowing an app access to specific hardware and data has ALWAYS been a one time only authorization with an option to disable it later should the user choose to do so. Settings - > Privacy
The user has always only been able to assume that the app was only doing what it says it's doing with that granted permission.
Apple's entire vetting process was designed to make sure these apps were/are behaving appropriately. I'm pretty sure that whenever an app asks for access to something, it signals a flag and Apple jumps all over every instance that app accesses those API's and makes sure it isn't breaking any rules.
---------
But yes, I could write an app, install it on my iPhone, and I could have it record video constantly. In fact, I don't have to follow ANY of Apple's guidelines and rules when I install my own apps. The problem is, my app would never make it on to the App Store because of that. In fact, I wrote my own workout app that constantly listens for audio, so I can give it specific commands. But that's an app I wrote for my own use.
There's nothing new here. It's another attempt to try and cast Apple in a bad light.