Google engineer proves any iPhone app with permission to access the camera is capable of s...

1235

Comments

  • Reply 81 of 103
    steven n.steven n. Posts: 1,229member
    steven n. said:

    You didn't answer the question. 'if the app is in foreground then it is actively used. How would you distinguish between "intended" and "unintended" use?'


    The way i see this: FarceBook (or Googley or Instagrammy or some other popular app) has this malicious code inserted. Instead of the camera activating ONLY when you make a face-to-face 'call' to a person in your contact list, the 'Facetime' camera ALSO activates while you are simply viewing your feed, reading news, or doing other 'stuff' that normally doesn't use the camera. Because the app is active, the camera is active and the camera is taking pictures/videos of you and uploading them to a secret server. Or maybe you are using a texting app that also allows you to snap a selfie and instantly send it to your significant other, but winds up on said 'secret server' to be leaked later for all the world to see. So, yes,, this could cause embarrassment, and yes, Apple should scan for this misuse of code during the app approval/update process.
    The question is how? What magic crystal ball does anyone use to divine this unknown malicious code? You don't think people take selfies? Per your list, you think they don't.

    I don't care that you just realized you could use a match to burn down your house instead of starting a camp fire.
    randominternetpersonHerbivore2
  • Reply 82 of 103
    steven n. said:
    steven n. said:

    You didn't answer the question. 'if the app is in foreground then it is actively used. How would you distinguish between "intended" and "unintended" use?'


    The way i see this: FarceBook (or Googley or Instagrammy or some other popular app) has this malicious code inserted. Instead of the camera activating ONLY when you make a face-to-face 'call' to a person in your contact list, the 'Facetime' camera ALSO activates while you are simply viewing your feed, reading news, or doing other 'stuff' that normally doesn't use the camera. Because the app is active, the camera is active and the camera is taking pictures/videos of you and uploading them to a secret server. Or maybe you are using a texting app that also allows you to snap a selfie and instantly send it to your significant other, but winds up on said 'secret server' to be leaked later for all the world to see. So, yes,, this could cause embarrassment, and yes, Apple should scan for this misuse of code during the app approval/update process.
    The question is how? What magic crystal ball does anyone use to divine this unknown malicious code? You don't think people take selfies? Per your list, you think they don't.

    I don't care that you just realized you could use a match to burn down your house instead of starting a camp fire.
    This one seems so obvious I'm surprised to see it come up. If the app activates the camera without the user pressing a "camera go ON now" button, it's bad. If the user pushes the button, the camera goes on. No button, no camera. Seems pretty simple to me.
    gatorguy
  • Reply 83 of 103
    tzeshantzeshan Posts: 2,351member
    Now I know how retarded Google Dreamers are? 
  • Reply 84 of 103
    clemynx said:
    A running app to which I gave permission can use the camera. SHOCKER !!
    Well be cynical whatever you want, but he brought up a good point as well as great suggestions. This is not an Apple VS Google discussion and how awesome Apple is, but about a person with valid concerns and good suggestions to improve the privacy of any user. 

    Apple devs should do the same for Android devs! It’s free advice, not a platform attack. 
    edited October 2017 muthuk_vanalingamgatorguy
  • Reply 85 of 103
    mjtomlinmjtomlin Posts: 2,673member
    So, yes,, this could cause embarrassment, and yes, Apple should scan for this misuse of code during the app approval/update process.

    This is the main problem I have with this engineer's findings - he failed to mention (or maybe doesn't know) that Apple does look for this misuse, in fact, it's in the damned developer license agreement...

    3.3.8 If Your Application makes recordings (including but not limited to an image, picture or voice capture or recording) (collectively “Recordings”), a reasonably conspicuous audio, visual or other indicator must be displayed to the user as part of the Application to indicate that a Recording is taking place.

    randominternetpersonbrucemcbakedbananas
  • Reply 86 of 103
    charlitunacharlituna Posts: 7,217member
    Rayz2016 said:
    airnerd said:
    problem identified, apple will solve it. 


    Not sure there’s a problem to be solved to be honest. 

    i agree with you for different reasons. camera and mic access are big privacy issues and Apple touts itself as being hip to the concerns. I suspect that they might already be scanning submitted apps for APIs for such items and verifying that the usage is within the rules. so an app that activates the camera without signaling that the camera is on would probably not be accepted
    randominternetperson
  • Reply 87 of 103
    mjtomlin said:
    So, yes,, this could cause embarrassment, and yes, Apple should scan for this misuse of code during the app approval/update process.

    This is the main problem I have with this engineer's findings - he failed to mention (or maybe doesn't know) that Apple does look for this misuse, in fact, it's in the damned developer license agreement...

    3.3.8 If Your Application makes recordings (including but not limited to an image, picture or voice capture or recording) (collectively “Recordings”), a reasonably conspicuous audio, visual or other indicator must be displayed to the user as part of the Application to indicate that a Recording is taking place.

    I think the more interesting question is how Apple (or any other app store) can enforce this policy. Unless they scan the source code at great cost this kind of abuse cannot be reliably discovered.
    gatorguy
  • Reply 88 of 103
    prof said:
    gatorguy said:

    It can continue to use the camera even after the intended use is done and over. For instance from the Facebook app you take a pic to post. But if Facebook wanted to be evil that allows the app to continue recording images that you would not have explicitly authorized and continue doing so minute by minute with no way for a user to know it was happening. That's what he brought to Apple's attention. The camera permission does not restrict the camera use to only what the user would intend to grant it. 
    Err, if the app is in foreground then it is actively used. How would you distinguish between "intended" and "unintended" use?


    By whether or not the "shutter" button has been pressed by the user, for snap shots, and by whether or not the "record" button has been pressed for video.

    If an app can use the camera without the user pressing the "shutter" or "record" buttons, that is a problem.  Even if the app has to be in the foreground, if both the front and back cameras can be used this way, that means that the app could be taking pictures and video I don't know about, because I haven't explicitly told the camera to operate.

    That seems bad to me.

    Now, if the code to do such would never make it through the app store, that's probably sufficient.  But think about this.  This guy published his findings.   How do you know, without a specific statement from Apple, that there isn't an app on the App Store doing this now?  How do you know someone didn't discover this last year, and decided to exploit it, rather than reveal it?  You don't.

    The fact that it violates the developer license agreement doesn't necessarily mean that someone hasn't figured out how to do it.  I would hope that Apple can detect this sort of thing, but I also know that malicious code isn't flying a flag that say "hey look at me, I'm doing something bad!"  Those malware writers take great pains to hide their code, and they get better at it right alongside the guys trying to detect it.  It's all very Darwinian at times.

    edited October 2017 muthuk_vanalingam
  • Reply 89 of 103
    mjtomlin said:

    In fact, I wrote my own workout app that constantly listens for audio, so I can give it specific commands. But that's an app I wrote for my own use.


    OT, properly configured, I'd probably buy that app.  Just sayin'. :)
  • Reply 90 of 103
    You’re dense. If I’m in the Facebook app and using the built in camera to take a picutre to post, that is an intended use of the camera and microphone in the app. Once I’m done taking the picture and exit the camera app to resume reading other posts I DO NOT want the Facebook app to access the camera or microphone anymore. Any use of the camera or microphone outside of using the built in camera app, for me at least, is unintended access. Maybe you don’t care about this, but other people do because I want to know when the camera and microphone is being used in the background. 
    The camera and microphone cannot be used in the background in iOS.

    I suspect that "background" in this context is intended to mean in the background of the Facebook app's process, not the iOS background to which apps go when they're not in your face.  If Facebook can be the active app and can be using the camera behind the scenes, without the user pressing the shutter or record buttons, that seems like a problem to me, unless the Facebook is up front about using the camera all the time.

    Now, if the code to do such would never make it through the app store, that's probably sufficient.  But think about this.  This guy published his findings.   How do you know, without a specific statement from Apple, that there isn't an app on the App Store doing this now?  How do you know someone didn't discover this last year, and decided to exploit it, rather than reveal it?  You don't.

    I honestly don't get why people don't see this as a potential problem.

    muthuk_vanalingam
  • Reply 91 of 103
    steven n. said:
    airnerd said:
    You're missing what I'm saying, I grant access to my camera so I can use it and photos.  But that doesn't mean i give it permission to capture any time I have facebook open.  
    I don't think you understand how computers and smartphones work. If you grant Facebook access to use your camera that means you have granted Facebook access to use your camera. The application/iOS does not have the ability to read your mind and divine, using some type of magic, if you think Facebook should have access to your camera at some random point in time.

    If you do not want Facebook to have access to your camera, DENY IT ACCESS!!!

    Well, I for one understand how computers work just find.  You are missing what he's saying.  Granting Facebook permission to use the camera doesn't imply "use the camera anytime it wants."  It implies, "use the camera to take pictures and videos," my intention to do which is signified by pressing the shutter or record button.  If I haven't pressed the shutter or record button, I don't expect the camera to be taking pictures or video.

    Now, if the code to do such would never make it through the app store, that's probably sufficient.  But think about this.  This guy published his findings.   How do you know, without a specific statement from Apple, that there isn't an app on the App Store doing this now?  How do you know someone didn't discover this last year, and decided to exploit it, rather than reveal it?  You don't.

    muthuk_vanalingamHerbivore2gatorguy
  • Reply 92 of 103
    steven n.steven n. Posts: 1,229member
    steven n. said:
    steven n. said:

    You didn't answer the question. 'if the app is in foreground then it is actively used. How would you distinguish between "intended" and "unintended" use?'


    The way i see this: FarceBook (or Googley or Instagrammy or some other popular app) has this malicious code inserted. Instead of the camera activating ONLY when you make a face-to-face 'call' to a person in your contact list, the 'Facetime' camera ALSO activates while you are simply viewing your feed, reading news, or doing other 'stuff' that normally doesn't use the camera. Because the app is active, the camera is active and the camera is taking pictures/videos of you and uploading them to a secret server. Or maybe you are using a texting app that also allows you to snap a selfie and instantly send it to your significant other, but winds up on said 'secret server' to be leaked later for all the world to see. So, yes,, this could cause embarrassment, and yes, Apple should scan for this misuse of code during the app approval/update process.
    The question is how? What magic crystal ball does anyone use to divine this unknown malicious code? You don't think people take selfies? Per your list, you think they don't.

    I don't care that you just realized you could use a match to burn down your house instead of starting a camp fire.
    This one seems so obvious I'm surprised to see it come up. If the app activates the camera without the user pressing a "camera go ON now" button, it's bad. If the user pushes the button, the camera goes on. No button, no camera. Seems pretty simple to me.
    Sigh.... ANY button or gesture could be considered a camera on button. So you want apps with no possible way to interact with them. Great. I hope you don’t make policy. 
    randominternetperson
  • Reply 93 of 103
    Lookseek.com is a no tracking search engine give it a try
  • Reply 94 of 103
    steven n.steven n. Posts: 1,229member
    steven n. said:
    airnerd said:
    You're missing what I'm saying, I grant access to my camera so I can use it and photos.  But that doesn't mean i give it permission to capture any time I have facebook open.  
    I don't think you understand how computers and smartphones work. If you grant Facebook access to use your camera that means you have granted Facebook access to use your camera. The application/iOS does not have the ability to read your mind and divine, using some type of magic, if you think Facebook should have access to your camera at some random point in time.

    If you do not want Facebook to have access to your camera, DENY IT ACCESS!!!

    Well, I for one understand how computers work just find.  You are missing what he's saying.  Granting Facebook permission to use the camera doesn't imply "use the camera anytime it wants."  It implies, "use the camera to take pictures and videos," my intention to do which is signified by pressing the shutter or record button.  If I haven't pressed the shutter or record button, I don't expect the camera to be taking pictures or video.

    Now, if the code to do such would never make it through the app store, that's probably sufficient.  But think about this.  This guy published his findings.   How do you know, without a specific statement from Apple, that there isn't an app on the App Store doing this now?  How do you know someone didn't discover this last year, and decided to exploit it, rather than reveal it?  You don't.

    I don’t think you do understand Software, computers, smartphones or programming. 

    For example: 
    “if I haven’t pressed the camera or record button”

    So, in your mind, SnapChat is banned because it opens the camera data stream simply by opening the app. Same with Instagram. Once the stream is open, pressing the “record” button simply saves a snapshot of the data the app has already had access to.

    what if you “swipe up” to bring up the camera?

    basically, you are saying you want apps with no ability to interact with them because any interaction could be the interaction to trigger the camera. 

    Felix, in no way, had an epiphany here except of the type you get when you are massively stoned. The problem is, his head never cleared up when he looked at his scrawling to realize his “epiphany” was a drug induced delusion. 

    randominternetpersonbrucemc
  • Reply 95 of 103
    dysamoria said:
    Hasn't it also already been demonstrated that malware can activate the camera on laptops without turning on their camera LED? How would an LED or status bar indicator solve the uncertainty of camera activation?
    If that's true for an Apple laptop or desktop, I'd be interested to read about it.  As far as I know, the cameras and the indicator light on Macs are impossible to control separately via software.
  • Reply 96 of 103
    mjtomlin said:
    So, yes,, this could cause embarrassment, and yes, Apple should scan for this misuse of code during the app approval/update process.

    This is the main problem I have with this engineer's findings - he failed to mention (or maybe doesn't know) that Apple does look for this misuse, in fact, it's in the damned developer license agreement...

    3.3.8 If Your Application makes recordings (including but not limited to an image, picture or voice capture or recording) (collectively “Recordings”), a reasonably conspicuous audio, visual or other indicator must be displayed to the user as part of the Application to indicate that a Recording is taking place.

    Thanks for finding and posting the specific prohibition.  It says something about us that it wasn't until the 86th message that this was done.  Frankly it should have been in the original dude's "expose" and the AI story.

    To Bigmushroom's assertion that it would be very difficult for Apple to discover this "at great cost" during the review process, I expect that that's completely wrong.  I would hope that when Apple reviews apps, they not only do code analysis to look for red flag, but also monitor all the "interesting" subsystems while physically/manual testing the app.  This would include "watching" (monitoring/logging/auditing) when the camera or mic is activated and watching what data is transmitted or recorded. 

    So if, for example, an app that asks for permission to use the camera to scan a QR code, appears to be using the camera all the time, it would be rejected.  As other's have said, the Google guy's story is 100% pointless without any evidence that Apple wouldn't reject his malware app.

    Maybe I'll write an expose that points out that there is nothing stopping someone creating a MyDiaryOfDirtySecrets app that actually sends all the text directly to Twitter.  I could create a proof of concept app that does exactly that this afternoon.  And I have no doubt that this app would never make it into the App Store.
    brucemc
  • Reply 97 of 103
    gatorguygatorguy Posts: 24,213member
    mjtomlin said:
    So, yes,, this could cause embarrassment, and yes, Apple should scan for this misuse of code during the app approval/update process.

    This is the main problem I have with this engineer's findings - he failed to mention (or maybe doesn't know) that Apple does look for this misuse, in fact, it's in the damned developer license agreement...

    3.3.8 If Your Application makes recordings (including but not limited to an image, picture or voice capture or recording) (collectively “Recordings”), a reasonably conspicuous audio, visual or other indicator must be displayed to the user as part of the Application to indicate that a Recording is taking place.

    Thanks for finding and posting the specific prohibition.  It says something about us that it wasn't until the 86th message that this was done.  Frankly it should have been in the original dude's "expose" and the AI story.

    To Bigmushroom's assertion that it would be very difficult for Apple to discover this "at great cost" during the review process, I expect that that's completely wrong.  I would hope that when Apple reviews apps, they not only do code analysis to look for red flag, but also monitor all the "interesting" subsystems while physically/manual testing the app.  This would include "watching" (monitoring/logging/auditing) when the camera or mic is activated and watching what data is transmitted or recorded.
    IMO if Apple were consistently checking apps that thoroughly they would not have had a few hundred apps discovered to have built over malware last year, nor would they have to be culling thousands of approved apps this year that have no useful purpose, or claim to have a function they don't, or that are plainly near identical rip-offs of well-known popular and original apps. Malware scanner apps would not be approved that include a permission to read the user's device password, nor would apps that supposedly scan for viruses be approved to begin with. Yet all of these have been approved, only later to be discovered as either malware or PUP's and typically found by someone other than Apple. 

    Apple plainly does a far better job of vetting apps than Google currently does. We all know that. But Apple also does not appear to be checking apps as deeply as has been presumed by some folks, or at least not on a consistent basis. 
    edited October 2017 muthuk_vanalingamlorin schultz
  • Reply 98 of 103
    gatorguy said:
    mjtomlin said:
    So, yes,, this could cause embarrassment, and yes, Apple should scan for this misuse of code during the app approval/update process.

    This is the main problem I have with this engineer's findings - he failed to mention (or maybe doesn't know) that Apple does look for this misuse, in fact, it's in the damned developer license agreement...

    3.3.8 If Your Application makes recordings (including but not limited to an image, picture or voice capture or recording) (collectively “Recordings”), a reasonably conspicuous audio, visual or other indicator must be displayed to the user as part of the Application to indicate that a Recording is taking place.

    Thanks for finding and posting the specific prohibition.  It says something about us that it wasn't until the 86th message that this was done.  Frankly it should have been in the original dude's "expose" and the AI story.

    To Bigmushroom's assertion that it would be very difficult for Apple to discover this "at great cost" during the review process, I expect that that's completely wrong.  I would hope that when Apple reviews apps, they not only do code analysis to look for red flag, but also monitor all the "interesting" subsystems while physically/manual testing the app.  This would include "watching" (monitoring/logging/auditing) when the camera or mic is activated and watching what data is transmitted or recorded.
    IMO if Apple were consistently checking apps that thoroughly they would not have had a few hundred apps discovered to have built over malware last year, nor would they have to be culling thousands of approved apps this year that have no useful purpose, or claim to have a function they don't, or that are plainly near identical rip-offs of well-known popular and original apps. Malware scanner apps would not be approved that include a permission to read the user's device password, nor would apps that supposedly scan for viruses be approved to begin with. Yet all of these have been approved, only later to be discovered as either malware or PUP's and typically found by someone other than Apple. 

    Apple plainly does a far better job of vetting apps than Google currently does. We all know that. But Apple also does not appear to be checking apps as deeply as has been presumed by some folks, or at least not on a consistent basis. 
    I would love to get the detailed scoop on the app review process.  It's a fascinating business challenge: how to "evaluate" millions of apps (and app updates) quickly and cost effectively with low false-positive and false-negative errors (rejecting apps that would have been "good" additions to the App Store and approving problematic apps) and a reasonable appeals process.  I'm sure it's a complex mix of automation and human judgement some element of risk-based auditing (giving some apps more scrutiny than others).   And a process that has no doubt evolved over time.  Given the number of apps that have been processed and they relatively few complaints and problems, it's a very impressive achievement by Apple.

    But, right, we shouldn't expect that it's a perfect, fool-proof process.
  • Reply 99 of 103
    steven n. said:
    So, in your mind, SnapChat is banned because it opens the camera data stream simply by opening the app. Same with Instagram. Once the stream is open, pressing the “record” button simply saves a snapshot of the data the app has already had access to.

    what if you “swipe up” to bring up the camera?

    basically, you are saying you want apps with no ability to interact with them because any interaction could be the interaction to trigger the camera.
    This is an interesting point that i hadn't considered. If you climb inside my head (which I don't recommend because it's a mess in there and you could get hurt) you see a neural pathway that connects "Make Camera Go On" with "Big Red Button that says 'Make Camera Go On.'" Since I don't use any third-party apps that access the camera I wasn't aware of (or at least didn't think of) apps triggering the camera by means other than the Big Red Button.

    You're right. That throws a wrench in my otherwise flawless and foolproof plan for protecting the world from evil super villains.
  • Reply 100 of 103
    mjtomlinmjtomlin Posts: 2,673member
    gatorguy said:
    mjtomlin said:
    So, yes,, this could cause embarrassment, and yes, Apple should scan for this misuse of code during the app approval/update process.

    This is the main problem I have with this engineer's findings - he failed to mention (or maybe doesn't know) that Apple does look for this misuse, in fact, it's in the damned developer license agreement...

    3.3.8 If Your Application makes recordings (including but not limited to an image, picture or voice capture or recording) (collectively “Recordings”), a reasonably conspicuous audio, visual or other indicator must be displayed to the user as part of the Application to indicate that a Recording is taking place.

    Thanks for finding and posting the specific prohibition.  It says something about us that it wasn't until the 86th message that this was done.  Frankly it should have been in the original dude's "expose" and the AI story.

    To Bigmushroom's assertion that it would be very difficult for Apple to discover this "at great cost" during the review process, I expect that that's completely wrong.  I would hope that when Apple reviews apps, they not only do code analysis to look for red flag, but also monitor all the "interesting" subsystems while physically/manual testing the app.  This would include "watching" (monitoring/logging/auditing) when the camera or mic is activated and watching what data is transmitted or recorded.
    IMO if Apple were consistently checking apps that thoroughly they would not have had a few hundred apps discovered to have built over malware last year, nor would they have to be culling thousands of approved apps this year that have no useful purpose, or claim to have a function they don't, or that are plainly near identical rip-offs of well-known popular and original apps. Malware scanner apps would not be approved that include a permission to read the user's device password, nor would apps that supposedly scan for viruses be approved to begin with. Yet all of these have been approved, only later to be discovered as either malware or PUP's and typically found by someone other than Apple. 

    Apple plainly does a far better job of vetting apps than Google currently does. We all know that. But Apple also does not appear to be checking apps as deeply as has been presumed by some folks, or at least not on a consistent basis. 

    You of course mean all those apps that have abused the camera? Because for the passed 10 years this is how it has always been.

    Why don't take your anti-apple shit and shove it? Why is this douche still allowed on this site? Everything he says has absolutely no merit or basis in reality?

    Sorry.. as far as I can tell he's just looking for an argument that doesn't exist. Except maybe this stupid site wants to happen for hits?
    edited October 2017
Sign In or Register to comment.