or Connect
AppleInsider › Forums › Mobile › iPhone › Purported iOS 'flaw' lets nefarious apps secretly log keystrokes in background
New Posts  All Forums:Forum Nav:

Purported iOS 'flaw' lets nefarious apps secretly log keystrokes in background

post #1 of 24
Thread Starter 
In a blog post late Monday, network security firm FireEye claims to have discovered a new iOS "flaw" that allows a nefarious app to log touch events and button presses in the background, then send the data off to a remote server.

FireEye
FireEye's background monitoring proof-of-concept. | Source: FireEye


First spotted by ArsTechnica, the post describes a proof-of-concept that FireEye researchers say can collect and transmit potentially sensitive information while running in the background.

From what can be gleaned from FireEye's blog, the supposed "flaw" takes advantage of iOS' built-in multitasking components, suggesting the attacking app must first be vetted and installed on an affected device to access legitimate APIs. Barring the side-loading of an app with private APIs, such as those certified for internal distribution through Apple's remote management solution, the app would have to successfully sneak by the App Store review process in order to work.

To this end, FireEye claims to have developed "approaches to bypass" Apple's app review process, but does not detail the workarounds.

Note that the demo exploits the latest 7.0.4 version of iOS system on a non-jailbroken iPhone 5s device successfully. We have verified that the same vulnerability also exists in iOS versions 7.0.5, 7.0.6 and 6.1.x. Based on the findings, potential attackers can either use phishing to mislead the victim to install a malicious/vulnerable app or exploit another remote vulnerability of some app, and then conduct background monitoring.


The monitoring app can reportedly record all input events in the background, including on-screen touches and physical button actuation like the home button, Touch ID and volume controls.

Further, FireEye notes that disabling iOS 7's background app refresh feature will not block said monitoring app from collecting and disseminating data. The firm offers the example of music apps that were granted access to background processes in earlier versions of iOS.

According to ArsTechnica, a now-removed blog post from FireEye claimed the firm had "successfully delivered a proof-of-concept monitoring app through the App Store that records user activity and sends it to a remote server. We have been collaborating with Apple on this issue."

FireEye's discovery, if it can be deemed as much, comes as Apple is being scrutinized over an SSL security flaw found recently in both iOS and OS X. The so-called "goto fail" error potentially opens the door for hackers to surreptitiously intercept data meant to be encrypted.

Apple issued a patch for iOS last week and is working on a fix for OS X that should see release soon.
post #2 of 24
Just great...

C'mon Apple. I thought you were better than this!
post #3 of 24
So do I have this right? Getting an app that leverages this "flaw" would require jailbreaking (or else you self-installing the app as a developer/enterprise deployment on your own device)? How is that news? You can do all kinds of questionable things by jailbreaking/side loading.

So what is seemingly newsworthy is not the background thing at all, but rather a second, unrelated flaw, in Apple's review process. If such a flaw exists (sounds plausible, but they are very vague) then--and only then--could such an app actually get onto normal users' phones?

And they've backtracked, removing their formerly-posted claim to have gotten an instance of this onto the App Store?

C'mon journalists and fact-checkers, I thought you were better than this! Until I see real evidence, this sounds like FireEye putting out misleading info to drum up publicity.
post #4 of 24
Quote:
Originally Posted by nagromme View Post

So do I have this right? Getting an app that leverages this "flaw" would require jailbreaking (or else you self-installing the app as a developer/enterprise deployment on your own device)? How is that news? You can do all kinds of questionable things by jailbreaking/side loading.

So what is seemingly newsworthy is not the background thing at all, but rather a second, unrelated flaw, in Apple's review process. If such a flaw exists (sounds plausible, but they are very vague) then--and only then--could such an app actually get onto normal users' phones?

And they've backtracked, removing their formerly-posted claim to have gotten an instance of this onto the App Store?

C'mon journalists and fact-checkers, I thought you were better than this! Until I see real evidence, this sounds like FireEye putting out misleading info to drum up publicity.
 
I agree with what nagromme says, there are too many unknowns here for us to be sure that we are legitimately at risk.  Of course there is an enterprise program which allows businesses to create their own apps and sideload them (legally) onto their own iOS devices to issue to employees, and in this case, it is possible for companies to deploy apps that are not secured through the app store.  Although, I think it would be somewhat unlikely for such an app, being compromised, to make its way into the wild.  We have to be wary of these kinds of reports, they may, as nagromme said, be trying to drum up some publicity for FireEye.  In a way, though, attacks on iOS and Mac apps is a bit flattering, suggesting that Apple's products are in the mainstream now, and have a large enough installed base that crackers find it a worthy platform to write malware for.  Of course, though, I am sure that Apple is increasingly aware of this, and will plug security holes in short order after they are discovered.
post #5 of 24
Quote:
Originally Posted by nagromme View Post

So what is seemingly newsworthy is not the background thing at all, but rather a second, unrelated flaw, in Apple's review process. If such a flaw exists (sounds plausible, but they are very vague) then--and only then--could such an app actually get onto normal users' phones?
 

 

Apple's app review process isn't what you think. Apple only sees the binary, it's very easy to sneak code past. Just look how many tethering apps got through.

http://appleinsider.com/articles/13/08/16/apples-approval-of-jekyll-malware-app-reveal-flaws-in-app-store-review-process

post #6 of 24
Unlike the SSL problem which is so major conspiracy theorists think it deliberate, this is nonsense. An app needs to piggy back on another app and then it can catch where you touched the screen which isn't a keystroke catcher but a position on screen catcher.
I wanted dsadsa bit it was taken.
Reply
I wanted dsadsa bit it was taken.
Reply
post #7 of 24
Quote:
Originally Posted by asdasd View Post

 then it can catch where you touched the screen which isn't a keystroke catcher but a position on screen catcher.

 

The keys are in the exact same place on the iPhone. And even if they weren't you could easily do analysis to find where they were. For example on the iPad, the e key is going to be in one of two columns (split and non-spilt) and you could easily tell by the spacing of the keys or the lack of center key presses. It would be easy to deduce the horizontal rows from the range of key presses.

 

It is definitely a flaw if an app can capture all events when it isn't in the foreground though the severity is mitigated by the fact that it does require a malicious app be loaded.


Edited by konqerror - 2/25/14 at 1:28am
post #8 of 24
Quote:
Originally Posted by AppleInsider View Post

From what can be gleaned from FireEye's blog, the supposed "flaw" takes advantage of iOS' built-in multitasking components…

 

This not just a simple flaw in the iOS code.

 

Quote:
Originally Posted by s.metcalf View Post

Just great...

C'mon Apple. I thought you were better than this!

 

Hogwash. While I'm no programmer, I can appreciate that an operating system like iOS involves millions of lines of code, managing a highly complex system of components. Given this complexity of the hardware and software, there is no way any smartphone system can be 100% bug-free, regardless of the quantity or quality of coders involved.

 

Personally, I am happy any time a flaw is discovered before it can actually be put to use. This gives Apple the chance to plug the hole before harm is done.

"You can't fall off the floor"   From 128k Mac to 8GB MBP

Reply

"You can't fall off the floor"   From 128k Mac to 8GB MBP

Reply
post #9 of 24
Quote:
Originally Posted by konqerror View Post
 

... the severity is mitigated by the fact that it does require a malicious app be loaded.


Key part.  I'm more than confident that Apple is always trying to keep the filth out of the app store.  If I want my data and identity compromised, I'll use an Android phone.

This is more FUD than anything else.

post #10 of 24
Quote:
Originally Posted by sflocal View Post
 

Key part.  I'm more than confident that Apple is always trying to keep the filth out of the app store.  If I want my data and identity compromised, I'll use an Android phone.

 

Repeat after me. When you submit an app to the app store, you only send in a binary. Apple cannot determine the logic of your program unless they reverse engineer your binary which is very difficult to do. This is like me giving you the iTunes binary and asking if you can find any hidden code in it.

 

You have to understand that the main way iOS ensures security is through limited app permissions, which has been breached here. The main purpose of app review is to check for things like porn and in app purchases.

post #11 of 24
Quote:
Originally Posted by s.metcalf View Post

Just great...

C'mon Apple. I thought you were better than this!

News articles will word it like it's a keylogger and so people assume it exploits bugs but it's more about functionality choices Apple made. I don't know why they allow background apps to monitor screen touches but iOS 7 has a way to turn it off as long as the app isn't playing music - maybe they were allowing for custom gestures or just gave 3rd parties the same level of functionality their own background apps have. There was a malware example for iOS and Android that used this technique:

http://www.forbes.com/sites/tamlinmagee/2014/01/27/trustwave-demonstrates-malware-that-logs-touchscreen-swipes-to-record-your-pin/

To exploit this, someone would have to download a malicious app and leave it open in the background while you did something interesting in another app. Apple can isolate touches to the current app and just pass gestures to their own OS background apps. These guys say they're working with Apple on a fix.

It's possible that apps like these exist in app stores just now.
post #12 of 24
Quote:
Originally Posted by nagromme View Post

So do I have this right? Getting an app that leverages this "flaw" would require jailbreaking (or else you self-installing the app as a developer/enterprise deployment on your own device)? How is that news? You can do all kinds of questionable things by jailbreaking/side loading.

So what is seemingly newsworthy is not the background thing at all, but rather a second, unrelated flaw, in Apple's review process. If such a flaw exists (sounds plausible, but they are very vague) then--and only then--could such an app actually get onto normal users' phones?

And they've backtracked, removing their formerly-posted claim to have gotten an instance of this onto the App Store?

C'mon journalists and fact-checkers, I thought you were better than this! Until I see real evidence, this sounds like FireEye putting out misleading info to drum up publicity.

I agree.

IF they had discovered the flaw BEFORE Apple had then found a way to get an app approved on the App Store they most likely would have created a video of it. Removing the proof negated their argument for me.

With so much scrutiny going on about this flaw, I guess iOS is much important than Wall Street is willing to admit.

EDIT... See newer post below before responding to this one. Thanks!
Edited by leavingthebigG - 2/25/14 at 6:07am
post #13 of 24
Quote:
Originally Posted by leavingthebigG View Post

Quote:
Originally Posted by nagromme View Post

So do I have this right? Getting an app that leverages this "flaw" would require jailbreaking (or else you self-installing the app as a developer/enterprise deployment on your own device)? How is that news? You can do all kinds of questionable things by jailbreaking/side loading.

So what is seemingly newsworthy is not the background thing at all, but rather a second, unrelated flaw, in Apple's review process. If such a flaw exists (sounds plausible, but they are very vague) then--and only then--could such an app actually get onto normal users' phones?

And they've backtracked, removing their formerly-posted claim to have gotten an instance of this onto the App Store?

C'mon journalists and fact-checkers, I thought you were better than this! Until I see real evidence, this sounds like FireEye putting out misleading info to drum up publicity.

I agree.

IF they had discovered the flaw BEFORE Apple had then found a way to get an app approved on the App Store they most likely would have created a video of it. Removing the proof negated their argument for me.

With so much scrutiny going on about this flaw, I guess iOS is much important than Wall Street is willing to admit.
So you would prefer they had put a step by step how to guide plus video for everyone to see and use rather than work with Apple to fix the issue because then youd be happy. Im actually happier that they went to Apple to get it resolved and posted only a titbit. Id have been happier if it had not come to light because they have just flagged to all and sundry that it could be done, thus encouraging others to try there hand at it.
post #14 of 24
Hmm...so just like the SSL bug this is present in iOS 6 too. It's almost as if Forstall knew he was going to be canned and these were his parting gifts to the company. 1wink.gif j/k
post #15 of 24
The writing of "flaw" is certainly the correct way to express it. Of course, testing and debugging requires that such feature be designed into iOS, as it does in every OS. The tricky part will always be how to prevent its access generally.
post #16 of 24
Quote:
Originally Posted by s.metcalf View Post

Just great...

C'mon Apple. I thought you were better than this!

 

I am sitting here laughing at comments like this. I thought we just got finished lamenting that NOTHING is secure, that the NSA is spying on all of us, that there is no such thing as safety on the Internet. Now we castigate Apple for a security flaw? What, are we to hide under a rock and know that there’s a hacker hiding in every bush waiting to pounce on our privacy. This nerd paranoia is ridiculous. 

 

So to all of you hand wringing paranoids, please remind us once more that even if Apple fixes EVERY SINGLE security flaw that exists we are still only one click away from datageddon and the destruction of our lives. No wonder we normal users tend to ignore your hands waving in the air and your hair on fire when you go on about these issues.

post #17 of 24
Quote:
Originally Posted by leavingthebigG View Post


I agree.

IF they had discovered the flaw BEFORE Apple had then found a way to get an app approved on the App Store they most likely would have created a video of it. Removing the proof negated their argument for me.

What if they thought that going public might have gotten their dev memberships revoked like Charlie Miller lost his?

post #18 of 24
Quote:
Originally Posted by singularity View Post

So you would prefer they had put a step by step how to guide plus video for everyone to see and use rather than work with Apple to fix the issue because then youd be happy. Im actually happier that they went to Apple to get it resolved and posted only a titbit. Id have been happier if it had not come to light because they have just flagged to all and sundry that it could be done, thus encouraging others to try there hand at it.

Thanks! Your response made me pause then for some reason I read the article again. I chuckled at myself when I read the following bolded sentence... According to ArsTechnica, a now-removed blog post from FireEye claimed the firm had "successfully delivered a proof-of-concept monitoring app through the App Store that records user activity and sends it to a remote server. We have been collaborating with Apple on this issue."

It became obvious to me that I ignored the sentence! I took a deep breath then decided to admit my mistake in comprehending what was written..

Thanks again for calling me out on this!!
post #19 of 24
Quote:
Originally Posted by leavingthebigG View Post

Quote:
Originally Posted by singularity View Post

So you would prefer they had put a step by step how to guide plus video for everyone to see and use rather than work with Apple to fix the issue because then youd be happy. Im actually happier that they went to Apple to get it resolved and posted only a titbit. Id have been happier if it had not come to light because they have just flagged to all and sundry that it could be done, thus encouraging others to try there hand at it.

Thanks! Your response made me pause then for some reason I read the article again. I chuckled at myself when I read the following bolded sentence... According to ArsTechnica, a now-removed blog post from FireEye claimed the firm had "successfully delivered a proof-of-concept monitoring app through the App Store that records user activity and sends it to a remote server. We have been collaborating with Apple on this issue."

It became obvious to me that I ignored the sentence! I took a deep breath then decided to admit my mistake in comprehending what was written..

Thanks again for calling me out on this!!
thats ok. Its far too easy to rush in with a comment, I do it all the time. 1wink.gif
The bit you originally missed is I think the most important part. They found a flaw and are collaborating to get it fixed.
post #20 of 24
This article does no one any good.

It helps create problems rather than solve any.
post #21 of 24

This is just a proof of concept.  No app with this flaw would ever get sold on the App Store.  Apple would detect it and reject it in a heartbeat.  Walled garden my ass!

post #22 of 24
Quote:
Originally Posted by konqerror View Post

Repeat after me. When you submit an app to the app store, you only send in a binary. Apple cannot determine the logic of your program unless they reverse engineer your binary which is very difficult to do. This is like me giving you the iTunes binary and asking if you can find any hidden code in it.

You have to understand that the main way iOS ensures security is through limited app permissions, which has been breached here. The main purpose of app review is to check for things like porn and in app purchases.

I agree that the flaw likely exists in theory (something to fix?), and that it could reach people in reality if the App Store process allows it, which as I said is plausible (something else to fix?); I just don't see the evidence yet. So from what we know now this sound like one of the many past claims that got exaggerated, gave someone 5 seconds of fame, and then turned out to be very different from what articles had stated/implied (and fed to the echo chamber).

But the App Store approval process does more (both human and automated) then you're saying, and does scan the code. It's not like "you getting the code" and eyeballing the binary. The code is scanned automatically, and hidden transgressions can be caught, and are caught all the time. (I use the game engine Unity, for instance, and sometimes developers report that the Unity engine is triggering some automated rejection for doing something wrong, and so Unity makes a needed tweak and the apps now make it through.)

Otherwise:

A) why would they need to "approaches to bypass" the process at all? They'd just submit and there would be no story about that end of the problem.

B) how would they catch apps doing invisible disallowed things (like using certain private APIs or using a UDID) which they catch all the time?

It's possible (we don't yet know) that Apple's process could detect this in the same way--even if from a binary.

Here are some links to developers explaining how Apple scans app binaries for various things prior to approval:

http://stackoverflow.com/questions/2842357/how-does-apple-know-you-are-using-private-api
http://stackoverflow.com/questions/9934143/how-does-apple-detect-udid-access-during-app-review
http://stackoverflow.com/questions/3186648/does-apple-view-the-actual-source-code-when-approving-apps
Edited by nagromme - 2/25/14 at 7:50am
post #23 of 24
Quote:
Originally Posted by nagromme View Post


I agree that the flaw likely exists in theory (something to fix?), and that it could reach people in reality if the App Store process allows it, which as I said is plausible (something else to fix?); I just don't see the evidence yet. So from what we know now this sound like one of the many past claims that got exaggerated, gave someone 5 seconds of fame, and then turned out to be very different from what articles had stated/implied (and fed to the echo chamber).

But the App Store approval process does more (both human and automated) then you're saying, and does scan the code. It's not like "you getting the code" and eyeballing the binary. The code is scanned automatically, and hidden transgressions can be caught, and are caught all the time. (I use the game engine Unity, for instance, and sometimes developers report that the Unity engine is triggering some automated rejection for doing something wrong, and so Unity makes a needed tweak and the apps now make it through.)

Otherwise:

A) why would they need to "approaches to bypass" the process at all? They'd just submit and there would be no story about that end of the problem.

B) how would they catch apps doing invisible disallowed things (like using certain private APIs or using a UDID) which they catch all the time?

It's possible (we don't yet know) that Apple's process could detect this in the same way--even if from a binary.

Here are some links to developers explaining how Apple scans app binaries for various things prior to approval:

http://stackoverflow.com/questions/2842357/how-does-apple-know-you-are-using-private-api
http://stackoverflow.com/questions/9934143/how-does-apple-detect-udid-access-during-app-review
http://stackoverflow.com/questions/3186648/does-apple-view-the-actual-source-code-when-approving-apps

I think you're overestimating how much can be found by a binary scanner.  It may find clear violations like using undocumented APIs, but code that is written to do things in a roundabout way is nearly impossible to decipher without a detailed analysis by a programmer.  It's very easy to confuse even the Clang/LLVM static analyzer which has full access to the source code.

 

The only solution is to have a rock solid OS that can prevent apps from performing actions outside the sandbox at runtime.  I don't blame the binary scanner here, this is a sandbox problem.

post #24 of 24
Apple is doomed! But seriously it's good to have these firms pointing out flaws for Apple to close, just in case. iOS is still quite secure and the malware available is almost non-existent compared to Android.
My blender/recipe blog: http://blenderinsider.com
Reply
My blender/recipe blog: http://blenderinsider.com
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: iPhone
  • Purported iOS 'flaw' lets nefarious apps secretly log keystrokes in background
AppleInsider › Forums › Mobile › iPhone › Purported iOS 'flaw' lets nefarious apps secretly log keystrokes in background