Apple's approval of 'Jekyll' malware app reveal flaws in App Store review process

2

Comments

  • Reply 21 of 42
    wovelwovel Posts: 956member
    _rick_v_ wrote: »
    Is it really any surprise that someone is able to make a legitimate-looking app, and bury some code in there that only activates after some time or criteria has passed?  It's called a trojan for a reason.  Famed security researcher Charlie Miller proved the exact same thing (and subsequently got himself banned from the App Store) by writing a fake finance app.

    The part that IS disturbing to me is that I thought that the Contacts and such were supposed to be more or less firewalled until you explicitly give that app permission.  What's up with that?  It's a bit disturbing to think that a rogue app could start sending emails and such.  What is this, Android?!
    Nothing in the article indicates it was done without permissions. All they did was execute code that was not in the original app.

    The story is a lot more hype than substance. I imagine the user did give permission to access contacts, post tweets, etc. the problem is they did it with unreviewed code, not that they did it without permission.
  • Reply 22 of 42

    Quote:

    Originally Posted by jragosta View Post





    Sure. So there's one security issue that affects iOS - but also would affect Android. And there are 100,000 security issues that affect Android and not iOS.



    Which is better?



    As for dynamic testing, that's nice in theory. In practice, it would have enormous impact on the system operation. The OS would be far larger and slower and everything would be dog slow. I don't think that's worth the tradeoff. Apple will presumably settle for 100,000 times more secure rather tan 101,000 times more secure.


     


    I think the "dynamic testing" Marvin refers to would happen on the reviewers' workstations, not the end-user's device. Also, Google Bouncer does do some dynamic analysis, although who knows whether its tests would catch an app like Jekyll.

  • Reply 23 of 42
    So... here we are again... Apple is obviously inadequate. They have a more in depth review process, stricter controls on what data/APIs apps can access, and more stringent built-in OS security than the competition; but because a highly focused security research team was able to sneak past their approval process with INACTIVE code this is a story. If the competition was held to the same standard as Apple this kind of news would be tiresome and old-hat. Why is there a double standard? Why do Android and Windows Phone get a free pass while people expect Apple to be perfect? Regardless, you'll find considerably less malware in the Apple ecosystem. Anyone who has a clue about security knows that NO SYSTEM IS INFALLIBLE. There will always be something that gets through.. Let's just try to keep it in context. Just take a look at how many pieces of Android malware are in the wild vs. iOS malware and then tell me which is the more secure mobile platform.
  • Reply 24 of 42
    MarvinMarvin Posts: 14,612moderator
    d4njvrzf wrote: »
    I think the "dynamic testing" Marvin refers to would happen on the reviewers' workstations, not the end-user's device. Also, Google Bouncer does do some dynamic analysis, although who knows whether its tests would catch an app like Jekyll.

    Yeah I just meant running the apps at review but it seems they already do this and wouldn't easily find this vulnerability. I can't believe they approve all those apps with crazy in app purchases after running them manually.
    wovel wrote:
    Nothing in the article indicates it was done without permissions. All they did was execute code that was not in the original app.

    The story is a lot more hype than substance.

    This site has a very detailed run-down of the attack:

    http://www.imore.com/jekyll-apps-how-they-attack-ios-security-and-what-you-need-know-about-them

    The executable code was in the original app. It was an app that collected data e.g articles from a news server. The developer planted a deliberate vulnerability in their app and the app itself contained blocks of malicious code that were just separated out, which are named gadgets in this attack:

    http://en.wikipedia.org/wiki/Return-oriented_programming

    They created a data download on their server that exploited the buffer overflow vulnerability they put in their app, which was then able to execute the dormant code but the download was not executed. They knew where the malicious code was in memory because the address space layout randomisation Apple uses is limited.

    Improving ASLR is one thing they can do but also protecting better against buffer overflow vulnerabilities as mentioned by jmncl earlier.

    Apple uses a buffer overflow prevention called Data Execution Prevention, which meant the data they downloaded from the server could be written into memory but could not be executed. This data however changed the execution order of the application so the malicious code that was already in the app became active.

    Perhaps they need to have an API that forces data that isn't bundled with the app to be loaded into some quarantined memory location so that it can't overwrite parts of the application binary and reorder the code execution. That way it wouldn't matter if anyone put a buffer overflow vulnerability in their app because it's not that code reading the file directly, they'd have to call Apple's API to read the downloaded file, which can also check for suspicious payloads at runtime.
    _rick_v_ wrote:
    The part that IS disturbing to me is that I thought that the Contacts and such were supposed to be more or less firewalled until you explicitly give that app permission.

    They used private APIs to call the processes directly. This is likely why Apple bans private APIs but they can be bundled in a malicious app. Still, you'd think the Camera, Contacts and other processes could protect against this somehow with say verification keys that show the commands came from an authorised process. There could be an internal OS list that has approved processes for certain activities and the storage locations of those processes. They could use hash verification for the binaries but that might mean having to update the approval every time the app gets updated.

    I'd say this kind of attack is fairly complex to pull off as you'd have to know how to write a buffer overflow vulnerability into an app, how to split malicious code into gadgets, how to get the memory addresses, how to then write a payload that will rewrite the execution order and then to actually pull it off to do something worthwhile. It can be trivial if people reuse working code but these researchers don't sound like the kind of people that would share exploit code irresponsibly. Hopefully Apple will implement at least some of the extra security measures where possible but it is quite far from Android's ability to source malware:

    http://www.computerworld.com/s/article/9241596/New_Android_malware_is_being_distributed_through_mobile_ad_networks

    Malware and advertising together, how will Google deal with that one?
  • Reply 25 of 42
    mstonemstone Posts: 11,510member

    Quote:

    Originally Posted by jmncl View Post



    They can't. But they can make sure that any donwloaded code can't execute and more important can't cross the app sandbox.


     


    Normally that would be the case for any app. The problem here is that the hack exploited a bunch of iOS bugs to let them around that. Apple just needs to patch those bugs.


     


    In fact iMore is reporting that this hack already doesn't work on iOS7.



    That is the real news. The sandbox got exploited. Apple has always known that malicious code can lurk in the background undetected in the review process. One technique is to have date conditional code that doesn't reveal itself until after the review process. The only way Apple could be 100% on the review is if developers had to submit the app source code and Apple compiled it for them after the review which would be really expensive and time consuming.


     


    It just goes to show that you should only download apps from reputable big name brands that have a reputation at stake. A small unknown programmer with a marginally useful application should be avoided especially when the app is brand new and has no recommendations.

  • Reply 26 of 42
    So Apple has shut this door. Has Android? Now that the word is out, Android represents a very juicy target for this kind of exploit.

    Yet another reason to move over to iOS ASAP.
  • Reply 27 of 42
    @Chazwatson

    Love your aviatar man! Brings back memories.
  • Reply 28 of 42
    gatorguygatorguy Posts: 23,300member
    Marvin wrote: »
    Malware and advertising together, how will Google deal with that one?
    I don't know that either Android or iOS has a way yet to keep Black Hole from potentially exposing web page visitors to malware.
  • Reply 29 of 42


    Where's TEKSTUD when you need him? This thread should have 200+ posts by now.

  • Reply 30 of 42
    mstone wrote: »
    The only way Apple could be 100% on the review is if developers had to submit the app source code and Apple compiled it for them after the review which would be really expensive and time consuming.

    One does submit their app's source code. The issue here is that not all code behavior can be easily determined. Code can be very complex, even excessively so if the writer chooses. That's why apps with bugs can make it into the App Store to begin with-- not all bugs are found in the first place, even the intended ones. This one was designed to look like it wanted to do something innocent.
  • Reply 31 of 42
    mstonemstone Posts: 11,510member

    Quote:

    Originally Posted by chazwatson View Post




    Quote:

    Originally Posted by mstone View Post



    The only way Apple could be 100% on the review is if developers had to submit the app source code and Apple compiled it for them after the review which would be really expensive and time consuming.




    One does submit their app's source code. 


    You submit an .ipa binary file not your xcode project. Apple really doesn't want your code and you don't want anyone else to have it either especially if it contains proprietary program code and intellectual property.

  • Reply 32 of 42

    Quote:

    Originally Posted by mstone View Post


    You submit an .ipa binary file not your xcode project. Apple really doesn't want your code and you don't want anyone else to have it either especially if it contains proprietary program code and intellectual property.



     


    It is promptly decompiled for examination.  Sorry, should have been more specific.

  • Reply 33 of 42
    mstonemstone Posts: 11,510member

    Quote:

    Originally Posted by chazwatson View Post




    Quote:

    Originally Posted by mstone View Post


    You submit an .ipa binary file not your xcode project. Apple really doesn't want your code and you don't want anyone else to have it either especially if it contains proprietary program code and intellectual property.



     


    It is promptly decompiled for examination.  Sorry, should have been more specific.



    I would argue against that being the case. Do you have any references? They use computers to scan your app to see if you are using any private APIs but even if they did decompile it, it would be so difficult to read they would not be able make any sense of it without days of work by a skilled programmer. The reviewers probably aren't even programmers. Furthermore, the code is your private property and if they were attempting to read your private code, that would be illegal.

  • Reply 34 of 42
    quadra 610quadra 610 Posts: 6,756member


    My reaction:  


     


    So what?


     


    ONE got past Apple. 


     


    Hundreds of thousands get into whatever they call the Android app store. 


     


    Since Jobs' return to Apple, malware has NEVER been a problem in any meaningful way, whether for OS X or any other Apple OS on an Apple device. Nothing has really ever materialized for anyone to be concerned about. And Apple has reached saturation (and is pushing beyond that) years ago. 


     


    This is simply not newsworthy. Apple's been notified, and that's it. 

  • Reply 35 of 42
    Of course they got Apple's blessing. Post malware that is.

    It looks a near facsimile of Charlie's hack so nothing has changed,
    Most malware apps will be Mr Hyde.
  • Reply 36 of 42
    bulk001bulk001 Posts: 661member
    jmncl wrote: »
    Oh please, Google Play's "security" gets circumvented almost every week. Did you hear about their "Bouncer" that was going to stop everything? It's a total sieve.

    So yes Apple's walled system is more secure. This has been demonstrated year after year.

    The thing is no company in the world offers 100% foolproof security against determined hackers, it's a continous process.

    You also have to understand that in this case these people "who know what they are doing" have PhDs in security.
    Phew. I feel reassured now. Probably a team of PhD's who released all the jailbreaks for iPhone. Silly me.
  • Reply 37 of 42
    Some of the nefarious actions listed here require explicit user approval via tapping yes on a confirmation dialog. The original article, however, does say the app works stealthily. Thus, I don't believe the app has access to address book, for instance (edit: I was right, the Jerkyll app is explicitly asking for user permissions to access address book, see link few posts above). It can’t access your contacts, your camera roll, your location etc. without using some previously undisclosed exploit (which I don't think they have because finding such exploit worth way more in security research than creating trojan apps). Thus, the scope of maliciousness of this app is more limited than it may seem.

    Apple steps up user privacy for every iOS version, and iOS 7 will dump unique device identifier to the bin, while adding more fine-grained access controls to the personal information. Also, you have to remember that to be successful a malware app has to be installed thousands of times. Good luck achieving that on App Store with your knock-off functionality! But this isn’t the point.

    The original article and AI’s article falsely led you to believe that because Apple’s app tests are not good enough to stop such kind of malware then Apple can fight it by introduced more lengthy and more thorough testing. This is plain bs. Objective-C, the language of iOS, is a highly dynamic language with first-class functions. It means the chunks of code can be created on the fly. There’s no way to catch the malicious activity in an app if it isn’t there yet. There’s no way of telling if it will be there tomorrow. And it’s not only Obj-C: the same can be achieved on C as well with a few tricks. All you need in a programming language is direct memory access.

    Thus, the only way for Apple to prevent malware on iOS is to introduce more power to the user in selecting what apps can and cannot do, meaning iOS 7 is a step in the right direction.
  • Reply 38 of 42
    Clear evidence of selective abstraction, denial and delusional thinking
  • Reply 39 of 42
    umumumumumum Posts: 76member

    Quote:

    Originally Posted by Quadra 610 View Post


    My reaction:  


     


    So what?


     


    ONE got past Apple. 


     


    Hundreds of thousands get into whatever they call the Android app store. 


     


    Since Jobs' return to Apple, malware has NEVER been a problem in any meaningful way, whether for OS X or any other Apple OS on an Apple device. Nothing has really ever materialized for anyone to be concerned about. And Apple has reached saturation (and is pushing beyond that) years ago. 


     


    This is simply not newsworthy. Apple's been notified, and that's it. 



     


    prove that only "ONE got past Apple.", for all we know other apps have been exploiting the vulnerability for ages, security services and criminals don't shout about it


     


    prove that "Hundreds of thousands get into whatever they call the Android app store.", although what relevance that has to apple i'm not sure


     


    but of course you can't prove it, here's some free advice, spend less time posting lies on the internet, and one day you may own as many apple shares as i do


     


    as a shareholder and long term apple user i'm angry, apple has been caught napping, it needs to beef up it's testing, otherwise it's only real justification for restricting general distribution of ios applications is dead

  • Reply 40 of 42
    As always it is 'one makes it into the news'.


    Their report was carefully crafted to upset Apple as little as possible, so if they all get to survive as a developer, then it bodes well for at least a little security monitoring.
Sign In or Register to comment.