Newly found code signing flaw allows for iOS malware
A newly discovered iOS security flaw could allow disguised malware capable of stealing user data to bypass Apple's stringent security measures and enter the App Store.
Mac hacker and researcher Charlie Miller has reportedly found a way to sneak malware into the App Store and subsequently onto any iOS device by exploiting a flaw in Apple's restrictions on code signing, allowing the malware to steal user data and take control of certain iOS functions, according to Forbes.
Miller explains that code signing restrictions allow only Apple's approved commands to run in an iOS device's memory, and submitted apps that violate these rules are not allowed on the App Store. However, he has found a method to bypass Apple's security by exploiting a bug in iOS code signing that allows an app to download new unapproved commands from a remote computer.
"Now you could have a program in the App Store like Angry Birds that can run new code on your phone that Apple never had a chance to check,? Miller said. ?With this bug, you can?t be assured of anything you download from the App Store behaving nicely.?
The flaw was introduced when Apple released iOS 4.3, which increased browser speed by allowing javascript code from the internet to run on a much deeper level in a device's memory than in previous iterations of the OS. Miller realized that in exchange for speed, Apple created a new exception for the web browser to run unapproved code. The researcher soon found a bug that allowed him to expand the flawed code beyond the browser, integrating it into apps downloaded from the App Store.
Miller created proof-of-concept app called "Instastock" to showcase the vulnerability, which was submitted to and approved by Apple to be distributed via the App Store. The simple program appears to be an innocuous stock ticker, but it can leverage the code signing bug to communicate with Miller's server to pull unauthorized commands onto the affected device. From there the program has the ability to send back user data including address book contacts, photos and other files, as well as initiate certain iOS functions like vibrating alerts.
The app has since been pulled and according to his Twitter account, Miller has reportedly been banned from the App Store and kicked out of the iOS Developer Program.
Miller, a former NSA analyst who now works for computer security firm Accuvant, is a prominent Apple researcher who previously exposed the MacBook battery vulnerability and a security hole in the mobile version of Safari.
The researcher has refused to publicly reveal the exploit, reportedly giving Apple time to come up with a fix, though he will announce the specifics at the SysCan conference in Taiwan next week.
Mac hacker and researcher Charlie Miller has reportedly found a way to sneak malware into the App Store and subsequently onto any iOS device by exploiting a flaw in Apple's restrictions on code signing, allowing the malware to steal user data and take control of certain iOS functions, according to Forbes.
Miller explains that code signing restrictions allow only Apple's approved commands to run in an iOS device's memory, and submitted apps that violate these rules are not allowed on the App Store. However, he has found a method to bypass Apple's security by exploiting a bug in iOS code signing that allows an app to download new unapproved commands from a remote computer.
"Now you could have a program in the App Store like Angry Birds that can run new code on your phone that Apple never had a chance to check,? Miller said. ?With this bug, you can?t be assured of anything you download from the App Store behaving nicely.?
The flaw was introduced when Apple released iOS 4.3, which increased browser speed by allowing javascript code from the internet to run on a much deeper level in a device's memory than in previous iterations of the OS. Miller realized that in exchange for speed, Apple created a new exception for the web browser to run unapproved code. The researcher soon found a bug that allowed him to expand the flawed code beyond the browser, integrating it into apps downloaded from the App Store.
Miller created proof-of-concept app called "Instastock" to showcase the vulnerability, which was submitted to and approved by Apple to be distributed via the App Store. The simple program appears to be an innocuous stock ticker, but it can leverage the code signing bug to communicate with Miller's server to pull unauthorized commands onto the affected device. From there the program has the ability to send back user data including address book contacts, photos and other files, as well as initiate certain iOS functions like vibrating alerts.
The app has since been pulled and according to his Twitter account, Miller has reportedly been banned from the App Store and kicked out of the iOS Developer Program.
Miller, a former NSA analyst who now works for computer security firm Accuvant, is a prominent Apple researcher who previously exposed the MacBook battery vulnerability and a security hole in the mobile version of Safari.
The researcher has refused to publicly reveal the exploit, reportedly giving Apple time to come up with a fix, though he will announce the specifics at the SysCan conference in Taiwan next week.
Comments
That being said, although Apple was well in their rights to kick him out, I think they are probably more than a bit embarrassed by this whole incident as well.
That's a pretty nasty exploit.
That being said, although Apple was well in their rights to kick him out, I think they are probably more than a bit embarrassed by this whole incident as well.
C. Miller is a strong proponent of iOS and is a great cracker having found many security holes in both OS X and iOS. Apple needs to setup an agreement with people like him and offer bounties for this type of thing much like the Airport security testers.
Kicking him out was stupid and churlish on Apple's part.
I disagree. Whether it's a proof of concept that he won't release to the public, or intended to harm or steal from users is irrelevant, Apple has to protect their base and ridding someone who wrote an app that breaks guidelines and allows developers backdoor access into a user's device should not be allowed.
- 2. Functionality
- 22. Legal requirements
I didn't look through all the sections, only 2 and 22 because they appeared to cover many of the offenses committed by Miller with this app so I don't know if there are others that would fit the bill, nor do I know if all the ones I listed fit the bill. Either way, I think it's clear Miller broke an excessive number of rules of the App Store which should not tolerated.2.4 Apps that include undocumented or hidden features inconsistent with the description of the app will be rejected
2.5 Apps that use non-public APIs will be rejected
2.6 Apps that read or write data outside its designated container area will be rejected
2.7 Apps that download code in any way or form will be rejected
2.8 Apps that install or launch other executable code will be rejected
2.13 Apps that are primarily marketing materials or advertisements will be rejected
2.14 Apps that are intended to provide trick or fake functionality that are not clearly marked as such will be rejected
2.17 Apps that browse the web must use the iOS WebKit framework and WebKit Javascript
22.1 Apps must comply with all legal requirements in any location where they are made available to users. It is the developer's obligation to understand and conform to all local laws
22.2 Apps that contain false, fraudulent or misleading representations will be rejected
22.3 Apps that solicit, promote, or encourage criminal or clearly reckless behavior will be rejected
22.4 Apps that enable illegal file sharing will be rejected
22.7 Developers who create apps that surreptitiously attempt to discover user passwords or other private user data will be removed from the iOS Developer Program
PS: As Steve N. states, "Apple needs to work closer with Miller." But that doesn't mean Miller should be allowed to violate Apple's Store policies.
I disagree. Whether it's a proof of concept that he won't release to the public or intended to harm or steal from users is irrelevant, Apple has to protect their customer base and riding someone who wrote an app that breaks guidelines and allows developers backdoor access into a user's device should be allowed.
- 2. Functionality
- 22. Legal requirements
I didn't look through all the sections, only 2 and 22 because they appeared to cover many of the offenses committed by Miller with this app so I don't know if there are others that would fit the bill, nor do I know if all the ones I listed fit the bill. Either way, I think it's clear Miller broke an excessive number of rules of the App Store which should not tolerated.2.4 Apps that include undocumented or hidden features inconsistent with the description of the app will be rejected
2.5 Apps that use non-public APIs will be rejected
2.5 Apps that read or write data outside its designated container area will be rejected
2.7 Apps that download code in any way or form will be rejected
2.8 Apps that install or launch other executable code will be rejected
2.13 Apps that are primarily marketing materials or advertisements will be rejected
2.14 Apps that are intended to provide trick or fake functionality that are not clearly marked as such will be rejected
2.17 Apps that browse the web must use the iOS WebKit framework and WebKit Javascript
22.1 Apps must comply with all legal requirements in any location where they are made available to users. It is the developer's obligation to understand and conform to all local laws
22.2 Apps that contain false, fraudulent or misleading representations will be rejected
22.3 Apps that solicit, promote, or encourage criminal or clearly reckless behavior will be rejected
22.4 Apps that enable illegal file sharing will be rejected
22.7 Developers who create apps that surreptitiously attempt to discover user passwords or other private user data will be removed from the iOS Developer Program
I'm not saying that what he did is right, but it says that he discovered the exploit in 4.3. We are now at 5.0, going on 5.0.1.
Maybe he contacted Apple a while ago and they didn't respond. So he decided to make it public and make a mockery of them.
....Apple has to protect their customer base and riding someone who wrote an app that breaks guidelines and allows developers backdoor access into a user's device should be allowed.
(I assume you meant to say "not be allowed").
It's these types of one-size-fits-all rules that I am calling "stupid and churlish."
Context and motives are important.
Kicking him out was stupid and churlish on Apple's part.
For all we know, Apple might have kicked him out only after he said the details would be publicly disclosed.
I'm not saying that what he did is right, but it says that he discovered the exploit in 4.3. We are now at 5.0, going on 5.0.1.
Maybe he contacted Apple a while ago and they didn't respond. So he decided to make it public and make a mockery of them.
And that's fine. I support his right to do that, but I also support Apple's right to close out any such developers. While Miller might be doing it for all the right reasons, that may not always be the case. And what if others who claim to be white hat hackers end up being a black hat, get taken advantage of by a black hat hacker, or through some other unintended accident compromise user's data. Either way, Apple is responsible for their users and we can't have hundreds of thousands of potential devs putting in backdoor code into their apps because they it's OK for Charlie Miller. This is a line that needs to be drawn deep and drawn fast.
On the flipside, Apple needs to take care of these JS holes or disallow mobileSafari/WebKit in apps.
If more of these JS-based bugs start to come out Apple might have to pull mobileSafari access from within apps. It's not like it's as needed as it was back before App Store apps were allowed to run in the background.
From what I got the hacker is not using safari itself, but has found a way of tricking the sandbox to allow him to download external code in a way that is similar to what safari's Java Script does. Am I mistaken?
If so, apple needs to put even more people on safari security, as it seems to be a recurring hole in the security process.
There isn't that much Apple can do to prevent you from hiding malware inside a app that is on the appstore. Apple just has the ability to pull the app, blacklist it, or sue the creator that hid the malware.
Code signing and the sandbox are not the same thing. Just because the Application's own binary is compromised doesn't mean the sandbox is compromised. There may be some hack that could be used to break the sandbox with a modified binary... but your treading on dangerous legal ground (you need to provide your information to put an app on the store) and the app will be quickly pulled and blacklisted by apple.
Maybe he contacted Apple a while ago and they didn't respond. So he decided to make it public and make a mockery of them.
Maybe so. 4.3 was out months ago—BUT that doesn’t mean Charlie found the bug months ago. If he did, and revealed the bug to Apple, and is now a brave and noble white knight going public because there’s no other way to make Apple fix the issue, then that’s one thing—but there’s no evidence for that, and if he likes attention as much as he seems to, you’d think he’d have made that point publicly: “I revealed this bug in July, but Apple will never fix it without being forced to, and now I must sacrifice myself for the greater good, oh beloved world!’ or something.
It’s equally possible he found the bug months ago and sat on it until he felt it would be worth the most attention for himself. Or, that he discovered it only recently. Either way, it’s very possible that he gave Apple very little time. We don’t know yet. (We do know he violated the App Store terms.)
Changing your OS, and your developer tools, affecting millions of people, isn’t always a small, quick thing. Doing so without other side effects isn’t always simple either. It involves much testing, many parties, and impacts (and is impacted by) other development efforts going on simultaneously.
Has Apple has “enough” time? We have no idea, because we don’t know when they were told, nor the magnitude of the challenge. You don’t need to “mock” a company to make them fix something faster. Maybe you need to if they tell you they plan to NEVER fix it, but that is highly unlikely to be Apple’s plan! They’ve taken iOS security very seriously—much more so than, say, Google/Android.
We DO know Charlie has given a big helping hand to malware writers in the mean time. That’s clearly harmful behavior, as enjoyable as the spotlight may be for Charlie.
People who catch bugs deserve our thanks. Unless possibly they use those bugs to gain fame/attention at our expense!
Maybe Apple should have made a special exception and let Charlie break the rules. But he didn’t NEED to break them just to explain the bug to Apple. He needed to break the rules as a publicity stunt. He could have pointed out the flaw to Apple without that. What about the next “good guy” who finds a flaw? Doesn’t Apple want them to report it and NOT actually send their app out into users’ hands? And if they do, intentionally and unnecessarily send their malicious app out there, shouldn’t they too be banned? After all, they signed an agreement, which protects us users, and they broke it knowing the consequences.
I'm not saying that what he did is right, but it says that he discovered the exploit in 4.3. We are now at 5.0, going on 5.0.1.
Maybe he contacted Apple a while ago and they didn't respond. So he decided to make it public and make a mockery of them.
Maybe Apple didn't care. It is easy to create a security vulnerability in your own app.... The solution is to sue the developer, blacklist the app, and remove it from the app store. The only problem I see is that this might be able to get around Apple's blacklist, but Apple should be able to patch that particular hole if it exists.
Beyond that, I'm not sure why you would go to the trouble to create an app that people want to download or purchase and then deliberately sabotage it.
That's a pretty nasty exploit.
That being said, although Apple was well in their rights to kick him out, I think they are probably more than a bit embarrassed by this whole incident as well.
Please inform us which of Miller's "nasty exploits" ever came to pass in the real world. He comes up with these convoluted machinations, makes a big deal of calling Apple out over them, and none of them ever appears to be deployed or actually go live in real world malware that attacks Mac users. I think he does a great service to Apple users by finding these flaws but he's a real prick when it comes to how he informs the world.
If more of these JS-based bugs start to come out Apple might have to pull mobileSafari access from within apps. It's not like it's as needed as it was back before App Store apps were allowed to run in the background.
Is this a JS bug? I don't see anything that indicates that.
I'm not sure what this has to do with background apps...
That's a pretty nasty exploit.
That being said, although Apple was well in their rights to kick him out, I think they are probably more than a bit embarrassed by this whole incident as well.
They didn't "kick him out." They are likely talking to him right now.
What he did was slightly irresponsible in a small way (he loves his publicity), but it's likely he has a list of everyone who downloaded the thing, has already told Apple where the flaw is, and has already given that list of users to them.
Is this a JS bug? I don't see anything that indicates that.
I'm not sure what this has to do with background apps...
I thought I read on another site he was exploiting a hole in JS.
C. Miller is a strong proponent of iOS and is a great cracker having found many security holes in both OS X and iOS. Apple needs to setup an agreement with people like him and offer bounties for this type of thing much like the Airport security testers.
He did expose it in an unprofessional and irresponsible way though. He really loves his publicity so hopefully he was just grandstanding as usual and will still be cooperating with Apple over the issue.