And for the last time, if Apple allows development tools with cross-compile capability to take root, many developers will absolutely use it because they see it as a quick, easy solution. Perhaps at first, it would be. But the next time Apple wants to change the platform to a different processor, or make some API change, or update the operating system, the Apps that had been cross-compiled from Adobe's processor might break, and developers that rely on Adobe's system may have to wait some time for them to bring new capabilities forward. Such apps would languish and become stagnant. And the logistics involved with any change become magnified tenfold, just as I experienced with Metrowerks CodeWarrior throughout the OS 9 to OS X transition and then the PPC to Intel transition. With how fast the hardware and OS landscape is changing now, Apple simply can't saddle themselves with the extra baggage of the third party SDKs. This is not hypothetical. It would absolutely happen.
If the worst price to be paid for the user is the loss of a few crummy Apps (because the popular ones would get ported) and the price to the developer is learning XCode, then I'd say that's a good bargain.
The hardest most costly solution is always better? No pain no gain? Why do you think an app is automatically crappy because it may have been written via compiler? I don't think users give a rat's a-- about the code. All they care about is whether or not what it does is useful, fun, or both, and if it works. The apps "written by hand" in native code from scratch are not immune to breaking. There are a ton of bad reviews for buggy apps on the app store right now, I doubt all of those apps were written via 3rd party compiler. Besides, the phrase "so what" comes to mind. Any company with an investment in a popular money making app will promptly update their product by whatever means (further motivated by an opportunity to sell it again to existing customers maybe throw in a feature or two and call it an upgrade). If they choose not to update their apps, again, so what. The user has the use of the original app until it's broken by an OS change. On average they paid all of a few bucks, vs never having the opportunity to use the app at all because it was excluded in the first place. If the app is on the expensive side, again, more incentive for the developer to update their product if it is selling well. If not enough people are buying it to make it worth keeping it running, again, so what if it dies. Do you think users's feel grateful to Jobs if their favorite app disappears from the store because Jobs chose to yank it at his discretion (as can happen no matter what) vs. it dying away because it was broken by an OS update? What's the difference from the users' point of view?
Frankly, I think you're letting yourself be suckered into buying Jobs' BS on this one. I don't believe this is about managing tightly controlled hardware specs tuned to their OS to maintain an optimal "user experience" as is the legitimate model Apple applies to it's computer business. I don't think it's about protecting the user by keeping them safe from sub-par apps or ensuring they won't break in the future (which can't done, no matter how the code was generated). It's about Jobs proclivity towards exclusion to a degree that goes a bit beyond rationality. It's about competition through trying to force developers to create products exclusive to the Apple mobile platform, betting on Apple as the 800 pound gorilla in the mobile market making developers who can't afford the investment in developing separately for multiple platforms chose Apple. It may be an approach that is better for Apple, but it's not an approach that's better for their customers, or developers who want to support their platform. Ironically in the long run, what is not in the best interest of the latter two is probably not the best for Apple either.