Google is now becoming its own worse enemy, Google hands off approach and let the systems smarts do all the work make this state a gem
I am just surprised how long it talking people to realize they own a product which they have no ideal whether apps they want to use and buy will actually work on their devices they own, an god forbid they update only to find out the apps they love no longer work on their new hardware. I been saying this for years, most bad review on Google Play have to do with people saying the app does not work correctly on a specific phone.
wonkothesane wrote: »
On a sidenote: While I am sure you can find negative reviews for nearly any app, no matter which platform, I particularly feel that the iOS App Store makes it quite hard to quickly get a realistic rating overview. I feel that many ratings are not helpful, even appear to be paid sometimes. And a feature like rating reviews as helpful is missing which would already make this easier.
If I'm not mistaken, Objective C is and will continue to be fully supported, and Swift code can be embedded / added into Objective C code and compiled. Swift may not yet support all of the object types that Objective C now supports, but it is very good for prototyping and / or creating apps / applications without the skill level that Objective C requires.
I'm sure that the next WWDC will bring a better assessment of its status and utility to developers.
Apple haven't switched from C to Swift, they've switched from Objective-C to Swift. Very few iOS developers write a significant amount of their code in C.
Swift isn't just matching Objective-C for performance, it's outperforming Objective-C in many recent tests.
Those might be tests but they're not real world tests. When did you last buy an app that had 5 buttons, each of which ran a different list sort?
You say they're replacing ObjC not C, but that doesn't really make sense. ObjC is C when you're not doing OO. And you shouldn't be doing OO everywhere, OO is great for GUI control libraries, collection classes, or anywhere there's a natural hierarchy in the business domain but elsewhere you'd just be using the tools for the sake of it.
The designers of Cocoa knew this. You can tell looking at the class library that they didn't make classes for absolutely everything (like you get in a language like Java), they made classes where OO really added something, and otherwise you can just get an off the shelf C library and link it in to your program (one example I used recently was the Bignum C library because ObjC didn't have a BigInteger class). C and ObjC are complementary and supposed to be chopped and changed in the same program. So they are replacing both, so it's fair to compare C performace with Swift performance.
Yep. People say it's market size that attracts developers, but really it's money, with the assumption being that a bigger market means more money. But sometimes a smaller market of higher end devices will have more to spend.
Writing an app in Swift doesn't exclude the use of C, just like writing an app in Objective-C doesn't exclude the use of C. Swift is very much a replacement for Objective-C, not C. This a point repeatedly made by Apple's engineers as WWDC 2014.
As for when to use Objective-C and when to use C, I disagree to an extent. Objective-C has features like ARC that make it worthwhile in most situations. Unless you're writing a code where performance is by far and away the most important factor or you're accessing Foundation-level APIs, Objective-C should be the default option.
I would say that Swift is a year or two away from widespread adoption. Lots of teams haven't even modernised their Objective C code yet.
As for Apple. Apple is definitely gimping on the hardware. It could probably go to quad core right now. It could use more RAM.
But it's got a long term plan. Get the software improvements in now and when the hardware specs in the Android world trend to smaller and smaller updates, which is just a few years from now, Apple will be selling cheap or mid range iPhones which will be as fast, and newer devices which will be much faster in real world tests. There is no need to optimise both hardware and software at the same time after all. Getting the optimisation in first will make iOS vastly better when it churns out the higher specced hardware.
Did they even need to extend the lead?
I have these little Android devices I use for product testing, and I noticed there is nothing to stop me from going to Google Play and buying (paying for) a major title game that absolutely positively will not run on my device, or probably 90% of all existing devices that run on something called "Android". My Android devices are so awful they usually crash when even attempting to load something from Google Play.
There may be "games" out there....but just like the ancient antiquated PC game market....don't bother unless you have the absolutely latest overclocked hardware to run it.
I just don't understand where people get the nerve to make comments like this. What do you have to back up this statement? What access to internal Apple engineers do you have, that fuel your comment here?
You know nothing of what they can or cannot do at this stage, nor do you know their reasons for what they do. Let's leave it at that.
I know lots more than you. Are you saying there is something in iOS which prohibits it from running on Quad Core (or higher?), or accessing more than 1GB of RAM?
Absent possible battery life problems ( which I doubt in any case) there isn't. And I was praising Apple's strategy not condemning it. It makes sense to not use the highest specced hardware if your software team is so good it doesn't matter. When the hardware race evens out Apple will be way ahead in software.
I can back up the statement. It had nothing to do with performance and everything to do with the bottom line and profit margins. All you have to do is run a task manager on your iPad and you can see it needs 2gb of ram to run certain software. Safari is a pig on the iPad and lags all the time.
I don't understand where people get the nerve to say it doesn't need more ram as if that isn't going to be a benefit with zero downside. What proof do you have it doesn't need more ram. What also makes me laugh is people talking about how Apple TV is going to kill the PS4 or Xbox One. That is the biggest joke on earth. With the latest iOS upgrade it can even mirror iOS devices anymore. For the exception of getting new apps Apple TV is about as advanced as a ROKU streaming stick.
Even the highest level game on iOS or Android I was playing six years ago on a pc or xbox.
pmz wrote: »
Did they even need to extend the lead?
I have these little Android devices I use for product testing, and I noticed there is nothing to stop me from going to Google Play and buying (paying for) a major title game that absolutely positively will not run on my device, or probably 90% of all existing devices that run on something called "Android".
Hang on, although I think that Apple is deliberately not going to market with the best chips it could, the real world stats show it is in real world tests way ahead of the competition on mobile. Thats the purpose of this article. It therefore is keeping it's powder dry for the future, a clever move even if I would prefer more RAM myself.
richl wrote: »
Cherry-picking negative reviews makes me sad. Every developer gets them, even for the best apps. As an app developer they make me simultaneously frustrated and glad that I can't reply to them.
I'd prefer to see some statistics on what percentage of reviews are negative or some reporting of the game's overall score on each platform. You could argue that Android users will rate apps more generously because they don't have access to iOS-quality apps but a broken experience is a one-star rating on any platform.
asdasd wrote: »
I know lots more than you.
Oh yeah. That's the argument. Xbox. Right.
Xbox is for 13 year olds.