If the Bay trail is so great, why Anandtech gives absolutely no detail on battery life? At 2.5 watts max the Z3740 it still 2.5 times more power hungry than Apple SoC.
That was the Z3770 being tested at 2.4GHz (turbo). Anand was using an Intel prototype, it will not be coming to market. Final device specifications are up to manufacturers and they will be launching with improved drivers.
The Clover Trail powered ThinkPad Tablet 2 was able to offer a ~10 hour battery life in a chassis thinner and lighter than the iPad 4. Bay Trail is more powerful and more efficient than Clover Trail.
That was the Z3770 being tested at 2.4GHz (turbo). Anand was using an Intel prototype, it will not be coming to market. Final device specifications are up to manufacturers and they will be launching with improved drivers.
So what you say is: there is no proof yet.
Quote:
The Clover Trail powered ThinkPad Tablet 2 was able to offer a ~10 hour battery life in a chassis thinner and lighter than the iPad 4. Bay Trail is more powerful and more efficient than Clover Trail.
From reviews I've read over the net, the battery life is more within 5 to 8 hours. Been more lighter and more thinner is not a feat considering his poor build quality, all reviews report about how flimsy the plastic is.
The Clover Trail powered ThinkPad Tablet 2 was able to offer a ~10 hour battery life in a chassis thinner and lighter than the iPad 4. Bay Trail is more powerful and more efficient than Clover Trail.
Not disagreeing that this is not a compelling new chip, but the tablet you referenced has a graphics core 2 generations older than the iPad 4, and has as third as many pixels than the iPad 4, yet just manages to achieve the same battery life. I think you made the point you were arguing against in so far as Intel's CPU efficiency….
From reviews I've read over the net, the battery life is more within 5 to 8 hours. Been more lighter and more thinner is not a feat considering his poor build quality, all reviews report about how flimsy the plastic is.
How do you infer that means there is no proof? The proof was reviewed and it will only be better in a production model from an OEM. Blocking your ears and closing your eyes will not help your cause. The performance has been known and tested on a number of different devices.
Engadget rated it at 10:27 on their battery test.
"In our standard rundown test (video looping, WiFi on, fixed display brightness) we got 10 hours and 27 minutes of runtime"
Not disagreeing that this is not a compelling new chip, but the tablet you referenced has a graphics core 2 generations older than the iPad 4, and has as third as many pixels than the iPad 4, yet just manages to achieve the same battery life. I think you made the point you were arguing against in so far as Intel's CPU efficiency….
The PowerVR SGX5445 is in the same generation 5 XT series of cores. Intel had the GPU at the max clock 500MHz+. The device had a lower resolution display, but it also had other features such as a digitizer and active Wacom Stylus. The SoC used quite a bit more power than Bay Trail and was manufactured on a 32nm process.
The battery inside the ThinkPad Tablet 2 is also smaller than the one inside the iPad 4:
30Wh (TT2) vs 43Wh (iPad 4)
The A6X was never an efficient SoC, but Apple made up the difference with the large battery.
First, on the advantages of 64 bit to iOS, most of the analysis focuses on features of the 64 bit chip that are unrelated to the fact that it is 64bit. You could have those same features on 32 bit. Hence it is really discussing the advantages of a 64 bit ARM in particular, not 64 bits.
There are a couple of advantages to 64bit that are barely mentioned: faster at high precision math, and faster at handling >4GB memory in a single application.
Whoopee. For a few apps, the math is a big deal. In the future, >4GB memory will be of some interest.
The Android analysis gets an F- - it is extremely wrong.
I won't go into all the errors - it would take to much work.
Critically: the assertion is that it would be harder to port Android (and Android apps) to 64 bit. The opposite is true!
Porting Android requires porting *one* piece of code: dalvik. Applications don't have to even be touched, and there's no cost (unlike iOS) for using apps targeted at 32 bit.
If we were to believe the author's bizarre argument (apps aren't native code so they are harder to port), we'd have to turn upside down the whole reason people went to high level languages and VM's!
In fact, it is the apps that use native code (C) that might, depending on the app, require some porting if they were to run as 64 bit apps. This is just like *all* Objective C apps - every one has the potential to require code changes for 64 bit. Some will actually require it even though they don't need 64 bit at all.
The erst of the Android stuff may sound fine to Apple fan bois, but to those of use working in the Android space, it's just nonsense.
fine. so let's wait and see what actually gets done - or not - ok?
The PowerVR SGX5445 is in the same generation 5 XT series of cores. Intel had the GPU at the max clock 500MHz+. The device had a lower resolution display, but it also had other features such as a digitizer and active Wacom Stylus. The SoC used quite a bit more power than Bay Trail and was manufactured on a 32nm process.
The battery inside the ThinkPad Tablet 2 is also smaller than the one inside the iPad 4:
30Wh (TT2) vs 43Wh (iPad 4)
The A6X was never an efficient SoC, but Apple made up the difference with the large battery.
Sorry I was looking at last year's Clover Trail. Yeah, these two are more similar in GHz/Watt then I was expecting...
- Apple is shipping 64 bit iPhones today with upgraded Apple apps, with iPads sure to follow next month.
- This already improves performance and supports new features (like the major advances in the camera app).
- They will sell 100+ million 64 bit iPhones/iPads in the next year.
- so developers will rush to produce upgraded/enhanced 64 bit apps for iPhone/iPad, because ...
- iPhone/iPad owners will pay for noticeably enhanced apps.
basically, everything the Android fan people here are saying boils down to "coulda, woulda, shoulda." but the truth is, Apple just cleaned Android's clock, and they know it.
Yes, but... all of the 'know for sure' is only vaguely related to 64 bit architecture. The same could be said if they had stuck with 32 architecture and just went to an upgraded 32 bit CPU. Sure, 64 bit is better, but right now, with all those devices mentioned, it is only slightly better than 32 bit.
So yes, this may indeed be a fine thing for the iOS ecosystem. But not because it is 64 bit. That is likely to only impact a very few apps that can take advantage of the wider math registers. Everything else is related to the future, which is just as iffy as anything said about Android.
As an engineer, I get really frustrated when marketing hype is used to blow up an engineering change into something gigantic, when it isn't. I get annoyed with articles that tout advantages that don't even exist, and slam the competition with arguments that are not only wrong, but actually backwards, painting the competitor's advantage (ease of porting) as negative.
64 bit is real, but it is minor. It won't make or break Apple or Android this year or next. Beyond that, it could make a difference, but somehow you don't allow speculation if it's about Android, so you can't do it for Apple.
How do you infer that means there is no proof? The proof was reviewed and it will only be better in a production model from an OEM. Blocking your ears and closing your eyes will not help your cause. The performance has been known and tested on a number of different devices.
Engadget rated it at 10:27 on their battery test.
"In our standard rundown test (video looping, WiFi on, fixed display brightness) we got 10 hours and 27 minutes of runtime"
Ok I give it to you on the battery life, I wasn't up to date with those new tablet. Still Intel can't win this battle, they can got higher (or lower) fab process but it only gives more rooms for further enhancement to ARM SoC.
Quote:
Originally Posted by LAKings33
The A6X was never an efficient SoC, but Apple made up the difference with the large battery.
That is baseless, comparing to what? Adding a third GPU core and doubling RAM channels don't make SoC inefficient to other equivalent. The CPU account only for a fraction of power usage, lightning the screen if what drain most the battery. With higher pixel density the screen is less translucent therefor it need better lightning.
Wrong. It has already been stated that Kit Kat, Android's soon to be released new version, will have 64 bit support and Samsung clearly stated its next phone, likely the Galaxy S5, will have a 64 bit processor. These don't happen overnight. They've already been working on them before the iPhone and iOS announcements.
As an engineer, I get really frustrated when marketing hype is used to blow up an engineering change into something gigantic, when it isn't. I get annoyed with articles that tout advantages that don't even exist, and slam the competition with arguments that are not only wrong, but actually backwards, painting the competitor's advantage (ease of porting) as negative.
64 bit is real, but it is minor. It won't make or break Apple or Android this year or next. Beyond that, it could make a difference, but somehow you don't allow speculation if it's about Android, so you can't do it for Apple.
I admit I've advocate the 64 bit a lot lately and while I did not sound that way, I do not think this is a world changer breakthrough, or be way way over ahead the competition, but a normal evolution of a long predicted roadmap. But I still love those architecture transition, on the developer side it normally cause a lot of cleaning old legacy relics and for the users it gives new breath.
Beside there is not many way to add power to a CPU, where other have choose core multiplication tactics, I found Apple approach more forward looking.
To be fair, iOS was also designed initially for low-memory low-powered devices like the original iPhone. Moreover, it debuted mainly as a platform for web apps, and only later were provisions for third-party apps added in.
To be fair iOS is based exclusively on OSX which is a full 64bit Unix OS. The first version was called "iPhone OS" and was later switched to iOS once Apple and Cisco worked out the purchase of the iOS name from Cisco. It is true that the first iteration of iOS was designed to run web apps but it was always based on OSX's unix core. Here is a quote from wikipedia on iOS's History:
Quote:
?iOS is derived from OS X, with which it shares the Darwin foundation and various application frameworks. iOS is Apple's mobile version of the OS X operating system used on Apple computers.
In fact iOS runs on the Darwin Mach microkernel, same as Mac OSX. Darwin 14.0 is running iOS7
Yes, but... all of the 'know for sure' is only vaguely related to 64 bit architecture. The same could be said if they had stuck with 32 architecture and just went to an upgraded 32 bit CPU. Sure, 64 bit is better, but right now, with all those devices mentioned, it is only slightly better than 32 bit.
So yes, this may indeed be a fine thing for the iOS ecosystem. But not because it is 64 bit. That is likely to only impact a very few apps that can take advantage of the wider math registers. Everything else is related to the future, which is just as iffy as anything said about Android.
As an engineer, I get really frustrated when marketing hype is used to blow up an engineering change into something gigantic, when it isn't. I get annoyed with articles that tout advantages that don't even exist, and slam the competition with arguments that are not only wrong, but actually backwards, painting the competitor's advantage (ease of porting) as negative.
64 bit is real, but it is minor. It won't make or break Apple or Android this year or next. Beyond that, it could make a difference, but somehow you don't allow speculation if it's about Android, so you can't do it for Apple.
honestly, just "no, wrong." the major improvements in the 5s camera software image processing alone simply would not be possible without the 64 bit computation power. now, we can't see with our own eyes how much a difference they make until this weekend so, yes, the jury is still out. but taking photos is certainly one of the top 3 uses of smartphones, and enabling "dummies" like me to get really good pix under all conditions - lighting is usually far less than optimal - without any extra effort is extraordinarily important for consumers.
you're really missing the point. what matters most of all is what us "dummy" users - not you engineers - never have to even think about, because it Just Works.
Multiple people playing on a single iPhone or even an iPad wouldn't be a very good experience. That is why I still maintain that this is for the TV. And the reason AirPlay doesn't feel the same is because it doesn't work as well. I have all Apple products, including my TimeMachine wireless router and from time to time it lags especially with multiple players. This is just my experience though and maybe with a software update it will be remedied. And I believe that for certain styles of games, buttons and joysticks are superior.
Well, that’s how consoles do it. Four people crowd around one single controller and they…
I never said it was an Apple product. Although Apple did release these DESIGNS and created the API's to use them which means they intend people to use them with their products as an option. And consoles crowd people around a much larger screen, called a TV, with MULTIPLE controllers. A small screen even with multiple controllers is a sub-par experience. Again, which is why I would imagine that a standalone controller is truly meant for a larger screen then an iPad.
I just… I don’t get why this was so difficult a conclusion to come to:
Use. Multiple. iDevices. On. The. Same. TV.
That would be nice, if I had enough money to buy 4 iDevices to connect to a single ATV. A standalone controller would offer a lower-cost solution for this exact reason. Besides, as I said when friends do come over and we are able to use multiple iDevices on a single ATV using AirPlay it LAGS!
A dedicated TV gaming platform with a lower cost standalone controller would provide a better overall experience.
Comments
It’s over 15% and rising.
The same policies that… aren’t true, you mean.
Hush.
Yeah, no.
Yeah… no. Again. Just wrong.
“Everything is better”… If you’re an idiot, sure.
lol.
Go crap all over MacRumors. They prefer your drivel to real conversation.
If the Bay trail is so great, why Anandtech gives absolutely no detail on battery life? At 2.5 watts max the Z3740 it still 2.5 times more power hungry than Apple SoC.
That was the Z3770 being tested at 2.4GHz (turbo). Anand was using an Intel prototype, it will not be coming to market. Final device specifications are up to manufacturers and they will be launching with improved drivers.
That was the Z3770 being tested at 2.4GHz (turbo). Anand was using an Intel prototype, it will not be coming to market. Final device specifications are up to manufacturers and they will be launching with improved drivers.
So what you say is: there is no proof yet.
From reviews I've read over the net, the battery life is more within 5 to 8 hours. Been more lighter and more thinner is not a feat considering his poor build quality, all reviews report about how flimsy the plastic is.
The Clover Trail powered ThinkPad Tablet 2 was able to offer a ~10 hour battery life in a chassis thinner and lighter than the iPad 4. Bay Trail is more powerful and more efficient than Clover Trail.
Not disagreeing that this is not a compelling new chip, but the tablet you referenced has a graphics core 2 generations older than the iPad 4, and has as third as many pixels than the iPad 4, yet just manages to achieve the same battery life. I think you made the point you were arguing against in so far as Intel's CPU efficiency….
So what you say is: there is no proof yet.
From reviews I've read over the net, the battery life is more within 5 to 8 hours. Been more lighter and more thinner is not a feat considering his poor build quality, all reviews report about how flimsy the plastic is.
How do you infer that means there is no proof? The proof was reviewed and it will only be better in a production model from an OEM. Blocking your ears and closing your eyes will not help your cause. The performance has been known and tested on a number of different devices.
Engadget rated it at 10:27 on their battery test.
"In our standard rundown test (video looping, WiFi on, fixed display brightness) we got 10 hours and 27 minutes of runtime"
http://www.engadget.com/2013/02/11/lenovo-thinkpad-tablet-2-review/
Not disagreeing that this is not a compelling new chip, but the tablet you referenced has a graphics core 2 generations older than the iPad 4, and has as third as many pixels than the iPad 4, yet just manages to achieve the same battery life. I think you made the point you were arguing against in so far as Intel's CPU efficiency….
The PowerVR SGX5445 is in the same generation 5 XT series of cores. Intel had the GPU at the max clock 500MHz+. The device had a lower resolution display, but it also had other features such as a digitizer and active Wacom Stylus. The SoC used quite a bit more power than Bay Trail and was manufactured on a 32nm process.
The battery inside the ThinkPad Tablet 2 is also smaller than the one inside the iPad 4:
30Wh (TT2) vs 43Wh (iPad 4)
The A6X was never an efficient SoC, but Apple made up the difference with the large battery.
This analysis is pretty bad.
First, on the advantages of 64 bit to iOS, most of the analysis focuses on features of the 64 bit chip that are unrelated to the fact that it is 64bit. You could have those same features on 32 bit. Hence it is really discussing the advantages of a 64 bit ARM in particular, not 64 bits.
There are a couple of advantages to 64bit that are barely mentioned: faster at high precision math, and faster at handling >4GB memory in a single application.
Whoopee. For a few apps, the math is a big deal. In the future, >4GB memory will be of some interest.
The Android analysis gets an F- - it is extremely wrong.
I won't go into all the errors - it would take to much work.
Critically: the assertion is that it would be harder to port Android (and Android apps) to 64 bit. The opposite is true!
Porting Android requires porting *one* piece of code: dalvik. Applications don't have to even be touched, and there's no cost (unlike iOS) for using apps targeted at 32 bit.
If we were to believe the author's bizarre argument (apps aren't native code so they are harder to port), we'd have to turn upside down the whole reason people went to high level languages and VM's!
In fact, it is the apps that use native code (C) that might, depending on the app, require some porting if they were to run as 64 bit apps. This is just like *all* Objective C apps - every one has the potential to require code changes for 64 bit. Some will actually require it even though they don't need 64 bit at all.
The erst of the Android stuff may sound fine to Apple fan bois, but to those of use working in the Android space, it's just nonsense.
fine. so let's wait and see what actually gets done - or not - ok?
The PowerVR SGX5445 is in the same generation 5 XT series of cores. Intel had the GPU at the max clock 500MHz+. The device had a lower resolution display, but it also had other features such as a digitizer and active Wacom Stylus. The SoC used quite a bit more power than Bay Trail and was manufactured on a 32nm process.
The battery inside the ThinkPad Tablet 2 is also smaller than the one inside the iPad 4:
30Wh (TT2) vs 43Wh (iPad 4)
The A6X was never an efficient SoC, but Apple made up the difference with the large battery.
Sorry I was looking at last year's Clover Trail. Yeah, these two are more similar in GHz/Watt then I was expecting...
And they both suck
- Apple is shipping 64 bit iPhones today with upgraded Apple apps, with iPads sure to follow next month.
- This already improves performance and supports new features (like the major advances in the camera app).
- They will sell 100+ million 64 bit iPhones/iPads in the next year.
- so developers will rush to produce upgraded/enhanced 64 bit apps for iPhone/iPad, because ...
- iPhone/iPad owners will pay for noticeably enhanced apps.
basically, everything the Android fan people here are saying boils down to "coulda, woulda, shoulda." but the truth is, Apple just cleaned Android's clock, and they know it.
Yes, but... all of the 'know for sure' is only vaguely related to 64 bit architecture. The same could be said if they had stuck with 32 architecture and just went to an upgraded 32 bit CPU. Sure, 64 bit is better, but right now, with all those devices mentioned, it is only slightly better than 32 bit.
So yes, this may indeed be a fine thing for the iOS ecosystem. But not because it is 64 bit. That is likely to only impact a very few apps that can take advantage of the wider math registers. Everything else is related to the future, which is just as iffy as anything said about Android.
As an engineer, I get really frustrated when marketing hype is used to blow up an engineering change into something gigantic, when it isn't. I get annoyed with articles that tout advantages that don't even exist, and slam the competition with arguments that are not only wrong, but actually backwards, painting the competitor's advantage (ease of porting) as negative.
64 bit is real, but it is minor. It won't make or break Apple or Android this year or next. Beyond that, it could make a difference, but somehow you don't allow speculation if it's about Android, so you can't do it for Apple.
How do you infer that means there is no proof? The proof was reviewed and it will only be better in a production model from an OEM. Blocking your ears and closing your eyes will not help your cause. The performance has been known and tested on a number of different devices.
Engadget rated it at 10:27 on their battery test.
"In our standard rundown test (video looping, WiFi on, fixed display brightness) we got 10 hours and 27 minutes of runtime"
http://www.engadget.com/2013/02/11/lenovo-thinkpad-tablet-2-review/
Ok I give it to you on the battery life, I wasn't up to date with those new tablet. Still Intel can't win this battle, they can got higher (or lower) fab process but it only gives more rooms for further enhancement to ARM SoC.
The A6X was never an efficient SoC, but Apple made up the difference with the large battery.
That is baseless, comparing to what? Adding a third GPU core and doubling RAM channels don't make SoC inefficient to other equivalent. The CPU account only for a fraction of power usage, lightning the screen if what drain most the battery. With higher pixel density the screen is less translucent therefor it need better lightning.
As an engineer, I get really frustrated when marketing hype is used to blow up an engineering change into something gigantic, when it isn't. I get annoyed with articles that tout advantages that don't even exist, and slam the competition with arguments that are not only wrong, but actually backwards, painting the competitor's advantage (ease of porting) as negative.
64 bit is real, but it is minor. It won't make or break Apple or Android this year or next. Beyond that, it could make a difference, but somehow you don't allow speculation if it's about Android, so you can't do it for Apple.
I admit I've advocate the 64 bit a lot lately and while I did not sound that way, I do not think this is a world changer breakthrough, or be way way over ahead the competition, but a normal evolution of a long predicted roadmap. But I still love those architecture transition, on the developer side it normally cause a lot of cleaning old legacy relics and for the users it gives new breath.
Beside there is not many way to add power to a CPU, where other have choose core multiplication tactics, I found Apple approach more forward looking.
To be fair, iOS was also designed initially for low-memory low-powered devices like the original iPhone. Moreover, it debuted mainly as a platform for web apps, and only later were provisions for third-party apps added in.
To be fair iOS is based exclusively on OSX which is a full 64bit Unix OS. The first version was called "iPhone OS" and was later switched to iOS once Apple and Cisco worked out the purchase of the iOS name from Cisco. It is true that the first iteration of iOS was designed to run web apps but it was always based on OSX's unix core. Here is a quote from wikipedia on iOS's History:
In fact iOS runs on the Darwin Mach microkernel, same as Mac OSX. Darwin 14.0 is running iOS7
Yes, but... all of the 'know for sure' is only vaguely related to 64 bit architecture. The same could be said if they had stuck with 32 architecture and just went to an upgraded 32 bit CPU. Sure, 64 bit is better, but right now, with all those devices mentioned, it is only slightly better than 32 bit.
So yes, this may indeed be a fine thing for the iOS ecosystem. But not because it is 64 bit. That is likely to only impact a very few apps that can take advantage of the wider math registers. Everything else is related to the future, which is just as iffy as anything said about Android.
As an engineer, I get really frustrated when marketing hype is used to blow up an engineering change into something gigantic, when it isn't. I get annoyed with articles that tout advantages that don't even exist, and slam the competition with arguments that are not only wrong, but actually backwards, painting the competitor's advantage (ease of porting) as negative.
64 bit is real, but it is minor. It won't make or break Apple or Android this year or next. Beyond that, it could make a difference, but somehow you don't allow speculation if it's about Android, so you can't do it for Apple.
honestly, just "no, wrong." the major improvements in the 5s camera software image processing alone simply would not be possible without the 64 bit computation power. now, we can't see with our own eyes how much a difference they make until this weekend so, yes, the jury is still out. but taking photos is certainly one of the top 3 uses of smartphones, and enabling "dummies" like me to get really good pix under all conditions - lighting is usually far less than optimal - without any extra effort is extraordinarily important for consumers.
Because they’ve done no such thing? Restricting developers to a set and configuration of buttons is exactly the opposite of the idea of iOS devices.
I apologize that you did not hear but, they did this past summer when they introduced iOS 7. please read this article,
http://appleinsider.com/articles/13/06/13/apple-working-with-logitech-and-moga-for-mfi-game-controllers-details-framework-at-wwdc
Multiple people playing on a single iPhone or even an iPad wouldn't be a very good experience. That is why I still maintain that this is for the TV. And the reason AirPlay doesn't feel the same is because it doesn't work as well. I have all Apple products, including my TimeMachine wireless router and from time to time it lags especially with multiple players. This is just my experience though and maybe with a software update it will be remedied. And I believe that for certain styles of games, buttons and joysticks are superior.
But it’s not an Apple product.
Multiple people playing on a single iPhone or even an iPad wouldn't be a very good experience.
Well, that’s how consoles do it. Four people crowd around one single controller and they…
Oh, wait.
I believe that for certain styles of games, buttons and joysticks remain superior to contemporary touchscreen solutions.
But it’s not an Apple product.
Well, that’s how consoles do it. Four people crowd around one single controller and they…
I never said it was an Apple product. Although Apple did release these DESIGNS and created the API's to use them which means they intend people to use them with their products as an option. And consoles crowd people around a much larger screen, called a TV, with MULTIPLE controllers. A small screen even with multiple controllers is a sub-par experience. Again, which is why I would imagine that a standalone controller is truly meant for a larger screen then an iPad.
I just… I don’t get why this was so difficult a conclusion to come to:
Use. Multiple. iDevices. On. The. Same. TV.
I just… I don’t get why this was so difficult a conclusion to come to:
Use. Multiple. iDevices. On. The. Same. TV.
That would be nice, if I had enough money to buy 4 iDevices to connect to a single ATV. A standalone controller would offer a lower-cost solution for this exact reason. Besides, as I said when friends do come over and we are able to use multiple iDevices on a single ATV using AirPlay it LAGS!
A dedicated TV gaming platform with a lower cost standalone controller would provide a better overall experience.