What they're really saying is that we intensively use the phone only part of the time. If you measured those intense use periods the battery life might increase 10% (eg: 5hrs goes to 5h30 of intense use). But once you factor in all the time it just sits in your pocket etc, it's a much lower amount (eg: 15hrs goes to 15h30 of average use, 3.3% difference).
edit: actually that's wrong. If it was 10% intense use, then average use would also be 10%.
15h would go to 16.5 hours average use. My logic was flawed (and Apple says it's 2-3%).
Ahh well.
I think you are correct in your first analysis. During normal use, the chips aren't often into the thermal range where the different in their architecture causes the thermal dissipation effects that cause them to diverge in power consumption. That's the reason Apple is correct to claim the difference is small. It's only when you ramp up the demand on the chips, which is what benchmarks do, that the power consumption begins to diverge and the battery life issue manifests.
Why isn't everyone taking this opportunity to bash Samsung? It must have caused much frustration when people couldn't bash Samsung's superior LCD panels and SSD performance, so I figured this chip story would give the Samsung haters much needed release.
Why isn't everyone taking this opportunity to bash Samsung? It must have caused much frustration when people couldn't bash Samsung's superior LCD panels and SSD performance, so I figured this chip story would give the Samsung haters much needed release.
On the backs of those ten thousand "repurposed" employees it's. Just. Too. Sad.
Samsung offered lower price for A9 under 14nm FINFET process. That's why Apple bought half of the A9 from Samsung even though they knew TSMC's A9 under 16nm FINFET would be better than Samsung's.
What they're really saying is that we intensively use the phone only part of the time. If you measured those intense use periods the battery life might increase 10% (eg: 5hrs goes to 5h30 of intense use). But once you factor in all the time it just sits in your pocket etc, it's a much lower amount (eg: 15hrs goes to 15h30 of average use, 3.3% difference).
edit: actually that's wrong. If it was 10% intense use, then average use would also be 10%.
15h would go to 16.5 hours average use. My logic was flawed (and Apple says it's 2-3%).
Ahh well.
The intense is in normal operation is not the same as done in a benchmark that loops the CPU.
Only maybe raytracing or similar long standing non interactive operation would be similar.
When you an play an intense game, it's still not maxing out the CPU generally.
You don'T do a lot of that on a smart phone unless maybe doing editing of 4K clips many time a day on your phone...
I think you are correct in your first analysis. During normal use, the chips aren't often into the thermal range where the different in their architecture causes the thermal dissipation effects that cause them to diverge in power consumption. That's the reason Apple is correct to claim the difference is small. It's only when you ramp up the demand on the chips, which is what benchmarks do, that the power consumption begins to diverge and the battery life issue manifests.
Also, even during intense use, a FPS shooter, your still possibly not getting in the range where it would make a big difference compared to a benchmark (maybe it is in those situation that you get 5-10% and difference while the rest of the day there is no difference at all, thus on average you get 2-3%). Those monster GPU'S on the Iphone are way overspecced for current or even near future games, and those games are not CPU bound.
People could just run a popular FPS game on IOS at max settings through a level on a loop and see the result. That would be a more realistic edge case than the benchmarks. I'd expect 10% difference for an avid gamer.
Why isn't everyone taking this opportunity to bash Samsung? It must have caused much frustration when people couldn't bash Samsung's superior LCD panels and SSD performance, so I figured this chip story would give the Samsung haters much needed release.
The funny thing is that initially, when people thought the Samsung had the best performance (before the battery numbers) Android idiots where all over that, but now they're probably hiding... Or have done a complete cognitive dissonance thing where they blame Apple for Samsung not being that great, although the 6s's performance itself is still within range.
Well, my Samsung Note 5 with the 14nm chip gets stellar performance and battery life. Smooth, fast and runs cool all day long. Maybe it's not Samsung's issue. Maybe it's some inefficiency with another component inside the iPhone 6s with the Samsung chip that the other chip doesn't have?
It's easy to bash Samsung, especially around here. But given that the latest Samsung devices run exceptionally well and efficient, maybe it's not the chip itself. Just maybe.
The funny thing is that initially, when people thought the Samsung had the best performance (before the battery numbers) Android idiots where all over that, but now they're probably hiding... Or have done a complete cognitive dissonance thing where they blame Apple for Samsung not being that great, although the 6s's performance itself is still within range.
Well my Note 5 runs fast, effecient, cool and I get awesome battery life, even under heavy use. I have a 14nm chip with the little.Big architecture on board. Couldn't be happier.
Comments
The tests did not find any speed difference between the two chips. So it is probably energy leakage as some poster said here.
A 2-3% battery life increase is still a fair bit.
What they're really saying is that we intensively use the phone only part of the time. If you measured those intense use periods the battery life might increase 10% (eg: 5hrs goes to 5h30 of intense use). But once you factor in all the time it just sits in your pocket etc, it's a much lower amount (eg: 15hrs goes to 15h30 of average use, 3.3% difference).
edit: actually that's wrong. If it was 10% intense use, then average use would also be 10%.
15h would go to 16.5 hours average use. My logic was flawed (and Apple says it's 2-3%).
Ahh well.
I think you are correct in your first analysis. During normal use, the chips aren't often into the thermal range where the different in their architecture causes the thermal dissipation effects that cause them to diverge in power consumption. That's the reason Apple is correct to claim the difference is small. It's only when you ramp up the demand on the chips, which is what benchmarks do, that the power consumption begins to diverge and the battery life issue manifests.
Why isn't everyone taking this opportunity to bash Samsung? It must have caused much frustration when people couldn't bash Samsung's superior LCD panels and SSD performance, so I figured this chip story would give the Samsung haters much needed release.
Why isn't everyone taking this opportunity to bash Samsung? It must have caused much frustration when people couldn't bash Samsung's superior LCD panels and SSD performance, so I figured this chip story would give the Samsung haters much needed release.
On the backs of those ten thousand "repurposed" employees it's. Just. Too. Sad.
A 2-3% battery life increase is still a fair bit.
What they're really saying is that we intensively use the phone only part of the time. If you measured those intense use periods the battery life might increase 10% (eg: 5hrs goes to 5h30 of intense use). But once you factor in all the time it just sits in your pocket etc, it's a much lower amount (eg: 15hrs goes to 15h30 of average use, 3.3% difference).
edit: actually that's wrong. If it was 10% intense use, then average use would also be 10%.
15h would go to 16.5 hours average use. My logic was flawed (and Apple says it's 2-3%).
Ahh well.
The intense is in normal operation is not the same as done in a benchmark that loops the CPU.
Only maybe raytracing or similar long standing non interactive operation would be similar.
When you an play an intense game, it's still not maxing out the CPU generally.
You don'T do a lot of that on a smart phone unless maybe doing editing of 4K clips many time a day on your phone...
I think you are correct in your first analysis. During normal use, the chips aren't often into the thermal range where the different in their architecture causes the thermal dissipation effects that cause them to diverge in power consumption. That's the reason Apple is correct to claim the difference is small. It's only when you ramp up the demand on the chips, which is what benchmarks do, that the power consumption begins to diverge and the battery life issue manifests.
Also, even during intense use, a FPS shooter, your still possibly not getting in the range where it would make a big difference compared to a benchmark (maybe it is in those situation that you get 5-10% and difference while the rest of the day there is no difference at all, thus on average you get 2-3%). Those monster GPU'S on the Iphone are way overspecced for current or even near future games, and those games are not CPU bound.
People could just run a popular FPS game on IOS at max settings through a level on a loop and see the result. That would be a more realistic edge case than the benchmarks. I'd expect 10% difference for an avid gamer.
Why isn't everyone taking this opportunity to bash Samsung? It must have caused much frustration when people couldn't bash Samsung's superior LCD panels and SSD performance, so I figured this chip story would give the Samsung haters much needed release.
The funny thing is that initially, when people thought the Samsung had the best performance (before the battery numbers) Android idiots where all over that, but now they're probably hiding... Or have done a complete cognitive dissonance thing where they blame Apple for Samsung not being that great, although the 6s's performance itself is still within range.
It's easy to bash Samsung, especially around here. But given that the latest Samsung devices run exceptionally well and efficient, maybe it's not the chip itself. Just maybe.
Well my Note 5 runs fast, effecient, cool and I get awesome battery life, even under heavy use. I have a 14nm chip with the little.Big architecture on board. Couldn't be happier.