or Connect
AppleInsider › Forums › Mobile › iPad › Reported fourth-gen iPad benchmarks show faster processor, quad-core GPU
New Posts  All Forums:Forum Nav:

Reported fourth-gen iPad benchmarks show faster processor, quad-core GPU

post #1 of 41
Thread Starter 
A post from Primate Labs, developer of online benchmarking website Geekbench, reveals what are believed to be the first scores of Apple's newest 9.7-inch iPad, showing that the performance of the tablet's A6X processor more than doubles the composite score of the third generation iPad.

iPad Benchmarks
Reported fourth-generation iPad benchmarks. | Source: Geekbench


According to Primate Labs' John Poole (via MacRumors), a device with the identifier "iPad3,4" logged a benchmark test on Sunday, with the device boasting an A6X chip clocked at 1.39GHz paired with 1GB of memory. This is similar to the die architecture found in the iPhone 5's A6 SoC, but that processor runs at a lower 1.3GHz to conserve energy.

Besides the higher clock speed, the A6X appears to be leveraging quad-core graphics, compared to the triple-core configuration seen in the iPhone 5, to power the high-resolution Retina display.

Benchmark Comparison


Overall, the fourth-generation iPad achieved a Geekbench score of 1757, outperforming the iPhone 5 by ten percent and more than doubling the scores seen with the third-generation iPad and iPad 2.

Apple unveiled the refreshed 9.7-inch tablet at a special event last week that also saw the debut of the 7.9-inch smaller iPad mini.
post #2 of 41
Here I was loving the speed and fluidity of my 5 and I discover my pre-ordered iPad 4 will put it to shame?

God bless Apple's marketing department for making these devices seem so quick to us all!
If you want to make enemies, try to change something.
Reply
If you want to make enemies, try to change something.
Reply
post #3 of 41

"...showing that the performance of the tablet's A6X processor more than doubles the composite score of the third generation iPad." Just as Tim Cook predicted.

 

 


Edited by AnalogJack - 10/30/12 at 1:18am
post #4 of 41

Seriously: still caring about power statistics?

I thought these days were long over.

 

At least for anything not being a PC. Sorry, but I fail to see the consumer benefit of these statistics. As long as it does what it needs to do and offers new possibilities for developers, who cares about how much more power it has?

post #5 of 41
@pinolo - as you say, it all depends on doing what it needs to do.

You have to remember that these days there are a lot of people who buy an iPad to play games on. It has become a serious mobile gaming platform. As soon as the new possibilities are realised by the developers, users of the older iPads will start to find more and more of the new titles to be either unavailable or perform poorly on their hardware.

When this happens only 6 months after release, this is a great way to anger a proportion of the current customer base. To add to these woes, Apple's refurb store is now throwing out the iPad I bought 6 months ago at a substantial discount, causing it's second-hand value to be heavily eroded. Bang goes the low(ish) cost upgrade path.

Just another perspective - Yes, it's a bit "PC", no, I don't expect everyone to agree with me, but there is a proportion of the user base that are slightly rubbed the wrong way by this behaviour.

I know tech moves on, but this is silly. My 3 year old iMac i5 is coming up for an upgrade with the new ones this year, but at the same price point, the new i5 isn't going to have double the performance of my current one, and that's after 3 years of progress, not 6 months.

Short changed? Yeah... a bit... Live with it? I guess so.
post #6 of 41

there is a LOT of development around ARM.

 

x86/ia64 intel is stagnating .  Intel is only one company,  but ARM processors are made and designed by more than 20s companies. it's that huge. of course, we will see fast progress here, no more on x86/ia64 intel.

post #7 of 41
Quote:
Originally Posted by Jemster View Post

@pinolo - as you say, it all depends on doing what it needs to do.
You have to remember that these days there are a lot of people who buy an iPad to play games on. It has become a serious mobile gaming platform. As soon as the new possibilities are realised by the developers, users of the older iPads will start to find more and more of the new titles to be either unavailable or perform poorly on their hardware.
When this happens only 6 months after release, this is a great way to anger a proportion of the current customer base. To add to these woes, Apple's refurb store is now throwing out the iPad I bought 6 months ago at a substantial discount, causing it's second-hand value to be heavily eroded. Bang goes the low(ish) cost upgrade path.
Just another perspective - Yes, it's a bit "PC", no, I don't expect everyone to agree with me, but there is a proportion of the user base that are slightly rubbed the wrong way by this behaviour.
I know tech moves on, but this is silly. My 3 year old iMac i5 is coming up for an upgrade with the new ones this year, but at the same price point, the new i5 isn't going to have double the performance of my current one, and that's after 3 years of progress, not 6 months.
Short changed? Yeah... a bit... Live with it? I guess so.

The point is, would you buy the discounted old iPad3 when a new more than twice as fast iPad4 is available? I wouldn't. And in that case your not short changed. You only have slower hardware.

And slow only in comparison, because in absolute sense the iPad3 is a 'beast' and you will be able to play the new games because game engines are scalable and only some games push the iPad to the max.

The reason performance doubled in such a short time is that the feature size of the current chips is 32nm instead of 45nm. The iPad3 has the old technology, simply because chips with the new feature size were not ready at the time.

Why did Apple release the iPad3? I don't know that for sure ofcourse, but it seems likely that a retina display iPad was a must for Apple because it would be to late if they waited a half year. It's seems that's pretty accurate because several other tables have retina displays now.

I bought the iPad3 because of it's retina display, and that's still the best choice possible.
It's an incredible display.

J.
post #8 of 41

All well and good, but how does it stack up against those quad-core Tegras?  It's claimed that Apple's hardware is weak when compared to the mighty Android quad-core tablets and that Apple is falling far behind the general processor curve.

post #9 of 41
Apple apparently is taking a parallel track to that of Android devices. Regular customers may not care about this but it matters a lot within the development fold. The part about "as long as it does what it needs to" and "offers new possibilities to developers" stem directly from such statistics. 1smile.gif
post #10 of 41
Quote:
Originally Posted by Constable Odo View Post

All well and good, but how does it stack up against those quad-core Tegras?  It's claimed that Apple's hardware is weak when compared to the mighty Android quad-core tablets and that Apple is falling far behind the general processor curve.

I see just the opposite with claims that Apple's total package trounces anything offered by competitors.

melior diabolus quem scies
Reply
melior diabolus quem scies
Reply
post #11 of 41
Quote:
Originally Posted by pinolo View Post

Seriously: still caring about power statistics?


I thought these days were long over.

At least for anything not being a PC. Sorry, but I fail to see the consumer benefit of these statistics. As long as it does what it needs to do and offers new possibilities for developers, who cares about how much more power it has?

The consumer benefit is not having to deal with the general lagginess of Android tablets.

Furthermore, as iOS develops, there are more and more games and other apps that require high performance.
Quote:
Originally Posted by Constable Odo View Post

All well and good, but how does it stack up against those quad-core Tegras?  It's claimed that Apple's hardware is weak when compared to the mighty Android quad-core tablets and that Apple is falling far behind the general processor curve.

Who claims that? Please cite a reputable source that says that Apple's product performance falls significantly behind any Android product.

Not just arbitrary specs (i.e., number of cores), but actual performance. Apple generally wins.
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
post #12 of 41

The quad-core Tegra 3 is slower than Apple's A5... yes, the one in the iPad mini. 

 

post #13 of 41
Why is the article written as if the quad core graphics is a surprise? iPad "3" had quad core graphics too, didn't it? That tidbit was not mentioned in the article.
post #14 of 41
Quote:
Originally Posted by Gatorguy View Post

Quote:
Originally Posted by Constable Odo View Post

All well and good, but how does it stack up against those quad-core Tegras?  It's claimed that Apple's hardware is weak when compared to the mighty Android quad-core tablets and that Apple is falling far behind the general processor curve.
I see just the opposite with claims that Apple's total package trounces anything offered by competitors.
I read Odo as being sarcastic...
It's the heat death of the universe, my friends.
Reply
It's the heat death of the universe, my friends.
Reply
post #15 of 41

If this Geekbench score doubles again it will be as fast as my 2008 Mac Book. When that happens I'll be interested in tablets more than ever. I'll buy an e-reader now but expecting work capability such as video editing will wait for a more capable tablet.
 

post #16 of 41
Quote:
Originally Posted by oomu View Post

there is a LOT of development around ARM.

x86/ia64 intel is stagnating .  Intel is only one company,  but ARM processors are made and designed by more than 20s companies. it's that huge. of course, we will see fast progress here, no more on x86/ia64 intel.
Linux is being used/updated by numerous companies too, and it still hasn't done anything to challenge OS X, let alone Windows.
post #17 of 41
Let's say you were building your own PC (like many of us do). You have a choice of the following two CPU/GPU combinations:

1. Performance is 100 for CPU and 100 for GPU.
2. Performance is 90 for CPU and 200 for GPU.

Which do you take? Any PC builder with half a brain takes the second one and gives up a slight CPU advantage for a huge GPU advantage.

The first one is the quad core GS 3, while the second one is the iPhone 5.

Android users have been sticking to their Geekbench scores like flies on.... The GS3 is slightly faster than the A6 when using Geekbench, but gets trounced in the graphical benchmarks. Now with the Exynos 5 vs the A6X we'll see the same thing. Slightly better CPU, but lagging in GPU. And again we're going to see Android fans talk about Geekbench and ignore GLBench.

Gotta stick with what works for you, I guess.
post #18 of 41
Quote:
Originally Posted by EricTheHalfBee View Post

Let's say you were building your own PC (like many of us do). You have a choice of the following two CPU/GPU combinations:
1. Performance is 100 for CPU and 100 for GPU.
2. Performance is 90 for CPU and 200 for GPU.
Which do you take? Any PC builder with half a brain takes the second one and gives up a slight CPU advantage for a huge GPU advantage.

Yes, a PC builder with half a brain might do what you are suggesting.

A PC builder with a whole brain would look at their requirements, the costs, expandability, and so on before making an arbitrary decision based solely on one number. Not everyone would benefit from a faster GPU. Not everyone would benefit from a faster CPU, either. It comes down to what you're trying to accomplish and who constitutes your target market.

Furthermore, with PCs, the scenario you outline is not likely to occur. For the most part, the GPU and CPU of PCs are independent and can be selected independently. Even with integrated GPUs, the two go in lockstep - normally, when you move to a faster CPU, you also get a faster GPU. The tradeoff you're discussing rarely occurs in the PC market.
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
post #19 of 41
Quote:
Originally Posted by EricTheHalfBee View Post

Let's say you were building your own PC (like many of us do). You have a choice of the following two CPU/GPU combinations:
1. Performance is 100 for CPU and 100 for GPU.
2. Performance is 90 for CPU and 200 for GPU.
Which do you take? Any PC builder with half a brain takes the second one and gives up a slight CPU advantage for a huge GPU advantage.
The first one is the quad core GS 3, while the second one is the iPhone 5.
Android users have been sticking to their Geekbench scores like flies on.... The GS3 is slightly faster than the A6 when using Geekbench, but gets trounced in the graphical benchmarks. Now with the Exynos 5 vs the A6X we'll see the same thing. Slightly better CPU, but lagging in GPU. And again we're going to see Android fans talk about Geekbench and ignore GLBench.
Gotta stick with what works for you, I guess.

Has GPU been benchmarked yet on the Exynos5 Dual-core? You mention it's lagging, but I hadn't seen any scores yet myself. Link?


Edited by Gatorguy - 10/30/12 at 6:47am
melior diabolus quem scies
Reply
melior diabolus quem scies
Reply
post #20 of 41
agreed does not look good but lookin at how the competition is gaining ground, probably they had no choice but upgrade the hardware. It also allows them to position the iPad mini better.
post #21 of 41
Quote:
Originally Posted by Jemster View Post

I know tech moves on, but this is silly. My 3 year old iMac i5 is coming up for an upgrade with the new ones this year, but at the same price point, the new i5 isn't going to have double the performance of my current one, and that's after 3 years of progress, not 6 months.
Short changed? Yeah... a bit... Live with it? I guess so.

 

Talk to Intel about that one.  Their processors have been improving, but the CPU aspects have been moving at a much slower pace than the graphical.  Which is probably the right call considering how shittastic their graphics were for ages

post #22 of 41
Originally Posted by zak_the_great View Post
agreed does not look good but lookin at how the competition is gaining ground, probably they had no choice but upgrade the hardware.

 

It's twice as frigging fast. NO ONE is gaining ground. Could you people just stop LYING for a few days?!

post #23 of 41
Quote:
Originally Posted by EricTheHalfBee View Post

Let's say you were building your own PC (like many of us do). You have a choice of the following two CPU/GPU combinations:
1. Performance is 100 for CPU and 100 for GPU.
2. Performance is 90 for CPU and 200 for GPU.
Which do you take? Any PC builder with half a brain takes the second one and gives up a slight CPU advantage for a huge GPU advantage.
The first one is the quad core GS 3, while the second one is the iPhone 5.
Android users have been sticking to their Geekbench scores like flies on.... The GS3 is slightly faster than the A6 when using Geekbench, but gets trounced in the graphical benchmarks. Now with the Exynos 5 vs the A6X we'll see the same thing. Slightly better CPU, but lagging in GPU. And again we're going to see Android fans talk about Geekbench and ignore GLBench.
Gotta stick with what works for you, I guess.

your post makes no sense since Apple's UI is so much faster. the s3 loses in every single way. speed, graphics, screen, camera, weight, etc.

post #24 of 41
Quote:
Originally Posted by jragosta View Post


Yes, a PC builder with half a brain might do what you are suggesting.
A PC builder with a whole brain would look at their requirements, the costs, expandability, and so on before making an arbitrary decision based solely on one number. Not everyone would benefit from a faster GPU. Not everyone would benefit from a faster CPU, either. It comes down to what you're trying to accomplish and who constitutes your target market.
Furthermore, with PCs, the scenario you outline is not likely to occur. For the most part, the GPU and CPU of PCs are independent and can be selected independently. Even with integrated GPUs, the two go in lockstep - normally, when you move to a faster CPU, you also get a faster GPU. The tradeoff you're discussing rarely occurs in the PC market.

 

When someone "builds" their PC they always have $xxx and look at the cost of the CPU/GPU and split their budget up accordingly, allocating a certain amount to each purchase. And when they do they will gladly give up a little in the CPU to get a lot in the GPU.

 

This is what Apple has done when they looked at making an SoC and decided to put their emphasis ($$$) in the GPU side of things. Apple has been doing this for all their SoC's, and they've always had a slight dis-advantage in CPU when compared to Android counterparts while having a significant advantage in the GPU.

 

And guess what else Apple always has an advantage in? Lag/smooth operation. The entire user interface is based on graphics. But wait, isn't Windows built on graphics too? Yes, but it's not the same. A Windows program has lots of screen real estate and the way objects are drawn and represented are different than a mobile device. On a smartphone, for example, you do a lot of scrolling and pinch/zoom. This requires graphical power, not CPU power. A good comparison would be Aero on Windows - it requires more power than the "normal" user interface. Moving a window while maintianing the entire window contents takes more power than simply moving the "outline" of the window and re-drawing the contents when you stop.

 

I always hear the haters saying you don't need GPU power since it's only for games. This is utter BS. It's relied upon heavily to make your device operate smoothly.

post #25 of 41

Wait a second.  A 4th generation product is faster than an older generation?  No way.  These have to be fake benchmarks.  

post #26 of 41

Seeing as the last benchmarks published by Geekbench showed the A6-equipped triple-core GPU iPhone 5 trouncing every other mobile device out there in almost every aspect but a few where it was bested by the iPad 3 and once by the LG Optimus G, it seems obvious that the iPad 4 is going to be pushing the boundaries out even further ahead of everyone for the time being.

 

Not even the battery-sucking modified Intel Atom 1.6GHz Medfield processor on the XOLO X900 could come close to the iPhone 5, let alone the new fondle-slab.

 

Back to the drawing board for the competition.

 
 
 
 
 
 
post #27 of 41
Quote:
Originally Posted by EricTheHalfBee View Post


Linux is being used/updated by numerous companies too, and it still hasn't done anything to challenge OS X, let alone Windows.

 

Good point.

A big difference between the ARM and Linux movement is that ARM isn't 'open source'.

Another difference is that ARM is a company and Linux isn't.

My guess is that the GNU (copyleft/right whatever) license is as much a hindrance as the 'openness' is a help.

But note that Linux (and open source software in general) is a big success on the server side.

 

J.

post #28 of 41
Quote:
Originally Posted by pedromartins View Post

your post makes no sense since Apple's UI is so much faster. the s3 loses in every single way. speed, graphics, screen, camera, weight, etc.

 

See my post above.

post #29 of 41
Quote:
Originally Posted by EricTheHalfBee View Post

When someone "builds" their PC they always have $xxx and look at the cost of the CPU/GPU and split their budget up accordingly, allocating a certain amount to each purchase. And when they do they will gladly give up a little in the CPU to get a lot in the GPU.

You're not getting it.

SOME people would give up some CPU power to get more GPU power. Others would not. It all depends on your application. There are plenty of applications where GPU performance is useless (and, given its power consumption, could even be negative).

Don't pretend that everyone has to behave the way you want them to. An intelligent designer considers their needs and performance benefits as part of the equation. No intelligent PC builder would automatically assume that more GPU power makes up for a decrease in CPU power. Sometimes it does, sometimes it doesn't.
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
post #30 of 41

For what it's worth, Ars Technica just posted an interesting comparison of the various phone and tablet processors. http://arstechnica.com/gadgets/2012/10/need-for-speed-just-how-fast-is-googles-new-nexus-4/

post #31 of 41

When I play a computer game, I bring down graphical settings as much as necessary to ensure fluid gameplay.

I want it to "operate smoothly" at all times.

 

But then I have this one PC gamer friend who always turns up all the video settings to max regardless of how it effects his FPS.

He insists that he prefers it to look better and happily sacrifices fluid gameplay to that end.

Weird, right?

 

Quote:
Originally Posted by jragosta View Post

There are plenty of applications where GPU performance is useless (and, given its power consumption, could even be negative).

 

I'm with you, but it's a big facepalm that you aren't offering any examples here.

post #32 of 41
Quote:
Originally Posted by brutus009 View Post

I'm with you, but it's a big facepalm that you aren't offering any examples here.

 

Writing.  Sound production.  Coding.  Just a few.

post #33 of 41
Quote:
Originally Posted by brutus009 View Post

When I play a computer game, I bring down graphical settings as much as necessary to ensure fluid gameplay.
I want it to "operate smoothly" at all times.

But then I have this one PC gamer friend who always turns up all the video settings to max regardless of how it effects his FPS.
He insists that he prefers it to look better and happily sacrifices fluid gameplay to that end.
Weird, right?

No one ever denied that some applications require all the GPU horsepower you can throw at them. I was simply arguing against EricTheHalfBee's argument that ALL applications would benefit from more GPU power - even at the expense of less CPU power. That's just not the case.
Quote:
Originally Posted by brutus009 View Post

I'm with you, but it's a big facepalm that you aren't offering any examples here.

All sorts of things. Email. Coding. Scientific applications. Sound processing. Office apps.

For many of these, no only would you pay the price of a 10% reduction in CPU power with no benefit (since they don't use the GPU much), but the double GPU power would increase battery consumption, so you'd also have a shorter battery life.
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
post #34 of 41
Quote:
Originally Posted by GadgetCanada View Post

The iPad 4 looks very tempting but I'm still going to hold out for the iPad 5 hoping it has an IGZO display and crazy battery life. I'm really hoping that Apple focuses on reducing the latency for the touch screen so pen input is real-time. Right now, I find all tablets have a slight lag from fast pen inputs.

Way more than that, the iPad 5 will be a different design to be more in line with the iPhone 5 And iPad mini. That style is too gorgeous to not continue it throughout the line. The iPad 4 looks like a relic next to the styling of the mini.

2014 27" Retina iMac i5, 2012 27" iMac i7, 2011 Mac Mini i5
iPad Air 2, iPad Mini 2, iPhone 6 Plus, iPhone 6, iPod Touch 5
Time Capsule 5, (3) AirPort Express 2, (2) Apple TV 3

Reply

2014 27" Retina iMac i5, 2012 27" iMac i7, 2011 Mac Mini i5
iPad Air 2, iPad Mini 2, iPhone 6 Plus, iPhone 6, iPod Touch 5
Time Capsule 5, (3) AirPort Express 2, (2) Apple TV 3

Reply
post #35 of 41
Quote:
Originally Posted by jragosta View Post


No one ever denied that some applications require all the GPU horsepower you can throw at them. I was simply arguing against EricTheHalfBee's argument that ALL applications would benefit from more GPU power - even at the expense of less CPU power. That's just not the case.
All sorts of things. Email. Coding. Scientific applications. Sound processing. Office apps.
For many of these, no only would you pay the price of a 10% reduction in CPU power with no benefit (since they don't use the GPU much), but the double GPU power would increase battery consumption, so you'd also have a shorter battery life.

 

Quote:
Originally Posted by jragosta View Post


You're not getting it.
SOME people would give up some CPU power to get more GPU power. Others would not. It all depends on your application. There are plenty of applications where GPU performance is useless (and, given its power consumption, could even be negative).
Don't pretend that everyone has to behave the way you want them to. An intelligent designer considers their needs and performance benefits as part of the equation. No intelligent PC builder would automatically assume that more GPU power makes up for a decrease in CPU power. Sometimes it does, sometimes it doesn't.

 

I get it fine. I also get, like usual, you like to take one specific point and run with it while completely missing the bigger picture.

 

First off, I never said that ALL applications would benefit. Why are you twisting what I say (or making things up outright) to try and prove your point? I said "it's relied upon heavily to make your device run smoothly." You want to argue against that point?

 

Do you know how many of the "examples" you gave actually do use the GPU? You might be interested to know that a GPU isn't just used for displaying graphics on a device. Many applications leverage the math capabilities of a GPU to do other work, and sound processing is one. So are "scientific applications". The iPhone uses the GPU for a lot more than displaying graphics, and you likely use it all the time without even knowing it.

 

I've been building rigs for 20+ years and follow all the popular modding sites. I believe I'm 100% correct when I say the majority of builders place a significant emphasis on the video card. Not all, but most. What was the last rig you or anyone else on AI has built? List your processor, video card, memory, storage and motherboard (the 5 main things you need). I'd be curious to see what decision people made.

 

 

Regardless of what a PC builder would do (since it's your opinion against mine) one cold, hard fact remains: Apple themselves have chosen GPU over CPU in all their custom designed SoC's. And Apple's devices are the smoothest. It's not a coincidence.

post #36 of 41
We should wonder if we'll see a 5th gen iPad after the new year. I am not expecting it until at least Summer as I suspect that is the earliest we'll see Rogue 6 ready in quantity. I see Jo train to update again now that they the move to 32nm.

This bot has been removed from circulation due to a malfunctioning morality chip.

Reply

This bot has been removed from circulation due to a malfunctioning morality chip.

Reply
post #37 of 41
Quote:
Originally Posted by SolipsismX View Post

We should wonder if we'll see a 5th gen iPad after the new year. I am not expecting it until at least Summer as I suspect that is the earliest we'll see Rogue 6 ready in quantity. I see Jo train to update again now that they the move to 32nm.

I wouldn't be surprised if they begin updating all the iDevices in the usual annual Fall update. iPad was the only one left out, until last week.
post #38 of 41
Quote:
Originally Posted by EricTheHalfBee View Post


I get it fine. I also get, like usual, you like to take one specific point and run with it while completely missing the bigger picture.

First off, I never said that ALL applications would benefit. Why are you twisting what I say (or making things up outright) to try and prove your point? I said "it's relied upon heavily to make your device run smoothly." You want to argue against that point?

Do you know how many of the "examples" you gave actually do use the GPU? You might be interested to know that a GPU isn't just used for displaying graphics on a device. Many applications leverage the math capabilities of a GPU to do other work, and sound processing is one. So are "scientific applications". The iPhone uses the GPU for a lot more than displaying graphics, and you likely use it all the time without even knowing it.

I've been building rigs for 20+ years and follow all the popular modding sites. I believe I'm 100% correct when I say the majority of builders place a significant emphasis on the video card. Not all, but most. What was the last rig you or anyone else on AI has built? List your processor, video card, memory, storage and motherboard (the 5 main things you need). I'd be curious to see what decision people made.


Regardless of what a PC builder would do (since it's your opinion against mine) one cold, hard fact remains: Apple themselves have chosen GPU over CPU in all their custom designed SoC's. And Apple's devices are the smoothest. It's not a coincidence.

What you said was that ANY PC builder would make the choice to go with a faster GPU even if they have to give up CPU power. You were wrong. Look at the people who make servers. People who make blades. People who make computers for scientific computing (although that is changing slowly with some science apps using the GPU).

The fact is that you were just plain wrong. It's not a no-brainer and not something that everyone would do. If it were, no one would ever make a PC with integrated graphics. In the mobile space, EVERYONE would be using quad GPUs like Apple.

Yes, Apple did it. And I already admitted that it makes sense some times and for some applications. I was simply objecting to your simplistic blanket statement that all PC vendors would make that same choice - without even considering application, cost, etc. Even a cursory glance at the PC (and mobile) market shows that you are wrong.
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
post #39 of 41
Quote:
Originally Posted by JeffDM View Post

I wouldn't be surprised if they begin updating all the iDevices in the usual annual Fall update. iPad was the only one left out, until last week.

That would seem to leave a hole in the Spring. A new product category coming next year?
Edited by SolipsismX - 10/30/12 at 4:51pm

This bot has been removed from circulation due to a malfunctioning morality chip.

Reply

This bot has been removed from circulation due to a malfunctioning morality chip.

Reply
post #40 of 41
Quote:
Originally Posted by SolipsismX View Post

That would seem to leave a hole in the Spring. A new product category coming next year?

Given how heavily iDevice sales are biased to the fall quarter, it might not really be a problem. It probably makes their software development cycle easier, I don't know how they manage their manufacturing.

There's always room for an AppleTV or Mac update any time for the rest of the year.
Edited by JeffDM - 10/31/12 at 9:04am
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: iPad
  • Reported fourth-gen iPad benchmarks show faster processor, quad-core GPU
AppleInsider › Forums › Mobile › iPad › Reported fourth-gen iPad benchmarks show faster processor, quad-core GPU