Reported fourth-gen iPad benchmarks show faster processor, quad-core GPU

2

Comments

  • Reply 21 of 41
    ssquirrelssquirrel Posts: 1,196member

    Quote:

    Originally Posted by Jemster View Post



    I know tech moves on, but this is silly. My 3 year old iMac i5 is coming up for an upgrade with the new ones this year, but at the same price point, the new i5 isn't going to have double the performance of my current one, and that's after 3 years of progress, not 6 months.

    Short changed? Yeah... a bit... Live with it? I guess so.


     


    Talk to Intel about that one.  Their processors have been improving, but the CPU aspects have been moving at a much slower pace than the graphical.  Which is probably the right call considering how shittastic their graphics were for ages

     0Likes 0Dislikes 0Informatives
  • Reply 22 of 41


    Originally Posted by zak_the_great View Post

    agreed does not look good but lookin at how the competition is gaining ground, probably they had no choice but upgrade the hardware.


     


    It's twice as frigging fast. NO ONE is gaining ground. Could you people just stop LYING for a few days?!

     0Likes 0Dislikes 0Informatives
  • Reply 23 of 41

    Quote:

    Originally Posted by EricTheHalfBee View Post



    Let's say you were building your own PC (like many of us do). You have a choice of the following two CPU/GPU combinations:

    1. Performance is 100 for CPU and 100 for GPU.

    2. Performance is 90 for CPU and 200 for GPU.

    Which do you take? Any PC builder with half a brain takes the second one and gives up a slight CPU advantage for a huge GPU advantage.

    The first one is the quad core GS 3, while the second one is the iPhone 5.

    Android users have been sticking to their Geekbench scores like flies on.... The GS3 is slightly faster than the A6 when using Geekbench, but gets trounced in the graphical benchmarks. Now with the Exynos 5 vs the A6X we'll see the same thing. Slightly better CPU, but lagging in GPU. And again we're going to see Android fans talk about Geekbench and ignore GLBench.

    Gotta stick with what works for you, I guess.


    your post makes no sense since Apple's UI is so much faster. the s3 loses in every single way. speed, graphics, screen, camera, weight, etc.

     0Likes 0Dislikes 0Informatives
  • Reply 24 of 41

    Quote:

    Originally Posted by jragosta View Post





    Yes, a PC builder with half a brain might do what you are suggesting.

    A PC builder with a whole brain would look at their requirements, the costs, expandability, and so on before making an arbitrary decision based solely on one number. Not everyone would benefit from a faster GPU. Not everyone would benefit from a faster CPU, either. It comes down to what you're trying to accomplish and who constitutes your target market.

    Furthermore, with PCs, the scenario you outline is not likely to occur. For the most part, the GPU and CPU of PCs are independent and can be selected independently. Even with integrated GPUs, the two go in lockstep - normally, when you move to a faster CPU, you also get a faster GPU. The tradeoff you're discussing rarely occurs in the PC market.


     


    When someone "builds" their PC they always have $xxx and look at the cost of the CPU/GPU and split their budget up accordingly, allocating a certain amount to each purchase. And when they do they will gladly give up a little in the CPU to get a lot in the GPU.


     


    This is what Apple has done when they looked at making an SoC and decided to put their emphasis ($$$) in the GPU side of things. Apple has been doing this for all their SoC's, and they've always had a slight dis-advantage in CPU when compared to Android counterparts while having a significant advantage in the GPU.


     


    And guess what else Apple always has an advantage in? Lag/smooth operation. The entire user interface is based on graphics. But wait, isn't Windows built on graphics too? Yes, but it's not the same. A Windows program has lots of screen real estate and the way objects are drawn and represented are different than a mobile device. On a smartphone, for example, you do a lot of scrolling and pinch/zoom. This requires graphical power, not CPU power. A good comparison would be Aero on Windows - it requires more power than the "normal" user interface. Moving a window while maintianing the entire window contents takes more power than simply moving the "outline" of the window and re-drawing the contents when you stop.


     


    I always hear the haters saying you don't need GPU power since it's only for games. This is utter BS. It's relied upon heavily to make your device operate smoothly.

     0Likes 0Dislikes 0Informatives
  • Reply 25 of 41


    Wait a second.  A 4th generation product is faster than an older generation?  No way.  These have to be fake benchmarks.  

     0Likes 0Dislikes 0Informatives
  • Reply 26 of 41


    Seeing as the last benchmarks published by Geekbench showed the A6-equipped triple-core GPU iPhone 5 trouncing every other mobile device out there in almost every aspect but a few where it was bested by the iPad 3 and once by the LG Optimus G, it seems obvious that the iPad 4 is going to be pushing the boundaries out even further ahead of everyone for the time being.


     


    Not even the battery-sucking modified Intel Atom 1.6GHz Medfield processor on the XOLO X900 could come close to the iPhone 5, let alone the new fondle-slab.


     


    Back to the drawing board for the competition.



     


     




     


     




     


     




     


     


     0Likes 0Dislikes 0Informatives
  • Reply 27 of 41
    jnjnjnjnjnjn Posts: 588member

    Quote:

    Originally Posted by EricTheHalfBee View Post





    Linux is being used/updated by numerous companies too, and it still hasn't done anything to challenge OS X, let alone Windows.


     


    Good point.


    A big difference between the ARM and Linux movement is that ARM isn't 'open source'.


    Another difference is that ARM is a company and Linux isn't.


    My guess is that the GNU (copyleft/right whatever) license is as much a hindrance as the 'openness' is a help.


    But note that Linux (and open source software in general) is a big success on the server side.


     


    J.

     0Likes 0Dislikes 0Informatives
  • Reply 28 of 41

    Quote:

    Originally Posted by pedromartins View Post


    your post makes no sense since Apple's UI is so much faster. the s3 loses in every single way. speed, graphics, screen, camera, weight, etc.



     


    See my post above.

     0Likes 0Dislikes 0Informatives
  • Reply 29 of 41
    jragostajragosta Posts: 10,473member
    When someone "builds" their PC they always have $xxx and look at the cost of the CPU/GPU and split their budget up accordingly, allocating a certain amount to each purchase. And when they do they will gladly give up a little in the CPU to get a lot in the GPU.

    You're not getting it.

    SOME people would give up some CPU power to get more GPU power. Others would not. It all depends on your application. There are plenty of applications where GPU performance is useless (and, given its power consumption, could even be negative).

    Don't pretend that everyone has to behave the way you want them to. An intelligent designer considers their needs and performance benefits as part of the equation. No intelligent PC builder would automatically assume that more GPU power makes up for a decrease in CPU power. Sometimes it does, sometimes it doesn't.
     0Likes 0Dislikes 0Informatives
  • Reply 30 of 41


    For what it's worth, Ars Technica just posted an interesting comparison of the various phone and tablet processors. http://arstechnica.com/gadgets/2012/10/need-for-speed-just-how-fast-is-googles-new-nexus-4/

     0Likes 0Dislikes 0Informatives
  • Reply 31 of 41


    When I play a computer game, I bring down graphical settings as much as necessary to ensure fluid gameplay.


    I want it to "operate smoothly" at all times.


     


    But then I have this one PC gamer friend who always turns up all the video settings to max regardless of how it effects his FPS.


    He insists that he prefers it to look better and happily sacrifices fluid gameplay to that end.


    Weird, right?


     


    Quote:

    Originally Posted by jragosta View Post



    There are plenty of applications where GPU performance is useless (and, given its power consumption, could even be negative).


     


    I'm with you, but it's a big facepalm that you aren't offering any examples here.

     0Likes 0Dislikes 0Informatives
  • Reply 32 of 41
    ssquirrelssquirrel Posts: 1,196member

    Quote:

    Originally Posted by brutus009 View Post


    I'm with you, but it's a big facepalm that you aren't offering any examples here.



     


    Writing.  Sound production.  Coding.  Just a few.

     0Likes 0Dislikes 0Informatives
  • Reply 33 of 41
    jragostajragosta Posts: 10,473member
    brutus009 wrote: »
    When I play a computer game, I bring down graphical settings as much as necessary to ensure fluid gameplay.
    I want it to "operate smoothly" at all times.

    But then I have this one PC gamer friend who always turns up all the video settings to max regardless of how it effects his FPS.
    He insists that he prefers it to look better and happily sacrifices fluid gameplay to that end.
    Weird, right?

    No one ever denied that some applications require all the GPU horsepower you can throw at them. I was simply arguing against EricTheHalfBee's argument that ALL applications would benefit from more GPU power - even at the expense of less CPU power. That's just not the case.
    brutus009 wrote: »
    I'm with you, but it's a big facepalm that you aren't offering any examples here.

    All sorts of things. Email. Coding. Scientific applications. Sound processing. Office apps.

    For many of these, no only would you pay the price of a 10% reduction in CPU power with no benefit (since they don't use the GPU much), but the double GPU power would increase battery consumption, so you'd also have a shorter battery life.
     0Likes 0Dislikes 0Informatives
  • Reply 34 of 41
    andysolandysol Posts: 2,506member
    The iPad 4 looks very tempting but I'm still going to hold out for the iPad 5 hoping it has an IGZO display and crazy battery life. I'm really hoping that Apple focuses on reducing the latency for the touch screen so pen input is real-time. Right now, I find all tablets have a slight lag from fast pen inputs.

    Way more than that, the iPad 5 will be a different design to be more in line with the iPhone 5 And iPad mini. That style is too gorgeous to not continue it throughout the line. The iPad 4 looks like a relic next to the styling of the mini.
     0Likes 0Dislikes 0Informatives
  • Reply 35 of 41

    Quote:

    Originally Posted by jragosta View Post





    No one ever denied that some applications require all the GPU horsepower you can throw at them. I was simply arguing against EricTheHalfBee's argument that ALL applications would benefit from more GPU power - even at the expense of less CPU power. That's just not the case.

    All sorts of things. Email. Coding. Scientific applications. Sound processing. Office apps.

    For many of these, no only would you pay the price of a 10% reduction in CPU power with no benefit (since they don't use the GPU much), but the double GPU power would increase battery consumption, so you'd also have a shorter battery life.


     


    Quote:

    Originally Posted by jragosta View Post





    You're not getting it.

    SOME people would give up some CPU power to get more GPU power. Others would not. It all depends on your application. There are plenty of applications where GPU performance is useless (and, given its power consumption, could even be negative).

    Don't pretend that everyone has to behave the way you want them to. An intelligent designer considers their needs and performance benefits as part of the equation. No intelligent PC builder would automatically assume that more GPU power makes up for a decrease in CPU power. Sometimes it does, sometimes it doesn't.


     


    I get it fine. I also get, like usual, you like to take one specific point and run with it while completely missing the bigger picture.


     


    First off, I never said that ALL applications would benefit. Why are you twisting what I say (or making things up outright) to try and prove your point? I said "it's relied upon heavily to make your device run smoothly." You want to argue against that point?


     


    Do you know how many of the "examples" you gave actually do use the GPU? You might be interested to know that a GPU isn't just used for displaying graphics on a device. Many applications leverage the math capabilities of a GPU to do other work, and sound processing is one. So are "scientific applications". The iPhone uses the GPU for a lot more than displaying graphics, and you likely use it all the time without even knowing it.


     


    I've been building rigs for 20+ years and follow all the popular modding sites. I believe I'm 100% correct when I say the majority of builders place a significant emphasis on the video card. Not all, but most. What was the last rig you or anyone else on AI has built? List your processor, video card, memory, storage and motherboard (the 5 main things you need). I'd be curious to see what decision people made.


     


     


    Regardless of what a PC builder would do (since it's your opinion against mine) one cold, hard fact remains: Apple themselves have chosen GPU over CPU in all their custom designed SoC's. And Apple's devices are the smoothest. It's not a coincidence.

     0Likes 0Dislikes 0Informatives
  • Reply 36 of 41
    solipsismxsolipsismx Posts: 19,566member
    We should wonder if we'll see a 5th gen iPad after the new year. I am not expecting it until at least Summer as I suspect that is the earliest we'll see Rogue 6 ready in quantity. I see Jo train to update again now that they the move to 32nm.
     0Likes 0Dislikes 0Informatives
  • Reply 37 of 41
    jeffdmjeffdm Posts: 12,954member
    solipsismx wrote: »
    We should wonder if we'll see a 5th gen iPad after the new year. I am not expecting it until at least Summer as I suspect that is the earliest we'll see Rogue 6 ready in quantity. I see Jo train to update again now that they the move to 32nm.

    I wouldn't be surprised if they begin updating all the iDevices in the usual annual Fall update. iPad was the only one left out, until last week.
     0Likes 0Dislikes 0Informatives
  • Reply 38 of 41
    jragostajragosta Posts: 10,473member

    I get it fine. I also get, like usual, you like to take one specific point and run with it while completely missing the bigger picture.

    First off, I never said that ALL applications would benefit. Why are you twisting what I say (or making things up outright) to try and prove your point? I said "it's relied upon heavily to make your device run smoothly." You want to argue against that point?

    Do you know how many of the "examples" you gave actually do use the GPU? You might be interested to know that a GPU isn't just used for displaying graphics on a device. Many applications leverage the math capabilities of a GPU to do other work, and sound processing is one. So are "scientific applications". The iPhone uses the GPU for a lot more than displaying graphics, and you likely use it all the time without even knowing it.

    I've been building rigs for 20+ years and follow all the popular modding sites. I believe I'm 100% correct when I say the majority of builders place a significant emphasis on the video card. Not all, but most. What was the last rig you or anyone else on AI has built? List your processor, video card, memory, storage and motherboard (the 5 main things you need). I'd be curious to see what decision people made.


    Regardless of what a PC builder would do (since it's your opinion against mine) one cold, hard fact remains: Apple themselves have chosen GPU over CPU in all their custom designed SoC's. And Apple's devices are the smoothest. It's not a coincidence.

    What you said was that ANY PC builder would make the choice to go with a faster GPU even if they have to give up CPU power. You were wrong. Look at the people who make servers. People who make blades. People who make computers for scientific computing (although that is changing slowly with some science apps using the GPU).

    The fact is that you were just plain wrong. It's not a no-brainer and not something that everyone would do. If it were, no one would ever make a PC with integrated graphics. In the mobile space, EVERYONE would be using quad GPUs like Apple.

    Yes, Apple did it. And I already admitted that it makes sense some times and for some applications. I was simply objecting to your simplistic blanket statement that all PC vendors would make that same choice - without even considering application, cost, etc. Even a cursory glance at the PC (and mobile) market shows that you are wrong.
     0Likes 0Dislikes 0Informatives
  • Reply 39 of 41
    solipsismxsolipsismx Posts: 19,566member
    jeffdm wrote: »
    I wouldn't be surprised if they begin updating all the iDevices in the usual annual Fall update. iPad was the only one left out, until last week.

    That would seem to leave a hole in the Spring. A new product category coming next year?
     0Likes 0Dislikes 0Informatives
  • Reply 40 of 41
    jeffdmjeffdm Posts: 12,954member
    solipsismx wrote: »
    That would seem to leave a hole in the Spring. A new product category coming next year?

    Given how heavily iDevice sales are biased to the fall quarter, it might not really be a problem. It probably makes their software development cycle easier, I don't know how they manage their manufacturing.

    There's always room for an AppleTV or Mac update any time for the rest of the year.
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.