iPad 3 - SGX 543MP2 or 600 series?

2»

Comments

  • Reply 21 of 33
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by yow! View Post


    I think that sums it up: can Apple be so far ahead of the industry, not just in design, concept etc, but even in designing the chips themselves? I gave some factors above supporting this; but their past performance is a guide: they had a new SoC for iPad 1, and a year later, another new SoC for iPad 2.



    It would be a very pleasant shock if they delivered any sort of SoC with a 600 series GPU. It would very much put them way out in front of the competition. That would be six months silicon wise and probably a year software wise.



    If they can't I'm not extremely worried as Apple should be able to tweak the A5 into something powerful enough to drive iPad 3.

    Quote:

    Now, that's just two data points, and there could be other factors (e.g. maybe both chips were in the pipeline for years). Crucially, and the point, I haven't compared that with the industry - maybe everyone managed that too.



    Well an A5 tweaked to do the iPad 3 i still a major new chip in my mind even if it is mostly a die shrink with a beefed up GPU. A5X sounds more like an iPhone chip than anything though.

    Quote:

    The A15 seems a significant upgrade, at least doubling the performance without doubling power consumption. ARM know what they are doing. Can you expand on why you think that?



    Mostly because Apple can get close to 4X improvements out of the Cortex A9 cores it is currently using. The can do that by adding two cores and doubling the clock rate. Demos of Cortex A9 cores running at 2GHz where made years ago. Notably this can be implemented without a lot of reworking of the system software. A15 is targeted more at the server market anyway.



    by the time Apple needs a new core ARM might have a 64 bit implementation going. If Apple where to make a major change t the system architecture this would be a better path to follow.

    Quote:

    I do agree that a CPU upgrade isn't crucial for the iPad (for this generation); but GPU absolutely is. This is true for the x4 pixels; but also in general.



    I'm not sure if crucial is the word or not. I just don't think it is a big deal as they can get faster cores from a simple process shrink while enhancing the GPU. In effect the CPU improvements come for free with a faster GPU.

    Quote:

    Agreed. And the "A5X", and dual-track rumour etc all support this. Plus, just a retina display is enough to wow everyone. I was musing that it might be workable if the risks could be self-contained within the SoC group (and it has the massive resources of Apple behind it) - but risks like "delays" can't be self-contained.



    I suspect that just getting enough displays will be a major consideration for the iPad3 introduction. So I can sees where an A5X would be safe bet. Still it will be a very pleasant surprise to see a A6 class processor in the iPad.

    Quote:

    Hey, it wasn't meant as an insult. I was speaking in sympathy with them - to modify an existing design, when there's a new architecture available that solves all the problems you're facing... that was be frustrating to me.



    I just had this image of an Apple engineer sitting at a bar talking to somebody who is calling his processor half-hacked. I can see a stein of beer being poured over somebodies head because of that.

    Quote:

    As a developer, I find that at some point, it's easier to start fresh than modify the existing design. Anyway, that's how I'd experience it - maybe they have a different attitude; and/or the technical issues are interesting in themselves and can inform and be used in the next architecture. Maybe I should keep my mouth shut instead of offending people



    No keep talking, it is good to have reasoned conversation about something that will be here in about 6 days.

    Quote:

    Wouldn't it be amazing if Apple was ahead of everybody in every aspect? I think it's possible but unlikely; but most of all, it's not needed for this generation of the iPad.



    Not needed? Personally I will take all the improvement the can throw at the thing. Seriously the iPad impresses me almost daily but at times it is performance limited to say the least. So yeah more power is welcomed. More importantly I want the machine to have far more RAM. Given that this happens the iPad could effectively replace my MBP for almost all of my portable needs.

    Quote:

    And think that's one of the secrets of Apple: they *have* the technical chops, but they only use them in service of a user outcome, instead of an end in themselves. That's tricky. Most companies seem go one way or the other (i.e. all technical; or all user outcome).



    Apple has a long history of Engineering triumphs, they just are understated about celebrating those triumphs. Even though I don't like the machine, the iMac is a good example of an engineering success. Many don't see it that way but the whole series of iMacs where an impressive display of thinking with an open mind.
     0Likes 0Dislikes 0Informatives
  • Reply 22 of 33
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by tipoo View Post


    Even todays consoles have different advantages though, bandwidth is a big problem on these smartphone SoCs as well as storage, not to mention controls.



    A good point but there are ways for Apple to address bandwidth issues. For example a process shrink could free up space for a much larger cache. Or they could implement a frame buffer on chip. The question is which would be more valuable 2 more ARM cores or a 32 or 64 MB, buffer or RAM array on chip?



    I realize that the processes used to build processors don't lead to efficient RAM arrays but the idea here is that if one can keep the GPU from going off chip as much as possible you not only gain in speed but save a considerable amount of power. I actually believe the power savings would be ver significant as the chip would other wise be moving a great deal of data over the memory bus just to drive the display.



    There are just so many options for Apple that I'm not convinced that getting exceptional performance at an extremely low power point is impossible these days. The current chips are at 45nm and Apple could potentially jump two nodes or more which would free up a lot of die space.

    Quote:

    And todays desktop GPUs that would be in the next gen consoles have over 10x the raw Gflops of the current ones ~200. The new Radeon 7870 pushes over 3 terraflops.



    For us old guys this is astonishing. I can remember the days when CRAYs where a thing of wonder, now I can do everything a Cray 1 did on my desktop and frankly my iPhone is darn close.

    Quote:

    I think we're still 10-12 months early for Cortex A15, so my best hope would be a faster clocked quad A9.



    That won't be a issue for most. What iPad will really need is the GPU chops and far more RAM.

    Quote:

    As mentioned Apple isn't as cozy with ARM as with Imagination Technologies but both products are about a year out for most companies, so I hope for at least one or the other, the 600 series would put them incredibly far ahead of Android tablets. They already have a lead, but this would go from big to enormous.



    Yeah they would be far ahead if they debuted a 600 series GPU. I don't think it is totally impossible jet not likely. However if apple where to push hard this would be the place to push.

    Quote:

    Edit: of interest?




    A few more days and all our talk will be forgotten.
     0Likes 0Dislikes 0Informatives
  • Reply 23 of 33
    yow!yow! Posts: 7member
    Opps, I misread tipoo as meaning *network* bandwidth.
     0Likes 0Dislikes 0Informatives
  • Reply 24 of 33
    Quote:

    Apple has a long history of Engineering triumphs, they just are understated about celebrating those triumphs. Even though I don't like the machine, the iMac is a good example of an engineering success. Many don't see it that way but the whole series of iMacs where an impressive display of thinking with an open mind.



    It's good to hear you say this despite the fact you don't like it.



    I agonised between the Pro and iMac for far too many years. (Worried about the built in screen going, hard drive failure...how to get into the machine etc...gpu performance sucking etc.)



    But I can say it's a beautiful machine, a work of art.



    I could say the same of the Mac Pro though. It's an industrial masterpiece.



    And the Mini makes me 'coo' whenever I see one in the flesh.



    ...the iOS devices themselves are 'Star Trek' technology to me. They completely turned their markets on their heads. Real paradigm shifts.



    I don't like laptops. But I almost take a step back in amazement when I see the Airs side ways on...razor sharp design.



    But back to the main point. For me, Retina iPad3 would be amazing if they could get the 'Rogue' '600' gpu in their. With a Quad cpu, it would 'Blow the bloody doors off!'



    I can't see Apple going backwards in performance. (I hope we don't see a situation like the iPhone 4 where the cpu really struggles with loading web pages on the web browser, choppy scrolling at times.) I'd wager a guess that the current gpu in the iPad is powerful enough for that not to happen in general.



    I'm so getting an iPad 3.



    Lemon Bon Bon.
     0Likes 0Dislikes 0Informatives
  • Reply 25 of 33
    tipootipoo Posts: 1,163member
    Quote:
    Originally Posted by yow! View Post


    Opps, I misread tipoo as meaning *network* bandwidth.



    Haha, yeah no I meant the interconnects in the system. Smartphone GPUs may be reaching the raw GFlops of todays consoles, but they don't have anything like the memory bandwidth to both GPU and CPU, or bandwidth between the GPU and GPU, etc. Also apps are limited to what, a few hundred MB at the absolute maximum? Some PS3 games are filling up both layers in Blu Ray drives for 50GB, and some on 360 are using multiple 9GB DVDs and compression. Also John Carmack said with any dedicated gaming device devs are given much closer control of the hardware so it performs half again or up to twice as good as any other platform with the same hardware and I tend to believe his geeky self, so if it was the MP4 like the PS Vita we could still expect far better graphics on the Vita than the iPad if devs take the time for it. If it was Rogue, that hardware advantage would be too big even for better APIs to overcome, granted they allow more storage for apps (come on now Apple, well past time for another flash doubling, they used to do it every other generation with iPods).
     0Likes 0Dislikes 0Informatives
  • Reply 26 of 33
    yow!yow! Posts: 7member
    Quote:
    Originally Posted by tipoo View Post


    Also John Carmack said with any dedicated gaming device devs are given much closer control of the hardware so it performs half again or up to twice as good as any other platform with the same hardware



    In a video of a presentation where he said that [on iPhone RAGE engine], he stressed a factor was separation of gpu and cpu memory (i.e. video cards having their own RAM), and it's inefficient moving data from system RAM to gpu RAM. Consoles use the same RAM for both. Interestingly, Intel's integrated graphics also does - and he predicted great things from it as it becomes more powerful, due to this architectural difference. I believe the iPhone/iPad don't have separate cpu/gpu RAM, and so performance would be comparable to a special-purpose gaming machine with similar specs.



    Thanks, I didn't know PS Vita has quad SGX 543 GPU (and quad CPU). http://en.wikipedia.org/wiki/PlayStation_Vita
     0Likes 0Dislikes 0Informatives
  • Reply 27 of 33
    tipootipoo Posts: 1,163member
    I think we saw the same interview



    Part of it was having the CPU and GPU on the same die so having little time wasted sending stuff between them, but there were other advantages to dedicated game hardware, ie very low level APIs that don't exist on any other platform. With a gaming console they can alter a memory location in one step, with an OS on top they have to go through further removed APIs and take magnitudes more time doing it.



    If you could have unified memory with enough bandwidth, that would be better than separate memory, but we've kept it separate until now because the bandwidth just wasn't there, and I don't think will be for high performance graphics for another few years. Note from the vita specs you linked:



    Quote:

    Memory\t512 MB RAM, 128 MB VRAM



    the PS Vita has 512MB system RAM and 128MB video RAM for a very good reason, unified bandwidth would hinder the GPU. And the PS3 has split memory as well, 256 for each. The 360 was a bit of a special case as it used what was really only used as high speed graphics memory back in 2005 for both. So really only one console (we don't talk about the Wii :P) has unified.





    He talked specifically about AMD's plans with their Accelerated Processing Units, todays integrated GPUs are very much just the GPU on the same die as the CPU, but eventually they will merge functional elements to use the best parts of both interchangeably and both have the same access to memory. Theoretically that could be faster than a discreet GPU provided much more bandwidth than we have today, but we have a ways to go in terms of shrinking things down before its viable to put something as powerful as say a 7970 on the same die as a high performance CPU. There's also a reason AMD's current APUs have mid-low range graphics chips on them instead of high performance.



    tl;dr integration is definitely the future, but we're not there yet, so I'm not sure about when you say having everything in one SoC makes the iProducts comparable to dedicated game hardware, as most dedicated game hardware still has its own dedicated video and system RAM.
     0Likes 0Dislikes 0Informatives
  • Reply 28 of 33
    yow!yow! Posts: 7member
    Quote:
    Originally Posted by tipoo View Post


    so I'm not sure about when you say having everything in one SoC makes the iProducts comparable to dedicated game hardware, as most dedicated game hardware still has its own dedicated video and system RAM.



    Ah, I was generalizing from the xbox, which you note is a special case. But I do think that squeezing graphics performance for iProducts was a absolute priority - especially with the first iPhone - or else the touch interface wouldn't seem intuitive. And so I would guess apple would use all the low-level tricks available from dedicated game hardware that they could.



    The overhead of OS calls can be overcome, e.g. directX



    Agreed, matching current high-end cards is a long way off; but beating the xbox360 (that uses tech from a few generations back) seems on the cusp of being within reach.
     0Likes 0Dislikes 0Informatives
  • Reply 29 of 33
    shrikeshrike Posts: 494member
    Conservatively, it'll be the same SGX543MP2, but maybe 50% more clock rate for a 1.5x increase in performance.



    Optimistically, a SGX543MP4 or a SGX554MP2 and a 50% clock rate increase for about 2.5x to 4x increase in performance.



    The 600 Rogue series is still a year early.
     0Likes 0Dislikes 0Informatives
  • Reply 30 of 33
    Quote:
    Originally Posted by wizard69 View Post


    If they can't drive the screen at full resolution for 3D I don't see many accepting the platform as a next generation device.



    'Many'? How many? You? Hard core tower buyers? PS3 buyers trampled over in teh stampede to get one? But not the 50 million+ buyers who will storm the gates of Redmond to get one. 'Move out the way, Ballmer!' ebay is probably lighting up with people dumping their iPad2s for a pending iPad3. £395 for a retina screen on a ten inch pad? iPad is nowhere near it's critical mass and it's % sales improvement is near vertical.



    Sure, I'd like the Rogue class. But this iteration of the iPad 3 will smash the 2 in terms of sales. My guess. *Places a wager.



    I think if we look at the current iPad2. You have a x4-x9 increase in GPU over the iPad. Any nominal improvement in the gpu should see the retina screen more than ably handled. The iPhone4S is more than comfortable. There maybe a case that the current GPU could certainly handle it. (Better than the gpu in my iPhone 4 handles the retina screen for sure.)



    Given ram (Surely!!!), cpu AND GPU updates one would expect the iPad 3 handles 2d superbly well and 3D more than acceptably. (Given that the current gen of consoles only offer half def in 3D at 720p and that the graphics look very smooth at high framerates...and should do on a smaller screen vs a 50 inch plasma I don't see any reason why 720p would be a problem on an iPad3. It wasn't for PS3 or X-Box players..?)



    I guess that would be upto developers. The x5 chipset sounds more than ample for the job especially if it's a quad core (I'd be surprised if it wasn't.)



    Lemon Bon Bon.
     0Likes 0Dislikes 0Informatives
  • Reply 31 of 33
    Quote:
    Originally Posted by Shrike View Post


    Conservatively, it'll be the same SGX543MP2, but maybe 50% more clock rate for a 1.5x increase in performance.



    Optimistically, a SGX543MP4 or a SGX554MP2 and a 50% clock rate increase for about 2.5x to 4x increase in performance.



    The 600 Rogue series is still a year early.



    x5. I guess we don't know what that really means yet. I think even a 50% clock speed increae to give a 2x performance increase will make it more than comfortable. If it's x4 even better.



    'Rogue' sounds optimistic from what I've read on here and elsewhere. They have to have something to make the iPad 4 a worthy purchase. I'm guessing 'Rogue' will be a signature piece of technology for iPad 4.



    I'm still primarily a 'Mac' guy. But I have an iPhone 4. I know some people who have iPod touches, iPhone 4 then upgraded to iPhone4S and also bought an iPad2 and don't even have a Mac! (you know the people who swore they'd never go apple or use those 'weird' touch screens..?)



    I guess my evangelical zeal must have persuaded them. I also know a work colleague, who after an iPod touch, iPhone3GS...went to Crapberry, then back to an iPhone4...and...finally caved! She bought a 2010 Macbook Air in mint condition (13 inch.) She's thrilled by it. She 'just had to tell me...' Sent me a picture of it. Looks realllll nice. And those are the people who are getting into Apple.



    Lemon Bon Bon.
     0Likes 0Dislikes 0Informatives
  • Reply 32 of 33
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by Lemon Bon Bon. View Post


    'Many'? How many? You? Hard core tower buyers? PS3 buyers trampled over in teh stampede to get one? But not the 50 million+ buyers who will storm the gates of Redmond to get one. 'Move out the way, Ballmer!' ebay is probably lighting up with people dumping their iPad2s for a pending iPad3. £395 for a retina screen on a ten inch pad? iPad is nowhere near it's critical mass and it's % sales improvement is near vertical.



    I sad if there! I really don't expect a performance regression, at least not a major one, because as I said it would not be accepted. It is no different than if Apple where to introduce a new laptop with a huge performance regression sales would tank. Think about it would you jump at the chance to buy an iPad 3 if it graphically performed worst than iPd ?



    I think not.

    Quote:

    Sure, I'd like the Rogue class. But this iteration of the iPad 3 will smash the 2 in terms of sales. My guess. *Places a wager.



    Sales are never guaranteed. I'm not that concerned as I think it would be easy for Apple to get the performance they need from an A5 derived chip at a smaller process, probably 28nm. They might need to add cache/ memory and a couple of GPU units but this is no big deal. We would end up with a slightly faster machine with parity graphical performance.

    Quote:

    I think if we look at the current iPad2. You have a x4-x9 increase in GPU over the iPad. Any nominal improvement in the gpu should see the retina screen more than ably handled. The iPhone4S is more than comfortable. There maybe a case that the current GPU could certainly handle it. (Better than the gpu in my iPhone 4 handles the retina screen for sure.)



    You have 4 times the pixels to fill. That is a lot of data to transfer, thus they will need some modification to A5. I actually see the proble admire of a data transfer issue than anything.

    Quote:



    Given ram (Surely!!!), cpu AND GPU updates one would expect the iPad 3 handles 2d superbly well and 3D more than acceptably.



    RAN isn't a given. In fact there have been very few rumors in that regard. Frankly I agree that it is a requirement but little has been stupid in that regard.

    Quote:

    (Given that the current gen of consoles only offer half def in 3D at 720p and that the graphics look very smooth at high framerates...and should do on a smaller screen vs a 50 inch plasma I don't see any reason why 720p would be a problem on an iPad3. It wasn't for PS3 or X-Box players..?)



    It is a handheld device that is much closer to your eyes. It will depend upon the software but some apps will be noticeably compromised. Notice I said apps here, not every app using the 3D capability is a game.

    Quote:

    I guess that would be upto developers. The x5 chipset sounds more than ample for the job especially if it's a quad core (I'd be surprised if it wasn't.)



    Lemon Bon Bon.



    If such a chip is real it would be a marginal bump above current performance, due simply to all of those pixels. That really isn't bad though considering what you are getting.



    In any event two days or so till the debut. I suspect we will get dual core myself with GPU and data handling enhancements. Mind you with everything running at double today's clock rates maybe even more for the GPUs.
     0Likes 0Dislikes 0Informatives
  • Reply 33 of 33
    tipootipoo Posts: 1,163member
    Aaaand its the MP4, that settles that
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.