or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › iPad 3 - SGX 543MP2 or 600 series?
New Posts  All Forums:Forum Nav:

iPad 3 - SGX 543MP2 or 600 series?

post #1 of 34
Thread Starter 
Just wondering what the consensus on the rumor mill is for what will be in the iPad 3? I've heard the former more often, but I'm really hoping for the latter. Look at this.

Quote:
The Rogue GPU will deliver more than 210GFlops (no details as if single or double precision), deliver 350 million real polygons per second and more than five gigapixels per second visible fill rate, which translates into 13 gigapixels of effective fill rate.

This is because, unlike other rival technologies, PowerVR only processes pixels that will be displayed, leaving out hidden ones.

http://www.itproportal.com/2011/02/1...gpx-fill-rate/


That's getting in the range of laptop cards like the Radeon 55** mobile. It depends on if developers take advantage of it of course, but the potential is there, the cards in the 360 and PS3 are only pushing around 200Gflops too, so maybe they weren't so crazy saying console quality by 2013. Depends on the processor too, but just from a GPU standpoint we're nearly there. The SGX 543MP2? 12 Gflops, so 20x the performance is about right for the 600 series.

So which one of these chips do you think will be in the iPad 3? Anand from Anandtech seems to think 543MP4 for various reasons, but it won't be the first time Apple gets new technology first, and they are a 10% shareholder in PowerVR.

Another thing is that if its the MP4 variant of the same core type, that's "only" double the performnace for what's basically confirmed as 4x the pixels, so it would be like the 3gs-4 where GPU performance often went down because of the new res. That would be unfortunate but bearable, but I still really hope its the 600 series.
post #2 of 34
What is in the Sona vita New handheld..?

For me, I'd like to see Apple put the boot into Sony.

The Rogue sounds good. As good as a PS3..? That would be cool.

Lemon Bob Bon.

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply
post #3 of 34
Thread Starter 
The Vita uses the SGX 543MP4, the quad core version of the dual core GPU in the iPad 2. It also has to push far less pixels than the iPad 3 would though so it would get better performance for the same power, plus we should know by now not to underestimate dedicated gaming hardware (look what the old x1900-like chip in the 360 or the 7800 in the PS3 can do), they no doubt have sophisticated API's and closer control of the hardware for developers.
post #4 of 34
A base processor for handhelds and a more powerful processor for tablets.

What they could or will contain is an open question. At this point I suspect Apple will focus on the retina displays performance and do so with a far more powerful GPU. With only two weeks to go I'm hopeful for a major update.
post #5 of 34
Quote:
Originally Posted by tipoo View Post

The Vita uses the SGX 543MP4, the quad core version of the dual core GPU in the iPad 2. It also has to push far less pixels than the iPad 3 would though so it would get better performance for the same power, plus we should know by now not to underestimate dedicated gaming hardware (look what the old x1900-like chip in the 360 or the 7800 in the PS3 can do), they no doubt have sophisticated API's and closer control of the hardware for developers.

From what I've seen of pictures of the PSP Vita, a quad 543MP4 is pretty badass. I think Apple will use a quad 543MP4, and save the more fancy PowerVR stuff for the next iteration.

I'm predicting the iPad 2X will need this quad GPU for Retina Display in 2D, but it will deliver better 3D graphics than iPad 2 because it will still be running at 1024x768 but be ~upscaled ala Xbox360 Lanczos~ to 2048x1536. Everybody wins.
post #6 of 34
Thread Starter 
Quote:
Originally Posted by sunilraman View Post


I'm predicting the iPad 2X will need this quad GPU for Retina Display in 2D, but it will deliver better 3D graphics than iPad 2 because it will still be running at 1024x768 but be ~upscaled ala Xbox360 Lanczos~ to 2048x1536. Everybody wins.

Also a good point, most games on the 4 don't run at native since the GPU doesn't have enough bandwidth or fill rate. So games could benefit from the faster GPU even if there is 4x the pixels. Benchmarks would take a hit if they run at native, but actual games wouldn't since developers wouldn't be able to run at the full rez. I still really want the 600 series though those performance numbers are crazy for a sub 1W chip. I guess I'm just spoiled by technology but only 2x the GPU performance doesn't sound great after a year anymore after last times 7-9x bump.
post #7 of 34
Quote:
Originally Posted by tipoo View Post

Also a good point, most games on the 4 don't run at native since the GPU doesn't have enough bandwidth or fill rate. So games could benefit from the faster GPU even if there is 4x the pixels. Benchmarks would take a hit if they run at native, but actual games wouldn't since developers wouldn't be able to run at the full rez. I still really want the 600 series though those performance numbers are crazy for a sub 1W chip. I guess I'm just spoiled by technology but only 2x the GPU performance doesn't sound great after a year anymore after last times 7-9x bump.

If they can't drive the screen at full resolution for 3D I don't see many accepting the platform as a next generation device.
post #8 of 34
Thread Starter 
Quote:
Originally Posted by wizard69 View Post

If they can't drive the screen at full resolution for 3D I don't see many accepting the platform as a next generation device.

Its already happened; like I said sophisticated 3D games don't run at native on the iPhone 4. If you look at benchmarks, if they ran at native the 3GS is faster than the 4.





Not running at native is hardly a dealbreaker for joe consumer. Think about it, even desktop graphics cards with over 100x the power consumption of this whole SoC would struggle at the rumored resolution.
post #9 of 34
Quote:
Originally Posted by wizard69 View Post

If they can't drive the screen at full resolution for 3D I don't see many accepting the platform as a next generation device.

About a week-ish to find out?

We'll soon get to see.

Lemon Bon Bon.

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply
post #10 of 34
Quote:
Originally Posted by tipoo View Post

Its already happened; like I said sophisticated 3D games don't run at native on the iPhone 4. If you look at benchmarks, if they ran at native the 3GS is faster than the 4.

While I'm no fan of anandtech I'm not surprised at the numbers there. I do wonder how much Apples drivers are impacting things in the tests. In the end though this is my whole point, if graphics performance regresses iPad 3 will not be seen as a new generation device. Instead it will be seen as a half step in the general direction.

So to state it another way, in order for iPad3 to be well accepted it needs to not regress performance wise at full resolution. It actuall needs to perform better. Is that a tall order? I don't really think so, it will require a state of the art processor though.
Quote:



Not running at native is hardly a dealbreaker for joe consumer.

I really believe that is a mistake. It will depend upon how noticeable artifacts are, but I expect very noticeable differences between apps running at the old res vs apps running the new res.
Quote:
Think about it, even desktop graphics cards with over 100x the power consumption of this whole SoC would struggle at the rumored resolution.

You are talking about an entirely different class of performance. I believe Apple can easily get 4x better graphics performance just via the current A5 layout. How, through process shrinks that would allow them to more than double clock rate and cores. Apple could potentially jump two or more process nodes, that would allow doubling of clock rate without significant thermal issues. As I've mentioned before Cortex A9 cores are already running much faster than Apples old tech on the sub 32nm processes. Like wise Intels new ATOMs have their GPUs running extremely fast, while the overall power profile has dropped.

Yes I'm optimistic. The question is am I out in left field or a little more rational. To put it simply I think Apple can maintain performance parity at native resolution if they are will to be aggressive. In any event only a few days left before we are all cheering or sobbing in our Wheaties. Maintaining parity in GPU performance across all those extra pixels would make iPad3 very much a respected upgrade. Going beyond that is a more interesting discussion, I'd be very surprised if they have performance that is significantly faster with native resolution.
post #11 of 34
Mind you that is with current generation GPU cores of the 545 vain. Actually gaining at native resolution might be a bit more difficult because that would require more than doubling the curt cores and clock rate.

Quote:
Originally Posted by Lemon Bon Bon. View Post

About a week-ish to find out?

Yeah I'm excited, but sadly have no budget for such a device. I suspect I will be frustrated!
Quote:
We'll soon get to see.

Lemon Bon Bon.

After the big roll out I have to wonder how bad the backlog will be. If they can deliver what I think they can, Apple may be out of stock most of the years.
post #12 of 34
Thread Starter 
That's another possibility I hadn't mentioned, a 543MP4 but clocked higher than the one in the 2, so not only double the cores but a 50-100% clock increase. So even with 4x the resolution it could perform a bit better than the 2 even at native. The PS Vitas MP4 is clocked up to 400MHz, I think. That would make me happy. But still not as much as the 600 series


I still doubt actual games will be at native though, I think they would be 1024x768 in HiDPI scaling mode, that way they could squeeze even more performance out of it. Artifacts? Pheh, I'll deal with them for more detailed worlds, and the apps that don't need that much performance can run at native, best of both worlds.
post #13 of 34
Quote:
Originally Posted by tipoo View Post

That's another possibility I hadn't mentioned, a 543MP4 but clocked higher than the one in the 2, so not only double the cores but a 50-100% clock increase.

This makes the most sense to me! It would also be the best explanation for the A5X designation. The only potential problem would be memory bandwidth.
Quote:
So even with 4x the resolution it could perform a bit better than the 2 even at native. The PS Vitas MP4 is clocked up to 400MHz, I think. That would make me happy. But still not as much as the 600 series

My original understanding was that the 543 could go beyond four cores.
Quote:

I still doubt actual games will be at native though, I think they would be 1024x768 in HiDPI scaling mode, that way they could squeeze even more performance out of it.

Why not native? If you get the same basic performance why not enjoy it.
Quote:
Artifacts? Pheh, I'll deal with them for more detailed worlds, and the apps that don't need that much performance can run at native, best of both worlds.
post #14 of 34
Thread Starter 
Quote:
Originally Posted by wizard69 View Post

Why not native? If you get the same basic performance why not enjoy it.

Same reason most console games are 720p rather than 1080, or why 3D games like Uncharted 3 on PS3 have to reduce the level of detail when the GPU has to render effectively two scenes. Developer ease, and they can do more complicated things on screen at a lower resolution because they have more power to work with, which would otherwise be spent on the extra pixels. If they can run at native, great, but sooner or later they'd hit the point where they could effectively up the power by reducing the resolution to 1024x768 upscaled mode.
post #15 of 34
tipoo, I've thinking about the 600 series (Rogue) in iPad 3, too. I'm currently inclined to agree with your reluctant conclusion that they'll settle for x2 cores (MP4) at x2 clock, to handle the x4 pixels. This is because only one license has a SoC for Rogue, but it's expected to be a year til it's available in products. However, Apple might be faster because (i) they can integrate SoC and product development because they are both in-house; (ii) Apple own 9-10% of Imagination Technologies, the GPU designer, so they might get early versions, work closely with them etc. Hope springs eternal...

Also, if they went with a dual-core version of this GPU, it would end consoles.

BTW: That SoC with this GPU is from ST-Ericsson:

Nova A9600, 210G FLOPs, 350M polygons/s, fill rates in excess of 13Gpixels/sec, sampling in 2011

From wiki: http://en.wikipedia.org/wiki/List_of...oducts#Series6
This has been "released to licensees" just this Jan 2012. But I read elsewhere that it's not due in products til 2013.
post #16 of 34
Apple could very well have an A6 SoC using this chip ready for iPad 3, but I see it as unlikely. Contrardictory, maybe but I have no doubt that Apple partners with Imagination in the development of these cores so they would have early access. Work on the cores would have been started years ago to integrate it into a new SoC. However the GPU core isn't the big issue here as you still need to team it to a suitable CPU and system architecture to support that GPU. This would be the big difference between A5X and A6, A6 would have the system architecture to support the bus performance required to feed the GPU.

Quote:
Originally Posted by yow! View Post

tipoo, I've thinking about the 600 series (Rogue) in iPad 3, too. I'm currently inclined to agree with your reluctant conclusion that they'll settle for x2 cores (MP4) at x2 clock, to handle the x4 pixels. This is because only one license has a SoC for Rogue, but it's expected to be a year til it's available in products. However, Apple might be faster because (i) they can integrate SoC and product development because they are both in-house; (ii) Apple own 9-10% of Imagination Technologies, the GPU designer, so they might get early versions, work closely with them etc. Hope springs eternal...

Also, if they went with a dual-core version of this GPU, it would end consoles.

BTW: That SoC with this GPU is from ST-Ericsson:

Nova A9600, 210G FLOPs, 350M polygons/s, fill rates in excess of 13Gpixels/sec, sampling in 2011

From wiki: http://en.wikipedia.org/wiki/List_of...oducts#Series6
This has been "released to licensees" just this Jan 2012. But I read elsewhere that it's not due in products til 2013.
post #17 of 34
Quote:
Originally Posted by wizard69 View Post

[...] Work on the cores would have been started years ago to integrate it into a new SoC. However the GPU core isn't the big issue here as you still need to team it to a suitable CPU and system architecture to support that GPU.

Wouldn't "integration" into the SoC include system architecture to support it? Years in the pipeline seems enough time to do that...

On your first point, you're saying the current dual-core CPU isn't suitable to feed this GPU? Bus bandwidth to memory seems more a system architecture issue, esp. considering that licensees have great leeway to tweak ARM designs, maybe that SoC design firm that Apple bought could handle it...

OTOH... Of course, Apple could also go for a Cortex A15 upgrade (as the Nova above does), which would seem to address that issue. And, if we're thinking that they could get a Rogue into the iPad 3 one year ahead of competitors, why not get A15 a year ahead too? (Apple isn't as cosy with ARM as they are with IT, though they helped found it, so they mightn't get the needed early-access).

It seems highly unrealistic to expect both upgrades (esp if the A5X photo is real); yet it also seem neatest and simplest to have them go together. So... they both wait for iPad 4, and we get a half-hacked (aka tuned) version of the iPad 2 SoC in iPad 3.
post #18 of 34
Quote:
Originally Posted by yow! View Post

Wouldn't "integration" into the SoC include system architecture to support it? Years in the pipeline seems enough time to do that...

Well this would likely be why Apple has a rumored two track development process. That is to have the time to engineer a major new design every two years. So yeah they have to design the SoC to cover the memory bandwidth requirements of the GPU. At the same time that means an all new SoC.
Quote:
On your first point, you're saying the current dual-core CPU isn't suitable to feed this GPU? Bus bandwidth to memory seems more a system architecture issue,

That is system architecture and is exactly what I'm talking about. Frankly the CPU plays a small part here. The GPU on the other hand may require a new memory bus design, caches or other features. There are many ways for Apple to solve these issues but it does involve a bit of engineering to come up with the new architecture.
Quote:
esp. considering that licensees have great leeway to tweak ARM designs, maybe that SoC design firm that Apple bought could handle it...

I have no doubt they can handle it. The question is could they have this new architecture ready for launch next week. That is really hard to say, if the debut what could be called A6 next week they would effectively be six to nine months ahead of the industry.
Quote:
OTOH... Of course, Apple could also go for a Cortex A15 upgrade (as the Nova above does), which would seem to address that issue.

Cortex A15 doesn't have a lot to offer Apple, I believe the GPU would be the imperative. Let's face it with the right process shrink Apple could manage a 2-4X performance jump with the current Cortex A9's.
Quote:
And, if we're thinking that they could get a Rogue into the iPad 3 one year ahead of competitors, why not get A15 a year ahead too? (Apple isn't as cosy with ARM as they are with IT, though they helped found it, so they mightn't get the needed early-access).

I suspect it will be more like six months or so. The problem is the new processor is a big initiative, with all of the associated risks, so why not focus on the absolute must haves. That would be the GPU and system architecture.
Quote:

It seems highly unrealistic to expect both upgrades (esp if the A5X photo is real); yet it also seem neatest and simplest to have them go together. So... they both wait for iPad 4, and we get a half-hacked (aka tuned) version of the iPad 2 SoC in iPad 3.

Half-hacked? Wow I bet a few Apple engineers just lined up to tell you a thing or two. A SoC fast enough to really drive that high resolution screen well is a major engineering effort even if it derives from the old. In any event it will be most interesting to see what Apple delivers.
post #19 of 34
Thread Starter 
Quote:
Originally Posted by yow! View Post

Also, if they went with a dual-core version of this GPU, it would end consoles.

Even todays consoles have different advantages though, bandwidth is a big problem on these smartphone SoCs as well as storage, not to mention controls. And todays desktop GPUs that would be in the next gen consoles have over 10x the raw Gflops of the current ones ~200. The new Radeon 7870 pushes over 3 terraflops.

I think we're still 10-12 months early for Cortex A15, so my best hope would be a faster clocked quad A9. As mentioned Apple isn't as cozy with ARM as with Imagination Technologies but both products are about a year out for most companies, so I hope for at least one or the other, the 600 series would put them incredibly far ahead of Android tablets. They already have a lead, but this would go from big to enormous.

Edit: of interest?

http://www.theverge.com/2012/3/1/283...mment-on-apple
post #20 of 34
Quote:
Originally Posted by wizard69 View Post

if the debut what could be called A6 next week they would effectively be six to nine months ahead of the industry.

I think that sums it up: can Apple be so far ahead of the industry, not just in design, concept etc, but even in designing the chips themselves? I gave some factors above supporting this; but their past performance is a guide: they had a new SoC for iPad 1, and a year later, another new SoC for iPad 2.

Now, that's just two data points, and there could be other factors (e.g. maybe both chips were in the pipeline for years). Crucially, and the point, I haven't compared that with the industry - maybe everyone managed that too.

Quote:
Cortex A15 doesn't have a lot to offer Apple

The A15 seems a significant upgrade, at least doubling the performance without doubling power consumption. ARM know what they are doing. Can you expand on why you think that?

I do agree that a CPU upgrade isn't crucial for the iPad (for this generation); but GPU absolutely is. This is true for the x4 pixels; but also in general.

Quote:
The problem is the new processor is a big initiative, with all of the associated risks, so why not focus on the absolute must haves. That would be the GPU and system architecture.

Agreed. And the "A5X", and dual-track rumour etc all support this. Plus, just a retina display is enough to wow everyone. I was musing that it might be workable if the risks could be self-contained within the SoC group (and it has the massive resources of Apple behind it) - but risks like "delays" can't be self-contained.

Quote:
Half-hacked? Wow I bet a few Apple engineers just lined up to tell you a thing or two. A SoC fast enough to really drive that high resolution screen well is a major engineering effort even if it derives from the old.

Hey, it wasn't meant as an insult. I was speaking in sympathy with them - to modify an existing design, when there's a new architecture available that solves all the problems you're facing... that was be frustrating to me. As a developer, I find that at some point, it's easier to start fresh than modify the existing design. Anyway, that's how I'd experience it - maybe they have a different attitude; and/or the technical issues are interesting in themselves and can inform and be used in the next architecture. Maybe I should keep my mouth shut instead of offending people

Wouldn't it be amazing if Apple was ahead of everybody in every aspect? I think it's possible but unlikely; but most of all, it's not needed for this generation of the iPad. And think that's one of the secrets of Apple: they *have* the technical chops, but they only use them in service of a user outcome, instead of an end in themselves. That's tricky. Most companies seem go one way or the other (i.e. all technical; or all user outcome).
post #21 of 34
Quote:
Originally Posted by tipoo View Post

Even todays consoles have different advantages though, bandwidth is a big problem on these smartphone SoCs as well as storage, not to mention controls.

Thanks, great points. HD game textures need massive textures. Could put them in the cloud but... bandwidth. I was thinking that mystery extra component (part number) could be a game controller.

Quote:
And todays desktop GPUs that would be in the next gen consoles have over 10x the raw Gflops of the current ones ~200. The new Radeon 7870 pushes over 3 terraflops.

Yes, consoles are long overdue for an upgrade, but they've been making so much money they've grown complacent... which creates an opportunity for a new entrant. Also, there's a shift towards casual games, where iPhone/iPad are strong.

Alternatives: Apple could include console features in their TV, circumventing the storage and bandwidth issues.

Quote:
I think we're still 10-12 months early for Cortex A15, so my best hope would be a faster clocked quad A9.

I think a quad-core GPU, but not a quad-core CPU. Because (1) Doubling CPU doesn't double performance, but doubling GPU does. So quad-core GPU is a more efficient use of silicon and power consumption. (2) CPU performance isn't needed as much as GPU performance on tablets in general, and especially for the iPad 3's retina display.


Why be terse if there's nothing to say - it sort of screams that they're working closely, doesn't it?
post #22 of 34
Quote:
Originally Posted by yow! View Post

I think that sums it up: can Apple be so far ahead of the industry, not just in design, concept etc, but even in designing the chips themselves? I gave some factors above supporting this; but their past performance is a guide: they had a new SoC for iPad 1, and a year later, another new SoC for iPad 2.

It would be a very pleasant shock if they delivered any sort of SoC with a 600 series GPU. It would very much put them way out in front of the competition. That would be six months silicon wise and probably a year software wise.

If they can't I'm not extremely worried as Apple should be able to tweak the A5 into something powerful enough to drive iPad 3.
Quote:
Now, that's just two data points, and there could be other factors (e.g. maybe both chips were in the pipeline for years). Crucially, and the point, I haven't compared that with the industry - maybe everyone managed that too.

Well an A5 tweaked to do the iPad 3 i still a major new chip in my mind even if it is mostly a die shrink with a beefed up GPU. A5X sounds more like an iPhone chip than anything though.
Quote:
The A15 seems a significant upgrade, at least doubling the performance without doubling power consumption. ARM know what they are doing. Can you expand on why you think that?

Mostly because Apple can get close to 4X improvements out of the Cortex A9 cores it is currently using. The can do that by adding two cores and doubling the clock rate. Demos of Cortex A9 cores running at 2GHz where made years ago. Notably this can be implemented without a lot of reworking of the system software. A15 is targeted more at the server market anyway.

by the time Apple needs a new core ARM might have a 64 bit implementation going. If Apple where to make a major change t the system architecture this would be a better path to follow.
Quote:
I do agree that a CPU upgrade isn't crucial for the iPad (for this generation); but GPU absolutely is. This is true for the x4 pixels; but also in general.

I'm not sure if crucial is the word or not. I just don't think it is a big deal as they can get faster cores from a simple process shrink while enhancing the GPU. In effect the CPU improvements come for free with a faster GPU.
Quote:
Agreed. And the "A5X", and dual-track rumour etc all support this. Plus, just a retina display is enough to wow everyone. I was musing that it might be workable if the risks could be self-contained within the SoC group (and it has the massive resources of Apple behind it) - but risks like "delays" can't be self-contained.

I suspect that just getting enough displays will be a major consideration for the iPad3 introduction. So I can sees where an A5X would be safe bet. Still it will be a very pleasant surprise to see a A6 class processor in the iPad.
Quote:
Hey, it wasn't meant as an insult. I was speaking in sympathy with them - to modify an existing design, when there's a new architecture available that solves all the problems you're facing... that was be frustrating to me.

I just had this image of an Apple engineer sitting at a bar talking to somebody who is calling his processor half-hacked. I can see a stein of beer being poured over somebodies head because of that.
Quote:
As a developer, I find that at some point, it's easier to start fresh than modify the existing design. Anyway, that's how I'd experience it - maybe they have a different attitude; and/or the technical issues are interesting in themselves and can inform and be used in the next architecture. Maybe I should keep my mouth shut instead of offending people

No keep talking, it is good to have reasoned conversation about something that will be here in about 6 days.
Quote:
Wouldn't it be amazing if Apple was ahead of everybody in every aspect? I think it's possible but unlikely; but most of all, it's not needed for this generation of the iPad.

Not needed? Personally I will take all the improvement the can throw at the thing. Seriously the iPad impresses me almost daily but at times it is performance limited to say the least. So yeah more power is welcomed. More importantly I want the machine to have far more RAM. Given that this happens the iPad could effectively replace my MBP for almost all of my portable needs.
Quote:
And think that's one of the secrets of Apple: they *have* the technical chops, but they only use them in service of a user outcome, instead of an end in themselves. That's tricky. Most companies seem go one way or the other (i.e. all technical; or all user outcome).

Apple has a long history of Engineering triumphs, they just are understated about celebrating those triumphs. Even though I don't like the machine, the iMac is a good example of an engineering success. Many don't see it that way but the whole series of iMacs where an impressive display of thinking with an open mind.
post #23 of 34
Quote:
Originally Posted by tipoo View Post

Even todays consoles have different advantages though, bandwidth is a big problem on these smartphone SoCs as well as storage, not to mention controls.

A good point but there are ways for Apple to address bandwidth issues. For example a process shrink could free up space for a much larger cache. Or they could implement a frame buffer on chip. The question is which would be more valuable 2 more ARM cores or a 32 or 64 MB, buffer or RAM array on chip?

I realize that the processes used to build processors don't lead to efficient RAM arrays but the idea here is that if one can keep the GPU from going off chip as much as possible you not only gain in speed but save a considerable amount of power. I actually believe the power savings would be ver significant as the chip would other wise be moving a great deal of data over the memory bus just to drive the display.

There are just so many options for Apple that I'm not convinced that getting exceptional performance at an extremely low power point is impossible these days. The current chips are at 45nm and Apple could potentially jump two nodes or more which would free up a lot of die space.
Quote:
And todays desktop GPUs that would be in the next gen consoles have over 10x the raw Gflops of the current ones ~200. The new Radeon 7870 pushes over 3 terraflops.

For us old guys this is astonishing. I can remember the days when CRAYs where a thing of wonder, now I can do everything a Cray 1 did on my desktop and frankly my iPhone is darn close.
Quote:
I think we're still 10-12 months early for Cortex A15, so my best hope would be a faster clocked quad A9.

That won't be a issue for most. What iPad will really need is the GPU chops and far more RAM.
Quote:
As mentioned Apple isn't as cozy with ARM as with Imagination Technologies but both products are about a year out for most companies, so I hope for at least one or the other, the 600 series would put them incredibly far ahead of Android tablets. They already have a lead, but this would go from big to enormous.

Yeah they would be far ahead if they debuted a 600 series GPU. I don't think it is totally impossible jet not likely. However if apple where to push hard this would be the place to push.
Quote:
Edit: of interest?

A few more days and all our talk will be forgotten.
post #24 of 34
Opps, I misread tipoo as meaning *network* bandwidth.
post #25 of 34
Quote:
Apple has a long history of Engineering triumphs, they just are understated about celebrating those triumphs. Even though I don't like the machine, the iMac is a good example of an engineering success. Many don't see it that way but the whole series of iMacs where an impressive display of thinking with an open mind.

It's good to hear you say this despite the fact you don't like it.

I agonised between the Pro and iMac for far too many years. (Worried about the built in screen going, hard drive failure...how to get into the machine etc...gpu performance sucking etc.)

But I can say it's a beautiful machine, a work of art.

I could say the same of the Mac Pro though. It's an industrial masterpiece.

And the Mini makes me 'coo' whenever I see one in the flesh.

...the iOS devices themselves are 'Star Trek' technology to me. They completely turned their markets on their heads. Real paradigm shifts.

I don't like laptops. But I almost take a step back in amazement when I see the Airs side ways on...razor sharp design.

But back to the main point. For me, Retina iPad3 would be amazing if they could get the 'Rogue' '600' gpu in their. With a Quad cpu, it would 'Blow the bloody doors off!'

I can't see Apple going backwards in performance. (I hope we don't see a situation like the iPhone 4 where the cpu really struggles with loading web pages on the web browser, choppy scrolling at times.) I'd wager a guess that the current gpu in the iPad is powerful enough for that not to happen in general.

I'm so getting an iPad 3.

Lemon Bon Bon.

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply
post #26 of 34
Thread Starter 
Quote:
Originally Posted by yow! View Post

Opps, I misread tipoo as meaning *network* bandwidth.

Haha, yeah no I meant the interconnects in the system. Smartphone GPUs may be reaching the raw GFlops of todays consoles, but they don't have anything like the memory bandwidth to both GPU and CPU, or bandwidth between the GPU and GPU, etc. Also apps are limited to what, a few hundred MB at the absolute maximum? Some PS3 games are filling up both layers in Blu Ray drives for 50GB, and some on 360 are using multiple 9GB DVDs and compression. Also John Carmack said with any dedicated gaming device devs are given much closer control of the hardware so it performs half again or up to twice as good as any other platform with the same hardware and I tend to believe his geeky self, so if it was the MP4 like the PS Vita we could still expect far better graphics on the Vita than the iPad if devs take the time for it. If it was Rogue, that hardware advantage would be too big even for better APIs to overcome, granted they allow more storage for apps (come on now Apple, well past time for another flash doubling, they used to do it every other generation with iPods).
post #27 of 34
Quote:
Originally Posted by tipoo View Post

Also John Carmack said with any dedicated gaming device devs are given much closer control of the hardware so it performs half again or up to twice as good as any other platform with the same hardware

In a video of a presentation where he said that [on iPhone RAGE engine], he stressed a factor was separation of gpu and cpu memory (i.e. video cards having their own RAM), and it's inefficient moving data from system RAM to gpu RAM. Consoles use the same RAM for both. Interestingly, Intel's integrated graphics also does - and he predicted great things from it as it becomes more powerful, due to this architectural difference. I believe the iPhone/iPad don't have separate cpu/gpu RAM, and so performance would be comparable to a special-purpose gaming machine with similar specs.

Thanks, I didn't know PS Vita has quad SGX 543 GPU (and quad CPU). http://en.wikipedia.org/wiki/PlayStation_Vita
post #28 of 34
Thread Starter 
I think we saw the same interview

Part of it was having the CPU and GPU on the same die so having little time wasted sending stuff between them, but there were other advantages to dedicated game hardware, ie very low level APIs that don't exist on any other platform. With a gaming console they can alter a memory location in one step, with an OS on top they have to go through further removed APIs and take magnitudes more time doing it.

If you could have unified memory with enough bandwidth, that would be better than separate memory, but we've kept it separate until now because the bandwidth just wasn't there, and I don't think will be for high performance graphics for another few years. Note from the vita specs you linked:

Quote:
Memory\t512 MB RAM, 128 MB VRAM

the PS Vita has 512MB system RAM and 128MB video RAM for a very good reason, unified bandwidth would hinder the GPU. And the PS3 has split memory as well, 256 for each. The 360 was a bit of a special case as it used what was really only used as high speed graphics memory back in 2005 for both. So really only one console (we don't talk about the Wii :P) has unified.


He talked specifically about AMD's plans with their Accelerated Processing Units, todays integrated GPUs are very much just the GPU on the same die as the CPU, but eventually they will merge functional elements to use the best parts of both interchangeably and both have the same access to memory. Theoretically that could be faster than a discreet GPU provided much more bandwidth than we have today, but we have a ways to go in terms of shrinking things down before its viable to put something as powerful as say a 7970 on the same die as a high performance CPU. There's also a reason AMD's current APUs have mid-low range graphics chips on them instead of high performance.

tl;dr integration is definitely the future, but we're not there yet, so I'm not sure about when you say having everything in one SoC makes the iProducts comparable to dedicated game hardware, as most dedicated game hardware still has its own dedicated video and system RAM.
post #29 of 34
Quote:
Originally Posted by tipoo View Post

so I'm not sure about when you say having everything in one SoC makes the iProducts comparable to dedicated game hardware, as most dedicated game hardware still has its own dedicated video and system RAM.

Ah, I was generalizing from the xbox, which you note is a special case. But I do think that squeezing graphics performance for iProducts was a absolute priority - especially with the first iPhone - or else the touch interface wouldn't seem intuitive. And so I would guess apple would use all the low-level tricks available from dedicated game hardware that they could.

The overhead of OS calls can be overcome, e.g. directX

Agreed, matching current high-end cards is a long way off; but beating the xbox360 (that uses tech from a few generations back) seems on the cusp of being within reach.
post #30 of 34
Conservatively, it'll be the same SGX543MP2, but maybe 50% more clock rate for a 1.5x increase in performance.

Optimistically, a SGX543MP4 or a SGX554MP2 and a 50% clock rate increase for about 2.5x to 4x increase in performance.

The 600 Rogue series is still a year early.
post #31 of 34
Quote:
Originally Posted by wizard69 View Post

If they can't drive the screen at full resolution for 3D I don't see many accepting the platform as a next generation device.

'Many'? How many? You? Hard core tower buyers? PS3 buyers trampled over in teh stampede to get one? But not the 50 million+ buyers who will storm the gates of Redmond to get one. 'Move out the way, Ballmer!' ebay is probably lighting up with people dumping their iPad2s for a pending iPad3. £395 for a retina screen on a ten inch pad? iPad is nowhere near it's critical mass and it's % sales improvement is near vertical.

Sure, I'd like the Rogue class. But this iteration of the iPad 3 will smash the 2 in terms of sales. My guess. *Places a wager.

I think if we look at the current iPad2. You have a x4-x9 increase in GPU over the iPad. Any nominal improvement in the gpu should see the retina screen more than ably handled. The iPhone4S is more than comfortable. There maybe a case that the current GPU could certainly handle it. (Better than the gpu in my iPhone 4 handles the retina screen for sure.)

Given ram (Surely!!!), cpu AND GPU updates one would expect the iPad 3 handles 2d superbly well and 3D more than acceptably. (Given that the current gen of consoles only offer half def in 3D at 720p and that the graphics look very smooth at high framerates...and should do on a smaller screen vs a 50 inch plasma I don't see any reason why 720p would be a problem on an iPad3. It wasn't for PS3 or X-Box players..?)

I guess that would be upto developers. The x5 chipset sounds more than ample for the job especially if it's a quad core (I'd be surprised if it wasn't.)

Lemon Bon Bon.

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply
post #32 of 34
Quote:
Originally Posted by Shrike View Post

Conservatively, it'll be the same SGX543MP2, but maybe 50% more clock rate for a 1.5x increase in performance.

Optimistically, a SGX543MP4 or a SGX554MP2 and a 50% clock rate increase for about 2.5x to 4x increase in performance.

The 600 Rogue series is still a year early.

x5. I guess we don't know what that really means yet. I think even a 50% clock speed increae to give a 2x performance increase will make it more than comfortable. If it's x4 even better.

'Rogue' sounds optimistic from what I've read on here and elsewhere. They have to have something to make the iPad 4 a worthy purchase. I'm guessing 'Rogue' will be a signature piece of technology for iPad 4.

I'm still primarily a 'Mac' guy. But I have an iPhone 4. I know some people who have iPod touches, iPhone 4 then upgraded to iPhone4S and also bought an iPad2 and don't even have a Mac! (you know the people who swore they'd never go apple or use those 'weird' touch screens..?)

I guess my evangelical zeal must have persuaded them. I also know a work colleague, who after an iPod touch, iPhone3GS...went to Crapberry, then back to an iPhone4...and...finally caved! She bought a 2010 Macbook Air in mint condition (13 inch.) She's thrilled by it. She 'just had to tell me...' Sent me a picture of it. Looks realllll nice. And those are the people who are getting into Apple.

Lemon Bon Bon.

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply
post #33 of 34
Quote:
Originally Posted by Lemon Bon Bon. View Post

'Many'? How many? You? Hard core tower buyers? PS3 buyers trampled over in teh stampede to get one? But not the 50 million+ buyers who will storm the gates of Redmond to get one. 'Move out the way, Ballmer!' ebay is probably lighting up with people dumping their iPad2s for a pending iPad3. £395 for a retina screen on a ten inch pad? iPad is nowhere near it's critical mass and it's % sales improvement is near vertical.

I sad if there! I really don't expect a performance regression, at least not a major one, because as I said it would not be accepted. It is no different than if Apple where to introduce a new laptop with a huge performance regression sales would tank. Think about it would you jump at the chance to buy an iPad 3 if it graphically performed worst than iPd ?

I think not.
Quote:
Sure, I'd like the Rogue class. But this iteration of the iPad 3 will smash the 2 in terms of sales. My guess. *Places a wager.

Sales are never guaranteed. I'm not that concerned as I think it would be easy for Apple to get the performance they need from an A5 derived chip at a smaller process, probably 28nm. They might need to add cache/ memory and a couple of GPU units but this is no big deal. We would end up with a slightly faster machine with parity graphical performance.
Quote:
I think if we look at the current iPad2. You have a x4-x9 increase in GPU over the iPad. Any nominal improvement in the gpu should see the retina screen more than ably handled. The iPhone4S is more than comfortable. There maybe a case that the current GPU could certainly handle it. (Better than the gpu in my iPhone 4 handles the retina screen for sure.)

You have 4 times the pixels to fill. That is a lot of data to transfer, thus they will need some modification to A5. I actually see the proble admire of a data transfer issue than anything.
Quote:

Given ram (Surely!!!), cpu AND GPU updates one would expect the iPad 3 handles 2d superbly well and 3D more than acceptably.

RAN isn't a given. In fact there have been very few rumors in that regard. Frankly I agree that it is a requirement but little has been stupid in that regard.
Quote:
(Given that the current gen of consoles only offer half def in 3D at 720p and that the graphics look very smooth at high framerates...and should do on a smaller screen vs a 50 inch plasma I don't see any reason why 720p would be a problem on an iPad3. It wasn't for PS3 or X-Box players..?)

It is a handheld device that is much closer to your eyes. It will depend upon the software but some apps will be noticeably compromised. Notice I said apps here, not every app using the 3D capability is a game.
Quote:
I guess that would be upto developers. The x5 chipset sounds more than ample for the job especially if it's a quad core (I'd be surprised if it wasn't.)

Lemon Bon Bon.

If such a chip is real it would be a marginal bump above current performance, due simply to all of those pixels. That really isn't bad though considering what you are getting.

In any event two days or so till the debut. I suspect we will get dual core myself with GPU and data handling enhancements. Mind you with everything running at double today's clock rates maybe even more for the GPUs.
post #34 of 34
Thread Starter 
Aaaand its the MP4, that settles that
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › iPad 3 - SGX 543MP2 or 600 series?