or Connect
AppleInsider › Forums › Mobile › iPad › 'A5X' CPU featured on purported Apple 'iPad 3' logic board
New Posts  All Forums:Forum Nav:

'A5X' CPU featured on purported Apple 'iPad 3' logic board - Page 3

post #81 of 146
Quote:
Originally Posted by apple702 View Post

same situation everyone going Quad core can't picture iPhone 5 with Dual core honestly

iPhone 6, and yes, I can easily, EASILY see it having just a dual-core chip.

Quote:
stop and think iPhone 5 comes out late in the year by then everyone will be Quad core for sure.

Which, again, DOES NOT MATTER to Apple because specs DO NOT MATTER at all.

Quote:
All the will do is fall a year behind everyone else if they go by those standards

If your only standard is "to have the (subjectively) 'best' specs", then yes.

Since Apple couldn't care less about that and doesn't sell to customers who care about it, they'll continue to sell the top three devices per unit, continue to have 75% of the mobile profit, continue to have the best customer satisfaction, and continue to have the greatest turnaround for upgrades within the same manufacturer's line of devices.

Originally Posted by Slurpy

There's just a TINY chance that Apple will also be able to figure out payments. Oh wait, they did already… …and you’re already f*ed.

 

Reply

Originally Posted by Slurpy

There's just a TINY chance that Apple will also be able to figure out payments. Oh wait, they did already… …and you’re already f*ed.

 

Reply
post #82 of 146
Quote:
Originally Posted by SolipsismX View Post

Actually they won't. For this year's holiday season dual-core Cortex-A15 smartphones will be the best option fpr CPU performance.

Lets say your right, again think iPhone 5 why because we know that whatever Ipad 3 gets Iphone 5 will get, so what will the next iPhone be call???????? iPhone4ss if they go with an improve A5 chip what other choice of name they have to call it? 4s already taken

for me in order to call it iphone5 they need Quad-core new design
post #83 of 146
Quote:
Originally Posted by apple702 View Post

Lets say your right, again think iPhone 5 why because we know that whatever Ipad 3 gets Iphone 5 will get, so what will the next iPhone be call???????? iPhone4ss if they go with an improve A5 chip what other choice of name they have to call it? 4s already taken

for me in order to call it iphone5 they need Quad-core new design

That makes no sense on so many levels but let's start small: What makes you think the name of the device is tied to the number of the cores in the CPU?

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #84 of 146
A5X for the iPad 2X ... Get it? iPhone 4S, iPad 2X

Easy peasy marketing... Twice the graphics, twice the power, twice the battery life, twice as magical.

iPad 2X. I had a sudden revelation that this won't be called the iPad 3.

Particularly with the similar form factor.

Knowing Apple, they're saving the numbers for major changes, tick-tock strategy, etc.
post #85 of 146
At the end of the day it comes down to cost/performance. Those displays will not be cheap. It will be about the display this year. The extra power of the GPU and CPU just doing enough to drive the display. With 1GB of ram, possibly 32GB of storage as standard all within the current price structure. No other manufacturer will get close without making a loss. Apple can manufacture in volume due to market share
post #86 of 146
Quote:
Originally Posted by Jimbo1234 View Post

At the end of the day it comes down to cost/performance. Those displays will not be cheap. It will be about the display this year. The extra power of the GPU and CPU just doing enough to drive the display. With 1GB of ram, possibly 32GB of storage as standard all within the current price structure. No other manufacturer will get close without making a loss. Apple can manufacture in volume due to market share

Agreed... Except for the display price. What do you think of the parts listing for the Retina display that shows it's just over $100? I don't think it's as expensive as we would have expected, since Apple has probably spent a good two years at least developing the technique and the volume that they are going to buy is massive ~ at least five million panels a month from March 2012.
post #87 of 146
Quote:
Originally Posted by sunilraman View Post

Agreed... Except for the display price. What do you think of the parts listing for the Retina display that shows it's just over $100? I don't think it's as expensive as we would have expected, since Apple has probably spent a good two years at least developing the technique and the volume that they are going to buy is massive ~ at least five million panels a month from March 2012.

Apple also has the same reduction of cost due to volume without all the R&D investments for the current display and it's also selling at 5 million units per quarter. I see no way that these two displays cost the same today.

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #88 of 146
I couldn't give a rat's ass about how many cores there are. What matters is how much of an improvement the new chip will represent compared to the previous chip. As long as it shows an impressive gain in power VS the previous chip, then all is fine. The iPad 2 is pretty damn smooth, and I'm sure that the iPad 3 will be plenty fine too.

If it is a dual core chip, then obviously Apple has their reasons. They're not morons. Maybe a higher clocked dual core chip is preferable to a lower clocked quad core? What matters is how responsive the system is and how long certain tasks take to complete.

I suggest that anybody who must have a quad core go out and buy some kind of Android tablet. I wouldn't be surprised if a one core iPad will still be smoother and more responsive than a quad core Android tablet. It wouldn't even matter if somebody released an Octo Core Android tablet, it would simply suck eight times as much.

Maybe there won't be a quad core until the next iPad. With all of the new features that are rumored to appear, something's gotta give! Or maybe not, but if Apple does manage to deliver on every single rumored feature, then the iPad 3 is going to be insane.
post #89 of 146
Quote:
Originally Posted by Rockarollr View Post

Didn't the Wall Street Journal already report that the new iPad would be sporting 4G LTE capability when it's released? Generally, the WSJ doesn't post rumors.

http://online.wsj.com/article/SB1000...googlenews_wsj

You do realize that lack of any announcement from Apple makes that a rumor. Their general behavior doesn't matter.

Quote:
Originally Posted by Tallest Skil View Post

iPhone 6, and yes, I can easily, EASILY see it having just a dual-core chip.

It wouldn't surprise me. Apple's top priority isn't usually max cpu power.

Quote:
Originally Posted by Apple ][ View Post

If it is a dual core chip, then obviously Apple has their reasons. They're not morons. Maybe a higher clocked dual core chip is preferable to a lower clocked quad core? What matters is how responsive the system is and how long certain tasks take to complete.

I don't know how apps are set up in that regard, so I'm not going to comment regarding potential to take advantage of further cores. If they go with a dual core, it's most likely related to battery life. Apple is extremely conscious of battery life across product lines, and they have held back speed for battery life at times before. It's just a matter of design prioritization. It should be smooth either way. It's just that faster components often open up different possibilities. People ran graphics software on the G3, and it ran fine at the time assuming a well optimized setup. More power since then has allowed for more advanced features. In Apple's case the iphone apps are just for the iphone. The latest iphone won't be slower running them than another theoretical phone running IOS.
post #90 of 146
Dual core or quad core - either will be an upgrade to my single-core iPad 1.

I'm just amazed a chip so small (look at the size of the bubble wrap) can have 2 cpu cores *and* support 3d graphics at 2048x1536, all without a fan.
post #91 of 146
Quote:
Originally Posted by suddenly newton View Post

at least they didn't call it the a5s

lol ...
post #92 of 146
Quote:
Originally Posted by DrDoppio View Post

So, the iPad 2X (for 2x linear increase in resolution) will have a pentacore chip.

Quote:
Originally Posted by sunilraman View Post

A5X for the iPad 2X ... Get it? iPhone 4S, iPad 2X

Easy peasy marketing... Twice the graphics, twice the power, twice the battery life, twice as magical.

iPad 2X. I had a sudden revelation that this won't be called the iPad 3.

Particularly with the similar form factor.

Knowing Apple, they're saving the numbers for major changes, tick-tock strategy, etc.

Cool, as long as you don't come around saying that I stole your idea
post #93 of 146
Quote:
Originally Posted by wizard69 View Post

Could very well be. Or it could be a base A5 with more RAM. Another possibility is an A5 engineered for a low cost iPad. That is we could see an A6 in the high resolution iPads at a higher cost with the A5X rev targeting an entry level iPad.

I think it's pretty obvious the iPad 3 will have a souped-up A5 CPU with a better graphics chip, more memory, higher clockspeed, but still dual-core. I don't know why people expect a quad-core A6 this soon already. I expect the A6 to have Cortex-A15 cores (IMO it would be a huge disappointment if it hasn't), and the simple fact is that right now, no-one is making those in volume yet, the first devices with A15 cores are not expected before the second half of 2012.

To think that the iPad 3 will have a quad-core A6 CPU based on the Cortex-A15 architecture is simply wishful thinking. There might be a really slim chance the iPad 3 will have a quad-core A5, but I wouldn't bet on it. It would basically be a completely different chip compared to the current A5, and I can't imagine Apple would want to waste a lot of engineering effort on such a chip if it knows the A6 will also be available this year. Personally I don't mind if it's dual-core, right now, quad-core is nothing but marketing hype and buzz. A fast dual-core CPU will handily beat a quad-core at lower clock speed in almost any typical tablet task.

Quote:
Originally Posted by wizard69 View Post

There are many possibilities here. I would be surprised though if Apple upgrade the GPU enough to drive a retina display and kept the processor name the same.

The current A5 dual-core GPU is also available as a quad-core GPU (the PS Vita has it, for example), so I think it's not unlikely this A5X has it.

As I've said many times before here, you don't actually need that much additional GPU power to drive the retina display, at least not to render the iOS user interface. Low poly count, low fillrate, low overdraw, lots of big surfaces and stuff that can be cached such as fonts (hence relatively low bandwidth required). I'd estimate even the A4 GPU would be able to handle it.

For games, it's a completely different story, but I don't think any will run in retina resolution at all. Even desktop-class GPU's struggle with resolutions like what the iPad 3 will likely have. I expect iPad 3 games to always run in pixel doubling mode. I don't think there are any mobile GPU's that could pull off 30 or even 60 fps in-game at 2048x1536 right now.
post #94 of 146
Quote:
Originally Posted by DrDoppio View Post

Cool, as long as you don't come around saying that I stole your idea

LOL I swear I didn't see your post before I posted mine. I guess great minds ~do~ think alike!
post #95 of 146
The serial numbers on these memory chips are quite interesting by the way. Normally Hynix uses a numbering scheme where you can deduce the size of a memory chip from the part number, by looking at the leading number before the 'G' character. In hexadecimal, it gives you the number of gigabits on the chip, ie: a 1G = 1 Gigabit, AG = 10 Gigabit, etc.

These chips have a '0' in front of the G, which suggests someone does not want us to know the size of the memory on this board. I think this means that it's extremely likely that this a development or pre-production board, which might not be representative of the final iPad 3 part at all.
post #96 of 146
Quote:
Originally Posted by d-range View Post

The serial numbers on these memory chips are quite interesting by the way. Normally Hynix uses a numbering scheme where you can deduce the size of a memory chip from the part number, by looking at the leading number before the 'G' character. In hexadecimal, it gives you the number of gigabits on the chip, ie: a 1G = 1 Gigabit, AG = 10 Gigabit, etc.

These chips have a '0' in front of the G, which suggests someone does not want us to know the size of the memory on this board. I think this means that it's extremely likely that this a development or pre-production board, which might not be representative of the final iPad 3 part at all.

Agreed.
post #97 of 146
Quote:
Originally Posted by Ireland View Post

The way I see it this particular nomenclature means it's not quad core:

: (

You can always get an ASUS Transformer Prime.

Sure it is laggy balls, but at least it is quad core.

The "paper specs mean everything" disease of android needs to stay there.

As we've seen many times over "lower spec'ed" Windows Phones and Apple devices are far more satisfying to use than fully spec'ed out android devices.

Do what you want, though.
post #98 of 146
Here's my tests on how iPad 3 is going to do 3D gaming at 2048x1536. Indulge me this one cross-post.

1024x768:



1080p HD, with virtually no drop in frame rates:



Read more and watch the 1920x1080p videos at:
http://forums.appleinsider.com/showthread.php?p=2050899

post #99 of 146
Quote:
Originally Posted by Ireland View Post

I matters to me. When you're on iOS 7 it will matter. It matters.

But as I sold my iPad 2 so I have no choice, but I won't say I'm not : ( when I am. Hope it's quad core.

The emergent processes, those under 32nm, have resulted in Cortex A9 cores running at over 2GHz. This is one way for Apple to double performance of the CPUs, while providing for lots of real estate for GPU, memory, caches and other functional units.

As to iOS 7 you have real concerns. However RAM will have a bigger impact on future OS performance than cores.
post #100 of 146
Quote:
Originally Posted by sunilraman View Post

... I guess great minds ~do~ think alike!

Either that, or some ideas are so obvious that many come across them soon enough
post #101 of 146
Quote:
Originally Posted by DrDoppio View Post

Either that, or some ideas are so obvious that many realize them soon enough

I prefer to think of us dipping our toes in the stream of universal consciousness.
post #102 of 146
Quote:
Originally Posted by SolipsismX View Post

I don't think the iPad uses VRAM, but DRAM. Since the IGP needs to allocate RAM from the system RAM I'd think 512MB would be too little for pushing 4x as many pixels.

There is an advantage in providing dedicated VRAM for the GPU, especially if integrated on the SoC. it has the potential of controlling bandwidth demands on main memory, lowering power if on chip and freeing up cache.

Just because the GPU is on the same die as the CPU, that does not mean that one has to design it to use main memory for the frame buffer.

All that being said it would be foolish of Apple to not upgrade RAM in iPad 3. Many apps are crying out for more RAM.
post #103 of 146
Quote:
Originally Posted by cy_starkman View Post

What has that got to do with the device? Nothing

He was talking about the need for more RAM. If you don't understand the relationship Safari has with RAM and how the lack of RAM impacts Safari on the iPad then be quiet.

[insult removed]
post #104 of 146
Quote:
Originally Posted by Technarchy View Post

You can always get an ASUS Transformer Prime.

Sure it is laggy balls, but at least it is quad core.

The "paper specs mean everything" disease of android needs to stay there.

The Tegra 3 that is in the Transformer Prime is _the_ reason I'd rather see a higher-clocked A5 with a better GPU and more RAM, than an A6 quadcore part that's comparable with Tegra 3.

Looking at early benchmark, Tegra 3 doesn't even scale linearly in clock speed, ie: a 1.4 Ghz Tegra 3 is less than 40% faster than a 1.0 Ghz Tegra 2. The GPU in the Tegra 3 is not even faster than the one in the Apple A5.

So rather than being first-to-market (ie: the NVidia strategy) with a chip that has 4x the same cores as the current dual-cores and only a 'somewhat better' GPU, I'd rather wait for a 'real' quad-core A6 with Cortex-A15 cores and PowerVR 6 series GPU.
post #105 of 146
Quote:
Originally Posted by wizard69 View Post

There is an advantage in providing dedicated VRAM for the GPU, especially if integrated on the SoC. it has the potential of controlling bandwidth demands on main memory, lowering power if on chip and freeing up cache.

Just because the GPU is on the same die as the CPU, that does not mean that one has to design it to use main memory for the frame buffer.

For embedded GPU's dedicated VRAM is much less required, since apart from the Tegra GPU's, all of them use tile-based rendering. This means there is no Z-buffer, and the framebuffer is always written in tiles, allowing for very efficient caching, and all rendering operations are done to on-chip tile memory that is orders of magnitude faster than e.g. DDR5 memory. In general, you can't really extrapolate anything you know about desktop GPU's to mobile GPU's because they are based on completely different technology, with completely different performance characteristics and constraints.

Quote:
Originally Posted by wizard69 View Post

All that being said it would be foolish of Apple to not upgrade RAM in iPad 3. Many apps are crying out for more RAM.

Agreed, anything less than 1 GB would be a huge disappointment.
post #106 of 146
Quote:
Originally Posted by Aizmov View Post

A) I'm a computer science major (graduate student) doing research on parallel computing. Heck, look at my sig.
B) I'm not a troll.

It is pretty clear he doesn't have a clue as to how computers work in general nor how Safari works specifically.
Quote:
The 512MB in the iPad 2 is just not enough. How do I know? Try opening multiple tabs in Safari. My RAM complaint with the iPad is legitimate.

Yep. Plus your explanations are fact based and rational.
Quote:
Why are you trying to label it as trolling? I had the original iPad and it was slooooooooow so I replaced it immediately with the iPad 2. Much better; but with lots of tabs open it does get slow at times.

Plus it significantly impacts your data usage. Reloads are not free if you are on a capped connection. Even working with big files, PDFs, hi res pics and other large data sets sucks on iPad.
Quote:
Maybe I have become spoiled by very fast machines but I'm not trolling. I will get the iPad 3, 1GB of RAM or not, because there will be performance improvements. But I sincerely hope that it has at least 1GB.

That is pretty much my position. It likely won't be on release day but iPad 3 looks to be very nice. So unless Apple really screws it up I will likely have one by summer.
Quote:
Having pages reload just because I have lots of tabs open completely ruins the experience, especially if you had some forms filled in some tab and switched to another tab and switched back just to see the page reloading.

I'd actually like to see Apple jump to 2 GB of RAM. By the time the video buffer is subtracted, the extra RAM used by bit maps is subtracted and the extra system functions like Siri subtracted, we will have just about the right amount of RAM.
post #107 of 146
Quote:
Originally Posted by d-range View Post

The Tegra 3 that is in the Transformer Prime is _the_ reason I'd rather see a higher-clocked A5 with a better GPU and more RAM, than an A6 quadcore part that's comparable with Tegra 3.

Looking at early benchmark, Tegra 3 doesn't even scale linearly in clock speed, ie: a 1.4 Ghz Tegra 3 is less than 40% faster than a 1.0 Ghz Tegra 2. The GPU in the Tegra 3 is not even faster than the one in the Apple A5.

So rather than being first-to-market (ie: the NVidia strategy) with a chip that has 4x the same cores as the current dual-cores and only a 'somewhat better' GPU, I'd rather wait for a 'real' quad-core A6 with Cortex-A15 cores and PowerVR 6 series GPU.

Tegra 3 was delayed substantially and may not offer as great of an improvement as some would have liked, however it's up to 3x faster than the Tegra 2 and currently the most advanced SoC of its class. It is good enough so that Audi selected it as the application and graphics processor for its in-vehicle infotainment systems and digital instrument display.

Also, it doesn't have "4x the same cores as the current dual-cores", since that would make it 8-core, right?
post #108 of 146
Quote:
Originally Posted by sunilraman View Post

AFAIK, GPU RAM requirements don't scale like you mention... 4x the pixels does not mean 4x the RAM required.

Someplace in the machine there has to be a byte for every pixel, a frame buffer if you will, so that increases by 4x. The rest of the RAM allocated to video usage is made up of data that may or may not see an increase in RAM space depending upon exactly what that data is.

Beyond that all system and app wide bit maps move to 4x space requirements. Again this is variable and app dependent but clearly the machine will require more RAM to drive retina displays.
Quote:
Let's take the framebuffer itself. In theory, even a 8MB video card can handle 1920x1080 at 32bit colour:

Lets not. instead look at what iOS needs
Quote:

So, why do we need 256MB video cards? Well, because of how 3D is implemented by GPUs ~ this was a hot topic back when beyond-720p resolutions started to be supported by PCs.

Video RAM needs vary with the Apps and the Systems ability to use that RAM, no big deal there.
Quote:
For example, in Oblivion, at 0xAA, even at 1600x1200, only just over 200MB VRAM was being used. And, at 640x480, ~almost~ 200MB VRAM was still being used. Memory requirements jump when implementing 2xAA and 4xAA, but not in any linear fashion:

In this context that means nothing. Not every use of a video card is 3D, plus higher resolution displays impact RAM usage far outside the video card.


In any event this stuff is nice to dream about. I actually am hoping that Apple debuts a CPU and GPU complex using a single address space, combined witha frame buffer embedded in the SoC. This should positively impact memory bandwidth, power usage and flexibility. They could beat AMD to the punch here.
post #109 of 146
Quote:
Originally Posted by sunilraman View Post

But that's not how Apple rolls. What advantage could a quad iPhone 5 offer over dual ARM with a great GPU?


Faster performance and better battery life.
post #110 of 146
Quote:
Originally Posted by d-range View Post

For embedded GPU's dedicated VRAM is much less required, since apart from the Tegra GPU's, all of them use tile-based rendering. This means there is no Z-buffer, and the framebuffer is always written in tiles, allowing for very efficient caching, and all rendering operations are done to on-chip tile memory that is orders of magnitude faster than e.g. DDR5 memory.

Eventually that frame buffer has to go out to the video device.
Quote:
In general, you can't really extrapolate anything you know about desktop GPU's to mobile GPU's because they are based on completely different technology, with completely different performance characteristics and constraints.

You can't dismiss the need for more RAM simply because the technology is different I'm mobile hardware. Nor can you dismiss the fact that a 4X increase in bit map sizes affects system bandwidth. In any event we will see how similar the new GPU is to the current ones fairly soon. I'm fairly certain that Apple will have to address GPU performance demands with the move to a retina screen.

The architecture for SoC processors ismchanging rapidly. If Apple sees a unified address space, where both the GPU and CPU are equals on the memory bus is suspect that structures used by the GPU would become more accessible by the CPU. A bit of a moving target if you will.
Quote:
Agreed, anything less than 1 GB would be a huge disappointment.

Yes this is true, even 512 on the 2 was a disappointment. It would also be nice to see a significant performance increase, it is too bad you can't squeeze in a GB of high performance video RAM in the machine for both CPU and GPU.

Even though much of this thread has focused on raw performance I'm convinced that iPads biggest flaw is the lack of RAM. That flaw is close to tying the lack of a USB port as big frustrations on iPad.
post #111 of 146
Quote:
Originally Posted by wizard69 View Post

Someplace in the machine there has to be a byte for every pixel, a frame buffer if you will, so that increases by 4x. The rest of the RAM allocated to video usage is made up of data that may or may not see an increase in RAM space depending upon exactly what that data is.

Yeah, but in terms of the frame buffer for storing just bitmap information for each pixel, that's:

Memory in MB = (X-Resolution * Y-Resolution * Bits-Per-Pixel) / (8 * 1,048,576)

For 1024x768
that's 1024*768*32 / (8*1048576)
= only 3MB

For 2048x1536
that's 2048*1536*32/(8*1048576)
= only 12MB

4x yes, but the difference is only 8MB.

Quote:
Originally Posted by wizard69 View Post

Beyond that all system and app wide bit maps move to 4x space requirements. Again this is variable and app dependent but clearly the machine will require more RAM to drive retina displays.

Indeed, but my point is 4x is only in terms of the 12MB compared to 3MB. All other use of memory for the GPU is dependent on the implementation.

Quote:
Originally Posted by wizard69 View Post

In this context that means nothing. Not every use of a video card is 3D, plus higher resolution displays impact RAM usage far outside the video card.

Fair enough, but it will not be 4x the "VRAM" needed by any stretch of the imagination...

Like you say, we're just guessing, but my bone is there's no 4x the requirement, outside of reserving 12MB instead of 3MB for the framebuffer (in the purest sense of the word) only.
post #112 of 146
Quote:
Originally Posted by wizard69 View Post

You can't dismiss the need for more RAM simply because the technology is different I'm mobile hardware. Nor can you dismiss the fact that a 4X increase in bit map sizes affects system bandwidth. In any event we will see how similar the new GPU is to the current ones fairly soon. I'm fairly certain that Apple will have to address GPU performance demands with the move to a retina screen.

I think Apple is pretty much on the ball here. Like I said, Quartz as a concept is nothing to sneeze at. Even on my iPad 2, 1000x1000 pixel lossless PNGs are easily animated in several layers in an app. Certainly Retina places a lot of demand on the GPU, but I think Apple would have it figured out for 2D. As I've said in the past, the iOS render engine is phenomenal. There is a 4x increase in ~final output resolution~ but we'll have to see how Quartz handles a lot of the other aspects. For example, if we have an app that can zoom in on something, that means when you zoom out, you could be looking at 4000x4000 pixel bitmaps in several layers to be rescaled and output to 2048x1536.

Certainly this will need 256MB of GPU RAM at a very, very rough estimate, but it may need no more than that amount of GPU RAM, say 300MB at most.

I think there should be 1GB of RAM in the iPad 2X, with the GPU taking up ~100MB to 300MB of that RAM at the most.

3D games will be limited to 1024x768 with specialised extremely low-performance-impact upscaling to 2045x1536 ~ given it's pure 2x upscaling they can Lanczos or equivalent optimise for that easily.

2D stuff, ballpark, 300MB at most, I don't think by design Apple would want the GPU to eat up any more than that on a 1GB RAM device.

Quote:
Originally Posted by wizard69 View Post

The architecture for SoC processors is changing rapidly. If Apple sees a unified address space, where both the GPU and CPU are equals on the memory bus is suspect that structures used by the GPU would become more accessible by the CPU. A bit of a moving target if you will.

In this area I have to defer to you because I'm not informed on SoC RAM addressing by CPU and GPU. I would imagine a unified address space is the best way to go since it's all one die, plus allowing a flexible use of system RAM like on a PC/Mac IGP? Dedicated memory for the GPU, or "less flexible" memory for the GPU wouldn't be a good way to go because the GPU might only need, say, less than 100MB of RAM in very basic apps, but certainly more for 2D games, let alone 3D.
post #113 of 146
Seriously guys Cortex A15 will be a stretch in portable devices, mainly because it is likely to be far more power hungry than Cortex A9. aRM is targeting servers with that chip!

Quote:
Originally Posted by d-range View Post

The Tegra 3 that is in the Transformer Prime is _the_ reason I'd rather see a higher-clocked A5 with a better GPU and more RAM, than an A6 quadcore part that's comparable with Tegra 3.

The only thing that counts is what Apple delivers, it really makes no sense to damn an unannounced chip based on the results of somebody else's implementation of an ARM core. In the literature there has been a wide range of reports on Cortex A9's done on the new processes, just because NVidia didn't go for performance doesn't mean that Apple won't.
Quote:
Looking at early benchmark, Tegra 3 doesn't even scale linearly in clock speed, ie: a 1.4 Ghz Tegra 3 is less than 40% faster than a 1.0 Ghz Tegra 2. The GPU in the Tegra 3 is not even faster than the one in the Apple A5.

One big issue with processors is the wait on RAM as clock rates increase. As such performance will be a problem unless the bandwidth problem is addressed.
Quote:
So rather than being first-to-market (ie: the NVidia strategy) with a chip that has 4x the same cores as the current dual-cores and only a 'somewhat better' GPU, I'd rather wait for a 'real' quad-core A6 with Cortex-A15 cores and PowerVR 6 series GPU.

Frankly I'd rather see Apple stay with Cortex A9 until ARM delivers a viable 64 bit core. A15 just has to many trade offs in my minds. By carefully adjusting the architecture Apple should be able to double performance every year for the next two or three with Cortex A9 hardware.
post #114 of 146
Quote:
Originally Posted by wizard69 View Post

Seriously guys Cortex A15 will be a stretch in portable devices, mainly because it is likely to be far more power hungry than Cortex A9. aRM is targeting servers with that chip!

The only thing that counts is what Apple delivers, it really makes no sense to damn an unannounced chip based on the results of somebody else's implementation of an ARM core. In the literature there has been a wide range of reports on Cortex A9's done on the new processes, just because NVidia didn't go for performance doesn't mean that Apple won't.

One big issue with processors is the wait on RAM as clock rates increase. As such performance will be a problem unless the bandwidth problem is addressed.

Frankly I'd rather see Apple stay with Cortex A9 until ARM delivers a viable 64 bit core. A15 just has to many trade offs in my minds. By carefully adjusting the architecture Apple should be able to double performance every year for the next two or three with Cortex A9 hardware.

Yeah, that's why I've also said, A15 will very likely not be in iOS devices until big.Little, ie. most likely dual A7 with dual A15, doing the intelligent power-switching thingy. A pure dual-A15 is outside the scope of iOS devices, at least in 2012. Even ARM doesn't want to put that in phones...!
post #115 of 146
Quote:
Originally Posted by cy_starkman View Post

Really? I must be special then cause I can have many tabs open all with complex pages and it all goes nicely. Unless... the Internet is made up of varying connection speeds, with varying connection speeds to the websites I am on and a varying number of people accessing those websites.

Consider the case when all apps are using 4x as much memory for their graphics and display as they are now... 1GB is an effective minimum for a tablet with a retina display.

I'm sure a developer can chip in with real figures, but the majority of an application these days is its digital assets, including art. I wouldn't be surprised if the average application doubled or tripled its memory requirements when moving to the iPad 3. Some won't - many are data structure limited rather than graphics limited (e.g,. Minecraft).
post #116 of 146
Quote:
Originally Posted by Hattig View Post

Consider the case when all apps are using 4x as much memory for their graphics and display as they are now... 1GB is an effective minimum for a tablet with a retina display.

I'm sure a developer can chip in with real figures, but the majority of an application these days is its digital assets, including art. I wouldn't be surprised if the average application doubled or tripled its memory requirements when moving to the iPad 3. Some won't - many are data structure limited rather than graphics limited (e.g,. Minecraft).

Er, may I interject, I get what you're saying, but there's no 4x as much memory, it doesn't scale that way. The only 4x we know for sure is in the final output to the framebuffer, ie, 12MB required instead of 3MB. But everything up to that point does not necessarily mean 4x the memory required.

I do still however think something over 500MB is the minimum RAM, with 1GB being just nice. It's hard to imagine the GPU not needing less than 100MB (but not directly 4x the RAM that the iPad 2 GPU uses)
post #117 of 146
Quote:
Originally Posted by DrDoppio View Post

Tegra 3 was delayed substantially and may not offer as great of an improvement as some would have liked, however it's up to 3x faster than the Tegra 2 and currently the most advanced SoC of its class. It is good enough so that Audi selected it as the application and graphics processor for its in-vehicle infotainment systems and digital instrument display.

Also, it doesn't have "4x the same cores as the current dual-cores", since that would make it 8-core, right?

Oh my, Audi selected it? For their in-vehicle system?

I mean, car makers are just known for using the best tech for their infotainment systems.
post #118 of 146
Quote:
Originally Posted by d-range View Post

The Tegra 3 that is in the Transformer Prime is _the_ reason I'd rather see a higher-clocked A5 with a better GPU and more RAM, than an A6 quadcore part that's comparable with Tegra 3.

Looking at early benchmark, Tegra 3 doesn't even scale linearly in clock speed, ie: a 1.4 Ghz Tegra 3 is less than 40% faster than a 1.0 Ghz Tegra 2. The GPU in the Tegra 3 is not even faster than the one in the Apple A5. .

The problem with Tegra 3 is memory bandwidth. For some reason NVIDIA didn't use a 64-bit memory bus, but use a 32-bit bus. This is quite crippling in many situations. A5 uses 64-bit IIRC.

On the other hand, Tegra 3 is 80mm^2 on 40nm, and A5 is over 120mm^2 on 45nm. But a side effect is that the GPU isn't as powerful as the larger one in the A5.
post #119 of 146
Quote:
Originally Posted by Hattig View Post

The problem with Tegra 3 is memory bandwidth. For some reason NVIDIA didn't use a 64-bit memory bus, but use a 32-bit bus. This is quite crippling in many situations. A5 uses 64-bit IIRC.

On the other hand, Tegra 3 is 80mm^2 on 40nm, and A5 is over 120mm^2 on 45nm. But a side effect is that the GPU isn't as powerful as the larger one in the A5.



Apple's chips will be better than any multicore chip.
Apple's current iPad screen is better than any alternative.
No LTE. The iPad is better without it.

Even if the iPad 2S were basically the same as the iPad 2, with just a somewhat faster clock speed, everybody and his brother would buy it.
post #120 of 146
Quote:
Originally Posted by sunilraman View Post

Indeed, but my point is 4x is only in terms of the 12MB compared to 3MB. All other use of memory for the GPU is dependent on the implementation.

All applications will allocate memory for their UI. It is likely that each app's display is double buffered. However as applications are suspended in the background, background apps may not be double buffered, but I would need an iOS developer to confirm that.

6MB for a double buffered display is now 24MB, an 18MB difference. With ten apps running that would be 180MB more memory required for application interfaces alone. That's before the art resources the apps also use. A full-screen backdrop? Another 9MB * 10 apps. What about an app with a dozen different full screen bitmap UIs (Garageband has fancy drumsets, keyboards, etc)? Another 100MB gone.

Games? Oddly enough games can make use of (real time) compressed textures far better than a 2D UI application can. Even so, higher resolution textures will require more RAM - but you may not require them in your game.

Ultimately, RAM is cheap - even low-power RAM that is PoP like the A5. It's also dropped massively in price in the past year. So if the iPad 3 has less than 1GB it would be really frustrating. Stick 2GB on the top-end 128GB iPad as a bonus.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: iPad
AppleInsider › Forums › Mobile › iPad › 'A5X' CPU featured on purported Apple 'iPad 3' logic board