I know you don't want to do that math, but everyone else can see the pattern pretty easily.
No the BS is claiming Apple may charge for iOS 5. But now I will call BS on your personal schedule for Apple despite it being quite well understood that Apple has a one year cycle on iOS at the Apple Dev Conference and iPhone in June.
The math is that in real life demo, nobody is complaining about the performance of the Playbook. The only complaining is that the tbs.com flash website was stutterly --- when there were 3-4 other apps (big apps, not widgets) actually running in the background and the Playbook is only operating on a single core. If that's because RIM threw hardware into the Playbook because of Flash UI's big memory problem --- then Apple is going to be truly screwed because RIM just bought TAT to make all native UI widgets in C++ for the next generation of blackberries.
It's not without precedent that Apple had treat iOS updates differently for different devices.
Is 512MB enough? There is never enough memory if you are a developer, by definition. I wouldn't compare the 512MB to 1GB on a Xoom very directly though. That Xoom is running a robust JVM behind everything, and that means it is running a copy of that JVM in every process. That is a verified Metric Shitload of memory requirement that the app developers can never use.
Thank you!!!! Please repeat this ad nauseum in every thread that e-skater and wizzy try to make their sad case for iOS needing the same RAM as Droid.
OK. [i LOVE doing this!!!] ...iOS uses a central GUI service in the OS. That means the OS memory footprint itself shoulders the load for all the common API provided GUI graphics. This is universally known as an efficient way to do software: write once, use many. And quite a bit of the volume isn't code bloat, it is the actual graphics elements that need to be combined. Elements that only ever need to be in memory anywhere once.
AIR on the other hand pastes it's whole self into every application processes that makes a request of the Flash GUI services. That's known as: write once, duplicate endlessly. Every instantiation of the GUI adds it's full weight again to the overall memory requirements of the machine. Every line of AIR GUI machine code, duplicated in every process. Every API-based GUI graphic, duplicated in every process....
I'm no techy, but this seems to back up my hunch why AIR/Flash is such a memory hog.
I'm not debating your knowledge here Hiro, but a call goes out to the other tech guys here to verify this rather elegant and simple explanation.
BTW: in regards to my post above, regardless of whether iOS "needs" 1GB or has 1GB, I firmly believe that iOS will be able to do more with the memory than a similer spec'ed Android device... also because of Hiro's explaination of the JVM on which Android is based.
Utter nonsense. Unless you have some evidence that 1080p video playing, which is a pointless feature, was actually planned,
Well I'd say that the slide that said 1080p in Very Large Letters during Steves presentation implies that 1080p playback is a feature of the iPad2.
And saying that this is a pointless feature is bizarre - look around you. Every damn videocamera no matter how cheap, most new compact cameras and DSLRs, all record 1080p. All TV-sets advertised in the advertising material sneaked into my morning paper (I hate that), are 1080p, and it isn't even referred to except for the very cheapest sets. In short, the consumer electronics world around you is 1080p, including the only HD distribution format, so 1080p output would be a reasonable assumption even without that 1080p slide.
Folks really need to read about the purpose of RAM before bashing a device because it has x amount instead of y. My 3GS can run just plenty with its 600MHz CPU and 256MB including Infinity Blade with little if any lag. My quad-core i7 with 12GB running Win7 has its moments of lag as much as my MacBook with its Core2 Duo and 4GB.
True. SSDs make a big difference to the perceived responsiveness of even a very fast computer. They're not yet at a price/gigabyte that's acceptable to most people, but that will change.
It is? How so? Is there a special USE_MEMORY_EFFICIENTLY flag in Darwin designed especially for the iPad?
Don't know if this has been answered yet, but here's the answer.
- The iPad kernel & OS is compiled to run on ... the iPad. And that's it. All of the memory-gobbling modular abstraction features simply don't exist. Anybody who has ever done any degree of device-specific programming can tell you that there's a very large efficiency improvement that comes from going device-specific -- at least 2x. Android has to include a lot of these features that iOS does not -- let's get serious, Android builds heavily on Java, the biggest hog of them all. As a result, iOS will always run faster and cleaner than Android will.
Other things:
- The ARMv7 is 32 bit, rather than 64 bit like modern x86's. so pointers are half the size.
- The ARMv7 has a memory manager that is very agile, so it can pack memory structures more efficiently than most other chips.
- The graphics chipset has its own RAM, IIRC, so game performance is tied to that.
I'm not sure if there is a cost-effective 1GB RAM chip that can fit on top of the A5 MCM stack. Moreover, DRAM drains quite a lot of power in relation to the rest of the components in that A5 MCM stack. There's no reason to have more, unless you need more. It just adds cost and drains power.
First Android has absolutely nothing to do with this discussion, iPAds need for more memory than what is in iPad 1 stands on it's own. The machine never shipped with enough memory, this causes significant issues with Safari and other apps when handling large amounts of data. Sadly iPad 1 wasn't even at parity with iPhone 4.
Second my comments reflect the confusion over how much RAM will actually be in iPAd 2. If it ships with only 256MB then the upgrade is basically worthless. I've clearly stated multiple times now that going to 512MB would go a long way to correcting iPads problems. Certainly 1GB would be better but frankly I'd be surprised if we get that. In any event we have yet to see any solid evidence about iPads actual RAM implementation.
Third there has been a lot of BS posted here about how great iPads memory system is, it is all garbage. The lack of demand paging does not automatically make it better of more efficient than Android. Androids problems revolve around Java more than anything. Hiro correctly pointed this out but it really doesn't matter because as stated above iPad 1needed more RAM from day one and frankly would have needed it even if Android didn't exist.
Quote:
Originally Posted by ThePixelDoc
Thank you!!!! Please repeat this ad nauseum in every thread that e-skater and wizzy try to make their sad case for iOS needing the same RAM as Droid.
PS: AI - make this a Sticky!
Again DO NOT PUT WORDS IN MY MOUTH. IPads problems have nothing to do with Android. Further I have to wonder why you aren't concerned about this issue? In iOS RAM directly impacts what types of apps can be ran successfully on an iPad, more so than many other OS's.
Both are true. The original iPad suffers dramatically from too little ram, at least in some apps like safari. However comparison to android must consider differences in overhead between the platforms.
Additional ram is the reason I'll be in line to replace my iPad on Friday.
Well I'd say that the slide that said 1080p in Very Large Letters during Steves presentation implies that 1080p playback is a feature of the iPad2.
And saying that this is a pointless feature is bizarre - look around you. Every damn videocamera no matter how cheap, most new compact cameras and DSLRs, all record 1080p. All TV-sets advertised in the advertising material sneaked into my morning paper (I hate that), are 1080p, and it isn't even referred to except for the very cheapest sets. In short, the consumer electronics world around you is 1080p, including the only HD distribution format, so 1080p output would be a reasonable assumption even without that 1080p slide.
You mean the slide where SJ is talking about mirrored output from an iPad through the HDMI adapter cable, but you just forgot to mention that? (at around 24 minutes) Clearly he was talking about the cable, but you chose to vaguely misrepresent that to try to support your point? Funny how the iPad detractors around here feel the need to engage in dishonest misrepresentations.
The reason it's a pointless feature is that cabling your iPad to a TV to watch movies is a stupid idea. It's fine for giving a presentation, but it's just dumb to do this at home when there are so many better options.
First Android has absolutely nothing to do with this discussion, iPAds need for more memory than what is in iPad 1 stands on it's own. The machine never shipped with enough memory, this causes significant issues with Safari and other apps when handling large amounts of data. Sadly iPad 1 wasn't even at parity with iPhone 4.
Second my comments reflect the confusion over how much RAM will actually be in iPAd 2. If it ships with only 256MB then the upgrade is basically worthless. I've clearly stated multiple times now that going to 512MB would go a long way to correcting iPads problems. Certainly 1GB would be better but frankly I'd be surprised if we get that. In any event we have yet to see any solid evidence about iPads actual RAM implementation.
Third there has been a lot of BS posted here about how great iPads memory system is, it is all garbage. The lack of demand paging does not automatically make it better of more efficient than Android. Androids problems revolve around Java more than anything. Hiro correctly pointed this out but it really doesn't matter because as stated above iPad 1needed more RAM from day one and frankly would have needed it even if Android didn't exist.
Again DO NOT PUT WORDS IN MY MOUTH. IPads problems have nothing to do with Android. Further I have to wonder why you aren't concerned about this issue? In iOS RAM directly impacts what types of apps can be ran successfully on an iPad, more so than many other OS's.
I agree any platform can always use more memory. I would love to see couple GB in something like an iPad to do some really kick-ass stuff. But I don't see a major problem because it isn't there yet.
The biggest reason iPad could use more memory is that is will make more apps available from a wider cross section of developers. This is a little bit of a cut against the wider dev community because it implies that being efficient and precise with memory management is something many devs are not good at. Unfortunately that is the case, sometimes because of ability, but probably more because of ship date and budget pressures. Getting something working and out the door is often more important than optimizing memory usage.
That is where the desktop is today, memory usage is pretty much a tertiary thought and the mindset is bleeding over into the mobile space where the battery tech hasn't caught up to the task of powering gobs of GB yet. I personally think Apple could do some great things with more use of file mapping and a backing store to the Flash and I don't know what the continuing motivation is for avoiding it. I have doubts that SSD lifecycle is a valid concern, backing stores are by design not written to very often compared to the operations of the rest of the machine. Maybe they don't want to give up up to a couple GB SSD space for it.
Well BS that we need to get this info from a fourth party. Further it is BS in the sense that Apple does publish many specs for the iPads but prefers to screw over the consummer with respect to this one important parameter.
RAM is very important, if Apple has stayed with 256MB the upgrade would be worthless. From the day it debuted iPad one has suffered from the lack of RAM.
Ok. Educate me. Why is this such a big a deal as your are claiming?
I personally think Apple could do some great things with more use of file mapping and a backing store to the Flash and I don't know what the continuing motivation is for avoiding it. I have doubts that SSD lifecycle is a valid concern, backing stores are by design not written to very often compared to the operations of the rest of the machine. Maybe they don't want to give up up to a couple GB SSD space for it.
Could you elaborate on this -- it's been muchos años (1971) since I learned about paging, virtual memory, etc..
Ok. Educate me. Why is this such a big a deal as your are claiming?
Look at it this way iPad 1 had less RAM available to the user than the iPhone 3GS. Due to the way iOS uses RAM for user data this directly impacts what capability can be written into an app.
I'm not at all sure why this is so difficult for people to understand. Have you never purchased a shrink wrap piece of software and looked at the list of requirements. Many apps have minimal RAM requirements. Even more have recommended RAM sizes. This data is there because RAM impacts app usability and capability. The problem is even more acute on iOS devices because there is no paging or swap. Contrary to opinions expressed here forcing the programmer to manage all data in RAM himself does not make a program faster or more reliable. Rather it is the opposite in impact in many cases or it simpky leads to programs not being written.
In any event this has been gone over again and again so please tell us what you don't understand. Further why are people so damn willing to let Apple get away with this nonsense? Especially considering that Apple publishes far more useless specs but fails to be honest about this one parameter.
I agree any platform can always use more memory. I would love to see couple GB in something like an iPad to do some really kick-ass stuff. But I don't see a major problem because it isn't there yet.
Well it depends upon context. For example if you intend to use an iPad 1 with Safari over 3G the lack of RAM I very significant. Not only do page reloads, at a drop of the hat, slow things down you end up using a good portion of your bandwidth. I wouldn't be surprised if some users report lowered data usage numbers when switching to iPad 2. That given that iPad 2 actually has more memory in the first place.
Again though we need to focus on context here and how the user surfs with Safari. One person might not see much difference in data usage while another might see data usage cut to a small fraction of his current iPad 1 usage.
Note that I'm looking at this from the point if view of a user, not a programmer or developer. What I se as sad here is that iPad has been marketed as a way to access the net, yet of the supplied tools on the device Safari is the poorest performing of the lot.
Quote:
The biggest reason iPad could use more memory is that is will make more apps available from a wider cross section of developers. This is a little bit of a cut against the wider dev community because it implies that being efficient and precise with memory management is something many devs are not good at.
Not at all! The problem is pretty straight forward, some apps simply require more memory that others. One developer may be able to shave the app foot print relative to another but you still will end up with apps that don't fit comfortably in to 120MB of RAM.
Safari is my favorite example here because while it does work well for many just as many
Find the constant page reloads to be a real pain in the neck. Other apps such as image editors and CAD like programs haven't really gotten traction on the iPad. It isn't a question of programmer skill if the iPad doesn't have room for larger image buffers for example.
Quote:
Unfortunately that is the case, sometimes because of ability, but probably more because of ship date and budget pressures. Getting something working and out the door is often more important than optimizing memory usage.
Many of the apps written by iOS developers have been written by one and two man shops which implies a different set of pressures that those working in a corporate setting.
In any event let's revisit image editing here. It is my understanding that there are such programs available for iPad though I haven't used any. It would be very interesting to talk to those developers to get an in depth report on what it was like to try to develop for the iPad. My impression is that much code would have to be written simply to shoe horn the app into the limited memory. Code that wouldn't gave to be written if the programmer was free to create a couple of 16GPixel image buffers in memory. The question then becomes this, is it better for the programmer to spend his time on app code or memory management code and tricky algorithms that result from a lack of RAM.
You are right in the sense that getting something to work is important and often leads to ignoring performance and memory management. But here is the rub, once you have something working what is likely to get tackled next, memory usage or performance. I'd say 9 times out of 10 it will be performance and optimizing performance often lead to using even more memory.
Quote:
That is where the desktop is today, memory usage is pretty much a tertiary thought and the mindset is bleeding over into the mobile space where the battery tech hasn't caught up to the task of powering gobs of GB yet.
Yes this is true and is why Apple has to carefully optimize the RAM available to the system, the last thing I want to see is a regression in iPad battery lifetimes. One thing to jeep in mind here though is that Apple isn't exactly focused on tight memory usage either. IOS actually has a very rich set of APIs that continue to grow. The growing SDK is good for users in that such growth exposes new features and capabilities but it also adds more memory usage.
Quote:
I personally think Apple could do some great things with more use of file mapping and a backing store to the Flash and I don't know what the continuing motivation is for avoiding it. I have doubts that SSD lifecycle is a valid concern, backing stores are by design not written to very often compared to the operations of the rest of the machine.
Im not sure what Apple is up to here, but I do believe that hardware limitations play a role. Also depending upon the load on a system backing store can be the most heavily accessed storage on a machine. This is why swap is often on it's own partition or disk on Unix systems. So if your hardware doesn't support wear leveling I could see it as a big concern. Battery life is also a problem as going off chip uses a lot of energy.
Part of the problem here is that Apple seems to have built the original versions of iOS for a slightly different portable devices market than the iPad. That is one could reasonably expect iOS to have more capability on the iPad than it does on iPhone or Touch as the needs are different. Apple though has choosen to keep the OS the same memory management wise across all platforms.
Now the question is will that change in the future.
Quote:
Maybe they don't want to give up up to a couple GB SSD space for it.
I'm not sure the couple of GB of flash space is even an issue. I suspect that this is another one of those issues where we will never completely know Apples mind set. What is interesting here is that with Apples single user app model the swap space required is effectively limited in size. I still think the major issues revolve around wear and power but would love to know exactly why Apple has done things the way they have.
In any event iPad 2 reports, bench marks and experiences are starting to filter in to the web. It will be a very interesting week.
Well it depends upon context. For example if you intend to use an iPad 1 with Safari over 3G the lack of RAM I very significant. Not only do page reloads, at a drop of the hat, slow things down you end up using a good portion of your bandwidth.
The forced (unnecessary )page reloads should get better with more memory. They could get A LOT better with a proper backing store. As for 3G vs WiFi load speeds, that's not a memory issue.
Quote:
I wouldn't be surprised if some users report lowered data usage numbers when switching to iPad 2. That given that iPad 2 actually has more memory in the first place.
yup
Quote:
Again though we need to focus on context here and how the user surfs with Safari. One person might not see much difference in data usage while another might see data usage cut to a small fraction of his current iPad 1 usage.
Note that I'm looking at this from the point if view of a user, not a programmer or developer. What I se as sad here is that iPad has been marketed as a way to access the net, yet of the supplied tools on the device Safari is the poorest performing of the lot.
I don't find it as horrible as you do. I see the same behaviors, but they just don't seem as heinous to me as you make them out to be. Just chalk that up to out mileages/tolerances vary
Quote:
The problem is pretty straight forward, some apps simply require more memory that others.
Until you do something that needs that memory dynamically,this isn't very hard to get around for a good coder not on an insane schedule. The stuff that will need gobs of memory dynamically isn't well suited for a platform not dedicated to wholesale computation. Not all tasks will run well on an iPad even if it is possible to get them to run. This should really be a small minority of apps overall.
Quote:
One developer may be able to shave the app foot print relative to another but you still will end up with apps that don't fit comfortably in to 120MB of RAM.
The VAST majority of apps don't need even that much at once. And harping on an original iPad/iPhone3 limitation is being a bit dated. The real breaker is if Carmack can make RAGE work in that 120MB world using JIT streaming and procedural texture generation it shows that the old paradigm of load everything in up front is just so much laziness. We can talk all night about systems that do amazing things with TINY amounts of RAM because they recompute on the fly rather than store it all into memory someplace.
More memory is always better for devs, but most of the time that can do amazing things even without it.
Quote:
Safari is my favorite example here because while it does work well for many just as many
Find the constant page reloads to be a real pain in the neck. Other apps such as image editors and CAD like programs haven't really gotten traction on the iPad. It isn't a question of programmer skill if the iPad doesn't have room for larger image buffers for example.
CAD is an input issue. Now that people are starting to find some 3rd party styluses usable for diagramming and annotating by hand I think the CAD stuff may be able to come along too. But fingers were/are just too sloppy and unpredictable even with a very good digitizer.
Quote:
Many of the apps written by iOS developers have been written by one and two man shops which implies a different set of pressures that those working in a corporate setting.
OK, but...
Quote:
In any event let's revisit image editing here. It is my understanding that there are such programs available for iPad though I haven't used any. It would be very interesting to talk to those developers to get an in depth report on what it was like to try to develop for the iPad. My impression is that much code would have to be written simply to shoe horn the app into the limited memory. Code that wouldn't gave to be written if the programmer was free to create a couple of 16GPixel image buffers in memory. The question then becomes this, is it better for the programmer to spend his time on app code or memory management code and tricky algorithms that result from a lack of RAM.
The apps don't take much RAM at all. What these guys need is good API for file mapping into the VM system so they can handle scrolling across the content and cache changes for later writing. The low latencies of the flash compared to a spinning disk make that a reasonable approach without the pain that Photoshop users felt for a decade plus.
Quote:
You are right in the sense that getting something to work is important and often leads to ignoring performance and memory management. But here is the rub, once you have something working what is likely to get tackled next, memory usage or performance. I'd say 9 times out of 10 it will be performance and optimizing performance often lead to using even more memory.
Yup, that's the general way, trade tons of cycles for RAM usage. You would be surprised though how often that gets screwed up and the memory latency of hitting RAM is FAR higher than the extra half dozen cycles it takes to add or multiply a couple numbers. I don't know why some profilers lead people down those paths, whether it's misinterpretation or not capturing the block.
Quote:
Im not sure what Apple is up to here, but I do believe that hardware limitations play a role. Also depending upon the load on a system backing store can be the most heavily accessed storage on a machine. This is why swap is often on it's own partition or disk on Unix systems. So if your hardware doesn't support wear leveling I could see it as a big concern. Battery life is also a problem as going off chip uses a lot of energy.
Actually it's contention on the backing store disk by normal operations, not regular access to the backing store that is causing problems (unless the system is overloaded and for all intents and purposes thrashing -- nothing other than cutting the load can fix that.) The theories you have on why backing stores and file mapping aren't implemented in iOS are reasonable enough, although I thing wear leveling is a bit overblown by folks using consumer machines. Killing a SSD by cell failure would take some very dedicated hyperactive use, sure it could be done but...
Could you elaborate on this -- it's been muchos años (1971) since I learned about paging, virtual memory, etc..
TIA,
Dick
Some is buried in the post above, most of the rest would be speculation.
But I can do that with the best of them so...
If iOS implemented the backing store back into the virtual memory system, some chunks of not very recently used memory can be written to the backing store swapfile rather than having to kill the whole process to steal it's memory. SSD latency is wonderful (~50 microsec), a good 500 times less than first byte read latency from a rotating hard drive, a good 200 times below the delay threshold that humans would start to notice (~10ms). That leaves tons of time to get thing back into main memory if they are really needed and then work on them.
File mapping is somewhat similar, you just map the file's bytes into the address space, but leave it in the secondary storage instead of bringing it all into into RAM immediately. The same access times apply and if you aren't trying to do heavy serial computations on that file mapped range of addresses you don't kill the processing speed. For scrolling over a large image it should do all right from a SSD, not so much from a rotating disk (Just ask the PS users who didn't have HUGE amounts of RAM and had to use the Adobe supplied swap mechanisms). The difficulty is you can't safely write back into the file mapped file, so you add complexity caching any changes and managing them as the user scroll around.
Why Apple isn't doing this is the hot question. I'm sure they have what they feel is a decent answer, but it would be illuminating as to the future timetable of when this becomes the art of the possible again if we had that answer.
The problem is not lack of RAM - but rather video memory bandwidth.
C.
How do you know? Have you developed games for the PS3? Additional video memory bandwidth only helps if the GPU is powerful enough to take advantage of it, the amount of bandwidth available for the GPU in the PS3 is fine. A Radeon x1800 with a 200 GB/s memory bandwidth would not be a whole lot more capable than one with the default. Nvidia and AMD (then ATI) chose the memory bandwith carefully for that reason, but third party's can add to the memory amount if they want. So no, its doubtful the PS3's GPU is starved for bandwidth. More video memory however would have allowed more textures, likely more true 1080p games (most are upscaled right now), etc.
How do you know? Have you developed games for the PS3? Additional video memory bandwidth only helps if the GPU is powerful enough to take advantage of it, the amount of bandwidth available for the GPU in the PS3 is fine. A Radeon x1800 with a 200 GB/s memory bandwidth would not be a whole lot more capable than one with the default. Nvidia and AMD (then ATI) chose the memory bandwith carefully for that reason, but third party's can add to the memory amount if they want. So no, its doubtful the PS3's GPU is starved for bandwidth. More video memory however would have allowed more textures, likely more true 1080p games (most are upscaled right now), etc.
I have my name on a few PS3 games - but I am not a PS3 programmer. I do come into contact with a lot of engineers who are very clear on this.
If you allocate full 1080p frame buffer on the PS3, the amount of available bandwidth to the video memory is pretty much saturated (meaning the game has to rely on lower resolution textures).
The problem exists on the 360 but is less severe. Therefore most engineers opt to render to a 720p frame buffer, which is upscaled onto the display on both platforms.
Even then I have heard stories where the memory bandwidth issue on the PS3 remains a problem at this resolution. This sometimes means that developers have to dial-down the texture resolution on the PS3 SKU. Or adopt a cheaper anti-aliasing solution.
I have heard stories that Sony don't like to see cross platform games looking better on the 360 so often "encourage" developers to make equivalent visual quality resolution reduction on the 360 games.
So despite these technical differences, the resulting games are as often as not, very similar.
The problem exists on the 360 but is less severe. Therefore most engineers opt to render to a 720p frame buffer, which is upscaled onto the display on both platforms.
All of which proves my point, not yours. The 360 and PS3 both have a bit over 20GB/s bandwith from memory going to the GPU, the difference being that the 360's is shared so it can use up to 512MB if the developers prefer(not realistic, but they can), the ps3's is dedicated so that it has 256MB fixed. If its less of a problem on the 360, that just proves its capacity, not bandwidth thats becoming the issue.
360
ps3
Before anyone asks, that EDRAM is a small 10mb chip for performance loss free AA.
All of which proves my point, not yours. The 360 and PS3 both have a bit over 20GB/s bandwith from memory going to the GPU, the difference being that the 360's is shared so it can use up to 512MB if the developers prefer(not realistic, but they can), the ps3's is dedicated so that it has 256MB fixed. If its less of a problem on the 360, that just proves its capacity, not bandwidth thats becoming the issue.
Most engineers seem to agree with the sentiments in the article I posted...
Quote:
From what we?ve uncovered today, it?s pretty clear that the 360 demonstrates superior technical prowess when it comes to handling the wide and open-world nature of Red Dead Redemption. Unlike in GTAIV, a far-reaching field of view is absolutely required in order for a game like RDR to accurately represent the time and period in which it?s set. And it?s also exactly the type of engine that is ill suited to PS3?s lack of bandwidth and vertex shader power.
But to be fair, for most consumers, these differences are barely noticeable.
Comments
OK. [i LOVE doing this!!!]
I know you don't want to do that math, but everyone else can see the pattern pretty easily.
No the BS is claiming Apple may charge for iOS 5. But now I will call BS on your personal schedule for Apple despite it being quite well understood that Apple has a one year cycle on iOS at the Apple Dev Conference and iPhone in June.
The math is that in real life demo, nobody is complaining about the performance of the Playbook. The only complaining is that the tbs.com flash website was stutterly --- when there were 3-4 other apps (big apps, not widgets) actually running in the background and the Playbook is only operating on a single core. If that's because RIM threw hardware into the Playbook because of Flash UI's big memory problem --- then Apple is going to be truly screwed because RIM just bought TAT to make all native UI widgets in C++ for the next generation of blackberries.
It's not without precedent that Apple had treat iOS updates differently for different devices.
Is 512MB enough? There is never enough memory if you are a developer, by definition. I wouldn't compare the 512MB to 1GB on a Xoom very directly though. That Xoom is running a robust JVM behind everything, and that means it is running a copy of that JVM in every process. That is a verified Metric Shitload of memory requirement that the app developers can never use.
Thank you!!!! Please repeat this ad nauseum in every thread that e-skater and wizzy try to make their sad case for iOS needing the same RAM as Droid.
PS: AI - make this a Sticky!
OK. [i LOVE doing this!!!] ...iOS uses a central GUI service in the OS. That means the OS memory footprint itself shoulders the load for all the common API provided GUI graphics. This is universally known as an efficient way to do software: write once, use many. And quite a bit of the volume isn't code bloat, it is the actual graphics elements that need to be combined. Elements that only ever need to be in memory anywhere once.
AIR on the other hand pastes it's whole self into every application processes that makes a request of the Flash GUI services. That's known as: write once, duplicate endlessly. Every instantiation of the GUI adds it's full weight again to the overall memory requirements of the machine. Every line of AIR GUI machine code, duplicated in every process. Every API-based GUI graphic, duplicated in every process....
I'm no techy, but this seems to back up my hunch why AIR/Flash is such a memory hog.
I'm not debating your knowledge here Hiro, but a call goes out to the other tech guys here to verify this rather elegant and simple explanation.
BTW: in regards to my post above, regardless of whether iOS "needs" 1GB or has 1GB, I firmly believe that iOS will be able to do more with the memory than a similer spec'ed Android device... also because of Hiro's explaination of the JVM on which Android is based.
All seems logical to me.
Utter nonsense. Unless you have some evidence that 1080p video playing, which is a pointless feature, was actually planned,
Well I'd say that the slide that said 1080p in Very Large Letters during Steves presentation implies that 1080p playback is a feature of the iPad2.
And saying that this is a pointless feature is bizarre - look around you. Every damn videocamera no matter how cheap, most new compact cameras and DSLRs, all record 1080p. All TV-sets advertised in the advertising material sneaked into my morning paper (I hate that), are 1080p, and it isn't even referred to except for the very cheapest sets. In short, the consumer electronics world around you is 1080p, including the only HD distribution format, so 1080p output would be a reasonable assumption even without that 1080p slide.
Folks really need to read about the purpose of RAM before bashing a device because it has x amount instead of y. My 3GS can run just plenty with its 600MHz CPU and 256MB including Infinity Blade with little if any lag. My quad-core i7 with 12GB running Win7 has its moments of lag as much as my MacBook with its Core2 Duo and 4GB.
True. SSDs make a big difference to the perceived responsiveness of even a very fast computer. They're not yet at a price/gigabyte that's acceptable to most people, but that will change.
It is? How so? Is there a special USE_MEMORY_EFFICIENTLY flag in Darwin designed especially for the iPad?
Don't know if this has been answered yet, but here's the answer.
- The iPad kernel & OS is compiled to run on ... the iPad. And that's it. All of the memory-gobbling modular abstraction features simply don't exist. Anybody who has ever done any degree of device-specific programming can tell you that there's a very large efficiency improvement that comes from going device-specific -- at least 2x. Android has to include a lot of these features that iOS does not -- let's get serious, Android builds heavily on Java, the biggest hog of them all. As a result, iOS will always run faster and cleaner than Android will.
Other things:
- The ARMv7 is 32 bit, rather than 64 bit like modern x86's. so pointers are half the size.
- The ARMv7 has a memory manager that is very agile, so it can pack memory structures more efficiently than most other chips.
- The graphics chipset has its own RAM, IIRC, so game performance is tied to that.
I'm not sure if there is a cost-effective 1GB RAM chip that can fit on top of the A5 MCM stack. Moreover, DRAM drains quite a lot of power in relation to the rest of the components in that A5 MCM stack. There's no reason to have more, unless you need more. It just adds cost and drains power.
Second my comments reflect the confusion over how much RAM will actually be in iPAd 2. If it ships with only 256MB then the upgrade is basically worthless. I've clearly stated multiple times now that going to 512MB would go a long way to correcting iPads problems. Certainly 1GB would be better but frankly I'd be surprised if we get that. In any event we have yet to see any solid evidence about iPads actual RAM implementation.
Third there has been a lot of BS posted here about how great iPads memory system is, it is all garbage. The lack of demand paging does not automatically make it better of more efficient than Android. Androids problems revolve around Java more than anything. Hiro correctly pointed this out but it really doesn't matter because as stated above iPad 1needed more RAM from day one and frankly would have needed it even if Android didn't exist.
Thank you!!!! Please repeat this ad nauseum in every thread that e-skater and wizzy try to make their sad case for iOS needing the same RAM as Droid.
PS: AI - make this a Sticky!
Again DO NOT PUT WORDS IN MY MOUTH. IPads problems have nothing to do with Android. Further I have to wonder why you aren't concerned about this issue? In iOS RAM directly impacts what types of apps can be ran successfully on an iPad, more so than many other OS's.
Both are true. The original iPad suffers dramatically from too little ram, at least in some apps like safari. However comparison to android must consider differences in overhead between the platforms.
Additional ram is the reason I'll be in line to replace my iPad on Friday.
Well I'd say that the slide that said 1080p in Very Large Letters during Steves presentation implies that 1080p playback is a feature of the iPad2.
And saying that this is a pointless feature is bizarre - look around you. Every damn videocamera no matter how cheap, most new compact cameras and DSLRs, all record 1080p. All TV-sets advertised in the advertising material sneaked into my morning paper (I hate that), are 1080p, and it isn't even referred to except for the very cheapest sets. In short, the consumer electronics world around you is 1080p, including the only HD distribution format, so 1080p output would be a reasonable assumption even without that 1080p slide.
You mean the slide where SJ is talking about mirrored output from an iPad through the HDMI adapter cable, but you just forgot to mention that? (at around 24 minutes) Clearly he was talking about the cable, but you chose to vaguely misrepresent that to try to support your point? Funny how the iPad detractors around here feel the need to engage in dishonest misrepresentations.
The reason it's a pointless feature is that cabling your iPad to a TV to watch movies is a stupid idea. It's fine for giving a presentation, but it's just dumb to do this at home when there are so many better options.
First Android has absolutely nothing to do with this discussion, iPAds need for more memory than what is in iPad 1 stands on it's own. The machine never shipped with enough memory, this causes significant issues with Safari and other apps when handling large amounts of data. Sadly iPad 1 wasn't even at parity with iPhone 4.
Second my comments reflect the confusion over how much RAM will actually be in iPAd 2. If it ships with only 256MB then the upgrade is basically worthless. I've clearly stated multiple times now that going to 512MB would go a long way to correcting iPads problems. Certainly 1GB would be better but frankly I'd be surprised if we get that. In any event we have yet to see any solid evidence about iPads actual RAM implementation.
Third there has been a lot of BS posted here about how great iPads memory system is, it is all garbage. The lack of demand paging does not automatically make it better of more efficient than Android. Androids problems revolve around Java more than anything. Hiro correctly pointed this out but it really doesn't matter because as stated above iPad 1needed more RAM from day one and frankly would have needed it even if Android didn't exist.
Again DO NOT PUT WORDS IN MY MOUTH. IPads problems have nothing to do with Android. Further I have to wonder why you aren't concerned about this issue? In iOS RAM directly impacts what types of apps can be ran successfully on an iPad, more so than many other OS's.
I agree any platform can always use more memory. I would love to see couple GB in something like an iPad to do some really kick-ass stuff. But I don't see a major problem because it isn't there yet.
The biggest reason iPad could use more memory is that is will make more apps available from a wider cross section of developers. This is a little bit of a cut against the wider dev community because it implies that being efficient and precise with memory management is something many devs are not good at. Unfortunately that is the case, sometimes because of ability, but probably more because of ship date and budget pressures. Getting something working and out the door is often more important than optimizing memory usage.
That is where the desktop is today, memory usage is pretty much a tertiary thought and the mindset is bleeding over into the mobile space where the battery tech hasn't caught up to the task of powering gobs of GB yet. I personally think Apple could do some great things with more use of file mapping and a backing store to the Flash and I don't know what the continuing motivation is for avoiding it. I have doubts that SSD lifecycle is a valid concern, backing stores are by design not written to very often compared to the operations of the rest of the machine. Maybe they don't want to give up up to a couple GB SSD space for it.
Well BS that we need to get this info from a fourth party. Further it is BS in the sense that Apple does publish many specs for the iPads but prefers to screw over the consummer with respect to this one important parameter.
RAM is very important, if Apple has stayed with 256MB the upgrade would be worthless. From the day it debuted iPad one has suffered from the lack of RAM.
Ok. Educate me. Why is this such a big a deal as your are claiming?
I personally think Apple could do some great things with more use of file mapping and a backing store to the Flash and I don't know what the continuing motivation is for avoiding it. I have doubts that SSD lifecycle is a valid concern, backing stores are by design not written to very often compared to the operations of the rest of the machine. Maybe they don't want to give up up to a couple GB SSD space for it.
Could you elaborate on this -- it's been muchos años (1971) since I learned about paging, virtual memory, etc..
TIA,
Dick
Ok. Educate me. Why is this such a big a deal as your are claiming?
Look at it this way iPad 1 had less RAM available to the user than the iPhone 3GS. Due to the way iOS uses RAM for user data this directly impacts what capability can be written into an app.
I'm not at all sure why this is so difficult for people to understand. Have you never purchased a shrink wrap piece of software and looked at the list of requirements. Many apps have minimal RAM requirements. Even more have recommended RAM sizes. This data is there because RAM impacts app usability and capability. The problem is even more acute on iOS devices because there is no paging or swap. Contrary to opinions expressed here forcing the programmer to manage all data in RAM himself does not make a program faster or more reliable. Rather it is the opposite in impact in many cases or it simpky leads to programs not being written.
In any event this has been gone over again and again so please tell us what you don't understand. Further why are people so damn willing to let Apple get away with this nonsense? Especially considering that Apple publishes far more useless specs but fails to be honest about this one parameter.
I agree any platform can always use more memory. I would love to see couple GB in something like an iPad to do some really kick-ass stuff. But I don't see a major problem because it isn't there yet.
Well it depends upon context. For example if you intend to use an iPad 1 with Safari over 3G the lack of RAM I very significant. Not only do page reloads, at a drop of the hat, slow things down you end up using a good portion of your bandwidth. I wouldn't be surprised if some users report lowered data usage numbers when switching to iPad 2. That given that iPad 2 actually has more memory in the first place.
Again though we need to focus on context here and how the user surfs with Safari. One person might not see much difference in data usage while another might see data usage cut to a small fraction of his current iPad 1 usage.
Note that I'm looking at this from the point if view of a user, not a programmer or developer. What I se as sad here is that iPad has been marketed as a way to access the net, yet of the supplied tools on the device Safari is the poorest performing of the lot.
The biggest reason iPad could use more memory is that is will make more apps available from a wider cross section of developers. This is a little bit of a cut against the wider dev community because it implies that being efficient and precise with memory management is something many devs are not good at.
Not at all! The problem is pretty straight forward, some apps simply require more memory that others. One developer may be able to shave the app foot print relative to another but you still will end up with apps that don't fit comfortably in to 120MB of RAM.
Safari is my favorite example here because while it does work well for many just as many
Find the constant page reloads to be a real pain in the neck. Other apps such as image editors and CAD like programs haven't really gotten traction on the iPad. It isn't a question of programmer skill if the iPad doesn't have room for larger image buffers for example.
Unfortunately that is the case, sometimes because of ability, but probably more because of ship date and budget pressures. Getting something working and out the door is often more important than optimizing memory usage.
Many of the apps written by iOS developers have been written by one and two man shops which implies a different set of pressures that those working in a corporate setting.
In any event let's revisit image editing here. It is my understanding that there are such programs available for iPad though I haven't used any. It would be very interesting to talk to those developers to get an in depth report on what it was like to try to develop for the iPad. My impression is that much code would have to be written simply to shoe horn the app into the limited memory. Code that wouldn't gave to be written if the programmer was free to create a couple of 16GPixel image buffers in memory. The question then becomes this, is it better for the programmer to spend his time on app code or memory management code and tricky algorithms that result from a lack of RAM.
You are right in the sense that getting something to work is important and often leads to ignoring performance and memory management. But here is the rub, once you have something working what is likely to get tackled next, memory usage or performance. I'd say 9 times out of 10 it will be performance and optimizing performance often lead to using even more memory.
That is where the desktop is today, memory usage is pretty much a tertiary thought and the mindset is bleeding over into the mobile space where the battery tech hasn't caught up to the task of powering gobs of GB yet.
Yes this is true and is why Apple has to carefully optimize the RAM available to the system, the last thing I want to see is a regression in iPad battery lifetimes. One thing to jeep in mind here though is that Apple isn't exactly focused on tight memory usage either. IOS actually has a very rich set of APIs that continue to grow. The growing SDK is good for users in that such growth exposes new features and capabilities but it also adds more memory usage.
I personally think Apple could do some great things with more use of file mapping and a backing store to the Flash and I don't know what the continuing motivation is for avoiding it. I have doubts that SSD lifecycle is a valid concern, backing stores are by design not written to very often compared to the operations of the rest of the machine.
Im not sure what Apple is up to here, but I do believe that hardware limitations play a role. Also depending upon the load on a system backing store can be the most heavily accessed storage on a machine. This is why swap is often on it's own partition or disk on Unix systems. So if your hardware doesn't support wear leveling I could see it as a big concern. Battery life is also a problem as going off chip uses a lot of energy.
Part of the problem here is that Apple seems to have built the original versions of iOS for a slightly different portable devices market than the iPad. That is one could reasonably expect iOS to have more capability on the iPad than it does on iPhone or Touch as the needs are different. Apple though has choosen to keep the OS the same memory management wise across all platforms.
Now the question is will that change in the future.
Maybe they don't want to give up up to a couple GB SSD space for it.
I'm not sure the couple of GB of flash space is even an issue. I suspect that this is another one of those issues where we will never completely know Apples mind set. What is interesting here is that with Apples single user app model the swap space required is effectively limited in size. I still think the major issues revolve around wear and power but would love to know exactly why Apple has done things the way they have.
In any event iPad 2 reports, bench marks and experiences are starting to filter in to the web. It will be a very interesting week.
Well it depends upon context. For example if you intend to use an iPad 1 with Safari over 3G the lack of RAM I very significant. Not only do page reloads, at a drop of the hat, slow things down you end up using a good portion of your bandwidth.
The forced (unnecessary )page reloads should get better with more memory. They could get A LOT better with a proper backing store. As for 3G vs WiFi load speeds, that's not a memory issue.
I wouldn't be surprised if some users report lowered data usage numbers when switching to iPad 2. That given that iPad 2 actually has more memory in the first place.
yup
Again though we need to focus on context here and how the user surfs with Safari. One person might not see much difference in data usage while another might see data usage cut to a small fraction of his current iPad 1 usage.
Note that I'm looking at this from the point if view of a user, not a programmer or developer. What I se as sad here is that iPad has been marketed as a way to access the net, yet of the supplied tools on the device Safari is the poorest performing of the lot.
I don't find it as horrible as you do. I see the same behaviors, but they just don't seem as heinous to me as you make them out to be. Just chalk that up to out mileages/tolerances vary
The problem is pretty straight forward, some apps simply require more memory that others.
Until you do something that needs that memory dynamically,this isn't very hard to get around for a good coder not on an insane schedule. The stuff that will need gobs of memory dynamically isn't well suited for a platform not dedicated to wholesale computation. Not all tasks will run well on an iPad even if it is possible to get them to run. This should really be a small minority of apps overall.
One developer may be able to shave the app foot print relative to another but you still will end up with apps that don't fit comfortably in to 120MB of RAM.
The VAST majority of apps don't need even that much at once. And harping on an original iPad/iPhone3 limitation is being a bit dated. The real breaker is if Carmack can make RAGE work in that 120MB world using JIT streaming and procedural texture generation it shows that the old paradigm of load everything in up front is just so much laziness. We can talk all night about systems that do amazing things with TINY amounts of RAM because they recompute on the fly rather than store it all into memory someplace.
More memory is always better for devs, but most of the time that can do amazing things even without it.
Safari is my favorite example here because while it does work well for many just as many
Find the constant page reloads to be a real pain in the neck. Other apps such as image editors and CAD like programs haven't really gotten traction on the iPad. It isn't a question of programmer skill if the iPad doesn't have room for larger image buffers for example.
CAD is an input issue. Now that people are starting to find some 3rd party styluses usable for diagramming and annotating by hand I think the CAD stuff may be able to come along too. But fingers were/are just too sloppy and unpredictable even with a very good digitizer.
Many of the apps written by iOS developers have been written by one and two man shops which implies a different set of pressures that those working in a corporate setting.
OK, but...
In any event let's revisit image editing here. It is my understanding that there are such programs available for iPad though I haven't used any. It would be very interesting to talk to those developers to get an in depth report on what it was like to try to develop for the iPad. My impression is that much code would have to be written simply to shoe horn the app into the limited memory. Code that wouldn't gave to be written if the programmer was free to create a couple of 16GPixel image buffers in memory. The question then becomes this, is it better for the programmer to spend his time on app code or memory management code and tricky algorithms that result from a lack of RAM.
The apps don't take much RAM at all. What these guys need is good API for file mapping into the VM system so they can handle scrolling across the content and cache changes for later writing. The low latencies of the flash compared to a spinning disk make that a reasonable approach without the pain that Photoshop users felt for a decade plus.
You are right in the sense that getting something to work is important and often leads to ignoring performance and memory management. But here is the rub, once you have something working what is likely to get tackled next, memory usage or performance. I'd say 9 times out of 10 it will be performance and optimizing performance often lead to using even more memory.
Yup, that's the general way, trade tons of cycles for RAM usage. You would be surprised though how often that gets screwed up and the memory latency of hitting RAM is FAR higher than the extra half dozen cycles it takes to add or multiply a couple numbers. I don't know why some profilers lead people down those paths, whether it's misinterpretation or not capturing the block.
Im not sure what Apple is up to here, but I do believe that hardware limitations play a role. Also depending upon the load on a system backing store can be the most heavily accessed storage on a machine. This is why swap is often on it's own partition or disk on Unix systems. So if your hardware doesn't support wear leveling I could see it as a big concern. Battery life is also a problem as going off chip uses a lot of energy.
Actually it's contention on the backing store disk by normal operations, not regular access to the backing store that is causing problems (unless the system is overloaded and for all intents and purposes thrashing -- nothing other than cutting the load can fix that.) The theories you have on why backing stores and file mapping aren't implemented in iOS are reasonable enough, although I thing wear leveling is a bit overblown by folks using consumer machines. Killing a SSD by cell failure would take some very dedicated hyperactive use, sure it could be done but...
Could you elaborate on this -- it's been muchos años (1971) since I learned about paging, virtual memory, etc..
TIA,
Dick
Some is buried in the post above, most of the rest would be speculation.
But I can do that with the best of them so...
If iOS implemented the backing store back into the virtual memory system, some chunks of not very recently used memory can be written to the backing store swapfile rather than having to kill the whole process to steal it's memory. SSD latency is wonderful (~50 microsec), a good 500 times less than first byte read latency from a rotating hard drive, a good 200 times below the delay threshold that humans would start to notice (~10ms). That leaves tons of time to get thing back into main memory if they are really needed and then work on them.
File mapping is somewhat similar, you just map the file's bytes into the address space, but leave it in the secondary storage instead of bringing it all into into RAM immediately. The same access times apply and if you aren't trying to do heavy serial computations on that file mapped range of addresses you don't kill the processing speed. For scrolling over a large image it should do all right from a SSD, not so much from a rotating disk (Just ask the PS users who didn't have HUGE amounts of RAM and had to use the Adobe supplied swap mechanisms). The difficulty is you can't safely write back into the file mapped file, so you add complexity caching any changes and managing them as the user scroll around.
Why Apple isn't doing this is the hot question. I'm sure they have what they feel is a decent answer, but it would be illuminating as to the future timetable of when this becomes the art of the possible again if we had that answer.
The problem is not lack of RAM - but rather video memory bandwidth.
C.
How do you know? Have you developed games for the PS3? Additional video memory bandwidth only helps if the GPU is powerful enough to take advantage of it, the amount of bandwidth available for the GPU in the PS3 is fine. A Radeon x1800 with a 200 GB/s memory bandwidth would not be a whole lot more capable than one with the default. Nvidia and AMD (then ATI) chose the memory bandwith carefully for that reason, but third party's can add to the memory amount if they want. So no, its doubtful the PS3's GPU is starved for bandwidth. More video memory however would have allowed more textures, likely more true 1080p games (most are upscaled right now), etc.
How do you know? Have you developed games for the PS3? Additional video memory bandwidth only helps if the GPU is powerful enough to take advantage of it, the amount of bandwidth available for the GPU in the PS3 is fine. A Radeon x1800 with a 200 GB/s memory bandwidth would not be a whole lot more capable than one with the default. Nvidia and AMD (then ATI) chose the memory bandwith carefully for that reason, but third party's can add to the memory amount if they want. So no, its doubtful the PS3's GPU is starved for bandwidth. More video memory however would have allowed more textures, likely more true 1080p games (most are upscaled right now), etc.
I have my name on a few PS3 games - but I am not a PS3 programmer. I do come into contact with a lot of engineers who are very clear on this.
If you allocate full 1080p frame buffer on the PS3, the amount of available bandwidth to the video memory is pretty much saturated (meaning the game has to rely on lower resolution textures).
The problem exists on the 360 but is less severe. Therefore most engineers opt to render to a 720p frame buffer, which is upscaled onto the display on both platforms.
Even then I have heard stories where the memory bandwidth issue on the PS3 remains a problem at this resolution. This sometimes means that developers have to dial-down the texture resolution on the PS3 SKU. Or adopt a cheaper anti-aliasing solution.
I have heard stories that Sony don't like to see cross platform games looking better on the 360 so often "encourage" developers to make equivalent visual quality resolution reduction on the 360 games.
So despite these technical differences, the resulting games are as often as not, very similar.
This is worth a read.
http://imagequalitymatters.blogspot....ption-360.html
C.
The problem exists on the 360 but is less severe. Therefore most engineers opt to render to a 720p frame buffer, which is upscaled onto the display on both platforms.
All of which proves my point, not yours. The 360 and PS3 both have a bit over 20GB/s bandwith from memory going to the GPU, the difference being that the 360's is shared so it can use up to 512MB if the developers prefer(not realistic, but they can), the ps3's is dedicated so that it has 256MB fixed. If its less of a problem on the 360, that just proves its capacity, not bandwidth thats becoming the issue.
360
ps3
Before anyone asks, that EDRAM is a small 10mb chip for performance loss free AA.
All of which proves my point, not yours. The 360 and PS3 both have a bit over 20GB/s bandwith from memory going to the GPU, the difference being that the 360's is shared so it can use up to 512MB if the developers prefer(not realistic, but they can), the ps3's is dedicated so that it has 256MB fixed. If its less of a problem on the 360, that just proves its capacity, not bandwidth thats becoming the issue.
Most engineers seem to agree with the sentiments in the article I posted...
From what we?ve uncovered today, it?s pretty clear that the 360 demonstrates superior technical prowess when it comes to handling the wide and open-world nature of Red Dead Redemption. Unlike in GTAIV, a far-reaching field of view is absolutely required in order for a game like RDR to accurately represent the time and period in which it?s set. And it?s also exactly the type of engine that is ill suited to PS3?s lack of bandwidth and vertex shader power.
But to be fair, for most consumers, these differences are barely noticeable.
C.