Intel's siege on ARM begins.

Posted:
in Future Apple Hardware edited January 2014
Intel Moorestown platform speedy and ready for Smartphones and Tablets



This is an exhaustive review so I'll cut right to the summary. Read as much as you want to gain further insight.



Quote:

Final Words

This isn?t your netbook?s Atom. Thanks to an incredible amount of integration, power management and efficiency Moorestown has the potential to be the most exciting thing to hit the smartphone market since the iPhone. If Intel can deliver a platform that offers greater than 2x the performance of existing smartphones in the same power envelope it has a real chance of winning the market.



The problems are obvious. Intel is the underdog here, it has no foothold in the market and the established OSes are currently very ARM optimized. Not only does Intel have to get Moorestown off the ground but it also needs a win on the software side as well. MeeGo has to, er, go somewhere if this is going to work out. The one thing I will say is that the expected rarely pans out. The smartphone market in 5 years won't look like an extension of what we see today. Apple and Google dominating the market and running ARM processors is where we are today, I'm not convinced that's where we'll be tomorrow.



Intel's Sean Maloney, heir apparent to the CEO throne, said that to succeed Intel needs 3 out of the top 5 handset guys and a bunch of alternative players. He added, "we feel like we're in good shape for that." In the next 12 - 18 months we should see that come to fruition.



For me however it's more about software and design wins. Intel needs to be in an iPhone, or at least something equally emotionally captivating. It needs a halo product. I believe Intel has the right approach here with Moorestown. To be honest, I've seen the roadmap beyond it and it's very strong. The technology is there. We just need someone to put it to use and that's the part that isn't guaranteed.





ARM Holdings...you better hang on.
«1

Comments

  • Reply 1 of 21
    1337_5l4xx0r1337_5l4xx0r Posts: 1,558member
    Jon Stokes basically sums up my thoughts in his article; They mention how low power it can be, and how high performing it can be, but never in the same sentence.



    Quote:

    If you compare Intel's power consumption numbers to its performance numbers, you'll notice that there's zero overlap. Nowhere does Intel reveal how much power Moorestown draws when running Quake 3 at 100FPS, or when executing the SunSpider JavaScript benchmark in two seconds. That's because the power draw in those situations is certain to be astronomic by smartphone standards. In other words, as long as you're doing a bunch of stuff that either doesn't involve the main SoC at all (e.g., talking, idling), or will let you shut down all but a few specialized blocks (e.g., 1080p video playback, audio playback), you can expect to have something approaching a normal smartphone experience. But when you actually use the main SoC like a real x86 processor... well, Intel is silent on those power numbers, and that's almost certainly because they're ugly.



    The reason for Moorestown's decidedly unsmartphone-like range of power/performance points is that Intel relied heavily on power gating and other, very fine-grained dynamic power optimization techniques. This means that under ideal conditions, with most of the complex and power-hungry parts of the platform either shut down or severely throttled, the SoC's low leakage means that Moorestown's power profile will look much like that of a more integrated, simpler ARM platform. But as you start turning on and upclocking the parts of the SoC that can deliver the kinds of performance results that Intel touts in its slides, Moorestown's power usage will spike up out of smartphone territory and into the lower reaches of the netbook stratosphere. In other words, you can't get something for nothing, despite the fact that Intel can make it look like you do by selectively pairing some nothing-based power numbers with some something-based performance numbers.



    Do you want a hot phone burning your ear?
  • Reply 2 of 21
    backtomacbacktomac Posts: 4,579member
    Things get really interesting at 32nm. I wonder why Intel aren't more aggressive at getting Atom fabbed at the 32nm node.
  • Reply 3 of 21
    programmerprogrammer Posts: 3,458member
    Quote:
    Originally Posted by backtomac View Post


    Things get really interesting at 32nm. I wonder why Intel aren't more aggressive at getting Atom fabbed at the 32nm node.



    Because their 32nm fabs are busy churning out their real money makers.





    I think Intel has a long ways to go before they can really take a shot at ARM's position in the smartphone and other high mobility devices. And ARM has a lot of room to keep moving ahead before Intel closes in on them. ARM is open to licensing their cores and building custom SoC devices... this is something that Intel hasn't shown a willingness to do yet. Its about a lot more than just the processor.
  • Reply 4 of 21
    hmurchisonhmurchison Posts: 12,425member
    Yes



    I think Intel's "Walled Garden" is not enticing enough considering how much flexibility comes with ARM cores from licensee.



    Intel's Moorestown will generate some eye popping numbers but of concern is the chip proliferation. I know Apple has been constantly reducing the component count with each new iteration of the iPhone. I don't see how they or others will want a less integrated platform wrt hardware.
  • Reply 5 of 21
    bitemymacbitemymac Posts: 1,147member
    Quote:
    Originally Posted by backtomac View Post


    Things get really interesting at 32nm. I wonder why Intel aren't more aggressive at getting Atom fabbed at the 32nm node.



    For Intel, 45nm produced highest yield and 32nm still isn't mature enough to turn profit level at the 45nm process.



    Consumer grade chips will have to wait until the 32nm process matures and the yield improves.
  • Reply 6 of 21
    wizard69wizard69 Posts: 13,377member
    Really we should wait for actual devices that run similar software. Then do some side by side testing. I'm sure a few companies will get duped into implementing this in a cell phone but I hardly can see the product being successful. As has been pointed out trashing the battery everytime you use the device makes for a bad consummer experience.



    Now for tablets I see a different dynamic. Performance will become an issue on these devices as the market heats up. Performance however isn't the only attraction, the eventual ability to go beyound 32bit is. Right now Apple is getting away with selling the iPad with far to little RAM but that won't last forever. Some may think that more than 4GB of RAM in a tablet is unattainable in the near future. I would argue it isn't that far off.



    Depending upon how you look at Apples iPad tablet you might see Apple try to implement a real Mac tablet or even a lowcost notebook with this chip. It actually might give a burst of sales to the netbook makers too. Still in a cell phone it would be stupid.



    The flip side here is this why would Apple even botherafter the investment already made in ARM based devices. Apple has a very long history with ARM and ARM is a extremely good fit in mobile. Especially in the made to order SoC market.









    Dave
  • Reply 7 of 21
    programmerprogrammer Posts: 3,458member
    Quote:
    Originally Posted by wizard69 View Post


    Some may think that more than 4GB of RAM in a tablet is unattainable in the near future. I would argue it isn't that far off.



    Its a few years off... but they'll be able to support >4GB, just not >4GB per process. That will extend the life of a 32-bit processor a bit. There just isn't a clamoring demand for 64-bit apps on the desktop, nevermind in a tablet or other mobile device. And like PPC, ARM would suffer a performance hit going to 64-bit... unlike x86 where going 64-bit is partly motivated by a performance win. I'd guess that we're looking at close to a decade before it becomes a real issue (based on Moore's Law, and that's assuming that it continues unbounded which is looking less likely than ever before). And in the decade timeframe I would suggest that we'll see more radical changes in processors than this.
  • Reply 8 of 21
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by Programmer View Post


    Its a few years off... but they'll be able to support >4GB, just not >4GB per process. That will extend the life of a 32-bit processor a bit.



    I don't think the current ARM architecture supports any of the common addressing schemes used to supply a machine with 4GB segments. Admittedly my info is real thin on this and my experience is with embedded ARM processors. Given that that might be true would it not be easier for ARM to simply extend the architecture to 64 bits. The way I understand it power wise it doesn't impact the ALU all that much but of course makes moving data around more expensive. Note to, they don't need 64 bit addressing to the outside world, they could go a long way simply adding 8 bits to the current addresses.

    Quote:

    There just isn't a clamoring demand for 64-bit apps on the desktop, nevermind in a tablet or other mobile device.



    That depends upon who you talk to and what they are doing. For many of Apples core users 64 bit is a big deal on the desktop. Admittedly the tablet world is very different. However in my opinion Apples one big mistake is shipping iPad with to little memory. 4GB isn't required today but I could see the platform growing rapidly if it had even more RAM to support a wider array of apps. Of course right at the moment it doesn't make a difference when Apple sells out every week.

    Quote:



    And like PPC, ARM would suffer a performance hit going to 64-bit... unlike x86 where going 64-bit is partly motivated by a performance win. I'd guess that we're looking at close to a decade before it becomes a real issue (based on Moore's Law, and that's assuming that it continues unbounded which is looking less likely than ever before). And in the decade timeframe I would suggest that we'll see more radical changes in processors than this.



    It isn't a given because you don't know if the instructions size would change to 64 bits.





    Dave
  • Reply 9 of 21
    programmerprogrammer Posts: 3,458member
    Quote:
    Originally Posted by wizard69 View Post


    I don't think the current ARM architecture supports any of the common addressing schemes used to supply a machine with 4GB segments. Admittedly my info is real thin on this and my experience is with embedded ARM processors. Given that that might be true would it not be easier for ARM to simply extend the architecture to 64 bits. The way I understand it power wise it doesn't impact the ALU all that much but of course makes moving data around more expensive. Note to, they don't need 64 bit addressing to the outside world, they could go a long way simply adding 8 bits to the current addresses.



    The important thing is that adding support for multiple 32-bit process spaces doesn't require any ISA changes, nor does it impact the user programming model (i.e. apps don't notice). User visible segments aren't required. It also means you can add just a few bits to the kernel visible ISA and hardware, which is much cheaper than going to a full 64-bit ISA.



    Quote:

    That depends upon who you talk to and what they are doing. For many of Apples core users 64 bit is a big deal on the desktop. Admittedly the tablet world is very different. However in my opinion Apples one big mistake is shipping iPad with to little memory. 4GB isn't required today but I could see the platform growing rapidly if it had even more RAM to support a wider array of apps. Of course right at the moment it doesn't make a difference when Apple sells out every week.



    The tablet world is extremely different... for the user model that the iPad is aimed at. Apple isn't targeting everyone with the iPad. They have a very clear model of what its for, and that's why its so effective. Everyone else's tablets are more of a kitchen sink approach where they are still thinking its just a different form factor notebook... which has a market, but a much much much smaller market. This isn't idle speculation either -- Apple has outsold pretty much all other tablets in history combined. In a month.



    Quote:

    It isn't a given because you don't know if the instructions size would change to 64 bits.



    It wouldn't. No other extension to a 32-bit ISA has gone to 64-bit instructions. There is no reason to do that. The size of an instruction determines how many instructions you can have, how many parameters you can support, and how many registers you can address. None of these things change going to 64-bit (except in x86 variable length instruction scheme where immediate mode operands are supported, leading to really enormous instructions... but ARM is a RISC architecture).
  • Reply 10 of 21
    spotonspoton Posts: 645member
    Switching from RISC will only buy a little more time before thermal issues set in anyway.
  • Reply 11 of 21
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by Programmer View Post


    The important thing is that adding support for multiple 32-bit process spaces doesn't require any ISA changes, nor does it impact the user programming model (i.e. apps don't notice). User visible segments aren't required. It also means you can add just a few bits to the kernel visible ISA and hardware, which is much cheaper than going to a full 64-bit ISA.



    There is no doubt in my mind that this will work, that is to have multiple 32 bit segments. As you indicate this Can be transparent to the apps running under the OS. The problem is that processes are still 32 bit apps.



    Quote:



    The tablet world is extremely different... for the user model that the iPad is aimed at. Apple isn't targeting everyone with the iPad. They have a very clear model of what its for, and that's why its so effective.



    For the most part I have to agree with this, except when it comes to the amount of RAM in the iPad. Leaving an app with less addressable RAM than what is in an iPhone or Touch devices is very bad on Apples part.



    I know there is no free lunch in the world of electronics but this machine should have come with more RAM or at least the option for more.

    Quote:

    Everyone else's tablets are more of a kitchen sink approach where they are still thinking its just a different form factor notebook... which has a market, but a much much much smaller market. This isn't idle speculation either -- Apple has outsold pretty much all other tablets in history combined. In a month.



    I'm not a big fan of the kitchen sink approach either but I don't see RAM as even fitting into this discussion. RAM, well more of it, would simply have made for a wider range of potential software running on the device.

    Quote:

    It wouldn't. No other extension to a 32-bit ISA has gone to 64-bit instructions. There is no reason to do that. The size of an instruction determines how many instructions you can have, how many parameters you can support, and how many registers you can address.



    Interestingly here ARM is the only company I can think of that actually shrunk its instruction size by supporting Thumb instructions.



    As to 32 bit instructions in a 64 bit architecture you do run into issues mapping additional instructions onto an old instruction set. That is instructions to support the additional registers and operations. A common approach is an extended instruction that takes an additional word. While this isn't a 64 bit instruction per say it is 64 bits worth of data.

    Quote:

    None of these things change going to 64-bit (except in x86 variable length instruction scheme where immediate mode operands are supported, leading to really enormous instructions... but ARM is a RISC architecture).



    Actually the only architecture that was really designed from the beginning to support both 32 and 64 bit hardware is PowerPC. While ARM is also RISC I'm not convinced that it has enough unimplemented instructions available to implement 64 bit operations completely. There is only so much that can be encoded into a 32 bit word, so the rational approach would be to use two 32 bit words to define new instructions. This does increase code space for 64 bit operations but modern processors can easily handle the extra instruction length without an impact on the pipelines.



    Frankly to discuss this further I'd have to find some reference materials. I believe that ARM has more or less completely used all of the possible Thumb instructions but I'm not sure what the status of the ARM instructions is.



    Dave
  • Reply 12 of 21
    programmerprogrammer Posts: 3,458member
    Quote:
    Originally Posted by wizard69 View Post


    For the most part I have to agree with this, except when it comes to the amount of RAM in the iPad. Leaving an app with less addressable RAM than what is in an iPhone or Touch devices is very bad on Apples part.



    The iPad has the same physical memory as the 3GS (and 4G it appears, according to the rampant leaks). The major difference is the larger display buffer, which isn't all that much really. Any remaining delta will be due to a higher OS impact, which can be optimized



    Quote:

    I know there is no free lunch in the world of electronics but this machine should have come with more RAM or at least the option for more.



    Cost & power. Their price point and battery life is pretty stellar. 256MB is quite a lot of space... developers have just gotten lazy. A bit of memory optimization work can make an enormous difference, improve performance at the same time, and means you have a cheaper device and longer battery life. For the vast majority of apps, its plenty. For other apps... either this isn't the right device (get an iPad Pro), or find an innovative solution.





    Quote:

    Interestingly here ARM is the only company I can think of that actually shrunk its instruction size by supporting Thumb instructions.



    And they did that precisely because compact instructions are crucial in small, mobile devices. Doubling the size of instructions is a huge price to pay and unnecessary. If they really want to support 64-bit mode eventually, they can simply have a completely different instruction encoding in 64-bit mode. It actually takes very little new instructions to support. You're right that thumb mode is pretty much full, but the thumb mode doesn't need to support 64-bit mode.
  • Reply 13 of 21
    splinemodelsplinemodel Posts: 7,311member
    Intel doesn't have a business model that can succeed in the embedded space. ARM does. End of story.



    The fact is, if there's a choice, every developer on the planet will run, not walk, away from x86. Intel has already completely fucked the embedded space by making little endian the generic standard for code portability -- every DSP coder on Earth hates them for it. Combine ARM's very clean architecture with their extremely empowering licensing model, and I just don't see a way for Intel to compete.
  • Reply 14 of 21
    trmorrntrmorrn Posts: 5member
    why change what has been the golden apple?
  • Reply 15 of 21
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by Programmer View Post


    The iPad has the same physical memory as the 3GS (and 4G it appears, according to the rampant leaks). The major difference is the larger display buffer, which isn't all that much really. Any remaining delta will be due to a higher OS impact, which can be optimized



    Well that all depends upon what you mean by "all that much". The fact that iPad starts out with less RAM than the current iPhone for apps worries me. Maybe it can be optimized a bit but one reality that we have to keep an eye on is that the larger screen requires more data space simply because of all those pixels.

    Quote:





    Cost & power. Their price point and battery life is pretty stellar. 256MB is quite a lot of space... developers have just gotten lazy.



    There is a lot of truth in the point that developers have gotten lazy. However on the flip side I can see lots of people, programmers developers or whatever saying why waste my time. The reality is silicon is cheap. This doesn't even take into account that some apps simply won't fit into the available ram easily.



    Quote:

    A bit of memory optimization work can make an enormous difference, improve performance at the same time, and means you have a cheaper device and longer battery life. For the vast majority of apps, its plenty. For other apps... either this isn't the right device (get an iPad Pro), or find an innovative solution.



    Well that iPad Pro might come earlier than you think. Mostly because the RAM allocation leaves Apple open to agressive competition. {If the competition ever catches up}



    in anyevent this chronic lake of RAM is a real Apple problem across the product line.

    Quote:









    And they did that precisely because compact instructions are crucial in small, mobile devices. Doubling the size of instructions is a huge price to pay and unnecessary. If they really want to support 64-bit mode eventually, they can simply have a completely different instruction encoding in 64-bit mode. It actually takes very little new instructions to support. You're right that thumb mode is pretty much full, but the thumb mode doesn't need to support 64-bit mode.



    To be honest I don't see a demand from Apple for a 64 bit mode ARM in the next two years. I suspect that the next big jump will come from multi core. I'd love to see Apple get garbage collection working on iPhone OS so that we can have something like Ma.Ruby running directly on the hardware. A self hosted development environment is exactly what iPad needs.





    Dave
  • Reply 16 of 21
    programmerprogrammer Posts: 3,458member
    Quote:
    Originally Posted by wizard69 View Post


    Well that all depends upon what you mean by "all that much". The fact that iPad starts out with less RAM than the current iPhone for apps worries me. Maybe it can be optimized a bit but one reality that we have to keep an eye on is that the larger screen requires more data space simply because of all those pixels.



    Most apps don't need bitmapped images, they are better off doing primitive-based drawing and that makes them more resolution independent.



    Quote:

    There is a lot of truth in the point that developers have gotten lazy. However on the flip side I can see lots of people, programmers developers or whatever saying why waste my time. The reality is silicon is cheap. This doesn't even take into account that some apps simply won't fit into the available ram easily.



    This is a bit ridiculous... desktop machines a decade ago had less available RAM and did highly sophisticated things -- more than what is expected of the iPad today. The majority of apps that you might want to run on an iPad simply don't need that much memory. Based on a quick bit of research we're talking about something on the order of 50-100 megabytes. Really guys, that's TONS of memory. I've done quite sophisticated apps in sub-10 MB machines. Thousands of excellent games shipped on the 32MB PS2 and 24MB GameCube. And in both cases they were missing some of the capabilities in iPhoneOS that help with the memory pressure.



    Quote:

    Well that iPad Pro might come earlier than you think. Mostly because the RAM allocation leaves Apple open to agressive competition.



    It'll come because they'll want a bigger margin from people willing to pay for more RAM and hardware features (like a front facing camera). The vast majority of people in the market will have no issue at all with 256MB of RAM.



    Quote:

    To be honest I don't see a demand from Apple for a 64 bit mode ARM in the next two years. I suspect that the next big jump will come from multi core. I'd love to see Apple get garbage collection working on iPhone OS so that we can have something like Ma.Ruby running directly on the hardware. A self hosted development environment is exactly what iPad needs.



    I agree about the lack of 64-bit need... my original reply said a decade before it'll really be needed! Moore says double every two years, need 5 doublings to exceed 4 GB... ergo, 10 years.



    Not sure what you mean by "get garbage collection working in iPhoneOS"? You mean implement the Obj-C GC mechanism introduced in Obj-C 2.0? That has nothing to do with GC implementations that come with dynamic languages like Ruby. Apple has clearly indicated that they want "native" applications, but I have a hard time seeing how they are going to prohibit Obj-C applications with embedded dynamic languages that use their own GC scheme. Such a "self-hosted" environment has potential for customizing particular applications, but I have to agree with Apple that you wouldn't want to be doing from-scratch app development in a dynamic language.
  • Reply 17 of 21
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by Programmer View Post


    Most apps don't need bitmapped images, they are better off doing primitive-based drawing and that makes them more resolution independent.



    This of course is true but frankly it doesn't mean that the use of bit maps cease. Further you still need buffers to draw the images to.

    Quote:

    This is a bit ridiculous... desktop machines a decade ago had less available RAM and did highly sophisticated things -- more than what is expected of the iPad today.



    Well that depends upon what you expect from the iPad. Besides that I don't really think desktop X86 systems had the GUI responsiveness of the iPad. If nothing else it is extremely impressive what Apple has accomplished with A4.



    The lack of RAM dramatically limits what can be produced for the iPad. I suspect this is intentional.

    Quote:

    The majority of apps that you might want to run on an iPad simply don't need that much memory. Based on a quick bit of research we're talking about something on the order of 50-100 megabytes. Really guys, that's TONS of memory. I've done quite sophisticated apps in sub-10 MB machines. Thousands of excellent games shipped on the 32MB PS2 and 24MB GameCube. And in both cases they were missing some of the capabilities in iPhoneOS that help with the memory pressure.



    I have a very long history with computing hardware in various forms. Thsi goes all the way back to Vic 20s on the PC side and Texas Instruments 5TI controllers on the industrial side. The common theme on all the platforms was the lack of RAM. It is a huge problem when a system isn't upgradeable. As to Apple equipement I was an original 68K Mac Plus owner and again memory was an issue almost immediately.



    Now I completely realize we are talking about substantially more RAM in this rev of the iPad. But we are also talking about a platform running a version of Unix with an SDK that isn't known for being speedy. In any event 120 MB is just the new 640K

    Quote:







    It'll come because they'll want a bigger margin from people willing to pay for more RAM and hardware features (like a front facing camera). The vast majority of people in the market will have no issue at all with 256MB of RAM.



    I don't think most users even know how much RAM is in the device. Nor do people walking into an Apple store understand what RAM is.



    As an aside I was in an Apple store earlier tonight, lusting madly over an iPad and happened to hear a conversation between another customer and a clerk. The customer asked if the iPad had enough memory to surf the Internet and then started taking GB. People simply don't understand the difference between RAM and secondary storage anymore.



    Now we all know the iPad surfs the net pretty well especially with a fast WiFi connection. A fast connection masks the redownloading of sites. On a 3G connection though the ability to buffer pages would be far more useful. That is only part of the issue though because the lack of RAM makes things like Flash on iPad impossible.

    Quote:







    I agree about the lack of 64-bit need... my original reply said a decade before it'll really be needed! Moore says double every two years, need 5 doublings to exceed 4 GB... ergo, 10 years.



    I don't want to predict when 64bit processors and and the associated RAM will become viable in an iPad device. The thing is Intel isn't that far off for processor hardware, it is mostly a power reduction effort now. Low power RAM is separate issue but process shrinks happen here just the same. Now current software demands on tablets don't require GB of RAM but from the marketing standpoint it is a place other manufactures can compete against. In some regards it will result in far better performance when compared to an iPad.



    Of course none of this competitive hardware exists right now but considering Apples success there have to be a few crash projects in the competitors labs right now. Since iPad is a considerable grand slam success the competition will be looking for ways to exploit the devices few weaknesses.

    Quote:



    Not sure what you mean by "get garbage collection working in iPhoneOS"? You mean implement the Obj-C GC mechanism introduced in Obj-C 2.0?



    Last I knew there is no garbage collection on iPhone OS 3.x Last I knew iPhone OS handled memory management by reference counting. Over on the Mac Ruby list bits of discussion have taken place to figure out how to deal with that. I wouldn't be surprised to see garbage collection in the future as iPhone processors become more powerful.

    Quote:

    That has nothing to do with GC implementations that come with dynamic languages like Ruby. Apple has clearly indicated that they want "native" applications, but I have a hard time seeing how they are going to prohibit Obj-C applications with embedded dynamic languages that use their own GC scheme.



    Mac Ruby is a fork of sorts built upon the Mac runtime. It is a very interesting concept but is barely ready for Mac development much less iPhone. Mac Ruby is an interesting concept worth looking into and could be "The" scripting platform in the future for the Mac.

    Quote:

    Such a "self-hosted" environment has potential for customizing particular applications, but I have to agree with Apple that you wouldn't want to be doing from-scratch app development in a dynamic language.



    That is pretty much Mac Ruby's goals. Granted it is a ways off for release 1 but the whole project is driven by Apple. It depends upon what self hosted means but right now Mac Ruby is just another language for the XCode environment. Nothing would prevent a dedicated development envronment though.





    Dave
  • Reply 18 of 21
    programmerprogrammer Posts: 3,458member
    Quote:
    Originally Posted by wizard69 View Post


    This of course is true but frankly it doesn't mean that the use of bit maps cease. Further you still need buffers to draw the images to.



    True, but those don't actually amount to all that much... especially on the iPad where you're really only looking at one app at a time, and it doesn't have as heavily windowed a display as the Mac.



    Quote:

    Well that depends upon what you expect from the iPad. Besides that I don't really think desktop X86 systems had the GUI responsiveness of the iPad. If nothing else it is extremely impressive what Apple has accomplished with A4.



    GUI responsiveness doesn't arise from having lots of memory... quite the opposite, actually. These days, the more memory you need to process, the slower your processing.



    Quote:

    The lack of RAM dramatically limits what can be produced for the iPad. I suspect this is intentional.



    Sounds like a conspiracy theory to me. I maintain that it is merely the sweet-spot that the boys at Apple have currently identified.



    Quote:

    I have a very long history with computing hardware in various forms. Thsi goes all the way back to Vic 20s on the PC side and Texas Instruments 5TI controllers on the industrial side. The common theme on all the platforms was the lack of RAM. It is a huge problem when a system isn't upgradeable. As to Apple equipment I was an original 68K Mac Plus owner and again memory was an issue almost immediately.



    You and me both. The 128K Mac was a bear to use and I'm glad I'll never have to flip another floppy.



    Quote:

    Now I completely realize we are talking about substantially more RAM in this rev of the iPad. But we are also talking about a platform running a version of Unix with an SDK that isn't known for being speedy. In any event 120 MB is just the new 640K



    This is a non-linear situation. Going from 128K to 128M enables far more applications than going from 128M to 128G. Indeed, given the bandwidth constraints on RAM (relative to the potential rate of computation), a 128G application wouldn't be something you'd want to use. Just consider how long it would take to fill that memory!





    Quote:

    I don't think most users even know how much RAM is in the device. Nor do people walking into an Apple store understand what RAM is.



    Good. That's as it should be. The device should be judged (by the consumer) on what you can do with it versus what it costs (and numerous other non-RAM size factors).





    Quote:

    Now we all know the iPad surfs the net pretty well especially with a fast WiFi connection. A fast connection masks the redownloading of sites. On a 3G connection though the ability to buffer pages would be far more useful. That is only part of the issue though because the lack of RAM makes things like Flash on iPad impossible.



    Browsers typically cache pages to the file system, so its not necessarily about the amount of RAM. Adobe's Flash has plenty of issues, including excessive RAM usage.





    Quote:

    Of course none of this competitive hardware exists right now but considering Apples success there have to be a few crash projects in the competitors labs right now. Since iPad is a considerable grand slam success the competition will be looking for ways to exploit the devices few weaknesses.



    Back in January when iPad was announced techie people all around were going on and on about how its specs were inferior to this and that new tablet announced at CES. The iPad's price point raised eyebrows, but most techies were going on at length about "its too limited, its too closed, it doesn't have this, it doesn't have enough of that". They all missed the point. And that's why only the iPad has passed a million sales. In a month.





    Quote:

    Last I knew there is no garbage collection on iPhone OS 3.x Last I knew iPhone OS handled memory management by reference counting. Over on the Mac Ruby list bits of discussion have taken place to figure out how to deal with that. I wouldn't be surprised to see garbage collection in the future as iPhone processors become more powerful.



    Mac Ruby is a fork of sorts built upon the Mac runtime. It is a very interesting concept but is barely ready for Mac development much less iPhone. Mac Ruby is an interesting concept worth looking into and could be "The" scripting platform in the future for the Mac.



    That is pretty much Mac Ruby's goals. Granted it is a ways off for release 1 but the whole project is driven by Apple. It depends upon what self hosted means but right now Mac Ruby is just another language for the XCode environment. Nothing would prevent a dedicated development envronment though.



    Scripting environments continue to be a dime a dozen. I can't say that yet another one interests me. If they are stuck waiting for an Apple GC solution (which will come eventually, but as it is RAM sensitive it will take a while) then its even less interesting. I'd just toss Lua into an app and go from there.
  • Reply 19 of 21
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by Programmer View Post


    True, but those don't actually amount to all that much... especially on the iPad where you're really only looking at one app at a time, and it doesn't have as heavily windowed a display as the Mac.







    GUI responsiveness doesn't arise from having lots of memory... quite the opposite, actually. These days, the more memory you need to process, the slower your processing.



    On this one I have to disagree. The more of the GUI that you can keep in memory ready to go the more responsive the device will be. Even on iPad where trips to secondary storage for data are relatively quick and of low latenancy.

    Quote:

    Sounds like a conspiracy theory to me. I maintain that it is merely the sweet-spot that the boys at Apple have currently identified.



    Could be. I just think it is more likely an issues of engineering in limitations so that the device doesn't take to much market share from the laptop lineup.

    Quote:

    You and me both. The 128K Mac was a bear to use and I'm glad I'll never have to flip another floppy.



    Nor do I want to repeat what I was paying for a puny harddisk back then. It is interesting though that HD have actually caught up with the needs of many users even RAM in desktop machines these days allows for viable operation of the computer.

    Quote:





    This is a non-linear situation. Going from 128K to 128M enables far more applications than going from 128M to 128G. Indeed, given the bandwidth constraints on RAM (relative to the potential rate of computation), a 128G application wouldn't be something you'd want to use. Just consider how long it would take to fill that memory!



    Who is talking about a 128GB of RAM, going to 512MB on the iPad would be fantastic as just about all of it would be available to the users app. In effect instead of 1xxMB of RAM for user apps you would end up with 36xMB of RAM. This is nothing to sneeze at yet should still be doable in the multiple chip module.

    Quote:







    Good. That's as it should be. The device should be judged (by the consumer) on what you can do with it versus what it costs (and numerous other non-RAM size factors).



    That is fine for people that don't care but that isn't all of consummer space. My primary problem with Apple here is that they go out of their way to hide the spec. To me it is down right sleazy and honestly I hate using that word in conjunction with Apple.



    Not to use another car analogy but most people don't care about a cars specs and those cars are often sold without regards to the internals. However to some the particulars are important be it for practicle reasons (power to pull a trailor) or bragging rights at the bar. So auto manufactures provide all the specs to the consummer freely. The thing is the auto manufactures know that the buyers are for the most part stupid with respect to engineering of the vehicle. At the same time though they try to appeal to the user with more demanding needs often offering options to tailor the vehicle to a specific need.



    In contrast Apples approach is to assume that all consummers are stupid. It is actually an insult to have an important parameter, like installed RAM, hidden away.

    Quote:







    Browsers typically cache pages to the file system, so its not necessarily about the amount of RAM. Adobe's Flash has plenty of issues, including excessive RAM usage.



    Which brings us back to the reality that iPad doesn't have enough RAM. Right now it couldn't support Flash even if Apple wanted to.

    Quote:







    Back in January when iPad was announced techie people all around were going on and on about how its specs were inferior to this and that new tablet announced at CES. The iPad's price point raised eyebrows, but most techies were going on at length about "its too limited, its too closed, it doesn't have this, it doesn't have enough of that". They all missed the point. And that's why only the iPad has passed a million sales. In a month.



    It isn't so much as missing the point as seeing the market differently. IPad will continue to evolve, as such I would expect to see the hardware chage somewhat in the future. For a gen one product iPad is pretty darn good but that doesn't mean it can't be improved. RAM is just the easy upgrade.

    Quote:







    Scripting environments continue to be a dime a dozen. I can't say that yet another one interests me. If they are stuck waiting for an Apple GC solution (which will come eventually, but as it is RAM sensitive it will take a while) then its even less interesting. I'd just toss Lua into an app and go from there.



    Yeah the pop up faster than ants at a picnic. The difference here is that Apple supports Mac Ruby thus the potential for Mac Ruby to become THE scripting language on the Mac and possibly the IPhone OS devices. I'm not even a real fan of Ruby but would likely take a strong interest if Apple where to say this is the way on Mac. The justification is the same as for learning Objective C.







    Dave
  • Reply 20 of 21
    programmerprogrammer Posts: 3,458member
    Quote:
    Originally Posted by wizard69 View Post


    On this one I have to disagree. The more of the GUI that you can keep in memory ready to go the more responsive the device will be. Even on iPad where trips to secondary storage for data are relatively quick and of low latenancy.



    While you are correct, once you're past a few 10s of megabytes even the biggest GUI is all in memory. If its not, then your GUI is going to be grotesquely inefficient and it will run slowly simply because there is too much memory to be processed efficiently. The GUI you see in most apps is a tiny amount of memory, and applications aren't really that much code... what usually scales up in size is the data model, and even that isn't overly large for the vast majority of apps that are going to run well on a 1GHz ARM. Games are the biggest memory hogs, usually, because of their complex 3D models and multitude of texture maps -- yet we're already at Wii calibre capabilities on the iPad.



    Quote:

    Could be. I just think it is more likely an issues of engineering in limitations so that the device doesn't take to much market share from the laptop lineup.



    Price, battery, size... I don't think you have to look any farther to justify the iPad's current configuration. And doubling the tablet's RAM isn't going to substantially change its cannibalization of the laptop market... it is form factor and price that are doing the lion's share of that.



    Quote:

    Who is talking about a 128GB of RAM, going to 512MB on the iPad would be fantastic as just about all of it would be available to the users app. In effect instead of 1xxMB of RAM for user apps you would end up with 36xMB of RAM. This is nothing to sneeze at yet should still be doable in the multiple chip module.



    I was making the point that there is diminishing returns in increasing the available memory. There is a sweet spot determined by incremental benefit, cost, power, heat, size. I think Apple chose wisely, and has the added advantage of using the same A4 in the iPad and the iPhone.



    Quote:

    That is fine for people that don't care but that isn't all of consummer space. My primary problem with Apple here is that they go out of their way to hide the spec. To me it is down right sleazy and honestly I hate using that word in conjunction with Apple.



    It took me one Google search to find the information I needed. Seems like a non-issue to me.



    Quote:

    Which brings us back to the reality that iPad doesn't have enough RAM. Right now it couldn't support Flash even if Apple wanted to.



    It could if Flash was competently designed and implemented (Apple's HTML5 implementation does quite nicely, thank you). Flash is such a mess, its remarkable. Adobe's little empire can't fold up and collapse soon enough, IMO... and this isn't from listening to Apple & MS slag it, I have enough first hand experience with it to hate it on its own demerits.



    Quote:

    For a gen one product iPad is pretty darn good but that doesn't mean it can't be improved. RAM is just the easy upgrade.



    And I don't disagree. Next year they'll rev it with an A5 processor that will have 512MB RAM, 2 cores, twice as good a GPU, and lower power consumption.
Sign In or Register to comment.