Apple's A5 processor could pave the way for larger chip trend

1235

Comments

  • Reply 81 of 120
    One of the big issues for Intel is that the company is setup to sell $150.00 CPUs. The ARM cores we are talking about cost more like $15-20. They need to sell more expensive chips as they have the huge overhead of very state of the art fabs. But the fact of the matter is that mainframes more or less were replaced by mini-computer that were replaced by pc that were replaced by laptops and now tablets. Intel by not playing in this market is making themselves irrelevant. I think in the next 2-4 years we will not have Intel based machines at home though they will hang on to the server market for much longer.
  • Reply 82 of 120
    alandailalandail Posts: 755member
    Quote:
    Originally Posted by ash471 View Post


    What you say is certainly true for the very near term (e.g., 2 years or so). However, what if we look further out. Will ARM be 64bit? 6 cores? 2 GHz clock? I think the point that is being made is that ARM is accelerating fast and the improvements in Intel chips are becoming less important. The issue isn't that intel can't keep up (they can). Intel's problem is that their increases in horsepower are becoming increasingly irrelevant to people's computing needs. At some point we may cross a line where ARM provides sufficient horsepower and Intel's advantage is irrelevant. Keep in mind that despite Intel's prowess, it has struggled to reduce power consumption. It also doesn't have a terribly good track record with SoC. Thus, there could very well be a time (maybe in the next 5 years) when ARM starts to replace chips in the mobile PC realm and potentially some day in consumer desktops.



    I think the failure of ARM in Netbooks had a lot to do with 1. premature use of ARM (it still isn't there), 2. Lack of an OS optimized to take advantage of ARM 3. Microsoft using its marketing power to push WinXP. With MS out of the picture, an OS optimized for ARM, and more advanced ARM chips, we may see a different result.



    Also, I think the major difference between iOS and OS X is touch vs. mouse optimization. You will never see OS X on a touch device and you will never see iOS on a Mac. However, there is no inherent reason why Apple will prefer one CPU architecture over another. Of course, switching isn't trivial, but Apple has shown that it can carry out a switch without a hitch. I think the real issue will always be a balance between computing power, power consumption, and cost. Right now Intel has such an advantage in the computing power (needed computing power) that it outweighs power consumption and cost in Macs. But things are changing fast........



    Well, there are three big differences between switching from PowerPC to Intel and switching from Intel to ARM.

    1 - the switch from PowerPC to Intel involved using the next generation processor that was faster, thus Rosetta could do a reasonable job emulating PowerPC apps. In this case the move wouldn't be to a more powerful processor, thus emulation wouldn't work as well. This is a minus.

    2 - Modern OS X apps are cocoa, which is a lot easier to move to new architectures with universal binaries than Carbon apps were. This is a plus.

    3 - unlike prior architecture changes, this isn't being suggested as a complete migration, but rather as a move to supporting dual architectures. Something they have just gotten away from by killing Rosetta in Lion. Again, this isn't as bad now that apps are carbon apps are pretty much history.



    I'm not sure how much sense it makes for MacBooks as I see iPads filling that need. But perhaps there's a case for a less expensive iMac going this route as some people still need desktop machines with large screens without really needing faster CPUs than they have today.
  • Reply 83 of 120
    Quote:
    Originally Posted by solipsism View Post


    LOL I?d love to see a water-cooled smartphone.



    I do believe that all current cell phone models can be water cooled --- once.
  • Reply 84 of 120
    hattighattig Posts: 860member
    I don't see Apple moving Macs to ARM in the near future. Several things will be needed for this to occur:



    1) Single-threaded performance increases. ARM Cortex A9 is great for what it is, and outperforms an Atom at the same clock speed. But Atom isn't anything like a 3GHz+ Sandy Bridge. ARM Cortex A15 will be better of course, but not good enough - except maybe in a MacBook Air (or iPad Netbook type configuration - iPad in MBA form factor with ability to run Mac apps).



    2. 64-bit. ARM Cortex A9 is strictly 32-bit all over. ARM Cortex A15 will extend the addressing range, but apps will still be 32-bit. However systems will be able to have more than 4GB of memory, and each app will be able to have a full 4GB memory range. However full 64-bit capability isn't there, and ARM haven't released any plans for such yet. MIPS, an ARM competitor, does have a 64-bit version however, so I expect ARM are working hard on it internally, but it isn't a trivial thing extending an architecture to 64-bits.



    3. Non-embedded style of CPUs. 128-bit memory buses and SO-DIMM support. PCIe integration. SATA 3. USB 3. And so on. At least Apple is in control of this - they have a 64-bit memory bus on the A5 (albeit internal, with layered DRAM). These designs will be a different branch completely from the AX 'embedded' chips, they will use more power and not be suitable for the iPad or below. I assume these will start appearing in around 2014/15.



    Let's call them the E series, and assume the first generation will be for a cost-reduced MacBook Air type product (that might have a different branding too than Mac). The transistor budget will be around 1 billion transistors, and the die size could be up to 150mm^2. I could see that we could have a single, 20nm-ish chip. Four or Eight ARM Cortex A15+ cores, each with 1MB L2 cache. PowerVR 6 series graphics (let's assume SGX543MP16 performance at least, running at 4x the clock speed too - i.e., 32x faster than A5). It will have an external memory bus - 128-bit, driving DDR3+ memory. It may be that I/O will be delegated to an external chip, with PCIe3 integrated on board the CPU connecting to discrete graphics and the I/O chip.



    It would be quite exciting to see the internal product plans at Apple. In the meantime we will just have to live with anticipation, releases and teardowns!



    Please note that I believe that the A5 was designed to drive a Retina Display iPad, hence the excessive graphics. The bonus being that the excessive graphics give anti-aliasing for free in games, giving a more-retina effect anyway. I don't expect 2012's A6 to actually offer more graphics capability, and to be more about the die-shrink to 32nm or 28nm. It could have four ARM cores however.
  • Reply 85 of 120
    d-ranged-range Posts: 396member
    Quote:
    Originally Posted by wizard69 View Post


    And running pathetically slow with little capability to tackle the bigger jobs. I just don't think you have an understanding of just how far behind ARM is performance wise, such machines would simply not meet the performance needs of many Mac users.



    Performance of current ARM designs isn't actually that bad, the latest chips (including the A5) are faster than for example current Intel Atoms for many tasks, and nettops and netbooks with Atoms have been sold by the millions. The gap between a high-end x86 chip from Intel or AMD is huge, not denying that, but for the typical garden variety computing that many people do, a competent ARM SoC would probably suffice and have a few advantages over faster x86 chips.



    So while I don't see Apple actually switching to ARM for Macs, I don't rule out the possibility that they could introduce a MacBook Air type of laptop that running on an ARM SoC. Maybe even use a dual-CPU solution that lets you boot your computer either on the ARM chip, or on the x86, depending on your computing needs. This could also be an interesting idea for iOS application compatibility by the way.
  • Reply 86 of 120
    abarryabarry Posts: 31member
    Quote:
    Originally Posted by extremeskater View Post


    Apple didn't build the A5 chip, Samsung built the A5.



    "Designed in California, built in Korea"!
  • Reply 87 of 120
    notmenotme Posts: 5member
    Quote:
    Originally Posted by Ssampath View Post


    One of the big issues for Intel is that the company is setup to sell $150.00 CPUs. The ARM cores we are talking about cost more like $15-20. They need to sell more expensive chips as they have the huge overhead of very state of the art fabs. But the fact of the matter is that mainframes more or less were replaced by mini-computer that were replaced by pc that were replaced by laptops and now tablets. Intel by not playing in this market is making themselves irrelevant. I think in the next 2-4 years we will not have Intel based machines at home though they will hang on to the server market for much longer.



    I wrote this the other day and I think it fits perfectly:



    http://mattrichman.tumblr.com/post/4.../intel-is-dead
  • Reply 88 of 120
    Quote:
    Originally Posted by notme View Post


    I wrote this the other day and I think it fits perfectly:



    http://mattrichman.tumblr.com/post/4.../intel-is-dead



    Intel is treating it's one entry in this field the Atom as a step child. The graphics on the chip are crippled, it places major restrictions on how people can use them and it is still made with 45nm process when most of their other chips are in 32nm heading to 22. What works in one era does not work in another. Looks very much like Intel is going to lose this.
  • Reply 89 of 120
    notmenotme Posts: 5member
    Quote:
    Originally Posted by Ssampath View Post


    Intel is treating it's one entry in this field the Atom as a step child. The graphics on the chip are crippled, it places major restrictions on how people can use them and it is still made with 45nm process when most of their other chips are in 32nm heading to 22. What works in one era does not work in another. Looks very much like Intel is going to lose this.



    They probably will. x86 is way too bloated for use in a tablet or a phone. If they don't completely reboot their operations they're dead.
  • Reply 90 of 120
    splinemodelsplinemodel Posts: 7,311member
    Quote:
    Originally Posted by abarry View Post


    "Designed in California, built in Korea"!



    This is still preferred to: designed in California, built in China. SK is probably 50 years ahead of China in regard to societal development. Most people don't know this, but in the 80's SK had something of a silent revolution which brought in democracy and free market policies. Since then it has been up and up.\\





    Quote:
    Originally Posted by notme View Post


    They probably will. x86 is way too bloated for use in a tablet or a phone. If they don't completely reboot their operations they're dead.



    This is an interesting thing. The x86 is a bloated, ugly architecture. However, it can be streamlined nicely, which is the way it is implemented in every Desktop/Laptop chip since the first Centrinos. The problem is, this streamlining requires a _fixed_ number of gates for translation logic, so it can scale-up well, but not down. In other words, I seriously doubt the ARM has a chance to contend for the desktop/laptop market -- Intel is already so entrenched, and every innovation they make is effectively standardized and rolled into gcc before you have a chance to blink. On the same token, I serious doubt x86 has a chance in the mobile market -- ARM is so entrenched, yadda yadda.



    The A15 is, in my opinion, maybe not the best way for ARM to go. I am still green here, and I need to do more homework, but IMHO ARM's best opportunity is to focus more energy on scaling down. We are moving more computing power to the edge, and the first example here was the smartphone/tablet/kindle revolution. Now, the wave-riders are basically just scaling these up, feeding the marketing machine for bigger-faster in the same way that the PC market used to do 10 years ago. But the big winners will be the people who start the next wave, and that is sure to be something smaller and lighter than before. What's next: smartcards ... err, super-smartcards. I'm serious. The technologies required to produce a credit card sized computing platform, with wireless I/O and even display & buttons, are all coming together this year. The fun part about this technology is that it's another 1000x reduction of the way you think about power: PC's are concerned with watts, phones milliwatts, and cards microwatts. For microwatt tech, the sweetspot is somewhere between 90 - 130 nm, and there are loads of fabs now, at this spec.
  • Reply 91 of 120
    addaboxaddabox Posts: 12,665member
    Quote:
    Originally Posted by Splinemodel View Post


    The A15 is, in my opinion, maybe not the best way for ARM to go. I am still green here, and I need to do more homework, but IMHO ARM's best opportunity is to focus more energy on scaling down. We are moving more computing power to the edge, and the first example here was the smartphone/tablet/kindle revolution. Now, the wave-riders are basically just scaling these up, feeding the marketing machine for bigger-faster in the same way that the PC market used to do 10 years ago. But the big winners will be the people who start the next wave, and that is sure to be something smaller and lighter than before. What's next: smartcards ... err, super-smartcards. I'm serious. The technologies required to produce a credit card sized computing platform, with wireless I/O and even display & buttons, are all coming together this year. The fun part about this technology is that it's another 1000x reduction of the way you think about power: PC's are concerned with watts, phones milliwatts, and cards microwatts. For microwatt tech, the sweetspot is somewhere between 90 - 130 nm, and there are loads of fabs now, at this spec.



    Interesting thoughts, but the problem with scaling down is that we're human being with human being sized parts and there is a natural lower limit to the size of things we can manipulate.



    I think the cell phone is already as small as anyone wants to go with an integrated device, so the future there will be in packing more power into the same power/size envelope.



    Even smaller devices that use wireless I/O don't seem to solve any problem. If its about having a lot of computing power in a negligible package, why not just put that power on the display device, which is going to be necessary regardless? If its about having your computing environment on your person, isn't that more likely to become cloud based? And I certainly don't want to be carrying around all my data on a tiny card, which is going to be replicated on remote servers or my home storage anyway.



    In other words, in a world moving towards ever more mobile thin clients, the emphasis is going to be on the display and interconnectivity, not on tiny key fobs that can't do anything without hooking up to something else, and can't do anything that connected device with a display can't do just as well. At least until the display device becomes a contact lens.
  • Reply 92 of 120
    ssampathssampath Posts: 10member
    Quote:
    Originally Posted by Splinemodel View Post




    This is an interesting thing. The x86 is a bloated, ugly architecture. However, it can be streamlined nicely, which is the way it is implemented in every Desktop/Laptop chip since the first Centrinos. The problem is, this streamlining requires a _fixed_ number of gates for translation logic, so it can scale-up well, but not down. In other words, I seriously doubt the ARM has a chance to contend for the desktop/laptop market -- Intel is already so entrenched, and every innovation they make is effectively standardized and rolled into gcc before you have a chance to blink. On the same token, I serious doubt x86 has a chance in the mobile market -- ARM is so entrenched, yadda yadda.



    The A15 is, in my opinion, maybe not the best way for ARM to go. I am still green here, and I need to do more homework, but IMHO ARM's best opportunity is to focus more energy on scaling down. We are moving more computing power to the edge, and the first example here was the smartphone/tablet/kindle revolution. Now, the wave-riders are basically just scaling these up, feeding the marketing machine for bigger-faster in the same way that the PC market used to do 10 years ago. But the big winners will be the people who start the next wave, and that is sure to be something smaller and lighter than before. What's next: smartcards ... err, super-smartcards. I'm serious. The technologies required to produce a credit card sized computing platform, with wireless I/O and even display & buttons, are all coming together this year. The fun part about this technology is that it's another 1000x reduction of the way you think about power: PC's are concerned with watts, phones milliwatts, and cards microwatts. For microwatt tech, the sweetspot is somewhere between 90 - 130 nm, and there are loads of fabs now, at this spec.



    I don't think the ARM needs to be in the desktop/notebook market. Intel has those - but those will become smaller markets as the phone/tablet markets become bigger. In fact this quarter for the first time I think more smartphones shipped than PCs.



    From desktops to laptops to tablets it looks like there is a 5-10x reduction is wattage desktops about 100W to 20W for laptops to about 2W for tablets. Will the next jump be to something that uses less than .5W? Each of these transitions has taken 5-10 years and I am sure new technologies will come along to make this happen.
  • Reply 93 of 120
    splinemodelsplinemodel Posts: 7,311member
    Quote:
    Originally Posted by addabox View Post


    ... If its about having your computing environment on your person, isn't that more likely to become cloud based? ...



    And I certainly don't want to be carrying around all my data on a tiny card, which is going to be replicated on remote servers or my home storage anyway. ...



    In other words, in a world moving towards ever more mobile thin clients ...



    Whether you think you like it or not, it is already starting. The cloud computing vision is mostly a marketing ploy by the IBMs, Ciscos, HPs of the world. They want to sell big, expensive servers and crap like that. The problem is that cloud computing just doesn't scale, and it is horribly energy inefficient even when it does work. Pushing query down-down-down ends up working better and using, well, no grid power at all because at the small level it is easy to scavenge energy.
  • Reply 94 of 120
    addaboxaddabox Posts: 12,665member
    Quote:
    Originally Posted by Splinemodel View Post


    Whether you think you like it or not, it is already starting. The cloud computing vision is mostly a marketing ploy by the IBMs, Ciscos, HPs of the world. They want to sell big, expensive servers and crap like that. The problem is that cloud computing just doesn't scale, and it is horribly energy inefficient even when it does work. Pushing query down-down-down ends up working better and using, well, no grid power at all because at the small level it is easy to scavenge energy.



    I disagree. Powerful mobile devices plus ubiquitous connectivity are moving us in this direction. The smartphone is the going to be the most widely deployed computing platform in the world soon enough, and it simply makes sense to use devices like these in conjunction with network services and data. No one is going to carry all of their document on a phone, much less on a credit card, but it does make sense to use a small device to access remote data-- even if that data is on a server in your home.



    And that still doesn't obviate the need for a display, which can't get smaller than it is now and remain useful. I could see a device as thin as a credit card, with the entire face a touch screen, but that's just an evolution of the current smartphone design.
  • Reply 95 of 120
    ash471ash471 Posts: 705member
    Quote:
    Originally Posted by alandail View Post


    Well, there are three big differences between switching from PowerPC to Intel and switching from Intel to ARM.

    1 - the switch from PowerPC to Intel involved using the next generation processor that was faster, thus Rosetta could do a reasonable job emulating PowerPC apps. In this case the move wouldn't be to a more powerful processor, thus emulation wouldn't work as well. This is a minus.

    2 - Modern OS X apps are cocoa, which is a lot easier to move to new architectures with universal binaries than Carbon apps were. This is a plus.

    3 - unlike prior architecture changes, this isn't being suggested as a complete migration, but rather as a move to supporting dual architectures. Something they have just gotten away from by killing Rosetta in Lion. Again, this isn't as bad now that apps are carbon apps are pretty much history.



    I'm not sure how much sense it makes for MacBooks as I see iPads filling that need. But perhaps there's a case for a less expensive iMac going this route as some people still need desktop machines with large screens without really needing faster CPUs than they have today.



    I find number 3 to be quite persuasive. I agree that it wouldn't be good for Apple to support two architectures with the same OS for anything but a temporary transition period. Even though Apple is the expert in ARM with mobile, ARM in desktops and servers is a whole different ball game. Mobile users will readily accept functionality restrictions in exchange for ultra portability. There is absolutely no way a Mac Pro user is going to accept those compromises. (which is why I own an iPad and a Mac Pro). I doubt Apple would try to make the switch to ARM on their own. If Apple switches to ARM in desktops, it will probably be a transition that a fair number of PC manufacturers take....which means such a change is iffy and certainly not happening anytime soon. I know the ARM community is excited about competing with Intel in desktop and server. But until we see how Intel performs with GPU integration and increased power efficiency and what ARM can do to increase clock speeds and computing power, there is no way to predict what will happen. Should be interesting....
  • Reply 96 of 120
    ash471ash471 Posts: 705member
    To know whether an ARM chip would be useful in a Macbook Air, we need to consider how much battery weight is added to run an Intel chip instead of an ARM chip. I suppose there is a certain screen size where the power requirements for the screen begin to dwarf the power requirements for the CPU. That cutoff may be in the range of 11-13 inches. If that is the case, switching from Intel to ARM makes much less sense in MacBooks and almost no sense in desktops. I'm increasly more convinced that the swtich to ARM in Macbooks doesn't make much sense unless the entire PC industry moves in that direction.
  • Reply 97 of 120
    clesenclesen Posts: 2member
    I've done extensive research on this topic, as I found that the Ipad was unable to keep up with my processing needs. Especially when I was watching movies, which is one of the main reasons I purchased the ipad. Should I even bother with Ipad 2?
  • Reply 98 of 120
    aaronjaaronj Posts: 1,595member
    Quote:
    Originally Posted by fisherman View Post


    I do believe that all current cell phone models can be water cooled --- once.



    Having lost my very first Mac, a 17" TiPb by kicking a can of Coke onto it while asleep, this post makes me very sad.



    J/K



    Quote:
    Originally Posted by DominoXML View Post


    The big advantage of Apple is the double take on optimization. Starting with the A4 they not only optimized the software for the chip but also began to optimize the SoC for the needs of their software.



    So, this is going to be an incredibly stupid, uninformed question, but I'll ask anyways:



    Does this have to do with the APIs are developed for developers, given that Apple knows all the ins and outs of the hardware? Or am I missing the boat entirely?



    Thanks.
  • Reply 99 of 120
    mikemikebmikemikeb Posts: 113member
    Quote:
    Originally Posted by Splinemodel View Post


    The A15 is, in my opinion, maybe not the best way for ARM to go. I am still green here, and I need to do more homework, but IMHO ARM's best opportunity is to focus more energy on scaling down. We are moving more computing power to the edge, and the first example here was the smartphone/tablet/kindle revolution. Now, the wave-riders are basically just scaling these up, feeding the marketing machine for bigger-faster in the same way that the PC market used to do 10 years ago. But the big winners will be the people who start the next wave, and that is sure to be something smaller and lighter than before. What's next: smartcards ... err, super-smartcards. I'm serious. The technologies required to produce a credit card sized computing platform, with wireless I/O and even display & buttons, are all coming together this year. The fun part about this technology is that it's another 1000x reduction of the way you think about power: PC's are concerned with watts, phones milliwatts, and cards microwatts. For microwatt tech, the sweetspot is somewhere between 90 - 130 nm, and there are loads of fabs now, at this spec.



    Mark of the beast!



    I'm not all that religious, but your post got me thinking of how the "mark of the beast" described in the Book of Revelation could theoretically become possible in modern times with microwatt ARM technology. I first thought of the "super-smartcard" application that you described in a non-MOTB way, but what if you lose the card? Well, you wouldn't easily lose your smartcard if it's embedded in your arm! Sure, there are chips the size of a grain of rice that can be implanted into your house pet. However, the device simply spits out a number when a handheld reader is put to the pet's neck. With ARM SoC's, things can get really advanced compared to today's primitive remote embedded chip devices. For demonstrative purposes, we'll call the device "BeastChip".



    And it wouldn't be super-hard to sell the concept to anyone that isn't a privacy or religious advocate: A GPS receiver built into the BeastChip's SoC would help find you if you were kidnapped, or just got lost like this child. Fugitives would be easier to track down and be arrested. If you had a teenage daughter, you could electronically access her location at any time with a smartphone or PC (imagine the peace of mind that could bring). A WiFi chip within the SoC would be used to assist in GPS location when a person is indoors or otherwise out of the GPS satellites' LOS (because your teenage daughter might be in the basement of a friend's house, after her curfew).



    The WiFi chip would also double as a way to connect to "BeastNet", the new all-electronic payment system that, as the system is matured and phased in, will become the required form of payment for anything. And by anything, I mean it'll replace the bulky EZ-Pass toll booth collection device currently placed on your car's windshield. And since the BeastNet system will become required payment for everything, it'll eliminate the need for toll booths on toll roads. Currently, you need to slow down and wait in line, possibly for a long time, to pay a toll, right? Well, now you'll pass under a human-free enclosure at full speed, and the proper toll will be deducted from your BeastNet account. Think this "e-toolbooth" is crazy? The state of Maryland recently introduced this exact concept on their newest tollroad (picture of an "e-tollbooth").



    Additionally, entering the subway wouldn't require the need to print a farecard or use an old-fashioned credit-card-sized smartcard. Just walk through the designated "BeastWay" open gate, just like the e-tollbooth on the roads, and the proper fare is deducted from your BeastNet account.



    Now how to power the device? Some scientists recently came up with the idea of creating power by harnessing internal body movements. Now it begins to make sense why Revelation would talk about how the mark must be located on the forehead or forearm. Both locations likely have a high enough concentration of blood vessels, whose movement created by one's pulse would be enough to constantly power a BeastChip-like device, even during sleep, while the head or arm might not be moving much. You wouldn't need a battery, eliminating the worry of how an old battery would affect performance over a potentially 100-year lifecycle. The WiFi antenna would serve as the needed antenna for all communications between the SoC and the outside world. No RFID chip is needed.



    (Placing the device on one's chest wouldn't make sense, because the electronics would likely interfere with pacemakers. Even those who don't have a pacemaker one day might have one the next.)



    And because there are billions of people on Earth, any BeastChip would have to be extremely mass-production friendly. If 90-130nm fabs are so common, then that makes it easy to mass-manufacture the technology quickly enough to globally roll it out in five years or less.



    (No, I wouldn't want one put in me.)
  • Reply 100 of 120
    splinemodelsplinemodel Posts: 7,311member
    Not sure this is worth getting so far off topic on, but I'll bite.



    Quote:
    Originally Posted by addabox View Post


    I disagree...



    Obviously you are free to disagree, but this push to cards is already in motion. Smartphones are not all the same, and of the total phone market, smartphones are a small percentage still. It is easier to distribute card computers to users, which are guaranteed to work with the system they are configured for, than to depend on smartphones. Big card issuing companies are betting their futures on better cards, because NFC is severely disrupting the way these companies have made money in the past.



    Quote:
    Originally Posted by mikemikeb View Post


    I'm not all that religious, but your post got me thinking of how the "mark of the beast" described in the Book of Revelation could theoretically become possible in modern times with microwatt ARM technology...



    A card is far from an implanted device. People used to carry around calculators, now phones. The number of cards in the wallet will probably stay constant, but some will be card computers. I don't see how a user-activated card is an invasion of privacy. E911 on your phone could be considered an invasion of privacy. A card with a button on it, probably not.
Sign In or Register to comment.