Apple's A5 processor could pave the way for larger chip trend

1356

Comments

  • Reply 41 of 120
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by ksec View Post


    Well, die size i was referring to Apple doesn't care about die size as much as Nvidia due to cost issues.



    TSMC 40nm LP is actually much better then what other people think. But Samsung Holds all the liscense needed to create Apple's SoC.



    Th cost of A5 comes from some other analyst and isuppli. Which predict from somewhere 20 - 30 Range. So i took the middle of the estimate as $25.



    The thing is the A5 is a multichip module. If I remember correctly a stack of three chips. This inpacts pricing versus a single die implementation. I would lean towards $40 for each assembled A5. Basically they sandwich 3 state of the art chips in the A5 package.
  • Reply 42 of 120
    d-ranged-range Posts: 396member
    Quote:
    Originally Posted by EgisCodr View Post


    I guess you guys don't realize that the Tegra 2 is last year's chip. You are comparing the A5 to a chip from 2010. The next chip will be out in products in June. Code-named Kal-El, it is the first quad-core processor on the market. Five times faster than the Tegra 2.



    That is THIS YEAR's chip. Cannot wait to see how the A5 stacks up.



    Apple would be better served buying the chips from Nvidia. By 2014, Nvidia will have chips 100 times faster than Tegra 2. This is an arena Apple will not be able to keep up in.



    I guess about everything has already been said by the people before me, but I just wanted to add how I don't believe you are falling for the same hot air NVidia is blowing with Tegra 3 again. It's like you are buying into the same PR bluff three times in a row.



    First Tegra was going to be the killer SoC from Nvidia that would show the complete embedded establishment how it was done. After a year of posing how great it was it finally ended up in just one mildly popular device from a big manufacturer (the Zune). It never delivered on any of its promises because before everyone found out it wasn't so great after all NVidia was already touting the Tegra 2 as the next big thing. This went on for another year, NVidia continually bragging about a chip no-one could actually buy yet, leaving no opportunity unused to put the Tegra 2 name everywhere, even in their 'own' Tegra 2 app store. Now devices are shipping with this fabulous chip, and surprise surprise: the CPU part is nothing special, and the GPU part is severely underperforming, it's almost a year behind the curve compared to Imagination's offerings.



    Now instead of blindly following the same 'NVidia is going to rock the ARM SoC world' mantra a third time, realize that there are companies that have a whole decade more experience designing ARM CPU's and GPU's. NVidia might be a big name in desktop GPU's, but embedded graphics is a whole different ballgame, with completely different design requirements, performance metrics, it's actually almost incomparable to desktop GPU's in terms of how you build them and what makes them fast and efficient. Desktop GPU's are mainly brute force, throwing more hardware at the problem. Mobile GPU's are nothing like that.



    When I was graduating I worked on a video decoding chip that was supposed to be used for 3D graphics somewhere in the future. Back then (10 years ago), at the company I was graduating at (it has been bought by intel a few months ago by the way), everyone was looking at Imagination already, because even then they were the only credible mobile GPU company. TEN years ago, and they have been steadily improving all these years up to now. NVidia is just entering this market and hoping their desktop GPU experience will give them a wildcard for entry into this market, but it won't. It will take them at least 5 years before they can close the gap with Imagination, if they ever do.



    NVidia is not going to take over the world with their ARM SoC's, they should be happy if they are still even in that business in 5 years instead of quietly retracting and going back to what they are good at. I have to admit I'm impressed by the way NVidia has managed to spin the public opinion (at least in the tech world) that they have a great product line-up and roadmap, because in reality, they are barely catching up.
  • Reply 43 of 120
    gatorguygatorguy Posts: 24,651member
    FWIW, there's more than one flavor of Tegra2/graphics combo. The fact that a specific Apple chipset on a specific device benchmarks better than a specific Tegra2 in a specific device doesn't mean that all combinations will have the same results does it? And Tegra isn't even the latest mobile chipset in use.



    http://www.engadget.com/2011/04/02/q...ompetition-in/
  • Reply 44 of 120
    dominoxmldominoxml Posts: 110member
    Quote:
    Originally Posted by EgisCodr View Post


    I guess you guys don't realize that the Tegra 2 is last year's chip. You are comparing the A5 to a chip from 2010. The next chip will be out in products in June. Code-named Kal-El, it is the first quad-core processor on the market. Five times faster than the Tegra 2.



    That is THIS YEAR's chip. Cannot wait to see how the A5 stacks up.



    Apple would be better served buying the chips from Nvidia. By 2014, Nvidia will have chips 100 times faster than Tegra 2. This is an arena Apple will not be able to keep up in.



    To be honest I also believed that NVIDIA's Tegra chips will rule the mobile chip world, but I think it was more a NVIDIA marketing buzz by now.



    Looking back they claimed that the first Tegra equipped Zoom HD will crush the iPod Touch in gaming performance. Unfortunately it wasn't able to show this superiority and the Zune failed in attracting developers and customers.



    Then there was word that the Tegra 2 will crush the competition. But beside the lack of games for e.g. the Xoom, it never showed a significant advantage over the A4 in graphics performance.

    That's because it's not only the raw calculation power that counts but also things like memory bandwidth, filtrate, driver and core system optimization when it comes to decent performance.



    The big advantage of Apple is the double take on optimization. Starting with the A4 they not only optimized the software for the chip but also began to optimize the SoC for the needs of their software.



    I think the A5 is an impressive result of this multi synergy, multi optimization strategy which insured that games like Real Racing 2 were able to unleash it's power from day one.



    Even if NVIDIA packs an impressive number of cores in the Tegra 3 it seems unlikely that devices like a Xoom 2 will soon be able to take advantage of this glory because Android might get better in multicore efficiency, but this OS will have a hard time to catch up with Apple having technologies like GPU-accelerated UI, GCD and OpenCL already in place.



    The big advantage for Apple and their 3rd party devs is that they can easily unleash the power of the A5 or an upcoming A6 chip while Google and Android devs will have to take care for a multitude of SoC's.



    Apple concentrates on improving the ARM A and VRX lines which ensures that devs have a future proof compatibility guaranty for optimized code and only have to adopt new SoC features when favorable.



    Therefore I don't believe that the Tegra-Android combo will catch up anytime soon.
  • Reply 45 of 120
    nvidia2008nvidia2008 Posts: 9,262member
    Apple can easily clock the A5 lower for iPod and iPhone use. As for space, I'm sure they can squeeze it in, that's what they do best.
  • Reply 46 of 120
    gatorguygatorguy Posts: 24,651member
    Domino, Google is making chipset standardization a priority in it's Android development. I think they've taken a cue from Apple and realized that having their software running on different chipset combos makes for an uneven user experience. Some see great results while others are so-so. Expect to see a more unified approach from android over the next year, and that includes chipsets.
  • Reply 46 of 120
    nvidia2008nvidia2008 Posts: 9,262member
    Quote:
    Originally Posted by solipsism View Post


    LOL I’d love to see a water-cooled smartphone.



    Fandroids would love it!



    Edit: Damn, pipped by ThePixelDoc
  • Reply 48 of 120
    shrikeshrike Posts: 494member
    Quote:
    Originally Posted by solipsism View Post


    Excellent points, but could you explain this one in more detail. I?m wondering why less metal layers would create a larger chip.



    Transistors must be connected together through wires for them to be called integrated circuits! The challenge is of course, how do you wire hundreds of millions of transistors together.



    The metal layers are the wiring for the transistors. The transistors are etched, drawn onto the silicon wafer. Then the wires are deposited in layers on top of the transistor layer. My arm chair understanding of this, I'm no EE btw, is that by using more metal layers, the CPU die could be made more compact.



    So, 10,000 ft level, if you laid down the wires at the same level as the transistors, the various blocks in the CPU (fetchers, decoders, integer, caches, registers, execution, etc) would have to be floor planned in such a way to minimize die size as there would be millions of wires in between them. This would make a pretty big die size in any case.



    Well, you can reduce the die size by taking some of the wiring off this first layer, and deposit them in another layer above. This allows die size to shrink some by removing the area occupied by the move wiring. The ICs and blocks can then be replanned to occupy the space more compactly in an optimization dance between transistor floor plan and wiring layers. By adding more metal layers, or wiring/interconnect layers, the manufacturer can further optimize die size. There is a point of diminishing returns along with other factors (yield) that'll come into play.



    In a 6 metal layer CPU, you have 1 level of transistors, than you have 6 levels of wiring connecting the transistors. So a CPU is really a 3 dimensional construct like a 7 story building where the first floor are the transistors and the remaining 6 floors are nothing but wiring.
  • Reply 49 of 120
    Quote:
    Originally Posted by Shrike View Post


    That's just a little bit of exaggeration there, man. Tegra 3 is 5x times faster than Tegra 2? And June? Nvidia ARM SoCs will be 100x faster in 2014?



    How about Tegra 3 will be out in 2H 2011. The rumors are saying August, not June. There are no known design wins for it yet. They better announce one soon if a product is going to ship soon. In system throughput, a 1 GHz Tegra 3 has the potential to be 2x faster than a 1 GHz Tegra 2, and has potential to be 5x faster in GPU than Tegra 2.



    In 2014, 3 years away, ARM SoCs will be at 18nm to 20nm process nodes, representing 4x to 5x the transistor count from Tegra 2. This will translate to about 2x to 3x the single-threaded performance (not many low hanging fruit for single threaded performance after the A5) and 5x to 10x the multi-threaded performance (quad-core with 2-way SMT). GPU will be much more power consumption limited than desktop hardware GPU are. And I think 5x to 10x is the most one could expect as well. You likely can have 100x GPU performance, desktop SLI rigs today are probably already there, but I doubt they are going to fit in handheld products.



    Nvidia's press release roadmap is great and all, but it's a press release roadmap. You should inject some realism instead of propagandizing it.



    Tegra 3 is going to be built on the same 40 nm process as Tegra 2. The thing has about 50% more transistors then Tegra 2. It'll need to be operating at ~25% lower voltages and be power-gated up the wazoo to fit inside the same power envelope (~0.5 Watts) as the Tegra 2 in order to fit inside handhelds. People should be skeptical that they could do this, and quad-core SoCs probably won't fit until the 28nm to 32nm nodes. And that means 2012.



    According to Nvidia web site (http://blogs.nvidia.com/2011/02/tegr...ile-processor/)



    Nvidia are claiming the quad core (Kal-El) is sampling now and launches in August 2011.



    As to running in the same power budget this will really depend how they have exploited, power gating, DVFS and the quad cores, and how the OS load-balances between the cores. No mean feat I can tell you. However, for many use-cases a quad core A9 vs a dual core A9 should be be far more power efficient as the four cores will be running at a lower power more of the time, which will give less heat and less leakage (so better active time).



    http://twitter.com/kevinmcintyre09
  • Reply 50 of 120
    Quote:
    Originally Posted by wizard69 View Post


    There are several things the article didn't consider.



    1.

    The article didn't take into account Apples FAST logic that trades space for lower power and high performance.



    2.

    The CPU is assembled in a module that stacks RAM on top of it. The chips need to line up properly to accomplish this.



    This is common practice in the industry, most semicos I have dealt with offer this as an option



    3.

    No body has a sound idea as to what is actually included on A5. Chipworks can guess at possible functionality but they won't get 100% of the functionality. So the question is how much of this comparison is one to one.



    The x-ray analysis or whatever it is called is pretty good at comparison. So they have a fair idea of A4 vs A5, and most of the major blocks are distinguishable.



    4.

    I will take a trade off that gives me better performance and lower power any day. Customer satisfaction means more than chip size!!!!



    Me too
  • Reply 51 of 120
    aquaticaquatic Posts: 5,602member
    Quote:
    Originally Posted by Swift View Post


    It showed at last year's CES. There are a couple of half-assed tablets on the market now with that chip. Not before this year, though they've told us all multiple times that it was coming. You can tell the difference between "shipping" and "in the design stages," can't you?



    A quad chip? Will it finally run Flash?



    My single core 1.2ghz Droid 2 Global runs Flash smoothly. I watch the Daily Show on the official website with it. So I'm guessing a quad-core ARM Android device will amply run Flash. Very amply.



    Now if the A5 and iPad are so fast, why isn't Apple allowing you to have Flash? That is the question.
  • Reply 52 of 120
    solipsismsolipsism Posts: 25,726member
    Quote:
    Originally Posted by kevin.mcintyre View Post


    Quote:
    Originally Posted by Shrike


    [?]

    How about Tegra 3 will be out in 2H 2011. The rumors are saying August, not June.

    [?]




    According to Nvidia web site (http://blogs.nvidia.com/2011/02/tegr...ile-processor/)



    Nvidia are claiming the quad core (Kal-El) is sampling now and launches in August 2011.



    [?]



    Sounds like he?s getting his info from the same place you are. Since you both have it it tell me that the August launch for Tegra 3 is the most accurate. Now, this is for production of the chip, not for a shipping product utilizing the chip, right? So when are we to expect new devices utilizing Tegra 3? I assume by the end of the year we?ll see a couple trying to claim the prize of first, but that CES 2012 will be swarmed with another round of iPad-kllers sporting Tegra 3.
  • Reply 53 of 120
    sessamoidsessamoid Posts: 182member
    Quote:
    Originally Posted by Aquatic View Post


    My single core 1.2ghz Droid 2 Global runs Flash smoothly. I watch the Daily Show on the official website with it. So I'm guessing a quad-core ARM Android device will amply run Flash. Very amply.



    Now if the A5 and iPad are so fast, why isn't Apple allowing you to have Flash? That is the question.



    I can watch the Daily Show on the official website on my iPhone, too. But I don't need to have Flash to watch it.
  • Reply 54 of 120
    cgc0202cgc0202 Posts: 624member
    Quote:
    Originally Posted by Aquatic View Post


    My single core 1.2ghz Droid 2 Global runs Flash smoothly. I watch the Daily Show on the official website with it. So I'm guessing a quad-core ARM Android device will amply run Flash. Very amply.



    Now if the A5 and iPad are so fast, why isn't Apple allowing you to have Flash? That is the question.



    That has been settled a long time ago. If you ever folllowed the various discussions. but you are too lazy to do that. It makes your point irrelevant. I have several browsers (Camino, Firefox, Chrome, Opera) that were set up (some with Flash, others where Flash is restricted).



    Comparing the same website, e.g., the New York Times, I get better viewing experience in the browser where the Flash is restricted -- no stalling, battery consumption much better, etc.



    One very big bonus, many ads and useless animations still use Flash -- I get to view New York Times without have to suffer those ads and animations. Do you know how satisfying to be freed of most of those ads and animations? I have photos to prove this experiment.



    The last observation alone is a big argument NOT to have Flash -- in my book.



    One question that you should answer: Can you name ten (10) very popular (most visited websites) that do not yet have Flash alternatives? Do you even know what that is - Flash alternatives?



    Do you see many here dying of envy with your Flash?



    So, enjoy your Flash but do it in your own time. Or, have an orgy with Flash with other Droids, and Xooms. Oh wait, the Xoom owners do not yet have Flash.



    Quote:
    Originally Posted by sessamoid View Post


    I can watch the Daily Show on the official website on my iPhone, too. But I don't need to have Flash to watch it.





    And that is because any show, website, Apps creator or any endeavor that wants to reach a larger audience cannot ignore about 200 million iOS devices already around (this number may reach around 250-300 million by the end of this year). The owners of these iOS devices are known to be willing to spend their money and buy stuff at premium, not just use free Apps.



    No Flash in those iOS devices. Solutions for creators. Don't use Flash, or create Flash alternatives (e.g., HTML5 scripting), for those iOS. That is how YouTube works -- it has both Flash and non-Flash versions, the latter for iOS devices. And many others do the same. Others simply just do not use Flash anymore. That is what I am seeing in many of the sites I visit myself.



    I do not use Flash in any of the websites I created. Joomla (a very popular CMS) used by millions of website creators all over the world, do not include Flash as a native component.



    QED



    CGC
  • Reply 55 of 120
    shrikeshrike Posts: 494member
    Quote:
    Originally Posted by Ssampath View Post


    I think the question is what value add does Nvidia provide in the Arm food chain if an upstart (in the chip business) like Apple can outdo them with the A5. Also there is at least 4 other companies doing Arm chips - Marvel, Samsung, Ti and Qualcomm. It is not clear to me if this is going to be a great business for Nvidia.



    For stock investors, I'd submit there is no value to invest in any of these companies. I wouldn't.



    The market is fairly saturated today, so, there isn't a lot of opportunity for chip companies to make handsome profits, especially for $15 parts.



    Theoretically there is a scenario where ARM becomes the de facto standard for computing: servers, desktops, laptops, handhelds and embedded. This can allow in increase per part revenues: desktop/laptop chips would be in the $100 to $500 range and server parts will be in the $5000 to $3000 range. A company can make handsome profits by selling into these markets and sweep away the existing competitors in this space.



    Maybe it is Nvidia, maybe it could be Qualcomm or TI. Maybe Intel will smarten up and license ARM and fab ARM chips again and it'll be them. That's an interesting investment scenario and if you choose the right company, you can get 10:1 returns. If only Intel would smarten up.



    But.



    ARM is a licensed architecture. It's not owned by a fab company. In such a scenario I can't see how one company can come to dominate (except for Intel). All companies are essentially at parity in terms of architecture and fabs, so how does one company dominate? (Intel is the exception. They can do it but it looks like they let x86 cut their legs under them before they move.)



    Agree with you. Chip companies are a poor investment. As for why Nvidia would want to do it, well, you have to give them credit for wanting to dominate. It is more business for them and represents a growth revenue stream.
  • Reply 56 of 120
    sheffsheff Posts: 1,407member
    Quote:
    Originally Posted by extremeskater View Post


    Apple didn't build the A5 chip, Samsung built the A5.



    Touche.
  • Reply 57 of 120
    alandailalandail Posts: 772member
    Quote:
    Originally Posted by Aquatic View Post


    My single core 1.2ghz Droid 2 Global runs Flash smoothly. I watch the Daily Show on the official website with it. So I'm guessing a quad-core ARM Android device will amply run Flash. Very amply.



    Now if the A5 and iPad are so fast, why isn't Apple allowing you to have Flash? That is the question.



    your first answer



    Quote:
    Originally Posted by sessamoid View Post


    I can watch the Daily Show on the official website on my iPhone, too. But I don't need to have Flash to watch it.



    the question for you. Why do you think having standard H.264 video wrapped in a flash wrapper who's only purpose is to force you to use proprietary, closed flash is a good thing? Why do you think it's a good thing for Adobe to dictate release schedules for companies? How many years did it take Adobe to release a working flash?



    I think you'd be surprised at just how many sites there are that still use flash when you access them from a desktop computer or an Android device that work just fine without flash in an iOS device. Why do they still use flash? They want you to install it so their flash ads will run. I wish they would stop doing that because Flash is the only thing that ever crashes on my computer, and it crashes daily.



    So you have an "open" phone so you can extend the life of a needless proprietary wrapper for h.264 to get more ads and crash more computers. Is there even one thing that flash does that couldn't be accomplished without flash (aside from crashing computers)?
  • Reply 58 of 120
    sockrolidsockrolid Posts: 2,789member
    Apple bought PA Semi just for their chip design talent back in 2008. It cost Apple $278 million, and now their investment is finally starting to pay off. The A5 chip is the fastest mobile SoC on the market, it runs cool, it is economical with battery power, and it costs Apple far less than an off-the-shelf chip from, say, Intel.



    And down the road, it's not out of the realm of possibility for Apple to unify its Mac and iOS device hardware to all use some quad-or 6-core ARM variant. Especially for portables like the MacBook Air. Of course, that would require porting Mac OS X to the AX chip of the future. But don't forget that Mac OS X ran on RISC chips from the start. Been there, done that transition.



    If I were to make a silly wild-ass guess, I'd say that Apple will be quietly developing the RISC-based AX version of Mac OS quietly. They'll sit back and watch Microsoft botch their attempt to port Windows 8 to Tegra. It'll be buggy, the Tegra chip will run hot, and laptops running the combination will get poor battery life. And as we've seen, Microsoft has a poor track record of providing apps with backward compatibility. Two words: "XP Mode."



    Apple could watch the Windows 8 + Tegra dumpster fire to get out of control, then drop the bomb. Mac OS 11 running on 4- and 6-core AX chips, Grand Central Dispatch balancing the load perfectly, MacBooks running as cool as iPads with enormously long battery life, apps running fat binaries just like they did in the 68k to PowerPC and the PowerPC to Intel transitions.



    Should be fun to watch.
  • Reply 59 of 120
    recrec Posts: 217member
    You guys feed the trolls too much.
  • Reply 60 of 120
    dominoxmldominoxml Posts: 110member
    Quote:
    Originally Posted by Gatorguy View Post


    Domino, Google is making chipset standardization a priority in it's Android development. I think they've taken a cue from Apple and realized that having their software running on different chipset combos makes for an uneven user experience. Some see great results while others are so-so. Expect to see a more unified approach from android over the next year, and that includes chipsets.



    Sounds reasonable at first glance, but I think this move is pretty complicated.



    Where to draw the line? Tegra is OK by now, but what about Qualcomm SoC with Adreno GPU or Samsung Hummingbirds with SGX 540 or better?

    If this is already too much differentiation should Samsung abandon their own chips for the Tegra in their new tablets?



    The second problem is Google's system to elect launch partners like they did with Moto and NVIDIA for Honeycomp. These partners invested a lot in the Xoom and as compensation they seem to get exclusivity for some months.



    Neither Moto nor NVIDIA are interested in providing their hard work for free to their competitors. Perhaps this is one of the reasons why the HC code is still closed and has to be "cleaned" a bit.



    Why should Samsung, LG, NVIDIA and all the others invest heavily in a system Google is willing to exercise complete control?

    I'm sure all these Partners would prefer to optimize Android for their chips and systems.



    And third: It seems that this "open always wins" and "don't be evil" is just marketing buzz of an advertising company. While the Android fans seem not be much effected by this move Google lost a lot of credibility inside the open source community and among their partners.

    It was the broad support of the community and a the huge variety of partners that enabled the success of Android.



    Taking a cue from Apple and switching to a model of hand-picked partners is kicking the others in the a$$.



    Which leads me to the fourth point:

    Google somehow officially admits that HC was rushed to the market.

    So the hard cuts seem necessary to keep pace.



    We're talking about a different business model for Android 3.x then that of 2.x.



    I'm not convinced that it will be equally successful because it starts with some displeased partners questioning Google's trustworthiness.
Sign In or Register to comment.