Official: Larrabee a pile of fail

Posted:
in Future Apple Hardware edited January 2014
http://www.hardmac.com/news/2010/05/...HardMac.com%29



Quote:

While the project is now dead, Intel intends to strongly improve performance of its current graphical chipset solutions, and when looking at what is currently available, it is a serious and big challenge for Intel. Just to summarize the 3 years old GeForce 9400 M chipset delivers higher graphic performance level than the latest solution from Intel, so difficult to imagine that Intel will anytime soon be able to compete with discrete GPU from ATI and NVIDIA. Intel has to hurry up, as AMD is working on its fusion project, aiming to combine on the same die a multicore CPU and a Radeon-like processor, both communicating closely. We know that Apple is carefully following developments of fusion technology and it could be the solution to be adopted by Apple (and other) for notebooks. If both CPU giant are now trying to associated CPU and GPU-like unit, NVIDIA is going to be alone, and might decided to acquire VIA, the latest x86 CPU founder...







Now Intel officially has no GPU strategy, save for the fail that is GMA integrated graphics, which only 'sells' because it is bolted on all of Intel's new CPUs. Whether customers want them or not.



God, I want to slap Intel.
«13

Comments

  • Reply 1 of 46
    1337_5l4xx0r1337_5l4xx0r Posts: 1,558member
    http://arstechnica.com/business/news...proof-gpus.ars



    In the words of Jon Stokes at Ars:

    Quote:

    Intel's Bill Kircoss opens a blog post on the state of Intel graphics with a complaint about the vagueness of a term:



    '"Graphics." I think the high-tech industry needs to do a better job defining this omnibus word.'



    Perhaps it's Intel that needs to do a better job—a better job of delivering graphics, not just "graphics," which is what the chipmaker is offering now.



    Kircoss' post confirms that Intel's discrete GPU product, codenamed Larrabee, is still dead. More importantly, it shows that Intel still doesn't understand that it has a major problem in integrated graphics.



    Intel's IGPs have been a leaden albatross around the neck of mobile graphics for years, and now that they're in the same package as the CPU (earning them the moniker IPGs), they're impossible to escape if you want Intel inside your laptop. Just like when Intel decreed that the vast majority of users didn't need accurate floating-point division, the chipmaker is now informing us that the "vast majority" of users don't need OpenCL, DirectX 11, or DirectCompute. Users should be happy with the ability to run HD video—this is what most people are doing with their GPUs.



  • Reply 2 of 46
    MarvinMarvin Posts: 15,322moderator
    Quote:
    Originally Posted by 1337_5L4Xx0R View Post


    Now Intel officially has no GPU strategy, save for the fail that is GMA integrated graphics, which only 'sells' because it is bolted on all of Intel's new CPUs. Whether customers want them or not.



    God, I want to slap Intel.



    Yup, they need slappin' alright. They hold down NVidia in a choke hold so they can force their useless IGPs on people and fill us with promises of future developments in Larrabee being the ultimate graphics solution, while NVidia tell it like it is and say they can't do it. Then Intel proves they can't do it.



    Why can't Intel see sense and just make a strong partnership already. It is better to live by each other's happiness than by each other's misery. Instead of leaving one of the best graphics company out in the cold and pushing out shamefully inadequate chips, kill two birds with one stone.
  • Reply 3 of 46
    wizard69wizard69 Posts: 13,377member
    Seriously if only to send a message to Intel. The GMA lock in is doing great damage to Intel, especially when the GPU is becoming just as important as the CPU in a modern PC.



    As to AMD's Fusion specifically I see this chip enabling many new form factors.. It might not even see first implementation in laptops. Such a chip could make one nice AppleTV for example, A real Mac OS/X tablet, or a number of other interesting projects.



    The potential is there though I expect Apple to implement more ARM based hardware than many are expecting. So even though I think Fusion would be an excellent chip for Apple TV I just see Apple re imagining many of its products with ARM based hardware.



    Dave
  • Reply 4 of 46
    programmerprogrammer Posts: 3,458member
    Be careful not to read too much into the Intel announcement.
  • Reply 5 of 46
    1337_5l4xx0r1337_5l4xx0r Posts: 1,558member
    Well, on the bright side, if Intel doubles their GPU's capabilities every processor revision (tick), in about four years we should have GeForce 9400M performance.



    Someday they may even support a modern implementation of OpenGL.



    Wait, is that a bright side?
  • Reply 6 of 46
    MarvinMarvin Posts: 15,322moderator
    Quote:
    Originally Posted by Programmer View Post


    Be careful not to read too much into the Intel announcement.



    You mean in the sense that it's not dead? Sure they will focus on developing IGPs now using what they've learned, develop the software side of things and possibly turn Larrabee into a Tesla competitor but pretty much none of that matters for the following reasons:



    - it will take ages for them to get their IGPs to match the competition, they aren't even at 9400M and the 320M is double that so as the last poster said, they are maybe 3 or 4 generations behind

    - a software model is great but what's the point f you can't use it? They aren't going to start running graphics code on the CPU anytime soon as there aren't enough cores. They'd have to allow around 6 threads to run in parallel per physical core to come close to an IGP and even then, you're using up all the CPU

    - a Tesla competitor is a nice idea in theory but it's the hardware that's needed. NVidia ship Tesla machines with over 500 computing cores



    The option I'd like to see is that they just buy or partner with NVidia and then they'd deliver something worthwhile.



    Ultimately, I would actually like to see the scenario of on set of cores running both graphics and computational software - it's all computation no matter how you slice it - so at least this way you don't have redundant hardware.
  • Reply 7 of 46
    Quote:
    Originally Posted by Marvin View Post


    - a Tesla competitor is a nice idea in theory but it's the hardware that's needed. NVidia ship Tesla machines with over 500 computing cores



    NVIDIA cores ≠ AMD (GPU) cores ≠ Intel (Larrabee) cores
  • Reply 8 of 46
    programmerprogrammer Posts: 3,458member
    Quote:
    Originally Posted by Marvin View Post


    You mean in the sense that it's not dead?



    No -- in the sense that you're not informed about the whole story, nor the realities of hardware & software development.
  • Reply 9 of 46
    MarvinMarvin Posts: 15,322moderator
    Quote:
    Originally Posted by iMacmatician View Post


    NVIDIA cores ≠ AMD (GPU) cores ≠ Intel (Larrabee) cores



    I know, I just mean it's a lot of performance based on what we know of NVidia's typical setups. Plus we know Larrabee scaled linearly and the cores it needed to run certain games. It depends on the clock speed used too. Power consumption and cost also have to be taken into consideration.



    The Larrabee that was going to come out performed around the same as the GTX 285 and Intel said they were disappointed at the stage it had reached.



    Quote:
    Originally Posted by Programmer


    No -- in the sense that you're not informed about the whole story, nor the realities of hardware & software development.



    If we got all the details of the Larrabee development cycle, decisions made and the future plans, would it change the current outcome though? Intel marketed upcoming ATI/NVidia competition and they cancelled it, it doesn't matter what the reasons are, just the results.
  • Reply 10 of 46
    backtomacbacktomac Posts: 4,579member
    Quote:
    Originally Posted by Programmer View Post


    No -- in the sense that you're not informed about the whole story, nor the realities of hardware & software development.



    Hey don't hold out. Spread your wisdom around.
  • Reply 11 of 46
    programmerprogrammer Posts: 3,458member
    Quote:
    Originally Posted by Marvin View Post


    If we got all the details of the Larrabee development cycle, decisions made and the future plans, would it change the current outcome though? Intel marketed upcoming ATI/NVidia competition and they cancelled it, it doesn't matter what the reasons are, just the results.



    Perhaps the results aren't what you think they are though.
  • Reply 12 of 46
    MarvinMarvin Posts: 15,322moderator
    Quote:
    Originally Posted by Programmer View Post


    Perhaps the results aren't what you think they are though.



    I thought that's what you were getting at but the blog post from the Intel exec in the second post paints out Intel's view of the whole situation. The full post is here:



    http://blogs.intel.com/technology/20...phics-rela.php



    "Our current 2010 Intel® Core? processors integrate what we call Intel HD Graphics, and offer a best-in-class solution for the vast majority of how we all use our computers... Intel?s processor graphics will continue to be enhanced - with more surprises - in our 2011 Intel Core processor family, code-named Sandy Bridge.



    1. Our top priority continues to be around delivering an outstanding processor that addresses every day, general purpose computer needs and provides leadership visual computing experiences via processor graphics.



    2. We are also executing on a business opportunity derived from the Larrabee program and Intel research in many-core chips. This server product line expansion is optimized for a broader range of highly parallel workloads in segments such as high performance computing. Intel VP Kirk Skaugen will provide an update on this next week at ISC 2010 in Germany.



    3. We will not bring a discrete graphics product to market, at least in the short-term. As we said in December, we missed some key product milestones. Upon further assessment, and as mentioned above, we are focused on processor graphics, and we believe media/HD video and mobile computing are the most important areas to focus on moving forward.



    4. We will also continue with ongoing Intel architecture-based graphics and HPC-related R&D and proof of concepts.



    We?re interested in your feedback. Do you use our laptop more for media and content viewing, creation and/or high-end gaming? Is size, weight, screen resolution, battery life, high-end gaming and/or price your most important factor(s) when buying PCs, laptops, netbooks and other PC-like devices?"




    The Intel HD graphics do indeed offer the best solution for power consumption but the vast majority they are talking about is 50% of the market. They also constantly under-deliver on their products when it comes to software support. If they resolve this with Sandy Bridge and manage to get the IGP to scale up to 320M levels while supporting OpenCL but maintain the low power consumption features then sure, their IGPs will be worthwhile but they said similar things about Intel HD graphics too and Apple have opted to put a 320M in their low-end machines because Intel's GPUs don't offer a good enough experience nor do they support OpenCL.



    The Larrabee project is moving to server products for now and we'll find out what that's all about over the next 4 days.



    The discrete GPU is not coming to market, the focus is on IGPs and the primary aim is to improve video playback, which it always has been with Intel and it's why they get such a bad reputation for everything else a GPU is used for.



    Every time Intel come out with a new GPU it's the same story. There are suggestions that things will be different this time round and we should give them a chance but they have proved themselves wrong so many times in the past that it's very difficult to believe that they have anything to come back with.



    If they do deliver an IGP that matches a 320M (double the performance the Intel HD graphics have now) with Sandy Bridge and it fully supports OpenCL, it could be game over for NVidia in laptops but they have their own parallel computing language (Ct) they want to push forward now too:



    http://software.intel.com/en-us/data-parallel/



    Future-proof so long as you keep buying Intel products.



    http://software.intel.com/en-us/foru...ad.php?t=72505



    Q: Does Intel have any functional OpenCL CPU/GPU implementation as AMD or NVIDIA, pls?

    A: No, the Ct beta test would give you an idea of one of the directions of development.

    Q: Ct is cool, but not multivendor. I need a OS portable, multivendor and standard API, not another propietary one.

    A: I could use your portable programming setup tomorrow. We are scheduled for a presentation on openCL in the next few weeks. We'll see if it gives any information we can mention.



    That was in March so maybe the event they mean is the ISC event.
  • Reply 13 of 46
    programmerprogrammer Posts: 3,458member
    Its all about the timelines you're looking at.
  • Reply 14 of 46
    backtomacbacktomac Posts: 4,579member
    Its starting to smell like FUD, Programmer.
  • Reply 15 of 46
    g-newsg-news Posts: 1,107member
    If anything, that announcement is good, because it will make the charges Intel is suing Nvidia with look a bit different. Intel won't be able to close their platform for dedicated GPUs for long, if they only have integrated options to offer themselves. They are either going to suffer financially, or the judges will rule against them in favor of the economy and the consumer.
  • Reply 16 of 46
    programmerprogrammer Posts: 3,458member
    Quote:
    Originally Posted by backtomac View Post


    Its starting to smell like FUD, Programmer.



    Sorry, my hands are tied.
  • Reply 17 of 46
    programmerprogrammer Posts: 3,458member
    Quote:
    Originally Posted by G-News View Post


    If anything, that announcement is good, because it will make the charges Intel is suing Nvidia with look a bit different. Intel won't be able to close their platform for dedicated GPUs for long, if they only have integrated options to offer themselves. They are either going to suffer financially, or the judges will rule against them in favor of the economy and the consumer.



    Questions of Larrabee aside, you may want to consider what direction chip (and GPU) design is going. The discrete GPU may simply not have a future. This is what nVidia is worried about because, unlike Intel and AMD, they can't create x86 cores (and even if they were licensed to do so, they have no experience and track record doing so). What judges rule is immaterial if there is no discrete GPU market -- nVidia will be reduced to (at best) licensing their designs, and using ARM cores (locking them out of the PC market). And before you start raving about how inferior integrated GPUs are, consider that the reasons for this are largely historical. Integrating devices together, if you have a sufficient transistor budget and design capability, is an advantage because it eliminates inter-chip bottlenecks. Moore's Law, world leading process technology, recent heavy investment into software development, and years of R&D into Larrabee ensure that Intel is going to dominate in this space. Why do you think AMD bought ATI? To compete with Intel.
  • Reply 18 of 46
    1337_5l4xx0r1337_5l4xx0r Posts: 1,558member
    The reason IGPs, and particularly Intels implementaion, suck, is for numerous reasons.



    The drivers for GMA are reportedly poor. In other words, the GMAs are capable of more than they do, they just don't do it.



    The transistor count is miniscule. Doubling the transistor count every year means that in five years, the transistor count will still be minuscule.



    The potential for OpenCL is nonexistent on Intel's GMAXXX.



    The implementation does less than half what a 9400M does, and the 9400M is now obsoleted by the 320M, which doubles the performance again.*



    The GMA sucks main memory bandwidth in a significant manner, and main memory, as compared to VRAM, is several orders of magnitude slower. We're talking WELL OVER 100GB/sec slower.



    Main memory latency is several orders of magnitude slower than VRAM. Which means by the time the info arrives, it may be too late.



    Graphics cards by ATI and NVidia are not a static target. By the time Intel catches up, it will be five years down the line and NV and ATI will have exponentially faster products then, too.



    A dedicated GPU will not only perform significantly better than IGP, it also reduces the burden on the rest of the system. IGPs increase that burden.



    So for the next five years at least, I can say without a doubt that Intels graphics not only suck, but will continue to do so, compared to ATI and NVidia.



    And anyone who thought running graphics ops on a generic x86 core was a great idea that would 'floor' dedicated, custom cores, 200+GB/sec dedicated memory bandwidth, and 400+ programmable shader pipes etc, well, you have your answer.



    string Larrabee == "Fail";



    Newsflash: these problems are not going away.



    *The 9400M, itself, is an IGP. So any comparisons to it must keep in mind that it, too is crippled. It is not a 'real' graphics card.



    Did Intel learn anything from Larrabee? Undoubtedly. Will we see implementations of some of those experiments in the consumer space? Almost certainly. Will IGP, some day, in some form, be 'good enough' as compared to real graphics accelerators? Definitely.



    Will that day come anytime soon? No.



    Intel aren't even capable of Fail when it comes to GPUs. And their attitudes and roadmaps reflects this.
  • Reply 19 of 46
    programmerprogrammer Posts: 3,458member
    Quote:
    Originally Posted by 1337_5L4Xx0R View Post


    Newsflash: these problems are not going away.



    *The 9400M, itself, is an IGP. So any comparisons to it must keep in mind that it, too is crippled. It is not a 'real' graphics card.



    Did Intel learn anything from Larrabee? Undoubtedly. Will we see implementations of some of those experiments in the consumer space? Almost certainly. Will IGP, some day, in some form, be 'good enough' as compared to real graphics accelerators? Definitely.



    Will that day come anytime soon? No.





    As I said above, it depends on your timeline. The problems aren't going away, but the priority of solving them (and how much transistor budget to dedicate to them) has been increasing steadily. OpenCL was mentioned above, and that reflects the convergence of general purpose and graphics computing. Intel has been investing heavily in its Terascale projects. Larrabee is much closer to that than to previous and current Intel IGP efforts. Intel looks at the world with a longer timeline than most people and companies do... so, as I said, don't read more into them canceling the discrete GPU part than is warranted. They cancelled them because their initial efforts didn't compete with where nVidia and ATI are now... but consider that Intel started way behind and have caught up exponentially.
  • Reply 20 of 46
    MarvinMarvin Posts: 15,322moderator
    Quote:
    Originally Posted by Programmer View Post


    As I said above, it depends on your timeline.



    Sure, things will be very different given enough time. Even 5-10 years away we may all have SSDs, GPUs with >100 cores in even the low-end, 32-core CPUs in the low-end. If you get 32 x86 cores (or 32 threads anyway) in the low-end running at 2-3GHz, you might not need a GPU, just the right software model to run on it and it certainly seems Larrabee would be the right type of software to use but as consumers, the only stuff that matters is what we can buy within the next year.



    Quote:
    Originally Posted by Programmer View Post


    Intel looks at the world with a longer timeline than most people and companies do... so, as I said, don't read more into them canceling the discrete GPU part than is warranted. They cancelled them because their initial efforts didn't compete with where nVidia and ATI are now... but consider that Intel started way behind and have caught up exponentially.



    Yeah, it's quite good that they caught up to a GTX 285 when they did but it was noted as having a 300W TDP or something like that. Dreamworks are making the move from AMD to Intel + Larrabee and will have advanced 3D workflows to drive the platform forward so that it eventually pays off for consumers:



    http://www.315weixiu.com/2010/05/27/...to-super-bowl/



    I think the ideal platform should minimize redundancy in the hardware. So many people pay for MBPs and Mac Pros with capable GPUs and apps like Quicktime, Final Cut, After Effects and so on don't really use them to any benefit for most tasks so you still sit waiting for hours on encodings to finish. Then if you run a 3D game or something, typically the GPU maxes out and the CPU doesn't do very much, sometimes it's still CPU-bound.



    I'm sure some of Intel's Larrabee demos were actually running on the Xeon CPUs. I'd expect a 12-core setup to be able to run graphics as fast as an 8800GT. The impressive feature is not the performance but the fact it's x86 code - that's really where Larrabee wins. I was quite impressed with the ray-tracing demo they had:



    http://www.techreport.com/discussions.x/17641



    It was slower than other ray-tracing demos from ATI/NVidia but you can use absolutely any code and run it on Larrabee so you don't have to use OpenGL or DirectX or difficult workarounds using them. A game can bundle a custom rasterization process and it would be portable.



    NVidia and AMD are going the same direction though and AMD is probably going to be the strongest player to begin with. Their Fusion products look very interesting and targeted at notebooks as well as desktops:



    http://sites.amd.com/us/fusion/APU/Pages/fusion.aspx

    http://www.pcgameshardware.com/aid,6...Die-shot/News/



    Intel won't rival that with their HD IGP junk as Fusion is based off AMD's 5000-series GPU.
Sign In or Register to comment.