Intel's Larabee is DEAD* - What this means for future Macs

Posted:
in Future Apple Hardware edited January 2014
*Disclaimer: Okay, it's not dead. If you believe Intel. But now that I've got your attention... there's still big news:

http://anandtech.com/weblog/showpost.aspx?i=659

http://anandtech.com/cpuchipsets/showdoc.aspx?i=3686



1. Intel's Integrated Graphics are rubbish.

2. Larrabee is supposed to be the next fantastic GPU out of Intel

3. 2010 and onwards will result in Intel CPUs packaged or on-die with Intel GPUs.



Which means.



On your Mac laptops, welcome back to Intel Integrated Graphics. Just when OpenCL and GPGPU was supposed to take off.



Poor Apple. The 9400M was a good offering and still is after more than a year. Moving to Arrandale (Core i3, i5, etc.) for Mac laptops, Apple now has to go back and either bite the bullet and have worse-than-9400M graphics (Arrandale will have something similar to Intel's X4500 GPU) ... or bite the thermals and put in an Nvidia or ATI *discrete* GPU.



Well, thanks to Intel f*king Nvidia in the a55 and locking them out of the chipset business. Now Apple loses out. 9400M MacBooks and MacBook Pros are looking like decent investments going into the next two years.



2010 will be very interesting to see what Apple brings out with their Mac laptops. Arrandale has distinct speed advantages, but thermals and a crappy integrated GPU will be a bitter pill to chew on. Don't forget as well that even on the discrete GPU side the 9600M GT is an old, old architecture in terms of GPUs. Since it is a more or less rebranded 8600, the architecture of which goes back at least 3 years.



It's funny, on one side you've got people saying, who needs GPUs? Nobody really plays demanding 3D games and most good games are all playable on consoles anyway.



Then on the other side you've got everybody pushing GP-GPU as the next best thing since x86 was dreamed up. In a few short years Microsoft Word might even be GPU-accelerated, if you were delusional enough believing the current hype (not to say that Microsoft isn't going to pull this out of its hat -er, I mean... butt).



Then Nvidia says, forget all this mainstream GPUs for desktop or laptops, we'll just make a behemoth and sell that to the high-performance-computing niche.



Somebody please help with my sanity. WTF IS GOING ON ??????
«13

Comments

  • Reply 1 of 50
    nvidia2008nvidia2008 Posts: 9,262member
    What a washout and vapour-hype machine Intel's Larrabee has been. Now it's waiting until 2011 or 2012 to see any tangible GPU/GPGPU stuff out of Intel. Do any of you believe Intel can still pull it off or is there going to be just more and more vapourware?



    Looks like I'm not going to buy an Arrandale or Sandy Bridge Mac in 2010 unless I really have to replace my Macbook Aluminium. I've got my sights set on a 256GB Intel (*cough, actually, Intel SSDs are pretty decent*) or reputable, Mac-compatible model *SSD* to be my "new Mac" for me in 2010. That is, a fast SSD and I'm set for 2010 with my existing MacBook Alu. 2ghz. Just gotta wait for the price to come down to USD $300 for 256GB. Then I'll be all on it like white on rice.
  • Reply 2 of 50
    nvidia2008nvidia2008 Posts: 9,262member
    Apparently according to http://www.brightsideofnews.com/news...-graphics.aspx



    Apple wants 32nm Arrandales *without* the graphics part. Which still means Apple can't use the 9400M or evolutions thereof. So it could be Apple using Intel Arrandales *without* the rubbish IGP (more tension between Apple and Intel)... AND using a possibly-underclocked/crippled discrete GPU from Nvidia or ATI.



    Good fun all around.
  • Reply 3 of 50
    What this means is that Intel continues it's strong history of epic fails in the GPU business. It also means that vendors start to breathe down Intels neck now that future chips are supposedly 'joined at the hip' with GMA 950 (aka: codename Fail) which actually makes the NVidia 9400 look blazing fast in comparison.



    Honestly, I could smell the stench of Larabee a mile away 2 years ago.



    Edit: Honestly, Intel has brass balls to integrate the GMA 950 (which has 1987 levels of GPU power) and block NForce/Nvidia, when they clearly DON'T EVEN HAVE AN F*ING GPU STRATEGY.
  • Reply 4 of 50
    vineavinea Posts: 5,585member
    Given that Larabee wasn't going to be a mobile part anytime soon probably not a whole lot...(to answer the thread title).



    Shame though. Was hoping it would have made a nice high performance computing part.
  • Reply 5 of 50
    benroethigbenroethig Posts: 2,782member
    Apple could just go back to what they were doing during the PowerPC days, using entry level dedicated GPUs.
  • Reply 6 of 50
    The way I saw it, Larrabee was always intended first to be a very parallel processor for high-power computing. But Intel wanted to spread the costs out by producing them in greater volume, which meant selling to consumers.



    So they concocted this "GPGPU" scheme, but it only would have been a middling GPU. When they compared it to Nvidia and ATI, it only matched their previous generation high-end cards for power, and that isn't good enough for the price they wanted to sell it at in 2010.



    Basically, the world isn't ready for what Intel wants to sell, which is a graphics processor that focuses on general-purpose computing first and graphics second. Intel is hoping that future manufacturing processes will let them cram more cores onto Larrabee faster than ATI and NV can cram more "cores" on to their GPUs, and that they can get the software market more on board.
  • Reply 7 of 50
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by nvidia2008 View Post


    *Disclaimer: Okay, it's not dead. If you believe Intel. But now that I've got your attention... there's still big news:

    http://anandtech.com/weblog/showpost.aspx?i=659

    http://anandtech.com/cpuchipsets/showdoc.aspx?i=3686



    This is actually good news because Larrabee would have been a pretty useless part in my opinion.

    Quote:



    1. Intel's Integrated Graphics are rubbish.



    Well yeah but Larrabee wasn't really targeted at that market.

    Quote:

    2. Larrabee is supposed to be the next fantastic GPU out of Intel



    More accurately it was to be Intels first dedicated GPU chip. It wasn't the next anything.

    Quote:

    3. 2010 and onwards will result in Intel CPUs packaged or on-die with Intel GPUs.



    Not if their biggest customers revolt. Lets face it Intel GPU's aren't marketable on anything more than bottom end hardware.

    Quote:

    Which means.



    On your Mac laptops, welcome back to Intel Integrated Graphics. Just when OpenCL and GPGPU was supposed to take off.



    It doesn't mean anything of the kind, Apple has choices to make and they can reject Intels offerings. Further there have been rumors for at least two months now of Arrandale CPU's without the integrated GPU.



    Also from the technical standpoint, placing a high end GPU on the same chip as the CPU is pretty stupid. Mainly due to the concentration of heat into to small of a space. Integrating a GPU with the CPU only makes sense on hardware destined for the low end right now.

    Quote:



    Poor Apple. The 9400M was a good offering and still is after more than a year. Moving to Arrandale (Core i3, i5, etc.) for Mac laptops, Apple now has to go back and either bite the bullet and have worse-than-9400M graphics (Arrandale will have something similar to Intel's X4500 GPU) ... or bite the thermals and put in an Nvidia or ATI *discrete* GPU.



    The 9400M is a very good value for low end hardware. Unfortunately it looks like NVidia burnt the bridge with respect to their relationship with Apple. In any event you are making assumptions here about what intel will deliver in the Arrandale lineup and just how bad the GPU will be that is integrated on board the MCM. Personally I don't consider the integration of a GPU on the CPU chip to even be good engineering, it has all the appearances of being a form of lock out. If so Intel will have its mis behavior corrected, NVidia is not completely out of the picture yet.



    As to thermals a discrete GPU actually has advantages with respect to distributing power out over a wider area of the board. Further modern discreet GPU's form the likes of ATI are actually low power when targeted at laptops.

    Quote:

    Well, thanks to Intel f*king Nvidia in the a55 and locking them out of the chipset business. Now Apple loses out. 9400M MacBooks and MacBook Pros are looking like decent investments going into the next two years.



    The only company with a real potential to loose here is Intel. Put to much control on your product and customer will look else where. Further if that control looks like anti competitive practice then the governments become interested. Apple on the other hand has plenty of options.

    Quote:



    2010 will be very interesting to see what Apple brings out with their Mac laptops. Arrandale has distinct speed advantages, but thermals and a crappy integrated GPU will be a bitter pill to chew on. Don't forget as well that even on the discrete GPU side the 9600M GT is an old, old architecture in terms of GPUs. Since it is a more or less rebranded 8600, the architecture of which goes back at least 3 years.



    ATI should have plenty of hardware for future Macs that will fit into Apple power envelopes. It appears that Apple gave NVidia the heave hoe and has lost interest in that companies products.

    Quote:



    It's funny, on one side you've got people saying, who needs GPUs? Nobody really plays demanding 3D games and most good games are all playable on consoles anyway.



    Nobody with a modicum on intelligence is saying anything like that at all. In fact equating the need for a good GPU with the desire to run games is beyond stupid for any modern OS. When you here people talk like that you need to correct them.

    Quote:



    Then on the other side you've got everybody pushing GP-GPU as the next best thing since x86 was dreamed up. In a few short years Microsoft Word might even be GPU-accelerated, if you were delusional enough believing the current hype (not to say that Microsoft isn't going to pull this out of its hat -er, I mean... butt).



    I'm not at all sure where you got that idea at. Things like word gain easily form technology like GCD there is no need to go the extra step with GP?U acceleration.

    Quote:



    Then Nvidia says, forget all this mainstream GPUs for desktop or laptops, we'll just make a behemoth and sell that to the high-performance-computing niche.



    I don't think NVIdia has said anything to that effect at all. Their new high performance GPU is just that a NEW platform. The tech seen there can easily trickle down to the lower end of the line up. It actually makes sense.

    Quote:



    Somebody please help with my sanity. WTF IS GOING ON ??????



    Why worry about it? Seriously, it is not an issue until Apple delivers new hardware. That will reveal the path that they have taken. Once unveiled the product can be effectively judged. In any event I suspect Arrandale will be a harder sell for Intel than they imagined, as a CPU saddled with a junk GPU is just bad.





    Dave
  • Reply 8 of 50
    MarvinMarvin Posts: 15,324moderator
    Intel clearly saw ATI and NVidia pushing for a fully programmable rendering pipeline and thought they could get there quicker by pushing x86 cores that they already know into a package. Their software developments produced some nice looking demos:



    http://techreport.com/discussions.x/17641

    http://www.projectoffset.com/index.p...d=58&Itemid=12



    It seems they just couldn't get the hardware design right. Now if Intel and NVidia could just settle their differences and join, maybe they'd be able to put their collective heads together and build something amazing instead of bickering over who has valid licenses. All that happens in the latter scenario is NVidia get unfairly squeezed out of the market and consumers lose good products and Intel loses a lot of respect and money on R&D with no return.



    If they join, NVidia puts their decades of driver and hardware efforts to good use and Intel finally becomes renowned for having competitive graphics options as they'd only ship NVidia designs and would ensure the manufacturing is up to code.



    "Bad news to Proyect Offset . Larrabee is delayed or cancelled - cancelled only the first desing -



    http://news.cnet.com/8301-13924_3-10409715-64.html

    http://arstechnica.com/hardware/news...me-in-2010.ars



    Probably there will be no Intel's discrete graphics in next year a half or two years because Intel will have to make a new design. And this will affect Offset Project"
  • Reply 9 of 50
    Quote:
    Originally Posted by Marvin View Post


    Intel clearly saw ATI and NVidia pushing for a fully programmable rendering pipeline and thought they could get there quicker by pushing x86 cores that they already know into a package. Their software developments produced some nice looking demos:



    http://techreport.com/discussions.x/17641

    http://www.projectoffset.com/index.p...d=58&Itemid=12



    It seems they just couldn't get the hardware design right. Now if Intel and NVidia could just settle their differences and join, maybe they'd be able to put their collective heads together and build something amazing instead of bickering over who has valid licenses. All that happens in the latter scenario is NVidia get unfairly squeezed out of the market and consumers lose good products and Intel loses a lot of respect and money on R&D with no return.



    If they join, NVidia puts their decades of driver and hardware efforts to good use and Intel finally becomes renowned for having competitive graphics options as they'd only ship NVidia designs and would ensure the manufacturing is up to code.



    "Bad news to Proyect Offset . Larrabee is delayed or cancelled - cancelled only the first desing -



    http://news.cnet.com/8301-13924_3-10409715-64.html

    http://arstechnica.com/hardware/news...me-in-2010.ars



    Probably there will be no Intel's discrete graphics in next year a half or two years because Intel will have to make a new design. And this will affect Offset Project"



    Intel should pull a AMD / ATI and give there on board video about 256m or more of it's own ram. So can be faster and you don't have carp video eating system ram.
  • Reply 10 of 50
    I would like to see AMD /ATI in my Mac.
  • Reply 11 of 50
    Quote:
    Originally Posted by wizard69 View Post


    I don't think NVIdia has said anything to that effect at all. Their new high performance GPU is just that a NEW platform. The tech seen there can easily trickle down to the lower end of the line up. It actually makes sense...



    I agree with your points except the above one. The Nvidia design with their GTX280 and similar NEVER made it to the laptop in any reasonable fashion. It's as impossible as a PowerPC G5 laptop. Too big, too hot, too power hungry.



    Starting out even higher with Fermi, as a beastly high performance computing part, and somehow thinking that will ever make it to a low power, high performance laptop part competitive with ATI's offerings by even the 4th calendar quarter of 2010, would be, I would say, quite delusional at this stage. Especially with Fermi's delays, TSMC 40nm operating very poorly, etc. IMO.
  • Reply 12 of 50
    aizmovaizmov Posts: 989member
    It means nothing.
  • Reply 13 of 50
    http://www.brightsideofnews.com/news...-graphics.aspx



    given that Apple isn't exactly reknowned to be on the bleeding edge of graphics performance or options, the fact that they have boycotted the integrated Arrendale CPU/GPU tells you something.



    It tells you that Intel's graphics accelerators are complete, utter crap unsuitable for even entry level 2010 graphics.
  • Reply 14 of 50
    nvidia2008nvidia2008 Posts: 9,262member
    Quote:
    Originally Posted by 1337_5L4Xx0R View Post


    http://www.brightsideofnews.com/news...-graphics.aspx



    given that Apple isn't exactly reknowned to be on the bleeding edge of graphics performance or options, the fact that they have boycotted the integrated Arrendale CPU/GPU tells you something.



    It tells you that Intel's graphics accelerators are complete, utter crap unsuitable for even entry level 2010 graphics.



    That's my concern, which some of the posters here are not so worried about.



    What if Apple can't get Intel to sell them Arrandales without the GPU?



    Apple is not the one with the big stick on the negotiation table. Apple could continue to use Core 2s but by March 2010 when everyone else has got Core i3, i5 and i7 branded laptops, Apple can't lag behind too much. And by mid-2010 Core 2s will be more or less discontinued in favour of either Atoms or the Core i3, i5, i7.



    Apple can't switch to AMD CPUs, in the mobile space they just won't cut it, even though on the desktop it's great value for money.



    Intel could realistically play hardball and tell Apple, take it or leave it. They *know* Apple can't go to AMD for mobile, the performance-per-watt won't come close to Arrandale.



    Another thing is that Apple has less leverage since Nvidia chipsets are totally out of the question, Apple *has* to use the Intel chipset.



    Intel could be laughing their a** off right now, teasing Apple, "Who you gonna call, now?". Locking out Nvidia was not only for desktops, but critical for domination in the more important portable and ultramobile/ netbook markets. Particularly with Arrandales having GPU on-package.



    Intel is going to use its huge marketing muscle to convince many laptop brands and customers that their all-in-one super CPU+GPU Core is going to be *the* big thing in 2010, munching away at Nvidia and ATI discrete solutions.



    Pretty evil overall.



    So, maybe I'm worried for nothing, Intel gives Apple Arrandales-without-GPU, Apple puts in simple dedicated graphics, everyone can go home and sleep well at night.



    Or, Intel gets ballsy, tells Apple, you got to really buy into the Calpella platform, Apple has to bite the bullet and deal with a junk part on the CPU package, find a suitable discrete GPU, ditch their investment (time, money, drivers, etc.) in the 9400M chipset, deal again with Intel-exclusive chipsets and deactivating/ making some use of the integrated GPU... you get the picture.



    Yeah, my 9400M MacBook Alu 2ghz is looking real good right now. Bring on more GCD and OpenCL optimised OS and apps, a 256GB fast SSD at USD$300, and 6GB of RAM within USD$200, I'd be able to cruise through 2010 and see what's up in 2011.



    But being Apple, they'd probably whip something out next year you just gotta have anyway, to hell with the specifications. I think it will be called "MagicBook"
  • Reply 15 of 50
    aizmovaizmov Posts: 989member
    Quote:
    Originally Posted by 1337_5L4Xx0R View Post


    http://www.brightsideofnews.com/news...-graphics.aspx



    given that Apple isn't exactly reknowned to be on the bleeding edge of graphics performance or options, the fact that they have boycotted the integrated Arrendale CPU/GPU tells you something.



    It tells you that Intel's graphics accelerators are complete, utter crap unsuitable for even entry level 2010 graphics.





    Apple should use AMD's. Stick it to Intel.
  • Reply 16 of 50
    Hi Folks;



    Forgive me for not having the link but Cringely is saying that Intel wants to buy NVidia.
  • Reply 17 of 50
    vineavinea Posts: 5,585member
    Quote:
    Originally Posted by nvidia2008 View Post


    That's my concern, which some of the posters here are not so worried about.



    What if Apple can't get Intel to sell them Arrandales without the GPU?



    Then they'll get Arrandales with GPUs and not use the GPU for much.



    It's a rumor anyway and Apple has never fully done the Centrino platform (Capella). There's no way that Apple has rejected Arrendale across the entire line up as that rumor suggests. Unless they go Sandy Bridge...and that seems like a bridge too far.
  • Reply 18 of 50
    http://www.cringely.com/2009/12/intel-will-buy-nvidia/ is the link geneking7320 refers to. I believe the rumor; I just think that leaves everyone in an awkward situation. Intel is king at laptop CPUs. NVidia laptop GPUs are power-hogs that are several generations behind their desktop parts.



    AMD makes good desktop CPUs. ATI make current, low-power GPUs, like the mobile 5XXX series expected in January.



    As a 3D artist I care far more about GPUs than the average user. 256MB of VRAM is considered weak in 3D, but super smokin' to Apple.
  • Reply 19 of 50
    nvidia2008nvidia2008 Posts: 9,262member
    Quote:
    Originally Posted by 1337_5L4Xx0R View Post


    http://www.cringely.com/2009/12/intel-will-buy-nvidia/ is the link geneking7320 refers to. I believe the rumor; I just think that leaves everyone in an awkward situation. Intel is king at laptop CPUs. NVidia laptop GPUs are power-hogs that are several generations behind their desktop parts.



    AMD makes good desktop CPUs. ATI make current, low-power GPUs, like the mobile 5XXX series expected in January.



    As a 3D artist I care far more about GPUs than the average user. 256MB of VRAM is considered weak in 3D, but super smokin' to Apple.



    Yeah, that's the problem. Intel buying Nvidia at this stage is an unwise move. Because, like you say, WTF can Nvidia offer anyway? Ion, Tegra and so on, okay, all sexy and what not, but the core of the issue is getting access to low-power, 40nm and less, decently-powerful mobile GPUs. Which Nvidia doesn't have (it's just rebrands/die-shrinks of the 8 series), and will not have for a while (because of waiting for Fermi then waiting for Fermi to trickle down to mobile GPUs, which may never happen).



    Cringely has an interesting take that Intel is messing about with Nvidia so as to make buying them easier. It won't be an easy buy, and IMHO it may have to be a hostile takeover. Sounds like it could be messy.



    ATI should have excellent mobile GPUs over the next few years, however brand and marketing-wise they're up against the Intel and Nvidia brainwash juggernauts. Additionally, ATI can't even get enough 40nm chips because of TSMC simply not being able to deliver decent yields.



    More and more it sounds like Apple will either get Arrandales without GPU or suck it up and pair the Arrandale laptops with an ATI 40nm GPU.



    This will be interesting in 2010. Like I said, just when all this promise of CPU+GPU is coming about, sounds like, at least in the mobile space, for decent laptop manufacturers it could involve two very different camps. An Intel CPU with an AMD GPU.



    I'm still confused about this whole thing BTW, so I understand my post may not be too coherent.



    Anyone hoping they'll bring a case against Intel that forcing companies to swallow a crap GPU when buying a CPU is like Microsoft "forcing" you to use IE with Windows? In the latter case, the end user is slowly winning thanks to all the litigation. In the former case...
  • Reply 20 of 50
    nvidia2008nvidia2008 Posts: 9,262member
    The flipside to all this is that even if Intel doesn't buy Nvidia they could bulldoze their way through 2010 and even 2011 with a lousy GPU bundled with the CPU... Because most people buying laptops wouldn't really care, netbooks would still be somewhat popular, and the CPU is powerful enough (as Intel is now claiming) to do a lot of HD decoding and so on.



    For gamers, you'd mostly be on a console anyway (or a PC desktop with a big hot GPU) and for 3D artists, etc. you'd be using a desktop with 1GB minimum VRAM.
Sign In or Register to comment.