Intel suit halts development of future Nvidia chipsets

Posted:
in Future Apple Hardware edited January 2014
Nvidia, the maker of Intel-compatible chipsets for Apple's line of Macs, has announced it will cease development of future hardware until its suit with Intel is settled sometime in 2010.



The announcement made this week means that Nvidia has placed its nForce chipset line on hold, pending the outcome of Intel's suit against the chip maker, according to PC Magazine. Intel has alleged that a previous chipset agreement between it and Nvidia does not apply to the Core or Nehalem series of processors.



"We have said that we will continue to innovate integrated solutions for Intel's FSB architecture," said Robert Sherbin, spokesman for Nvidia. "We firmly believe that this market has a long healthy life ahead. But because of Intel's improper claims to customers and the market that we aren't licensed to the new DMI bus and its unfair business tactics, it is effectively impossible for us to market chipsets for future CPUs. So, until we resolve this matter in court next year, we'll postpone further chipset investments."



A year ago, Apple officially made the move to Nvidia chipsets with the GeForce 9400M G integrated controller, a single chip of which 70 percent is devoted to graphics processing functions. It was then that Apple embraced Nvidia's MCP79 platform, married with Intel Core processors. It was later extended to iMacs and Mac minis.



Earlier this year, Intel sued Nvidia in an attempt to stop the company from developing compatible chipsets for future generation Intel processors. Many of Nvidia's gains -- including the partnership with Apple -- have amounted to Intel's loss.







This summer, Apple was rumored to be abandoning Nvidia chips in its Macs following a contract fight, though nothing official came of it.



Nvidia's recent announcement of the Fermi architecture, geared toward the scientific community and not PC graphics, has led some to believe that the company is changing its business strategy and moving away from the high-end gaming market. Nvidia has denied those assumptions.
«134

Comments

  • Reply 1 of 69
    Cry Baby's. Build a better chipset.
  • Reply 1 of 69
    801801 Posts: 271member
    And this effect roll out of new products, or present production, how?
  • Reply 3 of 69
    Quote:
    Originally Posted by 801 View Post


    And this effect roll out of new products, or present production, how?



    Nah, nvidia already has the license it needs for the Core series I think...
  • Reply 4 of 69
    Quote:
    Originally Posted by xwiredtva View Post


    Nah, nvidia already has the license it needs for the Core series I think...



    But that license will be moot when Core i5 and i3 mobile CPUs hit the market in Q1 2010. And I don't think any of Intel's IGPs have anywhere near the performance of nVidia's GeForce 9400M. It's going to be a huge step back in GPU performance, much like when Apple moved from the dedicated GPU of the PPC Mac Mini to the horrible performance in the GMA 900 in the first Intel Mac Mini.
  • Reply 5 of 69
    technotechno Posts: 737member
    Didn't Apple invest (buy) in a company to make their own chips at some point? Nvidia is just shooting themselves in the foot.
  • Reply 6 of 69
    MacProMacPro Posts: 19,817member
    If this summer Apple were thinking of a change they must have had a plan 'B' back then, one would think.
  • Reply 7 of 69
    winterwinter Posts: 1,238member
    Maybe ATI can step up?
  • Reply 8 of 69
    bring on the ATI baby!!!!!!!! Never liked nvidia cards on a mac.
  • Reply 9 of 69
    Quote:
    Originally Posted by xwiredtva View Post


    Cry Baby's. Build a better chipset.



    You obviously cannot read.



    nVidia would be foolish to spend millions of dollars developing new chipsets they might never be allowed to sell. It's far smarter to spend their time and money developing a new product.



    So if they lose in court they'll already have a new business to take the place of the PC chipset one.



    If they win in court they can decide whether it's worth it to go back into competition with Intel in the chipset business. They've proven in the past that they can make better chipsets than Intel and make good profit on them so it wouldn't take long to make up for lost time.
  • Reply 10 of 69
    mariomario Posts: 348member
    Wouldn't it be better if they stopped using the integrated graphics anyway, and moved to discreet cards in all macs?
  • Reply 11 of 69
    mactrippermactripper Posts: 1,328member
    This is Intel wanting to take over the graphics chip market with their integrated graphics.



    Expect future Mac's to have considerably less performance, this move is forcing Apple's hand but the writing on the wall has been there for quite some time since AMD bought ATI.



    PC 3D gaming is nearly dead anyway in favor of consoles.







    Road warriors: Get your MacBook Pro with matte screens, superdrives and separate graphics now before they are all turned into glorified glossy netbooks (MB AIRS) that can't do squat.



    Netbook sales are predicted to greatly increase sales for the holidays this year.





    This is the new trend, inexpensive, stripped, all machine produced, all on the logic board components wrapped in plastic shell (or metal for premium Macs) and sold a little bit less price than older models, but a hell of a lot more profit margin.



    Sad days of performance are upon us.





    iMac's with integrated graphics are going to go over like a lead balloon.



    I think Apple is going to rock the world with a iTablet/iMac combination. The tablet doubles as the monitor when it's in the iMac cradle and taps off the quad core/integrated graphics.



    Don't expect to get any good 3D performance out of the cradle though.
  • Reply 12 of 69
    Well, Apple can use intel chipsets...and put dedicated cards for the gpu on them.



    Remember when the Mac Mini used to cost less with PPC and a dedicated card for gpu? Now with intel/nvidia chipsets it costs more. *Shrugs. So what. GPUs that do way more and cost a 'nominal' amount are out there. Heck, the recent low end Ati card is well under a £100...and it's the kinda of card Apple should have in a low end consumer desktop...



    I'd like Apple stop taking the skinflint road to specs and truly put the GPU first in it's specs. They've been lousy with gpu choice and value for money for freakin' years.



    Lemon Bon Bon.
  • Reply 13 of 69
    Quote:

    Sad days are upon us.





    iMac's with integrated graphics are going to go over like a lead balloon



    £950 for an iMac with integrated crappics. They're already here. Um afraid. Vs a PC Tower in PC world with a far better cpu (quad core) and decent GPU with at least a gig of Vram on it...with Monitor.



    As opposed to the £695 iMac with integrated gpu they had a few years back.



    That's a £345 increase(!) in the base iMac inside two-ish years?



    'Only Apple'.



    Lemon Bon Bon.
  • Reply 14 of 69
    Vista was there opportunity to really build mind share and marketshare for the Mac. They could have done better.



    Margin huggers and a 'false god' consumer desktop line.



    Lemon Bon Bon.
  • Reply 15 of 69
    If you can't beat em, sue.
  • Reply 16 of 69
    charlitunacharlituna Posts: 7,217member
    Quote:
    Originally Posted by MacTripper View Post


    iMac's with integrated graphics are going to go over like a lead balloon





    and what about imacs with non integrated graphics. I mean isn't that what the geeks around here have been demanding. and saying if Apple doesn't go that way, the company is doomed.



    it is my understanding that what intel is putting a halt do is the single set that the local geeks don't like. not that Apple couldn't put two chips, one Intel and one whatever graphics they choose, into the systems.



    so this is actually a potential forcing of Apple's hand and a win for you geeks (assuming that Apple wishes to stay with Nvidia for graphics in one form or another)



    of course this is a chipset that probably wouldn't go into any machines for another year so by then the issue could be resolved
  • Reply 17 of 69
    Quote:
    Originally Posted by twistedarts View Post


    bring on the ATI baby!!!!!!!! Never liked nvidia cards on a mac.



    I think you're confused about the difference between an integrated chipset and a graphics processing unit. Nobody is stopping Apple from picking ATI or nVidia to supply discrete GPUs.



    What's happening is Intel is preventing anyone else from making chipsets for the Core i family of CPUs.



    Intel never has and never will allow AMD (who own ATI) from making compatible chipsets because they compete against each other in the CPU business.



    In the past Intel did have a licensing agreement that permitted nVidia to make chipsets for Intel processors. What's in question now is whether that agreement covers all Intel processor and bus designs or whether, as Intel claims, it only covered processors that used the Front Side Bus design to communicate with a support chip commonly referred to as the northbridge.



    Anyone wanting to use the old Core 2 series processors can either buy a matching Intel chipset or one from another company that makes them like nVidia. Apple chose nVidia because their chipset works well and includes an integrated GPU that's good enough for low to mid range customers. That saves both Apple and their customers money because one chip does it all.



    Anyone wanting to use Intel Core i series processors is required to purchase Intel chipsets and then either live with truly awful Intel graphics or spend more money and buy separate GPUs from ATI or nVidia.
  • Reply 18 of 69
    benroethigbenroethig Posts: 2,782member
    Remember people getting everything from one company without competition is always better.
  • Reply 19 of 69
    mactrippermactripper Posts: 1,328member
    Quote:
    Originally Posted by twistedarts View Post


    bring on the ATI baby!!!!!!!! Never liked nvidia cards on a mac.





    Having owned Mac towers, I've found the Nvidia cards better made than ATI ones.



    Nvidia is more focused on the upper end and quality, ATI seemed to be focused on mass produced cheaper/PC types.



    Of course the Nvida cards cost more and came in less, mostly OEM machines, as ATI would sell more to end users to upgrade.





    It's a mixed bag. I would get a Nvida with the machine and wind up switching to ATI over the years.





    Of course those days are nearly over, 3D gaming is done on cheaper consoles now.
  • Reply 20 of 69
    Intel felt a little insecure about their graphics *cough-slow* processors, and was beginning to feel threatened by the emerging power of the GPGPU. I understand why they'd want to squeeze out the competition (you can read a little more about it here: http://www.siliconmadness.com/2009/0...clarkdale.html), but I'm disappointed with how Intel's dealing with it, and they come off looking like a douche. Not to mention it hurts consumers. Intel should have picked up the slack long ago, and developed more competitive GPU's.



    I hope Nvidia unveils a new GPGPU running OpenCL that trounces anything Intel has to offer. There are desktop units out there that are getting close to supercomputer speeds with just a few discrete GPU's: http://www.joeandmotorboat.com/2008/...supercomputer/



    Hey Apple, I know its probably a few years off? but how about we put a few discrete GPGPU's in a MBP, and re-write OSX entirely in OpenCL
Sign In or Register to comment.