or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Intel suit halts development of future Nvidia chipsets
New Posts  All Forums:Forum Nav:

Intel suit halts development of future Nvidia chipsets

post #1 of 70
Thread Starter 
Nvidia, the maker of Intel-compatible chipsets for Apple's line of Macs, has announced it will cease development of future hardware until its suit with Intel is settled sometime in 2010.

The announcement made this week means that Nvidia has placed its nForce chipset line on hold, pending the outcome of Intel's suit against the chip maker, according to PC Magazine. Intel has alleged that a previous chipset agreement between it and Nvidia does not apply to the Core or Nehalem series of processors.

"We have said that we will continue to innovate integrated solutions for Intel's FSB architecture," said Robert Sherbin, spokesman for Nvidia. "We firmly believe that this market has a long healthy life ahead. But because of Intel's improper claims to customers and the market that we aren't licensed to the new DMI bus and its unfair business tactics, it is effectively impossible for us to market chipsets for future CPUs. So, until we resolve this matter in court next year, we'll postpone further chipset investments."

A year ago, Apple officially made the move to Nvidia chipsets with the GeForce 9400M G integrated controller, a single chip of which 70 percent is devoted to graphics processing functions. It was then that Apple embraced Nvidia's MCP79 platform, married with Intel Core processors. It was later extended to iMacs and Mac minis.

Earlier this year, Intel sued Nvidia in an attempt to stop the company from developing compatible chipsets for future generation Intel processors. Many of Nvidia's gains -- including the partnership with Apple -- have amounted to Intel's loss.



This summer, Apple was rumored to be abandoning Nvidia chips in its Macs following a contract fight, though nothing official came of it.

Nvidia's recent announcement of the Fermi architecture, geared toward the scientific community and not PC graphics, has led some to believe that the company is changing its business strategy and moving away from the high-end gaming market. Nvidia has denied those assumptions.
post #2 of 70
Cry Baby's. Build a better chipset.
post #3 of 70
And this effect roll out of new products, or present production, how?
post #4 of 70
Quote:
Originally Posted by 801 View Post

And this effect roll out of new products, or present production, how?

Nah, nvidia already has the license it needs for the Core series I think...
post #5 of 70
Quote:
Originally Posted by xwiredtva View Post

Nah, nvidia already has the license it needs for the Core series I think...

But that license will be moot when Core i5 and i3 mobile CPUs hit the market in Q1 2010. And I don't think any of Intel's IGPs have anywhere near the performance of nVidia's GeForce 9400M. It's going to be a huge step back in GPU performance, much like when Apple moved from the dedicated GPU of the PPC Mac Mini to the horrible performance in the GMA 900 in the first Intel Mac Mini.
post #6 of 70
Didn't Apple invest (buy) in a company to make their own chips at some point? Nvidia is just shooting themselves in the foot.

TechnoMinds

We are a Montreal based technology company that offers a variety of tech services such as tech support for Apple products, Drupal based website development, computer training and iCloud...

Reply

TechnoMinds

We are a Montreal based technology company that offers a variety of tech services such as tech support for Apple products, Drupal based website development, computer training and iCloud...

Reply
post #7 of 70
If this summer Apple were thinking of a change they must have had a plan 'B' back then, one would think.
Been using Apple since Apple ][ - Long on AAPL so biased
nMac Pro 6 Core, MacBookPro i7, MacBookPro i5, iPhones 5 and 5s, iPad Air, 2013 Mac mini, SE30, IIFx, Towers; G4 & G3.
Reply
Been using Apple since Apple ][ - Long on AAPL so biased
nMac Pro 6 Core, MacBookPro i7, MacBookPro i5, iPhones 5 and 5s, iPad Air, 2013 Mac mini, SE30, IIFx, Towers; G4 & G3.
Reply
post #8 of 70
Maybe ATI can step up?
post #9 of 70
bring on the ATI baby!!!!!!!! Never liked nvidia cards on a mac.
post #10 of 70
Quote:
Originally Posted by xwiredtva View Post

Cry Baby's. Build a better chipset.

You obviously cannot read.

nVidia would be foolish to spend millions of dollars developing new chipsets they might never be allowed to sell. It's far smarter to spend their time and money developing a new product.

So if they lose in court they'll already have a new business to take the place of the PC chipset one.

If they win in court they can decide whether it's worth it to go back into competition with Intel in the chipset business. They've proven in the past that they can make better chipsets than Intel and make good profit on them so it wouldn't take long to make up for lost time.
post #11 of 70
Wouldn't it be better if they stopped using the integrated graphics anyway, and moved to discreet cards in all macs?

Mac Pro, 8 Core, 32 GB RAM, nVidia GTX 285 1 GB, 2 TB storage, 240 GB OWC Mercury Extreme SSD, 30'' Cinema Display, 27'' iMac, 24'' iMac, 17'' MBP, 13'' MBP, 32 GB iPhone 4, 64 GB iPad 3

Reply

Mac Pro, 8 Core, 32 GB RAM, nVidia GTX 285 1 GB, 2 TB storage, 240 GB OWC Mercury Extreme SSD, 30'' Cinema Display, 27'' iMac, 24'' iMac, 17'' MBP, 13'' MBP, 32 GB iPhone 4, 64 GB iPad 3

Reply
post #12 of 70
This is Intel wanting to take over the graphics chip market with their integrated graphics.

Expect future Mac's to have considerably less performance, this move is forcing Apple's hand but the writing on the wall has been there for quite some time since AMD bought ATI.

PC 3D gaming is nearly dead anyway in favor of consoles.



Road warriors: Get your MacBook Pro with matte screens, superdrives and separate graphics now before they are all turned into glorified glossy netbooks (MB AIRS) that can't do squat.

Netbook sales are predicted to greatly increase sales for the holidays this year.


This is the new trend, inexpensive, stripped, all machine produced, all on the logic board components wrapped in plastic shell (or metal for premium Macs) and sold a little bit less price than older models, but a hell of a lot more profit margin.

Sad days of performance are upon us.


iMac's with integrated graphics are going to go over like a lead balloon.

I think Apple is going to rock the world with a iTablet/iMac combination. The tablet doubles as the monitor when it's in the iMac cradle and taps off the quad core/integrated graphics.

Don't expect to get any good 3D performance out of the cradle though.
The danger is that we sleepwalk into a world where cabals of corporations control not only the mainstream devices and the software on them, but also the entire ecosystem of online services around...
Reply
The danger is that we sleepwalk into a world where cabals of corporations control not only the mainstream devices and the software on them, but also the entire ecosystem of online services around...
Reply
post #13 of 70
Well, Apple can use intel chipsets...and put dedicated cards for the gpu on them.

Remember when the Mac Mini used to cost less with PPC and a dedicated card for gpu? Now with intel/nvidia chipsets it costs more. *Shrugs. So what. GPUs that do way more and cost a 'nominal' amount are out there. Heck, the recent low end Ati card is well under a £100...and it's the kinda of card Apple should have in a low end consumer desktop...

I'd like Apple stop taking the skinflint road to specs and truly put the GPU first in it's specs. They've been lousy with gpu choice and value for money for freakin' years.

Lemon Bon Bon.

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply
post #14 of 70
Quote:
Sad days are upon us.


iMac's with integrated graphics are going to go over like a lead balloon

£950 for an iMac with integrated crappics. They're already here. Um afraid. Vs a PC Tower in PC world with a far better cpu (quad core) and decent GPU with at least a gig of Vram on it...with Monitor.

As opposed to the £695 iMac with integrated gpu they had a few years back.

That's a £345 increase(!) in the base iMac inside two-ish years?

'Only Apple'.

Lemon Bon Bon.

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply
post #15 of 70
Vista was there opportunity to really build mind share and marketshare for the Mac. They could have done better.

Margin huggers and a 'false god' consumer desktop line.

Lemon Bon Bon.

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply
post #16 of 70
If you can't beat em, sue.
post #17 of 70
Quote:
Originally Posted by MacTripper View Post

iMac's with integrated graphics are going to go over like a lead balloon


and what about imacs with non integrated graphics. I mean isn't that what the geeks around here have been demanding. and saying if Apple doesn't go that way, the company is doomed.

it is my understanding that what intel is putting a halt do is the single set that the local geeks don't like. not that Apple couldn't put two chips, one Intel and one whatever graphics they choose, into the systems.

so this is actually a potential forcing of Apple's hand and a win for you geeks (assuming that Apple wishes to stay with Nvidia for graphics in one form or another)

of course this is a chipset that probably wouldn't go into any machines for another year so by then the issue could be resolved

A non tech's thoughts on Apple stuff 

(She's family so I'm a little biased)

Reply

A non tech's thoughts on Apple stuff 

(She's family so I'm a little biased)

Reply
post #18 of 70
Quote:
Originally Posted by twistedarts View Post

bring on the ATI baby!!!!!!!! Never liked nvidia cards on a mac.

I think you're confused about the difference between an integrated chipset and a graphics processing unit. Nobody is stopping Apple from picking ATI or nVidia to supply discrete GPUs.

What's happening is Intel is preventing anyone else from making chipsets for the Core i family of CPUs.

Intel never has and never will allow AMD (who own ATI) from making compatible chipsets because they compete against each other in the CPU business.

In the past Intel did have a licensing agreement that permitted nVidia to make chipsets for Intel processors. What's in question now is whether that agreement covers all Intel processor and bus designs or whether, as Intel claims, it only covered processors that used the Front Side Bus design to communicate with a support chip commonly referred to as the northbridge.

Anyone wanting to use the old Core 2 series processors can either buy a matching Intel chipset or one from another company that makes them like nVidia. Apple chose nVidia because their chipset works well and includes an integrated GPU that's good enough for low to mid range customers. That saves both Apple and their customers money because one chip does it all.

Anyone wanting to use Intel Core i series processors is required to purchase Intel chipsets and then either live with truly awful Intel graphics or spend more money and buy separate GPUs from ATI or nVidia.
post #19 of 70
Remember people getting everything from one company without competition is always better.
post #20 of 70
Quote:
Originally Posted by twistedarts View Post

bring on the ATI baby!!!!!!!! Never liked nvidia cards on a mac.


Having owned Mac towers, I've found the Nvidia cards better made than ATI ones.

Nvidia is more focused on the upper end and quality, ATI seemed to be focused on mass produced cheaper/PC types.

Of course the Nvida cards cost more and came in less, mostly OEM machines, as ATI would sell more to end users to upgrade.


It's a mixed bag. I would get a Nvida with the machine and wind up switching to ATI over the years.


Of course those days are nearly over, 3D gaming is done on cheaper consoles now.
The danger is that we sleepwalk into a world where cabals of corporations control not only the mainstream devices and the software on them, but also the entire ecosystem of online services around...
Reply
The danger is that we sleepwalk into a world where cabals of corporations control not only the mainstream devices and the software on them, but also the entire ecosystem of online services around...
Reply
post #21 of 70
Intel felt a little insecure about their graphics *cough-slow* processors, and was beginning to feel threatened by the emerging power of the GPGPU. I understand why they'd want to squeeze out the competition (you can read a little more about it here: http://www.siliconmadness.com/2009/0...clarkdale.html), but I'm disappointed with how Intel's dealing with it, and they come off looking like a douche. Not to mention it hurts consumers. Intel should have picked up the slack long ago, and developed more competitive GPU's.

I hope Nvidia unveils a new GPGPU running OpenCL that trounces anything Intel has to offer. There are desktop units out there that are getting close to supercomputer speeds with just a few discrete GPU's: http://www.joeandmotorboat.com/2008/...supercomputer/

Hey Apple, I know its probably a few years off but how about we put a few discrete GPGPU's in a MBP, and re-write OSX entirely in OpenCL
Sent from my iPod Shuffle
Reply
Sent from my iPod Shuffle
Reply
post #22 of 70
Quote:
Originally Posted by xwiredtva View Post

Cry Baby's. Build a better chipset.

They did and Apple dumped Intel's chipsets for them. Intel just won't allow anyone else to build chipsets for their chips anymore. Nvidia would have to make their own CPUs.
post #23 of 70
Quote:
Originally Posted by Bregalad View Post

You obviously cannot read.

nVidia would be foolish to spend millions of dollars developing new chipsets they might never be allowed to sell. It's far smarter to spend their time and money developing a new product.

So if they lose in court they'll already have a new business to take the place of the PC chipset one.

If they win in court they can decide whether it's worth it to go back into competition with Intel in the chipset business. They've proven in the past that they can make better chipsets than Intel and make good profit on them so it wouldn't take long to make up for lost time.

I was referring to intel.
post #24 of 70
Quote:
Originally Posted by BenRoethig View Post

Remember people getting everything from one company without competition is always better.

Are you serious?!?! Competition encourages businesses to provide better products/ services... while trying to become the most efficient, thus reducing costs/ prices.

If you got everything from one company without competition, they wouldn't feel compelled to make anything better, and they'd jack up the price because you wouldn't have other choices.

Please tell me you were being sarcastic
Sent from my iPod Shuffle
Reply
Sent from my iPod Shuffle
Reply
post #25 of 70
Quote:
Originally Posted by BenRoethig View Post

They did and Apple dumped Intel's chipsets for them. Intel just won't allow anyone else to build chipsets for their chips anymore. Nvidia would have to make their own CPUs.

Or put AMD CPU's in bed with nVidia... Can you hear the screaming yet?
post #26 of 70
This is Nvidia confidently committing full support to ARM vs an immediate shutdown of support to x86 (AMD's included w/ Intel).

Nvidia probably's already won sockets on the new iTablet (ARM) and when that iTablet ships, the performance gap (speed and efficiency) between ARM/iTablet and x86/Mac will likely be very noticeable.

Nvidia is just doubling down on ARM and Intel will cry uncle soon afetr the iTablet testbed numbers are published.
post #27 of 70
Quote:
Originally Posted by 801 View Post

And this effect roll out of new products, or present production, how?

This will not affect anything Apple currently uses (the 9400M chipset). It means that Nvidia will not make any chipsets for future Intel processors (i3/i5/i7), something we already knew.

So, come next year, it will be back to Intel integrated graphics for the Mac mini and Macbook.
post #28 of 70
Quote:
Originally Posted by xwiredtva View Post

Or put AMD CPU's in bed with nVidia... Can you hear the screaming yet?

Nvidia and AMD used to be big partners, but NV hasn't done much work on modern chipsets for AMD since AMD merged with ATI. Supporting AMD would be supporting their chief graphics competitor, and so far they have refused to do it.
post #29 of 70
Quote:
Originally Posted by FuturePastNow View Post

This will not affect anything Apple currently uses (the 9400M chipset). It means that Nvidia will not make any chipsets for future Intel processors (i3/i5/i7), something we already knew.

So, come next year, it will be back to Intel integrated graphics for the Mac mini and Macbook.

Or, the introduction of the MacBiggie which has a real upgradeable cards instead of discrete graphics. The MacBiggie will be 10x10x4, have 2x HDMI and quad-core for $1199. It's gonna be awesome. I saw it on the internet.
post #30 of 70
Quote:
Originally Posted by MacTripper View Post

PC 3D gaming is nearly dead anyway in favor of consoles.

Wrong, actually... It's not nearly dead, or on the way out or anything.
post #31 of 70
Quote:
Originally Posted by BenRoethig View Post

They did and Apple dumped Intel's chipsets for them. Intel just won't allow anyone else to build chipsets for their chips anymore.

Seems a tad anti-competative.
post #32 of 70
Quote:
Originally Posted by chronster View Post

Wrong, actually... It's not nearly dead, or on the way out or anything.

Yes it is.

For example, all the retail game stores used to sell only PC and Mac games, now they sell only console games.

Computer processors are now less powerful and multi-core to reduce heat as they can't get them to go any faster, which doesn't lend itself very well to 3D game programming advances which very few functions in a constantly changing game engine can be passed off onto other cores and realize a performance gain.


X-Box uses three PowerPC G5 processors, the PS3 uses a (up to) 9 core Cell
processor. Lots of heat for lots of performance.

Computers are trending for cooler and portable with integrated graphics, like netbooks.

Computers are designed to do a lot of different things at once, a 3D console is designed to do one thing at once very well.

A 3D gamin console can be had for a few hundred, it would take a few thousand for a good gamin PC even close to a console in performance. This means more people can afford a 3D gamin device, thus more games.
The danger is that we sleepwalk into a world where cabals of corporations control not only the mainstream devices and the software on them, but also the entire ecosystem of online services around...
Reply
The danger is that we sleepwalk into a world where cabals of corporations control not only the mainstream devices and the software on them, but also the entire ecosystem of online services around...
Reply
post #33 of 70
Quote:
Originally Posted by MacTripper View Post

Yes it is.

For example, all the retail game stores used to sell only PC and Mac games, now they sell only console games.

Computer processors are now less powerful and multi-core to reduce heat as they can't get them to go any faster, which doesn't lend itself very well to 3D game programming advances which very few functions in a constantly changing game engine can be passed off onto other cores and realize a performance gain.


X-Box uses three PowerPC G5 processors, the PS3 uses a (up to) 9 core Cell
processor. Lots of heat for lots of performance.

Computers are trending for cooler and portable with integrated graphics, like netbooks.

Computers are designed to do a lot of different things at once, a 3D console is designed to do one thing at once very well.

A 3D gamin console can be had for a few hundred, it would take a few thousand for a good gamin PC even close to a console in performance. This means more people can afford a 3D gamin device, thus more games.

Steam seems to be doing fine

i don't play as much games as before, but there is more quality and less quantity these days. unlike consoles
post #34 of 70
Intel should be like "Hey, look what Nvidia is doing with our chipset! Turning it into gold! That's sweet, they're helping Intel to look better than what we can ourself. Win-win!"
post #35 of 70
Quote:
Originally Posted by uniqueness-template View Post

Seems a tad anti-competative.

You think?

Quote:
Originally Posted by CdnBook View Post

Are you serious?!?! Competition encourages businesses to provide better products/ services... while trying to become the most efficient, thus reducing costs/ prices.

If you got everything from one company without competition, they wouldn't feel compelled to make anything better, and they'd jack up the price because you wouldn't have other choices.

Please tell me you were being sarcastic

I was being sarcastic. It just has parallels with another situation close to us.
post #36 of 70
Quote:
Originally Posted by palegolas View Post

Intel should be like "Hey, look what Nvidia is doing with our chipset! Turning it into gold! That's sweet, they're helping Intel to look better than what we can ourself. Win-win!"

Or they're thinking, "that money could be ours".
post #37 of 70
D: Sorry to bring this up again but I guess I'm not going to be playing Diablo III on a MBP at the rate things are going?
post #38 of 70
Reading through the thread it seems some folks are confused about what is going on with the lawsuit. This graphic gives a decent outline of what intel is doing to change their architecture with Westmere. IMO Intel is purposely leaving Nivdia out of the mix. The key point being Nivdia will no longer provide a northbridge controller/GPU for Westmere.



The second question is how does that affect Apple. Apple was using the 9400M to improve graphic performance in their designs which didn't use discrete graphics so we need to compare the next gen intel Graphics vs Nividia's solution. Intel was licensing SLI from Nvida so any system with discrete graphics will see little change from the current situation.



The Imac and Macbook pro have included the option for discrete graphics so this really affect the Mac Mini and Macbook of the future. If your buying those machines you probably are not looking at a graphics powerhouse. If nothing else, there is a die shrink involved, though not all the way to 32nm. While Westmere processors will be 32nm, the graphics, and other functions that used to be done by the northbridge, will be made on a 45nm process. The smaller transistors enable much higher performance. While G45 had 10 shader cores, the new intel GPU increases that to 12. A number of performance limiting issues have now been resolved, so we should see much more competitive performance from Intel's graphics vs 9400M.

Hopefully this all comes with a lower price
post #39 of 70
Quote:
Originally Posted by MacTripper View Post

Yes it is.

For example, all the retail game stores used to sell only PC and Mac games, now they sell only console games.

As mentioned, PC games, and console games for that matter, are moving to digital distribution. The reason store shelves are filled with console games is because the used game market makes the retailers more money, and you can't resell a PC game.

Quote:
Computer processors are now less powerful and multi-core to reduce heat as they can't get them to go any faster, which doesn't lend itself very well to 3D game programming advances which very few functions in a constantly changing game engine can be passed off onto other cores and realize a performance gain.

Less powerful than what, exactly? I'm sure a single Nehalem core can beat a single Xenon core any day of the week. Or a single core of any other consumer processor.

Quote:
X-Box uses three PowerPC G5 processors, the PS3 uses a (up to) 9 core Cell
processor. Lots of heat for lots of performance.

As I'm sure you know, even the initial Core Duo processors were faster per clock than the PPC970, and the SPEs of the Cell are certainly not comparable to a fully featured core. The graphics cards in the consoles are almost four generations old. Frankly, it's silly to expect a $200 box to be able to outperform a $1000 gaming PC.

Quote:
Computers are trending for cooler and portable with integrated graphics, like netbooks.

Perhaps so, but that is a different market segment. Why would a gamer trend toward a netbook? They're a cheap novelty.

Quote:
Computers are designed to do a lot of different things at once, a 3D console is designed to do one thing at once very well.

Not unless you're still living in 2002. The current trend is for consoles to act as media extenders and/or media hubs.

Quote:
A 3D gamin console can be had for a few hundred, it would take a few thousand for a good gamin PC even close to a console in performance. This means more people can afford a 3D gamin device, thus more games.

Really? I can put together a computer for less than $500 that can beat today's consoles on performance, crappy console ports aside. Remember, the Xbox 360 and PS3 play at low to medium resolutions.
post #40 of 70
Quote:
Originally Posted by MacTripper View Post

Yes it is.

For example, all the retail game stores used to sell only PC and Mac games, now they sell only console games.

Computer processors are now less powerful and multi-core to reduce heat as they can't get them to go any faster, which doesn't lend itself very well to 3D game programming advances which very few functions in a constantly changing game engine can be passed off onto other cores and realize a performance gain.


X-Box uses three PowerPC G5 processors, the PS3 uses a (up to) 9 core Cell
processor. Lots of heat for lots of performance.

Computers are trending for cooler and portable with integrated graphics, like netbooks.

Computers are designed to do a lot of different things at once, a 3D console is designed to do one thing at once very well.

A 3D gamin console can be had for a few hundred, it would take a few thousand for a good gamin PC even close to a console in performance. This means more people can afford a 3D gamin device, thus more games.

Quote:
Originally Posted by al_bundy View Post

Steam seems to be doing fine

i don't play as much games as before, but there is more quality and less quantity these days. unlike consoles

Several things here.

Steam has a very limited game selection and only offer the PC versions of games that had Mac versions. Also there is issue of the verification method Steam uses for those games.

MacTrippe is right in that console gaming has stomped PC gaming into the ground. Go to Best Buy, Hastings, GampStop, or even Walmart and compare the number and type of console games PC games.
Go online and the reality that Sturgeon's Law is alive and well in the PC gaming market is obvious especially if you look at the cheap game market of Flash/game creation program. The good to crap ratio is just as bad int he PC world as on the console--you just don't see it as much in the brick and mortar store as they will go with what sales. So you get a mixture of hits with casual 'why did they make this' thrown in.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Intel suit halts development of future Nvidia chipsets