There's one interesting point that makes Intel buying Nvidia less interesting. Intel would have gotten good access to competitive high end cards... NOT. Not until Fermi, and not until Fermi is proven to be able to scale downwards.
The flagship GTX260, 275, 285, 295 and the well-performing GTS250 (9800GTX+ more or less) are all very popular among the fanboys and also among the moderate gamers/ 3D enthusiasts/ CUDA users.
Nvidia is not interested in supplying more of these (and they have said so themselves) until Fermi. Calendar Q1 2010.
Nvidia's partners are in trouble, and it?s not an isolated case. They depend on good sales of high-end cards and this part of the market is nonexistent at this time. You simply cannot buy any high-end GT200 based chips from Nvidia and this is something that has been occurring since August.
Selling mainstream cards like the Geforce 210 and Geforce GT 220 is not something that most partners can make money from. In fact, they only happen to make pennies on such cards and they don't fair well against midrange options like the Geforce 9500GT and 9400GT that usually end up cheaper on the market.
Geforce GT 240 is selling at acceptable levels, but again partners don?t have GTX 260, GTX 285 and GTX 295 cards in stock, something that would help them make some profit.
To further clarify, some partners might actually go under. We don?t want to name any names at this point, but particularly in Europe, some big names might disappear waiting for Fermi.
IT STARTED SOONER THAN I EXPECTED. With Larrabee dead-ish, I mentioned just a short while ago that Intel would start to put the muscle behind their rubbish integrated GPUs, particularly things like Arrandale and Clarkdale where, guess what, you have to take it with the GPU, resistance is futile.
In such a short amount of time, the Intel hype engine has been engaged.
According to an Intel's spokesman, its next-generation code-named Clarkdale central processing unit with integrated graphics core will support video encoding using graphics procesising engines via a driver update. Despite lack of initial support of the feature, which will be added using a driver update, there is a confidence in the forthcoming Intel’s integrated desktop platform.
Guess what, now Intel does GPGPU! W00t. Now with video encoding that's been on ATI and Nvidia platforms for, hmm... a while now... Will the Intel IGP be fully OpenCL compliant? Support DX10.1 at anything more than 5 frames per second? Maybe, who knows? Who cares? Intel graphics FTW...!!!
Notice the best part: It's all coming... In a driver update. ROFLMA RIGHT OFF.
Not that Larrabee was supposed to be in Clarkdale and Arrandale but the shine has definitely worn off Intel a bit so they are trying to counter it with boosting up the IGP on the CPUs which are launching in a few weeks. The mobile market is huge. Intel is going to want to stuff as much 32nm chips and chipsets into it in 2010.
The marketing juggernaut has taken off and is at your doorstep.
Are you mostly having this conversation with yourself?
It's kinda like Twitter. Except I get to talk much more.
Quote:
Originally Posted by mdriftmeyer
That would bring up huge Anti-Trust ramifications as Intel would have a strangle hold on both the GPU and CPU markets.
Well, that's the reallly amazing thing... Intel has totally slipped under the Anti-Trust radar here... OK not really, they have settled a lot of stuff with AMD.
But Intel is doing it again.
(A) Chipsets: Nobody can make chipsets for Intel CPUs from 2010, more or less (Core 2 is being phased out early next year, of course, with Clarkdale and Arrandale). Maybe AMD could, but they sure as heck won't. Intel gave Nvidia the shaft and locked them out. Think about it. The most dominant CPU company now also will be the most dominant chipset manufacturer.
(B) Graphics: Intel is forcing you to accept a GPU part for what is predominantly a CPU part. You have no choice, more or less, with Clarkdale and Arrandale. It's not so bad now because it's a 45nm part on-chip with the CPU. When it's on-die (Sandy Bridge) -- there's really no turning back. So it's a *huge* abuse of dominance. Imagine you're Apple or HP and you want to have a decent AMD/ATI or Nvidia GPU. Guess what, you have to suck it up with Intel force-bundling a junk GPU with what you actually want, a good CPU. You have to fiddle with it (hybrid mode or something if that is even possible) or disable it outright when adding a discrete GPU. For the mid to low-end, this is a total lockout of AMD/ATI and Nvidia. There's no bargaining AFAIK (apart from rumours) -- Intel says, hey, you want the fastest CPUs per watt out there, you gotta take this other part, take it or leave it entirely.
Intel settling all that stuff with AMD is good, but it is also a bit of a head fake to slip their chipset and graphics bullying under the radar.
I wonder if alarm bells are going off at the usual Department of Justice, European Commission, or whatever, or if they are still dealing with the Intel vs AMD stuff.
As I understand Nvidia isn't even bothered now trying to compete with Intel or AMD/ATI, they've said, f*** it, we're not going to make chipsets anymore beyond Core 2. They're not going to make chipsets for AMD CPUs. They're not making high-end GPUs until Fermi trickles down to the consumer level.
2010 may be a little more interesting than I initially suspected.
IT STARTED SOONER THAN I EXPECTED. With Larrabee dead-ish, I mentioned just a short while ago that Intel would start to put the muscle behind their rubbish integrated GPUs, particularly things like Arrandale and Clarkdale where, guess what, you have to take it with the GPU, resistance is futile.
Laptop manufacturers are putting the GT 240M in with the mobile Core i7. The 13" motherboard will be a bit cramped but they just need to drop the optical drive. I'd much rather have a GeForce GT 130/230/240M or 9600M GT that I use regularly than an optical drive I use maybe once a month.
Even if they make two models like they do with the Mini server. Macbook option 1 is Arrandale with an optical drive and crappy Intel graphics, option 2 is Arrandale with a 9600M GT or similar and no optical drive. They can even push the motherboard to both sides without the optical and offer more ports on option 2 e.g 4 x USB 2.
Laptop manufacturers are putting the GT 240M in with the mobile Core i7. The 13" motherboard will be a bit cramped but they just need to drop the optical drive. I'd much rather have a GeForce GT 130/230/240M or 9600M GT that I use regularly than an optical drive I use maybe once a month.
Even if they make two models like they do with the Mini server. Macbook option 1 is Arrandale with an optical drive and crappy Intel graphics, option 2 is Arrandale with a 9600M GT or similar and no optical drive. They can even push the motherboard to both sides without the optical and offer more ports on option 2 e.g 4 x USB 2.
Option 1 might actually sell better and the majority of Apple's sales are the 13" machines.
Option 1 might actually sell better and the majority of Apple's sales are the 13" machines.
Quote:
Originally Posted by Marvin
Laptop manufacturers are putting the GT 240M in with the mobile Core i7. The 13" motherboard will be a bit cramped but they just need to drop the optical drive. I'd much rather have a GeForce GT 130/230/240M or 9600M GT that I use regularly than an optical drive I use maybe once a month.
Even if they make two models like they do with the Mini server. Macbook option 1 is Arrandale with an optical drive and crappy Intel graphics, option 2 is Arrandale with a 9600M GT or similar and no optical drive. They can even push the motherboard to both sides without the optical and offer more ports on option 2 e.g 4 x USB 2.
Appreciate the input and keeping this thread alive.
Now, however, you guys are willing to settle for this?
Firstly, the 130/230/240M/9600 are all rebrands/die-shrinks of years-old technology. ATI's 40nm, particularly 5-series mobile GPUs should deliver a class above for similar price points, or should be strongly considered. And it's at least clearly new tech/ architecture.
Secondly, if Macbook goes Option 1 above then AFAIK you're going to lose OpenCL support and go back to worse graphics, a step back by about 2 years.
I'm beginning to sound like a certain Teckstud but I'm not accepting a MacBook/Pro without anything less than a 40nm 5-series ATI GPU that can deliver a Nvidia GTX260/ ATI 4830 class of performance (about 30-50% better than 9600M GT, very roughly). Maybe it's a good thing, will stop me from buying a new Mac next year. Maybe I'm just spoilt by having played games with desktop GPUs.
The graphics in the MacBook Pro are now starting to get real old for "pro-level", or maybe the consumer is not so demanding in that regard. By early 2010 the MacBook Pro 15" (ideally) should be able to deliver Nvidia GTX260 or ATI 4830 class of performance within the same thermals as the 9600M GT. Nvidia's not going to be able to deliver this because the GTX260, 280 etc. tech cannot be made into mobile form, and there is no research in this direction as they move to Fermi. Even the 240M or so is a rebranded/die shrunk Nvidia 9500 ~ 9800. Apologies if I don't have the exact details.
For Montevina (Core 2) discrete GPUs it seems ATI may have the upper hand:
But surprisingly, according to Fudzilla the Calpella (Arrandale) discrete GPU market will be 80% dominated by Nvidia because ATI was in the process (and still is) of readying their 40nm 5-series Mobility Radeon GPUs:
"Kicking off the new year in style will be Pavilion desktops featuring a choice between ATI's Radeon HD 5350 (code named Evora Cedar), which will have HDMI, DVI and VGA ports along with 1GB of onboard memory, or the juicier Radeon HD 5570 (aka Jaguar), which bids adieu to VGA in favor of DisplayPort and bumps up the memory allowance to 2GB."
That's right, a mobile GPU part with an totally overkilled 2GB of VRAM.
Will this range of ATI GPUs be used in Mac laptops next year? Will there be enough chips available for laptop manufacturers? Will Apple stick with Nvidia GPUs with Arrandale? Interesting times.
2. Larrabee is supposed to be the next fantastic GPU out of Intel
3. 2010 and onwards will result in Intel CPUs packaged or on-die with Intel GPUs.
Which means.
On your Mac laptops, welcome back to Intel Integrated Graphics. Just when OpenCL and GPGPU was supposed to take off.
Poor Apple. The 9400M was a good offering and still is after more than a year. Moving to Arrandale (Core i3, i5, etc.) for Mac laptops, Apple now has to go back and either bite the bullet and have worse-than-9400M graphics (Arrandale will have something similar to Intel's X4500 GPU) ... or bite the thermals and put in an Nvidia or ATI *discrete* GPU.
Well, thanks to Intel f*king Nvidia in the a55 and locking them out of the chipset business. Now Apple loses out. 9400M MacBooks and MacBook Pros are looking like decent investments going into the next two years.
2010 will be very interesting to see what Apple brings out with their Mac laptops. Arrandale has distinct speed advantages, but thermals and a crappy integrated GPU will be a bitter pill to chew on. Don't forget as well that even on the discrete GPU side the 9600M GT is an old, old architecture in terms of GPUs. Since it is a more or less rebranded 8600, the architecture of which goes back at least 3 years.
It's funny, on one side you've got people saying, who needs GPUs? Nobody really plays demanding 3D games and most good games are all playable on consoles anyway.
Then on the other side you've got everybody pushing GP-GPU as the next best thing since x86 was dreamed up. In a few short years Microsoft Word might even be GPU-accelerated, if you were delusional enough believing the current hype (not to say that Microsoft isn't going to pull this out of its hat -er, I mean... butt).
Then Nvidia says, forget all this mainstream GPUs for desktop or laptops, we'll just make a behemoth and sell that to the high-performance-computing niche.
Somebody please help with my sanity. WTF IS GOING ON ??????
What's going on?
Er...Nvidia are not on top of their game. Ati seem to be. Tables have turned.
I can't see Apple using Intel's crappics.
People don't have money to spend on £300-500 house warmer gpus?
Ati got bought. Personally, I felt Nvidia getting bought by Intel was a matter of time. Nvidia are getting squeezed and Intel will get a better price if Nvidia stock drops?
I'm not sure this affects Apple that much. They used discrete gpus in the days of PPC. And I don't see why they can't now. Ati offer plenty of choice. Apple's historic gpu partner since the 'Ati Rage Alien Face Sucker' gpu days. (If we think the 4850 in the iMac was a side grade...remember the days when Apple would have the same gpu in their products for what seemed like years? Nvidia was a distant dream way back then...)
Larabee isn't 'alive' to consumers. So I'm not sure I can call it 'dead.' Sure more competition would be good in the gpu space. And Larrabee seemed interesting. But if gpu computing is that important going forwards...why not just buy Nvidia with pocket change. Lots of engineering talent. Expertise in Gl drivers. Open gl IS important on the iPhone and computing is going truly mobile not honking great big dinosaur towers with wind tunnel coolers on £500 gpus. Why buy one when you can have a PS3 for half the price?
But it's not like Apple uses the greatest and latest choice of gpus. Or offers said choice. They have an eclectic use of gpus, frequently stale and out of date...often with underperforming parts with a lack of vram compared to the competition.
Firstly, the 130/230/240M/9600 are all rebrands/die-shrinks of years-old technology. ATI's 40nm, particularly 5-series mobile GPUs should deliver a class above for similar price points, or should be strongly considered. And it's at least clearly new tech/ architecture.
Power usage and cost would be important factors but if the ATI cards can perform better and consume the same or less and cost the same or less then it makes perfect sense to use them instead. Apple and NVidia seemed to be forming a relationship for a brief period of time but then the desktop line switched back to ATI cards - none of the new iMacs offer dedicated NVidia cards.
Quote:
Originally Posted by nvidia2008
Secondly, if Macbook goes Option 1 above then AFAIK you're going to lose OpenCL support and go back to worse graphics, a step back by about 2 years.
I think the new Intel chips are supposed to support GPGPU processing - the performance will be poor but that's the selling point on the other option. It's a case of what do you want, a good GPU or an optical drive instead of here's a DVD drive (when others have Blu-Ray) and poor graphics and that's it.
Quote:
Originally Posted by nvidia2008
Will this range of ATI GPUs be used in Mac laptops next year? Will there be enough chips available for laptop manufacturers? Will Apple stick with Nvidia GPUs with Arrandale? Interesting times.
I'd much prefer it if it was a case of NVidia have great value and high performance mobile chips that Apple can use and so do ATI so either way we get a good deal. The current situation is not quite so certain.
I just hope the Larrabee experience gave Intel a wake-up call and they stop trying to hold the real graphics companies down.
... It's a case of what do you want, a good GPU or an optical drive instead of here's a DVD drive (when others have Blu-Ray) and poor graphics and that's it...
Man, the next refresh of MacBook Pros better have Blu-Ray drives, at worst, a custom configuration. "Bag of Hurt" won't cut it in 2010.
I don't see it being a big competitor of the iTunes Store, because, well, Apple has to decide, are Macs more important than the iTunes Store, I think Blu-Ray as option on 15" and 17" MacBook Pros strike a nice balance of "It's there if you want it but we're not saying ignore the iTunes Store".
People want their choices in this media mashup over the next few years. Physical, digital, social.
Also, Apple *has to* bring in Blu-Ray support on Final Cut Studio somehow, right??? What the heck is going on there? I mean, an end-to-end ingest to Blu-Ray solution for Mac Pro and MacBook Pro 15" and 17" should be critical for what Apple wants to do. Heck, imagine if iMovie '10 could burn Blu-Ray discs. Maybe globally Blu-Ray adoption is not as high as we think. But anyways, that's a whole 'nother thread... I know FCS can output files that you then send of to Blu-Ray disc producers. But for pros burning Blu-Ray on your Mac would be great. Again, a whole 'nother thread on Apple and the Pro market.
...But it's not like Apple uses the greatest and latest choice of gpus. Or offers said choice. They have an eclectic use of gpus, frequently stale and out of date...often with underperforming parts with a lack of vram compared to the competition...
Yeah, it has been eclectic, that's an interesting way of putting it. They tried to get away with too many 9400M integrated GPU models in the previous iMac range. Previously, years of GMA 950 was horrible, absolutely horrible.
However. 9400M in the 13" MacBook/Pro if you don't play any demanding games is fine, OpenCL and hardware H.264 decoding shows promise. As for the iMac 27", the ATI 4850 512MB is reasonably decent for an all-in-one.
The current MacBook Pro 15" with 9600M GT 256MB VRAM, that's starting to move into the "stale" territory. But... we would have to look at users of the MBP 15" and 17", how many are gamers or need more powerful GPUs?
I think my mind is starting to go round in circles with the whole GPU scene. On one hand, there's the promise of "Hey, it's going to rock your GPGPU world, so, get on board with a sexy GPU"... On the other hand, there really aren't the killer apps or even many apps.
I think the signs point to Apple turning to discrete (albeit low-end) GPUs in the Macbook and Mac mini when Nvidia chipsets cease to be an option. Those will probably be the Geforce G210M or GT220M.
Yeah, it has been eclectic, that's an interesting way of putting it.
I was being polite. See my signature.
Er...but with Ati's next round of gpu updates and a thermal 'shrink', I'd be hoping for a better gpu in the next iMac refresh that can push that 2500x1400 display better. And with 1 gig of VRAM. Even if they die shrink only offers 4870 style performance, that wouldn't be too bad. But I'd be hoping for something better than that. They now have 27.5 inches of space to spread the gpu heat now...
Sure, Apple's gpu pace is glacial, but even following that pace...we'll have Crysis running relatively smoothly on a top end iMac inside two years? I'd take that.
I've run Champions Online fairly smoothly at native resolution on my iMac with settings at medium and the 8800 gs is how old now? At least a couple of years. It's old tech. Runs Resident Evil 5 demo at 30 fps native too. Looks good. The 4850 is a decent card. But it is low end now. And has been for over a year.
By the time Apple finally releases a better gpu for the iMac we should be able to comfortably game/high end apps in 2500x1400 at medium to high settings inside the next 2 years. Maybe even next year. Some may argue the 4850 already can. But I seem to recall benches that showed it running out of breath at higher resolutions. Ergo. Why I'll wait until next year or the year after before upgrading this iMac. It's more than good enough.
I think my mind is starting to go round in circles with the whole GPU scene. On one hand, there's the promise of "Hey, it's going to rock your GPGPU world, so, get on board with a sexy GPU"... On the other hand, there really aren't the killer apps or even many apps.
Yeah, you're telling me. I've pull out what seems like what's left of my hair out with Apple and GPUs. Does it really matter?
Let's face it, you alude to it. Are there any killer apps out there for gpu computing yet? Yeah? But they're on a PS3 that costs £250 and it's far cheaper than a £350-500 gpu that is itself part of a £1500-2000 gaming rig. PS3 wins.
Macs. Gpus. Generally quite rubbish in all models. 9400 is reasonable for entry model I guess. Or was a year ago. Now we have the 4850 costing peanuts...it no longer is 'reasonable' to me.
The good news is that by the time gpu computing becomes important and by the time software developers finally catch up with quad cores and gpus Apple will have much better 'rubbish' gpus in their machines which will seem more powerful than today's rubbish gpus because software developers are actually beginning to use the entire rigs potential to do er...useful stuff. Ergo things will seem less rubbish than they are now. ie we have an iMac which now has quad core. That consigns the low end Mac Pro to irrelevance. You can have a machine with a better gpu and a gorgeous screen for less! And with next year's bump to the iMac range? That value equation is going mainstream.
Sure, they may seem moderate in performance compared to bleeding edge Nvidia cards but Nvidia may be bought by Intel by then. And we'll notice less because the iMac will be, by then, a workstation capable computer for the mainstream.
And if you want more performance? You'll plug in the dual Octo Mac Pro to run it through the iMac's screen. Instant render farm.
If you want a cheap PC tower for gaming? You'll buy a PS3 with a hi-def tv which will probably still be better than most PC gaming rigs... Or Apple will have conquered casual gaming to such a degree that people won't care about Crysis 5 and will be having fun on much less 'serious' games while the pc gaming market is consigned to irrelevance. PS4 or xbox 720? Even M$ seems to be hedging their bets. Talk about bailing on your own markets...not like M$ to do that, eh?
Workstation performance and gaming are going mainstream. Apple knows this. Keep watching.
By the time it does...even I won't care about mid-tower Macs anymore.
Honestly, if Apple doesn't step up their game, and we know they won't, my next Mac will be a Hackintosh Mid tower with a 1GB+ 5XXX Radeon card.
Christ it's frustrating being a Mac user. But at least Hackintoshes are an option. Nothing sucks like paying the Apple tax for a machines that has a GPU that was obsolete years ago. I say that while typing on the Core 2 Duo Macbook Pro with an ATI X1600 128MB VRAM... that came out a year after the previous Macbook Pro Core Duo with ATI X1600 128MB VRAM.....
One thing I forgot to mention about the ATI 5750 and 5770. When I say it is comparable to the 4850 and 4870, I mean the 4850 *1GB VRAM* and 4870 *1GB VRAM*.
So if the new iMacs come with the 5750 and 5770, those are 1GB cards (it would be cruel of Apple to somehow fish out 512MB versions of those cards, because that is a very non-standard configuration). So ATI 5750 and 5770 at 1GB VRAM each would see maybe 20-50% performance improvements compared to the ATI 4850 512MB VRAM... At real good thermals.
So yeah, for the iMacs, a 5750 1GB should be standard on the next revision of the 27" with an option for a 5770 1GB or 5850 1GB... Again, Apple & GPUs, who knows.
...But with Ati's next round of gpu updates and a thermal 'shrink', I'd be hoping for a better gpu in the next iMac refresh that can push that 2500x1400 display better. And with 1 gig of VRAM...
ATI's 5 series is a new architecture and die shrink. It's a totally different class now in terms of performance-per-watt compared to anything Nvidia currently offers and of course Intel is not even worth mentioning.
The 5850 and 5870 1GB VRAM are quite phenomenal, benchmarks and reviews around the web all corroborate that point.
If this is an option in the next round of iMacs then you really are going to see reasonable 2560x1440 graphics.
Just remember the 5750 and 5770 are slightly different propositions because they only have a 128-bit bus instead of a 256-bit bus. But they are now usually paired with 1GB of VRAM and they have good thermals.
But in any case, like I said, if the next iMacs get a ATI 5750 1GB VRAM you'll be seeing 20%-30%, very roughly, improvements in performance (at least in terms of gaming).
Comments
The flagship GTX260, 275, 285, 295 and the well-performing GTS250 (9800GTX+ more or less) are all very popular among the fanboys and also among the moderate gamers/ 3D enthusiasts/ CUDA users.
Nvidia is not interested in supplying more of these (and they have said so themselves) until Fermi. Calendar Q1 2010.
From Fudzilla:
http://www.fudzilla.com/content/view/16767/1/
No high-end to make money
Nvidia's partners are in trouble, and it?s not an isolated case. They depend on good sales of high-end cards and this part of the market is nonexistent at this time. You simply cannot buy any high-end GT200 based chips from Nvidia and this is something that has been occurring since August.
Selling mainstream cards like the Geforce 210 and Geforce GT 220 is not something that most partners can make money from. In fact, they only happen to make pennies on such cards and they don't fair well against midrange options like the Geforce 9500GT and 9400GT that usually end up cheaper on the market.
Geforce GT 240 is selling at acceptable levels, but again partners don?t have GTX 260, GTX 285 and GTX 295 cards in stock, something that would help them make some profit.
To further clarify, some partners might actually go under. We don?t want to name any names at this point, but particularly in Europe, some big names might disappear waiting for Fermi.
In such a short amount of time, the Intel hype engine has been engaged.
http://www.xbitlabs.com/news/video/d...ort_GPGPU.html
According to an Intel's spokesman, its next-generation code-named Clarkdale central processing unit with integrated graphics core will support video encoding using graphics procesising engines via a driver update. Despite lack of initial support of the feature, which will be added using a driver update, there is a confidence in the forthcoming Intel’s integrated desktop platform.
Guess what, now Intel does GPGPU! W00t. Now with video encoding that's been on ATI and Nvidia platforms for, hmm... a while now... Will the Intel IGP be fully OpenCL compliant? Support DX10.1 at anything more than 5 frames per second? Maybe, who knows? Who cares? Intel graphics FTW...!!!
Notice the best part: It's all coming... In a driver update. ROFLMA RIGHT OFF.
The marketing juggernaut has taken off and is at your doorstep.
Hi Folks;
Forgive me for not having the link but Cringely is saying that Intel wants to buy NVidia.
That would bring up huge Anti-Trust ramifications as Intel would have a strangle hold on both the GPU and CPU markets.
Are you mostly having this conversation with yourself?
It's kinda like Twitter. Except I get to talk much more.
That would bring up huge Anti-Trust ramifications as Intel would have a strangle hold on both the GPU and CPU markets.
Well, that's the reallly amazing thing... Intel has totally slipped under the Anti-Trust radar here... OK not really, they have settled a lot of stuff with AMD.
But Intel is doing it again.
(A) Chipsets: Nobody can make chipsets for Intel CPUs from 2010, more or less (Core 2 is being phased out early next year, of course, with Clarkdale and Arrandale). Maybe AMD could, but they sure as heck won't. Intel gave Nvidia the shaft and locked them out. Think about it. The most dominant CPU company now also will be the most dominant chipset manufacturer.
(B) Graphics: Intel is forcing you to accept a GPU part for what is predominantly a CPU part. You have no choice, more or less, with Clarkdale and Arrandale. It's not so bad now because it's a 45nm part on-chip with the CPU. When it's on-die (Sandy Bridge) -- there's really no turning back. So it's a *huge* abuse of dominance. Imagine you're Apple or HP and you want to have a decent AMD/ATI or Nvidia GPU. Guess what, you have to suck it up with Intel force-bundling a junk GPU with what you actually want, a good CPU. You have to fiddle with it (hybrid mode or something if that is even possible) or disable it outright when adding a discrete GPU. For the mid to low-end, this is a total lockout of AMD/ATI and Nvidia. There's no bargaining AFAIK (apart from rumours) -- Intel says, hey, you want the fastest CPUs per watt out there, you gotta take this other part, take it or leave it entirely.
Intel settling all that stuff with AMD is good, but it is also a bit of a head fake to slip their chipset and graphics bullying under the radar.
I wonder if alarm bells are going off at the usual Department of Justice, European Commission, or whatever, or if they are still dealing with the Intel vs AMD stuff.
As I understand Nvidia isn't even bothered now trying to compete with Intel or AMD/ATI, they've said, f*** it, we're not going to make chipsets anymore beyond Core 2. They're not going to make chipsets for AMD CPUs. They're not making high-end GPUs until Fermi trickles down to the consumer level.
2010 may be a little more interesting than I initially suspected.
IT STARTED SOONER THAN I EXPECTED. With Larrabee dead-ish, I mentioned just a short while ago that Intel would start to put the muscle behind their rubbish integrated GPUs, particularly things like Arrandale and Clarkdale where, guess what, you have to take it with the GPU, resistance is futile.
Laptop manufacturers are putting the GT 240M in with the mobile Core i7. The 13" motherboard will be a bit cramped but they just need to drop the optical drive. I'd much rather have a GeForce GT 130/230/240M or 9600M GT that I use regularly than an optical drive I use maybe once a month.
Even if they make two models like they do with the Mini server. Macbook option 1 is Arrandale with an optical drive and crappy Intel graphics, option 2 is Arrandale with a 9600M GT or similar and no optical drive. They can even push the motherboard to both sides without the optical and offer more ports on option 2 e.g 4 x USB 2.
Laptop manufacturers are putting the GT 240M in with the mobile Core i7. The 13" motherboard will be a bit cramped but they just need to drop the optical drive. I'd much rather have a GeForce GT 130/230/240M or 9600M GT that I use regularly than an optical drive I use maybe once a month.
Even if they make two models like they do with the Mini server. Macbook option 1 is Arrandale with an optical drive and crappy Intel graphics, option 2 is Arrandale with a 9600M GT or similar and no optical drive. They can even push the motherboard to both sides without the optical and offer more ports on option 2 e.g 4 x USB 2.
Option 1 might actually sell better and the majority of Apple's sales are the 13" machines.
Option 1 might actually sell better and the majority of Apple's sales are the 13" machines.
Laptop manufacturers are putting the GT 240M in with the mobile Core i7. The 13" motherboard will be a bit cramped but they just need to drop the optical drive. I'd much rather have a GeForce GT 130/230/240M or 9600M GT that I use regularly than an optical drive I use maybe once a month.
Even if they make two models like they do with the Mini server. Macbook option 1 is Arrandale with an optical drive and crappy Intel graphics, option 2 is Arrandale with a 9600M GT or similar and no optical drive. They can even push the motherboard to both sides without the optical and offer more ports on option 2 e.g 4 x USB 2.
Appreciate the input and keeping this thread alive.
Now, however, you guys are willing to settle for this?
Firstly, the 130/230/240M/9600 are all rebrands/die-shrinks of years-old technology. ATI's 40nm, particularly 5-series mobile GPUs should deliver a class above for similar price points, or should be strongly considered. And it's at least clearly new tech/ architecture.
Secondly, if Macbook goes Option 1 above then AFAIK you're going to lose OpenCL support and go back to worse graphics, a step back by about 2 years.
I'm beginning to sound like a certain Teckstud but I'm not accepting a MacBook/Pro without anything less than a 40nm 5-series ATI GPU that can deliver a Nvidia GTX260/ ATI 4830 class of performance (about 30-50% better than 9600M GT, very roughly). Maybe it's a good thing, will stop me from buying a new Mac next year. Maybe I'm just spoilt by having played games with desktop GPUs.
The graphics in the MacBook Pro are now starting to get real old for "pro-level", or maybe the consumer is not so demanding in that regard. By early 2010 the MacBook Pro 15" (ideally) should be able to deliver Nvidia GTX260 or ATI 4830 class of performance within the same thermals as the 9600M GT. Nvidia's not going to be able to deliver this because the GTX260, 280 etc. tech cannot be made into mobile form, and there is no research in this direction as they move to Fermi. Even the 240M or so is a rebranded/die shrunk Nvidia 9500 ~ 9800. Apologies if I don't have the exact details.
For Montevina (Core 2) discrete GPUs it seems ATI may have the upper hand:
http://www.fudzilla.com/content/view/16805/1/
But surprisingly, according to Fudzilla the Calpella (Arrandale) discrete GPU market will be 80% dominated by Nvidia because ATI was in the process (and still is) of readying their 40nm 5-series Mobility Radeon GPUs:
http://www.fudzilla.com/content/view/16804/1/
There have been leaks about HP using 5-series Mobility Radeons:
http://www.engadget.com/2009/12/09/h...u-and-netflix/
"Kicking off the new year in style will be Pavilion desktops featuring a choice between ATI's Radeon HD 5350 (code named Evora Cedar), which will have HDMI, DVI and VGA ports along with 1GB of onboard memory, or the juicier Radeon HD 5570 (aka Jaguar), which bids adieu to VGA in favor of DisplayPort and bumps up the memory allowance to 2GB."
That's right, a mobile GPU part with an totally overkilled 2GB of VRAM.
Will this range of ATI GPUs be used in Mac laptops next year? Will there be enough chips available for laptop manufacturers? Will Apple stick with Nvidia GPUs with Arrandale? Interesting times.
Intel's Larabee is DEAD* - What this means for future Macs
*Disclaimer: Okay, it's not dead. If you believe Intel. But now that I've got your attention... there's still big news:
http://anandtech.com/weblog/showpost.aspx?i=659
http://anandtech.com/cpuchipsets/showdoc.aspx?i=3686
1. Intel's Integrated Graphics are rubbish.
2. Larrabee is supposed to be the next fantastic GPU out of Intel
3. 2010 and onwards will result in Intel CPUs packaged or on-die with Intel GPUs.
Which means.
On your Mac laptops, welcome back to Intel Integrated Graphics. Just when OpenCL and GPGPU was supposed to take off.
Poor Apple. The 9400M was a good offering and still is after more than a year. Moving to Arrandale (Core i3, i5, etc.) for Mac laptops, Apple now has to go back and either bite the bullet and have worse-than-9400M graphics (Arrandale will have something similar to Intel's X4500 GPU) ... or bite the thermals and put in an Nvidia or ATI *discrete* GPU.
Well, thanks to Intel f*king Nvidia in the a55 and locking them out of the chipset business. Now Apple loses out. 9400M MacBooks and MacBook Pros are looking like decent investments going into the next two years.
2010 will be very interesting to see what Apple brings out with their Mac laptops. Arrandale has distinct speed advantages, but thermals and a crappy integrated GPU will be a bitter pill to chew on. Don't forget as well that even on the discrete GPU side the 9600M GT is an old, old architecture in terms of GPUs. Since it is a more or less rebranded 8600, the architecture of which goes back at least 3 years.
It's funny, on one side you've got people saying, who needs GPUs? Nobody really plays demanding 3D games and most good games are all playable on consoles anyway.
Then on the other side you've got everybody pushing GP-GPU as the next best thing since x86 was dreamed up. In a few short years Microsoft Word might even be GPU-accelerated, if you were delusional enough believing the current hype (not to say that Microsoft isn't going to pull this out of its hat -er, I mean... butt).
Then Nvidia says, forget all this mainstream GPUs for desktop or laptops, we'll just make a behemoth and sell that to the high-performance-computing niche.
Somebody please help with my sanity. WTF IS GOING ON ??????
What's going on?
Er...Nvidia are not on top of their game. Ati seem to be. Tables have turned.
I can't see Apple using Intel's crappics.
People don't have money to spend on £300-500 house warmer gpus?
Ati got bought. Personally, I felt Nvidia getting bought by Intel was a matter of time. Nvidia are getting squeezed and Intel will get a better price if Nvidia stock drops?
I'm not sure this affects Apple that much. They used discrete gpus in the days of PPC. And I don't see why they can't now. Ati offer plenty of choice. Apple's historic gpu partner since the 'Ati Rage Alien Face Sucker' gpu days. (If we think the 4850 in the iMac was a side grade...remember the days when Apple would have the same gpu in their products for what seemed like years? Nvidia was a distant dream way back then...)
Larabee isn't 'alive' to consumers. So I'm not sure I can call it 'dead.' Sure more competition would be good in the gpu space. And Larrabee seemed interesting. But if gpu computing is that important going forwards...why not just buy Nvidia with pocket change. Lots of engineering talent. Expertise in Gl drivers. Open gl IS important on the iPhone and computing is going truly mobile not honking great big dinosaur towers with wind tunnel coolers on £500 gpus. Why buy one when you can have a PS3 for half the price?
But it's not like Apple uses the greatest and latest choice of gpus. Or offers said choice. They have an eclectic use of gpus, frequently stale and out of date...often with underperforming parts with a lack of vram compared to the competition.
Lemon Bon Bon.
Firstly, the 130/230/240M/9600 are all rebrands/die-shrinks of years-old technology. ATI's 40nm, particularly 5-series mobile GPUs should deliver a class above for similar price points, or should be strongly considered. And it's at least clearly new tech/ architecture.
Power usage and cost would be important factors but if the ATI cards can perform better and consume the same or less and cost the same or less then it makes perfect sense to use them instead. Apple and NVidia seemed to be forming a relationship for a brief period of time but then the desktop line switched back to ATI cards - none of the new iMacs offer dedicated NVidia cards.
Secondly, if Macbook goes Option 1 above then AFAIK you're going to lose OpenCL support and go back to worse graphics, a step back by about 2 years.
I think the new Intel chips are supposed to support GPGPU processing - the performance will be poor but that's the selling point on the other option. It's a case of what do you want, a good GPU or an optical drive instead of here's a DVD drive (when others have Blu-Ray) and poor graphics and that's it.
Will this range of ATI GPUs be used in Mac laptops next year? Will there be enough chips available for laptop manufacturers? Will Apple stick with Nvidia GPUs with Arrandale? Interesting times.
I'd much prefer it if it was a case of NVidia have great value and high performance mobile chips that Apple can use and so do ATI so either way we get a good deal. The current situation is not quite so certain.
I just hope the Larrabee experience gave Intel a wake-up call and they stop trying to hold the real graphics companies down.
... It's a case of what do you want, a good GPU or an optical drive instead of here's a DVD drive (when others have Blu-Ray) and poor graphics and that's it...
Man, the next refresh of MacBook Pros better have Blu-Ray drives, at worst, a custom configuration. "Bag of Hurt" won't cut it in 2010.
I don't see it being a big competitor of the iTunes Store, because, well, Apple has to decide, are Macs more important than the iTunes Store, I think Blu-Ray as option on 15" and 17" MacBook Pros strike a nice balance of "It's there if you want it but we're not saying ignore the iTunes Store".
People want their choices in this media mashup over the next few years. Physical, digital, social.
Also, Apple *has to* bring in Blu-Ray support on Final Cut Studio somehow, right??? What the heck is going on there? I mean, an end-to-end ingest to Blu-Ray solution for Mac Pro and MacBook Pro 15" and 17" should be critical for what Apple wants to do. Heck, imagine if iMovie '10 could burn Blu-Ray discs. Maybe globally Blu-Ray adoption is not as high as we think. But anyways, that's a whole 'nother thread... I know FCS can output files that you then send of to Blu-Ray disc producers. But for pros burning Blu-Ray on your Mac would be great. Again, a whole 'nother thread on Apple and the Pro market.
...But it's not like Apple uses the greatest and latest choice of gpus. Or offers said choice. They have an eclectic use of gpus, frequently stale and out of date...often with underperforming parts with a lack of vram compared to the competition...
Yeah, it has been eclectic, that's an interesting way of putting it. They tried to get away with too many 9400M integrated GPU models in the previous iMac range. Previously, years of GMA 950 was horrible, absolutely horrible.
However. 9400M in the 13" MacBook/Pro if you don't play any demanding games is fine, OpenCL and hardware H.264 decoding shows promise. As for the iMac 27", the ATI 4850 512MB is reasonably decent for an all-in-one.
The current MacBook Pro 15" with 9600M GT 256MB VRAM, that's starting to move into the "stale" territory. But... we would have to look at users of the MBP 15" and 17", how many are gamers or need more powerful GPUs?
I think my mind is starting to go round in circles with the whole GPU scene. On one hand, there's the promise of "Hey, it's going to rock your GPGPU world, so, get on board with a sexy GPU"... On the other hand, there really aren't the killer apps or even many apps.
Yeah, it has been eclectic, that's an interesting way of putting it.
I was being polite. See my signature.
Er...but with Ati's next round of gpu updates and a thermal 'shrink', I'd be hoping for a better gpu in the next iMac refresh that can push that 2500x1400 display better. And with 1 gig of VRAM. Even if they die shrink only offers 4870 style performance, that wouldn't be too bad. But I'd be hoping for something better than that. They now have 27.5 inches of space to spread the gpu heat now...
Sure, Apple's gpu pace is glacial, but even following that pace...we'll have Crysis running relatively smoothly on a top end iMac inside two years? I'd take that.
I've run Champions Online fairly smoothly at native resolution on my iMac with settings at medium and the 8800 gs is how old now? At least a couple of years. It's old tech. Runs Resident Evil 5 demo at 30 fps native too. Looks good. The 4850 is a decent card. But it is low end now. And has been for over a year.
By the time Apple finally releases a better gpu for the iMac we should be able to comfortably game/high end apps in 2500x1400 at medium to high settings inside the next 2 years. Maybe even next year. Some may argue the 4850 already can. But I seem to recall benches that showed it running out of breath at higher resolutions. Ergo. Why I'll wait until next year or the year after before upgrading this iMac. It's more than good enough.
Lemon Bon Bon.
I think my mind is starting to go round in circles with the whole GPU scene. On one hand, there's the promise of "Hey, it's going to rock your GPGPU world, so, get on board with a sexy GPU"... On the other hand, there really aren't the killer apps or even many apps.
Yeah, you're telling me. I've pull out what seems like what's left of my hair out with Apple and GPUs. Does it really matter?
Let's face it, you alude to it. Are there any killer apps out there for gpu computing yet? Yeah? But they're on a PS3 that costs £250 and it's far cheaper than a £350-500 gpu that is itself part of a £1500-2000 gaming rig. PS3 wins.
Macs. Gpus. Generally quite rubbish in all models. 9400 is reasonable for entry model I guess. Or was a year ago. Now we have the 4850 costing peanuts...it no longer is 'reasonable' to me.
The good news is that by the time gpu computing becomes important and by the time software developers finally catch up with quad cores and gpus Apple will have much better 'rubbish' gpus in their machines which will seem more powerful than today's rubbish gpus because software developers are actually beginning to use the entire rigs potential to do er...useful stuff. Ergo things will seem less rubbish than they are now. ie we have an iMac which now has quad core. That consigns the low end Mac Pro to irrelevance. You can have a machine with a better gpu and a gorgeous screen for less! And with next year's bump to the iMac range? That value equation is going mainstream.
Sure, they may seem moderate in performance compared to bleeding edge Nvidia cards but Nvidia may be bought by Intel by then. And we'll notice less because the iMac will be, by then, a workstation capable computer for the mainstream.
And if you want more performance? You'll plug in the dual Octo Mac Pro to run it through the iMac's screen. Instant render farm.
If you want a cheap PC tower for gaming? You'll buy a PS3 with a hi-def tv which will probably still be better than most PC gaming rigs... Or Apple will have conquered casual gaming to such a degree that people won't care about Crysis 5 and will be having fun on much less 'serious' games while the pc gaming market is consigned to irrelevance. PS4 or xbox 720? Even M$ seems to be hedging their bets. Talk about bailing on your own markets...not like M$ to do that, eh?
Workstation performance and gaming are going mainstream. Apple knows this. Keep watching.
By the time it does...even I won't care about mid-tower Macs anymore.
Lemon Bon Bon.
Christ it's frustrating being a Mac user. But at least Hackintoshes are an option. Nothing sucks like paying the Apple tax for a machines that has a GPU that was obsolete years ago. I say that while typing on the Core 2 Duo Macbook Pro with an ATI X1600 128MB VRAM... that came out a year after the previous Macbook Pro Core Duo with ATI X1600 128MB VRAM.....
My next Mac might be an HP http://www.engadget.com/2009/12/09/h...u-and-netflix/
So if the new iMacs come with the 5750 and 5770, those are 1GB cards (it would be cruel of Apple to somehow fish out 512MB versions of those cards, because that is a very non-standard configuration). So ATI 5750 and 5770 at 1GB VRAM each would see maybe 20-50% performance improvements compared to the ATI 4850 512MB VRAM... At real good thermals.
So yeah, for the iMacs, a 5750 1GB should be standard on the next revision of the 27" with an option for a 5770 1GB or 5850 1GB... Again, Apple & GPUs, who knows.
...But with Ati's next round of gpu updates and a thermal 'shrink', I'd be hoping for a better gpu in the next iMac refresh that can push that 2500x1400 display better. And with 1 gig of VRAM...
ATI's 5 series is a new architecture and die shrink. It's a totally different class now in terms of performance-per-watt compared to anything Nvidia currently offers and of course Intel is not even worth mentioning.
The 5850 and 5870 1GB VRAM are quite phenomenal, benchmarks and reviews around the web all corroborate that point.
If this is an option in the next round of iMacs then you really are going to see reasonable 2560x1440 graphics.
Just remember the 5750 and 5770 are slightly different propositions because they only have a 128-bit bus instead of a 256-bit bus. But they are now usually paired with 1GB of VRAM and they have good thermals.
But in any case, like I said, if the next iMacs get a ATI 5750 1GB VRAM you'll be seeing 20%-30%, very roughly, improvements in performance (at least in terms of gaming).