or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Apple dumping Intel chipsets for NVIDIA's in new MacBooks
New Posts  All Forums:Forum Nav:

Apple dumping Intel chipsets for NVIDIA's in new MacBooks - Page 2

post #41 of 100
Quote:
Originally Posted by solipsism View Post

Isn't this was OpenCL was going to address in SL?

I'm not sure if this was going to be a capability of OpenCL or not. If I had to guess, I'd say no considering OpenCL is at the OS layer, rather than the low-level hardware/BIOS level to be able to allow this functionality. Almost always, laptop chipsets/motherboards have come either with integrated graphics *OR* a discrete card, not both. The capability to switch back and forth was recently developed by both AMD and Nvidia, and requires explicit motherboard support.
However, assuming Apple implements this capability, OpenCL should be able to use both the integrated graphics or the discrete card since they both use the same general architecture/driver model/virtual instruction set/etc.
post #42 of 100
Quote:
Originally Posted by SpamSandwich View Post

Correct me if I'm wrong, but AMD is in the CPU business, not the graphics accelerator cards or other higgledy-piggledy (sorry if I'm getting too technical ) business.

I'm sure I'm not the first, but yeah, consider yourself corrected.
post #43 of 100
AMD owns ATi so some people will refer ATi to AMD nowadays.

By the way, I don't get it, why people want to play high-end games on a notebook that is not meant for gaming? If you wan to play games get a Windows Notebook or PC, much more variety and wayy cheaper. Besides portable gaming notebooks is categorized under Desktop Replacement Notebooks, its heavy, its big, its battery life sucks and its expensive but its portable. The MBP was never meant to be categorized under DRN.

Oh yea, don't forget that mobile GPUs are weaker then their desktop counterparts so there is no way you can play high-end games on a notebook unless you are willing to sacrifice a lot of stuffs.

Besides if playing high-end game is so important why go for notebook? You know notebooks are not meant to play high-end gaming. Get a real desktop gaming rig, or if you are like me get a Playstation 3.

Quote:
Apple just doesn't like "Centrino" because every man and his dog makes laptops off that. Apple is tired of competing on specs with such a wide range of competitors, IMO.

Yup and people like to say you are paying extra for a PC internals.....

P.S: Take note that I play games but I consider my self as a casual gamer and the 8600M GT is already sufficient.
Apple is a hardware company, dont believe me? Read this Article!. For those who understand my message, help me spread this info to those who dont get it.
Reply
Apple is a hardware company, dont believe me? Read this Article!. For those who understand my message, help me spread this info to those who dont get it.
Reply
post #44 of 100
Quote:
Originally Posted by nvidia2008 View Post

Thanks. Is that data from NotebookCheck? There is some variance in the data, since 3DMark06, especially, is influenced strongly by RAM and CPU. Nonetheless, thanks for the info.
IMO, the most important thing to look at is the 3DMark06 score. I know it's not perfect, but in my view it is the most relevant in modern graphic card comparisons.
Let's take the 8600M GT. By mobile gaming standards, it is really not bad, and pushes 3000 3DMark06. The 9600M GT, goes up to 5000 3DMark06s. That's really pretty impressive for mobile. If you've got Hybrid mode, then that reduces power drain in MBPs a lot.
The 9300, while not great, does 1000 to 2000 3DMark06s. Which means reasonable, acceptable performance on Medium settings for Mac games, and Low-to-Medium settings for latest hardcore 3D Windows games. That really ain't that bad.
Could Apple actually be reversing its haughty stance on games? With the iPhone and iPod become game platforms in and of themselves, now portable Macs??? *shocked*

Yep, notebookcheck.com. I agree, the data can be somewhat weird, like how certain cards outperform cards that they shouldn't based on the clocks and number of SPs, but oh well what can you do.

Anyways, do you know anything more about the integrated chip in the new motherboard? Do you think it will be very similar to a 9300M G discrete card, better, or worse? It will almost certainly not have any dedicated memory, right? So perhaps it will be a bit slower than the 9300M G discrete if the specs are the same.


Quote:
Originally Posted by nvidia2008 View Post

Guys, for mobile gaming, 9300 on a 13", and a 9600 on a 15", is pretty darn good already. *snip*
Can you imagine, for the first time, GRAPHICS AND GAMING on the MACBOOK is going to be an Apple Advantage, if this is all true! This is HUGE! *snip*

Yeah, not bad at all. I've been grumpy about the Intel GMA nonsense since day one of the Macbook. Coming from PC land, I have always been used to significant configuration and having the freshest components, so It's been difficult to adjust to the Mac way of things.

On another note, is there a reason why Apple doesn't offer a mobile Quadro card on the Macbook Pro? Since the Mac Pro has one, I assume the drivers have already been worked out for the extended OpenGL features. Given the fact that nVidia seems to cripple parts of the OpenGL performance of the GeForce line to get you to upgrade (At least on Windows), a Quadro 2700/3700M would certainly be nice... Anyone else know anything about this?
post #45 of 100
Quote:
Originally Posted by wheelhot View Post

AMD owns ATi so some people will refer ATi to AMD nowadays.

By the way, I don't get it, why people want to play high-end games on a notebook that is not meant for gaming? If you wan to play games get a Windows Notebook or PC, much more variety and wayy cheaper. Besides portable gaming notebooks is categorized under Desktop Replacement Notebooks, its heavy, its big, its battery life sucks and its expensive but its portable. The MBP was never meant to be categorized under DRN.

Oh yea, don't forget that mobile GPUs are weaker then their desktop counterparts so there is no way you can play high-end games on a notebook unless you are willing to sacrifice a lot of stuffs.

Besides if playing high-end game is so important why go for notebook? You know notebooks are not meant to play high-end gaming. Get a real desktop gaming rig, or if you are like me get a Playstation 3.

I just responded to this same exact inquiry....

1) Graphics acceleration is NOT limited to gaming. First of all, the entire OSX interface runs on OpenGL. More importantly, professional video/graphics applications (based on CoreImage and other frameworks) utilize OpenGL for accelerated graphics processing and rendering. Remember the "Pro" in Macbook Pro?? 3D modeling, animation, CAD/CAM, video editing, compositing, etc all need decent graphics acceleration. Even Photoshop is supposedly going to be hardware accelerated in CS4.

2) Apple's OpenCL will make it much easier for applications to utilize the parallel processing capability of modern GPUs to offload certain types of processing.

3) Many people CAN'T AFFORD buying a dedicated computer just for gaming AND a Macbook/MB Pro. They want ONE device that can do it all.

4) nVidia's graphics chipsets have much better hardware HD video decoding than Intel's GMA. The image quality and enhancement features are better (iDCT, deinterlacing, upscaling, etc) and they use a lot less CPU power during decoding. (Even hardware video decoding uses some CPU time.. and the amount depends on the implementation. AMD's GPUs are even better than nVidia in this regard)
post #46 of 100
Quote:
Originally Posted by AHeneen View Post

I'm not a gamer, but why would you need more power than that? Why wouldn't you just get a desktop? If there are more powerful gpu's out there, there is probably a good reason Apple doesn't include one: too much heat would require a larger heatsink and Apple doesn't want to make a thicker laptop, it is not compatible with the current hardware on the MBP, it is too hard/not possible to write drivers for it, etc. At any rate, you can be certain it will not be finding its way into a MacBook!

As a non-tech guy, here's why I'm hoping we get top of the line nVidia 9800m GTX (http://www.nvidia.com/object/geforce_m_series.html):

Growing popularity of Second Life and other types of MMO type games that demand high res
Growth in using avatars (like in Second Life) to make purchases - expectation is that we'll use avatars to make more and more purchases online
Growth in demand for Blu-ray/HD video
Shift from text to video messages over the next 5 years (text will become like email - something for old people )
Advances in high band width Internet access (developing in UK, and Cern in Geneva, Switzerland) will bring holographic iChat
Rapid growth in the personal use of HD video to post to YouTube, and other sites

and, well, for those who know/care, there is Crysis....

My 17" PB is now running Leopard - I've been able to stay well below the cutting edge in hardware yet current on OS upgrades. Nice job Apple! Keep pushing the hardware specs when you release new products.

The demand on mobile graphics will grow steeply in the near future. Again, I hope Apple pushes the edge here. Yes, you can have a desktop with large GPU yet why? I want fewer computers, not more.

<soapbox>
Blog: PowerConferenceCalls.com
Reply
Blog: PowerConferenceCalls.com
Reply
post #47 of 100
Quote:
Originally Posted by winterspan View Post

The Geforce 9100M is the currently known integrated/motherboard graphics chipset for nVidia laptop motherboards. According to the information known, the macbook 'MCP79' board will have a new integrated graphics chipset based on some version of their discrete Geforce 9300M/9400M cards, but it will depend on the specific motherboard model that Apple uses. If I were to guess, I'd bet the performance will be a bit better than the existing "9300M G" (which would be nearly 2X as fast as the latest Intel GMA). Hopefully the Macbook Pro also uses an nVidia board and Apple implements their "Hybrid Power" technology. It will allow the Macbook Pro to turn off it's discrete GPU when not in heavy use and use the integrated chipset instead to save power.

I put together this graphic just to server as a rough performance comparison between nVidia's latest mobile GPU generation.

notes:
- On the graph I included the 8600GT which is the current Macbook Pro card to show as comparison. I believe the current Macbook uses the GMA X3100.
- SP stands for "Stream Processors" (shaders) which are the SIMD processing units that make up a nVidia's modern GPUs.
- Nvidia GPUs have three separate clock frequencies -- one each for the main clock, shader processor clock, and memory clock. On the graph, only the shader clock is listed.
- 3DMark05 and 3DMark06 are standard graphics performance benchmarks.
- I did the best I could and double-checked the data, but I can not make *ANY GUARANTEES* about the accuracy of the information on the chart.


To add to some of the pieces that that are missing from the chart above...
Intel GMA X4500HD (GM45)
— Shader Processors = 10
— Core Clock = 533MHz
— H.264 Decode Acceleration = Yes
— TDP =12W

Intel GMA X3100 (GM96)
— Shader Processors = 8
— Core Clock = 500MHz
— H.264 Decode Acceleration = No
— TDP = 13.5W

Intel GMA 950 (945G)
— Shader Processors = ?
— Core Clock = 400MHz
— H.264 Decode Acceleration = No
— TDP = 20W

http://en.wikipedia.org/wiki/Intel_G...s_and_chipsets
PS: Looking at these results, unless NVIDIA -and- Apple have something previously unannounced and significant to offer for the future of Apple I just doesn't see move to NVIDIA as being very likely scenario come next Tuesday.
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
post #48 of 100
Quote:
Originally Posted by wheelhot View Post

AMD owns ATi so some people will refer ATi to AMD nowadays.

By the way, I don't get it, why people want to play high-end games on a notebook that is not meant for gaming? If you wan to play games get a Windows Notebook or PC, much more variety and wayy cheaper.

If I were a betting man, I'd wager that Apple has discovered the long sought manna that has finally paid of from their tireless systems integration design... apps and games sold through iTunes! They will likely scale up with more high-end games and apps that will play on larger, more powerful Apple computers, and they will sell of of it to you over Wi-Fi, Internet and cellular networks.

Apple builds upon successes, and I've a feeling they will let us know exactly how many millions of apps and games they've pushed to users. This will break wide open the next frontier for all software sales via iTunes... just in time also, since the digital movie sales and rentals thing has not rolled out quite as rapidly as they'd hoped.

Proud AAPL stock owner.

 

GOA

Reply

Proud AAPL stock owner.

 

GOA

Reply
post #49 of 100
Quote:
Originally Posted by winterspan View Post

Graphics acceleration is beneficial to MUCH more than just gaming, particularly with professional media applications that use OpenGL/CoreImage for transforms and rendering...

So what are the chances of the Mac Pro getting SLI capability? SLI would provide more bang for the buck than a single Quadro card. SLI can also support 2 Quadro cards. Surely, 2 Quadro cards running SLI would be more powerful than a single Quadro card.
post #50 of 100
Questions and concerns I have about ising NVIDIA over Intel chipsets:
  1. Is NVIDIA's track record in this area strong enough to make it a valid choice for Apple? Remember, Apple uses very few top end components from each supplier so they need to have sufficient supply,which also means that any manufacturing issues may result in excessive recalls hurting the company in the long run.

  2. How much smaller is this North and Southbridge combined chipset over Intel's offerings? Combining ships into one is great, but if the result is only a couple percent difference it becomes a futile reason for the change.

  3. How much faster GPU performance can we expect over Intel's GMA X4500? What about other the other chipset options like lower-power WiFi, WiMAX, H.264/VC1 encoding/decoding, and other nifty aspects to Montevina?

  4. I've read plenty of pros regarding GPU performance with NVIDiA, but even GMA950 was never an issue for me, so what potential cons are there in Apple moving to NVIDIA chipsets?
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
post #51 of 100
Quote:
3) Many people CAN'T AFFORD buying a dedicated computer just for gaming AND a Macbook/MB Pro. They want ONE device that can do it all.

Then get a Windows notebook? Its cheap and you get better specs then MB or MBP. Except that you will lose the beauty of OS X but hey, its a compromise.

Quote:
2) Apple's OpenCL will make it much easier for applications to utilize the parallel processing capability of modern GPUs to offload certain types of processing.

True but you don't need the best GPU for that.

Quote:
1) Graphics acceleration is NOT limited to gaming. First of all, the entire OSX interface runs on OpenGL. More importantly, professional video/graphics applications (based on CoreImage and other frameworks) utilize OpenGL for accelerated graphics processing and rendering. Remember the "Pro" in Macbook Pro?? 3D modeling, animation, CAD/CAM, video editing, compositing, etc all need decent graphics acceleration. Even Photoshop is supposedly going to be hardware accelerated in CS4.

True, again you don't need the best GPU. And so far Intel GMA is doing well coping with OSX interface.

Im not saying that the MB shouldn't have a proper GPU or that the MBP could have improved GPU but its not a big deal not having the best GPU. Most people seem to make a big deal about it, and sadly most of these people give their reasoning is to play high-end games.

I understand the need of a good GPU for certain softwares but you usually don't need the best. Maybe Apple should offer it as a buy to configure option (that is not going to be cheap). Well there is one thing that I still wonder though, like some user says, how come there is no Nvidia Quadro M for MBP users? The only reason I can think of is the lack of popular Windows engineering software like SolidWorks?
Apple is a hardware company, dont believe me? Read this Article!. For those who understand my message, help me spread this info to those who dont get it.
Reply
Apple is a hardware company, dont believe me? Read this Article!. For those who understand my message, help me spread this info to those who dont get it.
Reply
post #52 of 100
Quote:
Originally Posted by Haggar View Post

So what are the chances of the Mac Pro getting SLI capability? SLI would provide more bang for the buck than a single Quadro card. SLI can also support 2 Quadro cards. Surely, 2 Quadro cards running SLI would be more powerful than a single Quadro card.

Zero. There is zero chance of the Mac Pro getting SLI capability. I'll go out on a short limb and say that there is zero chance of Apple using SLI in any capacity on any of its machines, the only possible exception being HybridPower.
post #53 of 100
Quote:
Originally Posted by Fishyesque View Post

Does it say otherwise anywhere?

In other words, yes.

What a rude response. Any idiot should have been able to reason that the person asking the question did not know that the de facto meaning of the term "chipset" is that of the set of chips that support graphics and I/O into and out of the CPU. As such, what sort of jerk responds to an innocent question of that sort with a response such as "Does it say otherwise anywhere"? You have provided reinforcement for the stereotype of a computer nerd who is pathologically lacking in the basic cognitive skills that are required for normal social interaction with people. You need to get away from your computer for a while and learn how to interact with people instead.

The question asked by newtomac08 and also by Elixer was a perfectly reasonable question for anyone not familiar with the de facto meaning of the term "chipset" to ask. Someone needed to ask it in order that someone else (Mr. H) would be prompted to provide some clarification. Mr. H was also unnecessarily rude. He wrote:

"No, that's why it's called a 'chipset' not a processor. Look at that word: 'chipset'. It's what it says on the tin: a set of chips."

What sort of garbage reasoning is that? He implicitly proposes that by virture of the word "chipset", it should be obvious to anyone who reads English that the processor is excluded. That is absurd. To anyone who is not familiar with the de facto meaning of the word, it is perfectly logical to assume that "chipset" includes the processor, which is a chip, and which is certainly included in the set of chips on the circuit board. Duh. That is precisely why the question was asked. Mr. H also needs to walk away from his keyboard for a while and learn something about people and how to interact with them.

I just have no patience for emotionally retarded twits whose parents allowed them to spend most of their developmental years sitting in front of a computer without ever recognizing the need for their social development. It is truly pathetic that today we have an entire generation of these types that, notwithstanding their technological capabilities, will never be worth a plug nickle to anyone other than themselves.
post #54 of 100
Nice rant.

Are you done for now? The kid just gave a response, so what if it was a little snotty? I think it is you who needs a little "social developement" if you think that eveyone in the world is peachy and wants to be your best friend. Just let it go.
post #55 of 100
great post kaiser_soze .People with the "Comic Book Guy" mentality and its accompanying illusion of superiority really need some perspective.
post #56 of 100
A 9600M GT would be kind of disappointing. It is an improvement over the 8600M GT, but really not that much considering the MBP already used the GDDR3 version and the 9600M GT still only has 32 Stream Processor, just clocked higher. I was hoping for a 9700M GT.

Apple really should incorporate an ATI Mobility Radeon 4670. Especially with all this talk of OpenCL. Only ATI's HD3xxx and HD4xxx series and nVidia GT2xx series supports 64-bit double precision floats. nVidia's 8xxx and 9xxx series only do 32-bit floats, so while they may be fast in games, they would be more limited in GPGPU applications. Although it may not show through in games due to driver limitations in exploiting ATI's 5-way architecture, the 4670's 320 Stream Processors really should show their power in GPGPU applications over the higher clocked 32 Stream Processors in most of nVidia's mainstream mobile lineup. The Mobility Radeon 4xxx series is supposed to launch before the year is out and I can think of no bigger event for ATI to gain momentum than to be in Apple's MBP and take part in the upcoming Apple event.

And in terms of switching between integrated and discrete graphics cards, Intel's Montevina also supports this feature if OEMs decide to use it. Intel's implementation is also GPU agnostic between ATI and nVidia GPUs, which is beneficial to avoid locking into one vendor.

EDIT: In terms of switching between integrated and discrete graphics card, I mean the feature that nVidia calls Hybrid Power where the discrete GPU can be shut off to save power and the screen is fed by the IGP. Montevina supports this feature and can work between Intel IGPs and ATI GPUs or Intel IGPs and nVidia GPUs. Intel doesn't support Hybrid SLI of course, but I don't think it matters anyways since Hybrid SLI or what nVidia calls GeForce Boost won't work with anything higher than a GeForce 9500M G anyways. Only Hybrid Power works with mainstream or high-end GPUs, which Montevina supports.
post #57 of 100
Quote:
Originally Posted by wheelhot View Post

or if you are like me get a Playstation 3.

I do hope you're getting LittleBigPlanet when it comes out this month!

Quote:
Originally Posted by kaiser_soze View Post

What a rude response.

Hahahahaha

Relax man. Sorry to have burst a fuse of yours. If you noticed, I just responded to Elixir, not the other guy. Me and Elixir bicker all the time on here, (well we used to, the Blu-ray thread isn't as active anymore) it's always fun! It'd be weird if I was too kind to him.

And I'm far from a nerd. In fact, I'm rather lacking in knowledge as opposed to all these people on here, there's a lot that people say that I don't grasp, as I've not read up on the stuff.

Sorry to have bugged you.

In fact, feel free to read through other posts I've made.

And you know more about chipsets than I do haha
post #58 of 100
Hmmm... What the rumours seem to say are a 9300 or 9400 discrete GPU for the MacBook. Apple may or may not use GeForce Boost/ Hybrid Power, as the 9300 and 9400 may not be that insanely power draining... We'll see.

As for the MacBook Pro, that could have Hybrid Power so it turns off the 9600 GPU whenever more basic operations are going on.

As for mobile Quadro, Apple seems to be very cautious about its pro offerings, so I wouldn't expect it anytime soon. It is a bit unfair because mobile and desktop Quadro, like you say, are just uncrippled or somehow more stable GeForces anyways.


Quote:
Originally Posted by winterspan View Post

Yep, notebookcheck.com. I agree, the data can be somewhat weird, like how certain cards outperform cards that they shouldn't based on the clocks and number of SPs, but oh well what can you do.

Anyways, do you know anything more about the integrated chip in the new motherboard? Do you think it will be very similar to a 9300M G discrete card, better, or worse? It will almost certainly not have any dedicated memory, right? So perhaps it will be a bit slower than the 9300M G discrete if the specs are the same.

Yeah, not bad at all. I've been grumpy about the Intel GMA nonsense since day one of the Macbook. Coming from PC land, I have always been used to significant configuration and having the freshest components, so It's been difficult to adjust to the Mac way of things.

On another note, is there a reason why Apple doesn't offer a mobile Quadro card on the Macbook Pro? Since the Mac Pro has one, I assume the drivers have already been worked out for the extended OpenGL features. Given the fact that nVidia seems to cripple parts of the OpenGL performance of the GeForce line to get you to upgrade (At least on Windows), a Quadro 2700/3700M would certainly be nice... Anyone else know anything about this?
post #59 of 100
For those of you who say that most people can't afford to buy two computers for separate uses, you might be surprised by how inexpensive a decent gaming PC is these days.

For $600, you can build a pretty solid gaming machine, and $899 (rumored price) buys you a MacBook with an integrated or low-end dedicated chip. $1500 for PCs is not too shabby. I paid that much just for my MBP.

I think that the biggest issue with the integrated Intel GPUs is not 3D graphics performance but HD decoding. The 4500HD supposedly does a solid job with HD content.

Contrary to some claims, most budget notebooks under $899 still use integrated graphics or low-end stuff from nVidia/ATi. There is no serious gaming PC laptop under $1000. In fact, most serious gaming laptops are over $1500.
32" Sharp AQUOS (1080p) > 13" MacBook Pro 2.26GHz. 4Gb RAM . 32Gb Corsair Nova SSD >>> 500Gb HDD
Reply
32" Sharp AQUOS (1080p) > 13" MacBook Pro 2.26GHz. 4Gb RAM . 32Gb Corsair Nova SSD >>> 500Gb HDD
Reply
post #60 of 100
Quote:
Originally Posted by applebook View Post

For those of you who say that most people can't afford to buy two computers for separate uses, you might be surprised by how inexpensive a decent gaming PC is these days.

Very true. Building it yourself is so much more inexpensive. I built my friend's computer, it cost about that much, and is a great gaming machine.
post #61 of 100
What about the goddamn Macbook Pros?????
There's absolutely no word on the MBP, which sucks ass.
15.4" Glossy Macbook Pro, Santa Rosa 2.4Ghz, 4Gb DDR2 667, 160Gb HDD + 1Tb Time Capsule, 8Gb iPhone - Wireless redefines sexy!
Reply
15.4" Glossy Macbook Pro, Santa Rosa 2.4Ghz, 4Gb DDR2 667, 160Gb HDD + 1Tb Time Capsule, 8Gb iPhone - Wireless redefines sexy!
Reply
post #62 of 100
Why would Apple use a 9600M GT when they could use a 9650M GT which is much faster and 55nm?
post #63 of 100
Quote:
Originally Posted by Machead99 View Post

Why would Apple use a 9600M GT when they could use a 9650M GT which is much faster and 55nm?

Why does Apple do anything? I hope it is a 9650M GT, but who knows nowadays. Until the Creation receives His Blessing, who knows what products make it and what they have.

Pls pls pls pls pls 9650M GT??? Come on Apple....!!!
post #64 of 100
Quote:
Originally Posted by nvidia2008 View Post

Hmmm... What the rumours seem to say are a 9300 or 9400 discrete GPU for the MacBook...

Everything I have read says the graphics on the macbook are definitely going to be integrated... but based upon the 9300M discrete card or on it's performance level. Oddly enough, there is no "9400" -- nVidia actually uses "9400" to describe the performance specification that is created when you combine the 9100M integrated graphics with a 9300M discrete card.

I know, confusing, right? nVidia and their crazy naming schemes... *sigh*...

Quote:
Originally Posted by wheelhot View Post

True but you don't need the best GPU for that.
*snip* Im not saying that the MB shouldn't have a proper GPU or that the MBP could have improved GPU but its not a big deal not having the best GPU. Most people seem to make a big deal about it, and sadly most of these people give their reasoning is to play high-end games. *snip*
I understand the need of a good GPU for certain softwares but you usually don't need the best.

In both the Macbook and the Macbook Pro, we are not talking about the "best". The macbook currently uses the *worst* (mainstream) graphics solution on the market. Going to a new integrated chipset from nVidia is hardly overkill. Many media applications that use CoreImage/OpenGL can't even RUN on a macbook. Literally, it's not about bad performance -- it won't even LOAD.

The Macbook Pro's 8600GT isn't a total slouch, but it is certainly not a "pro" card. Apple should really offer a decent mobile Quadro card as an option.


Quote:
Originally Posted by Haggar View Post

So what are the chances of the Mac Pro getting SLI capability? SLI would provide more bang for the buck than a single Quadro card. SLI can also support 2 Quadro cards. Surely, 2 Quadro cards running SLI would be more powerful than a single Quadro card.

"bang for the buck" is not what Quadro cards are about. They are professional grade cards that are optimized for very high-reliability and high-accuracy rendering for CAD/CAM/3D modeling. The drivers are much more rigorously verified and offer many advanced OpenGL features not available in Geforce cards. On the other hand, Geforce cards are optimized for high frame-rates and performance, not accuracy and reliability.
post #65 of 100
Quote:
Originally Posted by solipsism View Post

Questions and concerns I have about ising NVIDIA over Intel chipsets:
How much smaller is this North and Southbridge combined chipset over Intel's offerings? Combining ships into one is great, but if the result is only a couple percent difference it becomes a futile reason for the change.

I have no idea.. I'm assuming they wouldn't have gone to the effort if it didn't offer significant power and/or size reduction.

Quote:
Originally Posted by solipsism View Post

How much faster GPU performance can we expect over Intel's GMA X4500? What about other the other chipset options like lower-power WiFi, WiMAX, H.264/VC1 encoding/decoding, and other nifty aspects to Montevina?

If it runs at the level of the 9300M/9400M like nVidia has been claiming, it should be twice as fast or more on 3dMark06. I don't have info off-hand, but I remember reading more than one article that criticized Intel for weak performance on H264/VC-1 decoding versus nVidia (and even more so against ATI's superb implementation). I'm pretty sure I was reading about the X4500.. but possibly could have been X3100. Time to do some more research...

Quote:
Originally Posted by solipsism View Post

I've read plenty of pros regarding GPU performance with NVIDiA, but even GMA950 was never an issue for me, so what potential cons are there in Apple moving to NVIDIA chipsets?[/list]

Well, many would say that the macbook isn't meant for certain applications, but I'd have to ask if you have used the GMA950 for any type of DCC/video/audio/etc applications? I've read some of Apple's pro apps won't even boot on the X3100!

Intel has certainly never been popular for their chipsets, but I don't know.. Maybe nVidia's one-chip solution will have bugs, or their drivers will be weak, or it will run hot.. I guess anything is possible, but remember, this is Apple...

Especially given that they will have a totally new chipset situation when Nehalem rolls around (no FSB and Intel won't give nVidia the IP rights for quickpath), I don't think they would roll this transition out unless they were absolutely certain it is providing an advantage...
post #66 of 100
Quote:
Originally Posted by winterspan View Post

I have no idea.. I'm assuming they wouldn't have gone to the effort if it didn't offer significant power and/or size reduction.

Perhaps the flexibility that nVidia is offering. The Intel-Apple relationship seemed a little strained since the MacBook Air launch.

Quote:
Originally Posted by winterspan View Post

If it runs at the level of the 9300M/9400M like nVidia has been claiming, it should be twice as fast or more on 3dMark06. I don't have info off-hand, but I remember reading more than one article that criticized Intel for weak performance on H264/VC-1 decoding versus nVidia (and even more so against ATI's superb implementation). I'm pretty sure I was reading about the X4500.. but possibly could have been X3100. Time to do some more research...

The X4500 has full decoding thingys (http://www.phoronix.com/scan.php?pag..._x4500hd&num=1).

Quote:
Originally Posted by winterspan View Post

...I've read some of Apple's pro apps won't even boot on the X3100!...

Many pro apps

Quote:
Originally Posted by winterspan View Post

Intel has certainly never been popular for their chipsets, but I don't know.. Maybe nVidia's one-chip solution will have bugs, or their drivers will be weak, or it will run hot.. I guess anything is possible, but remember, this is Apple...

I expect some first revision bugs, probably a few lawsuits (I'm not being flippant here)... A risk that Apple has to take on, but in all seriousness, nVidia does make some kickass stuff, they are a competitor to Intel's chipsets and AMD/ATI's chipsets and GPUs. nVidia has PhysX as well...

Quote:
Originally Posted by winterspan View Post

Especially given that they will have a totally new chipset situation when Nehalem rolls around (no FSB and Intel won't give nVidia the IP rights for quickpath), I don't think they would roll this transition out unless they were absolutely certain it is providing an advantage...

Something has spooked Apple about Nehalem, if Apple is going to take an at least one year's detour from Intel chipsets...?

Quote:
Originally Posted by winterspan View Post

Everything I have read says the graphics on the macbook are definitely going to be integrated... but based upon the 9300M discrete card or on it's performance level. Oddly enough, there is no "9400" -- nVidia actually uses "9400" to describe the performance specification that is created when you combine the 9100M integrated graphics with a 9300M discrete card....I know, confusing, right? nVidia and their crazy naming schemes... *sigh*...

Ah... good catch. There is no 9400. So we are looking at Hybrid SLI and Hybrid Power on the MacBooks, according to this line of rumours. An integrated 9100M, and a discrete 9300M G (or 9300M GS -- probably 9300M G). I expect this to bench around 1500 to 2000 3DMark06's.

Edit:
The MCP79 is still "secret" so far. So it could be an integrated GPU based on a 9300 or 9400-esque level of performance. It could be 65nm, it could be 55nm... Do we have *any* corroborating evidence on what the heck is in the MCP79...?
post #67 of 100
To repeat, there is one more thing Intel doesn't have. Nvidia has it built into the GPU now, even the lowly 9300M G, if I read correctly.

PhysX.

Beyond games, what if your MacBook was a nice little high-powered Physics simulator? I mean, it ain't gonna process Hadron collider data or anything, but PhysX acceleration is rather impressive.

Intel has Havok but it is not accelerated in the X4500 GPU, AFAIK...

PhysX driver support for Windows should be here around now...
http://blog.laptopmag.com/nvidia-unv...-support-by-q3

I'm going to find and watch pretty PhysX demo videos.

Because if the MacBook can do that, at least initially in BootCamp, I am going to say, OMFG. Remember though, for Windows BootCamp, you have to find the latest drivers yourself and tinker with it a little, as laptop drivers are usually provided through the manufacturer which will choose previous versions which may be more stable, who may not update them as much.

So. The big question is now back to Apple. What is OpenCL? Will PhysX ever be used in OpenCL or OpenPhysics or something like that?

Could the GPU really be as powerful as the CPU? HD, H.264 Decode/Encode, Physics simulations, other great apps..?

If within 10.5.5+ or 10.6 you could access PhysX, and write a really amazing K-12+University physics education program/platform, that would be darn cool.

Nice PhysX game demo:
http://www.youtube.com/watch?v=tFzcBbdQEh4
post #68 of 100
Heh, an $800 MB with decent integrated graphics kills the Mini as a product as it is.

So, we can hopefully expect either a mini rev soon after this launch or a new Macbook for me.

I have a Rev A MBP so I have zero qualms about a 1st rev MB based on nvidia.

So how much faster could a new MB be over my 1st gen 17" MBP?
post #69 of 100
Another cool PhysX video:
http://www.youtube.com/watch?v=SS_8VIPGJsY
http://www.youtube.com/watch?v=sNZbOEPEWFU
Just a demo, but the finished product, screams "playable on MacBook" all over it? Hmm... interesting possibilities.

This one is quite interesting, basic demo but fun because of the squashable frogs and bunnies.
http://www.youtube.com/watch?v=gMQDPLcqt8s
post #70 of 100
Quote:
I do hope you're getting LittleBigPlanet when it comes out this month!

Nooo, then I wont have any more free time....

Quote:
In both the Macbook and the Macbook Pro, we are not talking about the "best". The macbook currently uses the *worst* (mainstream) graphics solution on the market. Going to a new integrated chipset from nVidia is hardly overkill. Many media applications that use CoreImage/OpenGL can't even RUN on a macbook. Literally, it's not about bad performance -- it won't even LOAD.

True, but you fail to see what I'm trying to say bout the MBP is people is not happy with what the rumored MBP GPU is going to be (its going to be slightly better then the 8600M GT but hey, for me this GPU is good enough for most usage), they want a better and most of them that I've heard and read wants one of the best mobile GPU for MBP, what's their reason? Games....I want to play Crysis.....one of their reason

I honestly thought Apple will be using Intel Larabee in the future...
Apple is a hardware company, dont believe me? Read this Article!. For those who understand my message, help me spread this info to those who dont get it.
Reply
Apple is a hardware company, dont believe me? Read this Article!. For those who understand my message, help me spread this info to those who dont get it.
Reply
post #71 of 100
Quote:
Originally Posted by guinness View Post

Yeah it is.

The 8800GT gets about twice as many FPS as an 8600GTS for example.

http://www.anandtech.com/video/showdoc.aspx?i=3140&p=8

Off topic sorry : This asked of any of the gamer experts. I have to run both my ACDs on the 8800GT on an 8 core Mac Pro even though I have an additional graphics card as Apple Pro apps refuse to work across two graphics cards correctly. FCPro doesn't allow preview on second monitor if on a second card as just one example. How much hit does the sharing the GT8800 make on the performance when only using one of the ACDs for games?
From Apple ][ - to new Mac Pro I've used them all.
Long on AAPL so biased
"Google doesn't sell you anything, they just sell you!"
Reply
From Apple ][ - to new Mac Pro I've used them all.
Long on AAPL so biased
"Google doesn't sell you anything, they just sell you!"
Reply
post #72 of 100
Quote:
Originally Posted by wheelhot View Post

I honestly thought Apple will be using Intel Larabee in the future...

They probably will, it is almost a year away.
post #73 of 100
Quote:
Originally Posted by solipsism View Post

Questions and concerns I have about ising NVIDIA over Intel chipsets:
  1. Is NVIDIA's track record in this area strong enough to make it a valid choice for Apple?

That's the real question here, and thank goodness someone finally asked it.

Latest Nvidia commentary off the web this morning:

Quote:
In July, a red-faced Nvidia announced it would be eating up to $200 million in repair, return and replacement costs because some significant number of its notebook graphics chips made it into the market with a flaw in the die/packaging material that caused a high failure rate. Apple, being an Nvidia customer, immediately asked about the chips in its Macs, and was assured at the time that it was unaffected by the problem processors. But, according to a new customer support post today, Apple did some poking around of its own and found that some MacBook Pros manufactured between May 2007 and September 2008 are indeed harboring vulnerable versions of Nvidias GeForce 8600M GT graphics processor, with failure signaled by distorted or absent video. Apple said that MacBook owners so afflicted will be eligible for free repairs within two years of the original date of purchase, and that those who had already paid for such repairs would be reimbursed.

Despite its troubles (which include a shareholders lawsuit over the handling of the fault chip issue), Nvidia still has at least one loyal fan base, though not one it would want to embrace hackers (see The Law of Unintended Consequences: a graphic example). Seems the massively parallel processing capabilities of the graphics chips also lend themselves well to brute-force cracking, and the euphemistically named password recovery software sold by Russian firm Elcomsoft puts that power in the hands of the ill-intentioned masses. The latest warning of the ramifications comes from Global Secure Systems, which says the hardware-software combination renders Wi-Fis WPA and WPA2 encryption systems pretty much useless. Using the graphics processors, hackers can break the commonly used wireless encryption schemes 100 times faster than with conventional microprocessors, GSS officials said. This breakthrough in brute force decryption of Wi-Fi signals by Elcomsoft confirms our observations that firms can no longer rely on standards-based security to protect their data. As a result, we now advise clients using Wi-Fi in their offices to move on up to a VPN encryption system as well, said GSS managing director David Hobson. Brute force decryption of the WPA and WPA2 systems using parallel processing has been on the theoretical possibilities horizon for some time - and presumably employed by relevant government agencies in extreme situations - but the use of the latest NVidia cards to speedup decryption on a standard PC is extremely worrying.

At this point, I believe Mac users should be thoughtful, concerned and watchful, rather than giddy, elated and jumping for joy when it comes to Apple's choice of more Nvidia product to roll out new MacBooks and MacBook Pros.
Hot tub blonde, pouring champagne: "Say when..." Dangerfield: "Right after this drink."
Reply
Hot tub blonde, pouring champagne: "Say when..." Dangerfield: "Right after this drink."
Reply
post #74 of 100
Quote:
Originally Posted by winterspan View Post

"bang for the buck" is not what Quadro cards are about. They are professional grade cards that are optimized for very high-reliability and high-accuracy rendering for CAD/CAM/3D modeling. The drivers are much more rigorously verified and offer many advanced OpenGL features not available in Geforce cards. On the other hand, Geforce cards are optimized for high frame-rates and performance, not accuracy and reliability.

So your response to why Apple does not offer SLI on the Mac Pro is "Quadro cards are professional cards". How does that address the issue at all?

Well, if a Quadro card is so great and so professional, then wouldn't 2 Quadro cards running SLI be even better? Does running a pair of Quadro cards in SLI make the Quadro less "professional" somehow? If Apple wants Macs to be the best platform for graphics and design, then shouldn't Macs have the best graphics options available? An Apple logo alone does not make up for subpar hardware.
post #75 of 100
Quote:
Originally Posted by wheelhot View Post

Then get a Windows notebook? Its cheap and you get better specs then MB or MBP. Except that you will lose the beauty of OS X but hey, its a compromise.

Which gaming notebooks are those, that cost less than the MacBook with better specs?
post #76 of 100
Quote:
Originally Posted by solipsism View Post

Questions and concerns I have about ising NVIDIA over Intel chipsets:


[*] How much smaller is this North and Southbridge combined chipset over Intel's offerings? Combining ships into one is great, but if the result is only a couple percent difference it becomes a futile reason for the change.

[*] How much faster GPU performance can we expect over Intel's GMA X4500? What about other the other chipset options like lower-power WiFi, WiMAX, H.264/VC1 encoding/decoding, and other nifty aspects to Montevina?

[*] I've read plenty of pros regarding GPU performance with NVIDiA, but even GMA950 was never an issue for me, so what potential cons are there in Apple moving to NVIDIA chipsets?[/list]


Hopefully Apple has tested the chipsets enough to enusre their durability. If I were to be concerned. I would be less concerned over the machine breaking right when I first bought it and more concerned about it breaking right after my Apple Care expires.

About your questions on performance. Is their anyway to really know the answers since the chipset hasn't been formally introduced.
post #77 of 100
I would trust Nvidia in the new Macs for one simple reason .....

If they get this wrong, they are in serious, serious trouble.

This chipset is going to be as important as their chips in the original XBOX, it needs to JUST WORK.

The solder issue in the exisiting Macbook pro's and other notebooks was an expensive mistake, but it was just one simple thing, and we shouldn't question everything they release based on this one issue.

Apple generally releases 'appliance' type pc's, limited configurations, extremely limited hardware choices, and no choice of OS. This makes it very similar to an XBOX in many ways, and I think they will get it right, mainly because they have done so before ... and it really isn't that hard to design a solid 'appliance'.

It's true that Dell and HP only offer one OS choice as well, but that OS (Vista) has to deal with zillions of hardware variations, making it a much less reliable OS overall.

MB's and MB pro's are like the cell phones of the PC market, and if Nvidia can't get it right (and personally I think they can), their future is in serious question.

And that alone should give you some extra confidence.
post #78 of 100
I think one of the key points here is that they'll be thoroughly updating the Macbooks and that my current black 1st gen Macbook will be utterly trounced in the graphics department by any of the rumored chipset upgrades. Color me excited. I'll be upgrading for certain this time.
Do you realize that fluoridation is the most monstrously conceived and dangerous Communist plot we have ever had to face? - Jack D. Ripper
Reply
Do you realize that fluoridation is the most monstrously conceived and dangerous Communist plot we have ever had to face? - Jack D. Ripper
Reply
post #79 of 100
Quote:
Originally Posted by nvidia2008 View Post

Beyond games, what if your MacBook was a nice little high-powered Physics simulator? I mean, it ain't gonna process Hadron collider data or anything, but PhysX acceleration is rather impressive.

I don't think PhysX is supported on the mobile chips though. The Macbooks will still be low end but just much more usable for the majority of apps.

Quote:
Originally Posted by nvidia2008 View Post

So. The big question is now back to Apple. What is OpenCL? Will PhysX ever be used in OpenCL or OpenPhysics or something like that?

Could the GPU really be as powerful as the CPU? HD, H.264 Decode/Encode, Physics simulations, other great apps..?

I think that on the lower end chips, they will be able to maybe half standard encoding times relative to the CPU in the Macbook. It won't be quite what we get from the much higher end chips but still a good improvement.

Quote:
Originally Posted by digitalclips

How much hit does the sharing the GT8800 make on the performance when only using one of the ACDs for games?

I would say none if it's not drawing updates to it. You could simply unplug one and test the framerate but I reckon you will see no drop in performance.

Quote:
Originally Posted by vinea

Heh, an $800 MB with decent integrated graphics kills the Mini as a product as it is.

So, we can hopefully expect either a mini rev soon after this launch or a new Macbook for me.

A Mini with the Nvidia chip based on the 9300/9400 should be enough to stop (my) complaints about the Mini being crippled. I think it has to get Nvidia chips to be OpenCL compatible or it will have to be discontinued. If it does get bumped, the GPU performance will go up by a factor of about 5-10.

If they have a good GPU at £399, I would quite like to see a SSD option using OCZ drives or at least 7200 rpm drives. The laptop drives kills the Mini performance too. An easier to access design would help greatly. I wish they'd make it so that the lid just flipped up and you simply slot Ram in at the front and the drive would be on a tray that could be pulled out.
post #80 of 100
Quote:
Originally Posted by Bancho View Post

I think one of the key points here is that they'll be thoroughly updating the Macbooks and that my current black 1st gen Macbook will be utterly trounced in the graphics department by any of the rumored chipset upgrades. Color me excited. I'll be upgrading for certain this time.

Are you unconcerned about the WiFi security holes accessible via the Nvidia devices?
Hot tub blonde, pouring champagne: "Say when..." Dangerfield: "Right after this drink."
Reply
Hot tub blonde, pouring champagne: "Say when..." Dangerfield: "Right after this drink."
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Apple dumping Intel chipsets for NVIDIA's in new MacBooks