or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › NVIDIA platform claimed likely for new MacBook line
New Posts  All Forums:Forum Nav:

NVIDIA platform claimed likely for new MacBook line

post #1 of 62
Thread Starter 
Stoking the fires just days before Apple revamps its portable line, the origin of a previous claim now believes NVIDIA's mainboard and discrete chipsets will form the backbone of the MacBook range update.

Where he had previously focused on an educated guess as to what Apple would use for its next notebook upgrade, PC Perspective's Ryan Shrout now claims to be certain that NVIDIA's MCP79 chipset in its various forms will represent the basis of any new MacBooks to be introduced next week.

The writer doesn't directly cite the contacts he claims to have but is confident that Apple will use NVIDIA's full-size notebook platform, the MCP79MX, for both the standard MacBook and at least a 15-inch MacBook Pro. As with the current lineup, the 13-inch MacBook would use integrated graphics but would use the GeForce 9300M or 9400M built into the chipset; either of these are believed to significantly outperform Intel's current GMA series architecture. MacBook Pros would include a distinct graphics processor from the GeForce 9600M series.

For the MacBook Air, Apple will supposedly use a very low-power version of the mainboard components, the MCP79U, to keep power low at the expense of a slight dip in video performance.

Shrout supports his assertions by highlighting NVIDIA's launch plans. The chip designer has not only kept silent on MCP79 -- an exception for a company normally vocal about its technology -- but has recently been forced to confirm that it will unveil the finished product on October 15th, a day after Apple's planned MacBook event.

Likewise, NVIDIA's graphics technology is also being marketed in areas that favor Apple: the GeForce 9-series is considered ideal for Apple's proposed OpenCL general computing standard and has been sold as a means of accelerating certain tasks in Adobe's new CS4 suite.

However, Shrout partly calls into question his analysis by pointing to a graphic on NVIDIA's website that appears to resemble an unfamiliar MacBook; the image has been present on NVIDIA's site for months as an abstract representation of the company's mobile graphics.

Even if over-eager to prove his case, the journalist's arguments provide incidental support to news first broken by AppleInsider in the summer, when it was learned that Apple would likely use a non-reference platform for its upcoming round of MacBook models. At the time, NVIDIA was suggested as one possible candidate to replace Intel's stock components and give Apple a definitive performance edge.

Regardless of which choices Apple has made, they will become apparent this coming Tuesday.
post #2 of 62
No comment. Watching the hockey! Go Habs Go!
post #3 of 62
Interesting, but I think I'll just wait until Tuesday to find out for sure. At this point, speculation just seems a little useless in that there is now a definite amount of time before we will actually know everything about these new notebooks.
post #4 of 62
The cool thing if this is true is the MacBook Pro will get hybrid SLI (a speed boost from the using the integrated graphics as well as the discrete GPU) plus hopefully the power saving feature which automatically switches back to the integrated GPU and shuts off the discrete GPU when load is low.
post #5 of 62
I can understand why Apple would do this, if true. It would be an attempt to shut up the people complaining about Intel's integrated graphics, and placate the people demanding a replacement for the 12" Powerbook.
post #6 of 62
Did they not pay attention to the headlines? MacBook Pros failed by defective NVIDIA chips and now NVIDIA platform claimed to be used for new MacBook line. Duh!

Maybe Apple got a great deal on those defective NVIDIA chips for the new line!
post #7 of 62
Quote:
Did they not pay attention to the headlines? MacBook Pros failed by defective NVIDIA chips and now NVIDIA platform claimed to be used for new MacBook line. Duh!

Maybe Apple got a great deal on those defective NVIDIA chips for the new line!

Dude, you got issues, so what? At least they are willing to replace if for you. So what go to ATi now? Which is owned by AMD which is Intel competitor.

Quote:
I can understand why Apple would do this, if true. It would be an attempt to shut up the people complaining about Intel's integrated graphics, and placate the people demanding a replacement for the 12" Powerbook.

True

Quote:
The cool thing if this is true is the MacBook Pro will get hybrid SLI (a speed boost from the using the integrated graphics as well as the discrete GPU) plus hopefully the power saving feature which automatically switches back to the integrated GPU and shuts off the discrete GPU when load is low.

I would like that
Apple is a hardware company, dont believe me? Read this Article!. For those who understand my message, help me spread this info to those who dont get it.
Reply
Apple is a hardware company, dont believe me? Read this Article!. For those who understand my message, help me spread this info to those who dont get it.
Reply
post #8 of 62
Enough to push me to buying one but leaving me wondering, where is my UP MB in a 10" form factor??

I hope the MB's get multi touch, that will simply rock.
Hard-Core.
Reply
Hard-Core.
Reply
post #9 of 62
So this guy is basing his theory on a bit of clipart on a presentation. What a load.
post #10 of 62
And I wouldn't be the least bit surprised if all these Nvidia rumors turned out to be bogus, either. They make sense to me but that is never an indicator of what Apple will do.
post #11 of 62
I like that AMD has made a formal commitment to OpenCL, back in August, and in this overview [last page] cites the greatest market penetration for Parallel Processing using the GPU via OpenCL.

PDF: http://ati.amd.com/products/firepro/...code_final.pdf
post #12 of 62
Quote:
Originally Posted by merdhead View Post

So this guy is basing his theory on a bit of clipart on a presentation. What a load.

Actually, from reading the story, he seems to be basing it on a calendar. Given Apple's penchant for secrecy (although I suspect Steve is now playing around w/ all the rumours), NVIDIA having a major product release the day after Apple releases what are supposedly "game-changing" new laptop systems pretty much forces you to assume 1+1 will = 2.
post #13 of 62
Quote:
Originally Posted by floccus View Post

Actually, from reading the story, he seems to be basing it on a calendar. Given Apple's penchant for secrecy (although I suspect Steve is now playing around w/ all the rumours), NVIDIA having a major product release the day after Apple releases what are supposedly "game-changing" new laptop systems pretty much forces you to assume 1+1 will = 2.

See, that's just silly. Apple is a fairly minor customer for Nvidia. Nvidia isn't going to schedule its releases around Apple's plans.*

*Unless they are. We'll find out very soon.
post #14 of 62
Quote:
Originally Posted by FuturePastNow View Post

I can understand why Apple would do this, if true. It would be an attempt to shut up the people complaining about Intel's integrated graphics, and placate the people demanding a replacement for the 12" Powerbook.

I the integrated graphics never seemed to be a real issue for me. It's better than the discrete offerings than Apple had in the old ATI Mobility Radeon 9550 in the late iBooks. X3100 can play HD my HD video just fine and the specs for X4500 are considerably better.

But I'm not a GPU-guy. I don't do any major video editing or play any graphics intensive games so I am definitely out of my league here. So can anyone tell me how moving to NVIDIA will affect the power usage over Intel's X4500 which has an even lower TDP than X3100? What will this offer the average MacBook user? Does this mean that even the MacBook will get DL-DVI, HDMI or DisplayPort connector capable of running 30" ACDs?

It just seems like a huge gamble and price increase for Apple for something that most MacBook users doesn't care about or need.
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
post #15 of 62
Quote:
Originally Posted by FuturePastNow View Post

See, that's just silly. Apple is a fairly minor customer for Nvidia. Nvidia isn't going to schedule its releases around Apple's plans.*

*Unless they are. We'll find out very soon.

What we don't know is how much Apple is paying Nvidia to be the first to use these chipsets. If they are as game-changing as they say, that might be worth quite a bit to a company like Apple. If Apple is giving them a pretty good cut, I would imagine it would be in Nvidia's best interest to work around Apple's schedule.
post #16 of 62
Quote:
Originally Posted by FuturePastNow View Post

See, that's just silly. Apple is a fairly minor customer for Nvidia. Nvidia isn't going to schedule its releases around Apple's plans.*

*Unless they are. We'll find out very soon.

How many cheap machines have discrete CPUs? Apple is taking over 30% of ALL consumer PC sales revenue in the US. They are also taking over 66% of ALL consumer unit sales from $1000+ PCs in the US. If they go exclusively with NVIDIA and offer it on even their best selling MacBooks, I think it would be a huge win for NVIDIA. Similar to the way that Intel is the only CPU used by Apple and and that they only use the higher-end CPUs and showcase new chips in Apple's flagship and unique offerings for a cheap, effective viral marketing.
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
post #17 of 62
Quote:
Originally Posted by FuturePastNow View Post

See, that's just silly. Apple is a fairly minor customer for Nvidia. Nvidia isn't going to schedule its releases around Apple's plans.*

*Unless they are. We'll find out very soon.

One thing Apple does is that it always becomes a major customer, even if their purchases from vendors are minor.
post #18 of 62
Quote:
Originally Posted by nvidia2008 View Post

One thing Apple does is that it always becomes a major customer, even if their purchases from vendors are minor.

Overall, their pruchases may be minor, but their purchases for a specific model chip would seem to be excessive compared to other OEMs. Apple is forcasted to sell some 12M units this year or next and has only a handful of options for their HW. As well as the free advertising, for better or worse) that comes along with being Apple.

PS: I'm still saying that X4500 is most likely for MB and MBAs.
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
post #19 of 62
Quote:
Originally Posted by FuturePastNow View Post

See, that's just silly. Apple is a fairly minor customer for Nvidia. Nvidia isn't going to schedule its releases around Apple's plans.....

Nah. Nvidia is blowing a lot of smoke about these new chip sets. For Apple to debut it's premier line using them is a big deal if true.
In Windows, a window can be a document, it can be an application, or it can be a window that contains other documents or applications. Theres just no consistency. Its just a big grab bag of monkey...
Reply
In Windows, a window can be a document, it can be an application, or it can be a window that contains other documents or applications. Theres just no consistency. Its just a big grab bag of monkey...
Reply
post #20 of 62
Quote:
Originally Posted by hittrj01 View Post

What we don't know is how much Apple is paying Nvidia to be the first to use these chipsets. If they are as game-changing as they say, that might be worth quite a bit to a company like Apple. If Apple is giving them a pretty good cut, I would imagine it would be in Nvidia's best interest to work around Apple's schedule.

Apple probably doesn't have to pay them anything (other than the actual cost of the product). Apple being the first to use Nvidia chip sets in this way would be invaluable to Nvidia. If anything Nvidia should be paying Apple.
In Windows, a window can be a document, it can be an application, or it can be a window that contains other documents or applications. Theres just no consistency. Its just a big grab bag of monkey...
Reply
In Windows, a window can be a document, it can be an application, or it can be a window that contains other documents or applications. Theres just no consistency. Its just a big grab bag of monkey...
Reply
post #21 of 62
Quote:
Originally Posted by paxman View Post

No comment. Watching the hockey! Go Habs Go!

I'm watching the Red Sox.


In Los Angeles.


I hope the MB have great GPU.

Get over it Apple.

Hope they do.
post #22 of 62
Quote:
But I'm not a GPU-guy. I don't do any major video editing or play any graphics intensive games so I am definitely out of my league here. So can anyone tell me how moving to NVIDIA will affect the power usage over Intel's X4500 which has an even lower TDP than X3100? What will this offer the average MacBook user? Does this mean that even the MacBook will get DL-DVI, HDMI or DisplayPort connector capable of running 30" ACDs?

It just seems like a huge gamble and price increase for Apple for something that most MacBook users doesn't care about or need.

You got a point, if the Nvidia Integrated GPU will consume more power then Intel Integrated then possibly the MB will have a shorter battery life unless Apple negates this with a more energy efficient system.

The reason why Apple would want to put a proper integrated GPU in it is because during the iBook time, it uses ATi which is a GPU manufacturer, Intel is not known for their GPU cards so people are complaining about MB GPU, besides, now Apple MB is selling like hot cakes, by changing to Nvidia GPU it will give old iBook owners a reason to upgrade.

I wonder what happen if the new MB will be listed as Amazon no.1 best selling laptop, beating those puny netbooks :amazed:
Apple is a hardware company, dont believe me? Read this Article!. For those who understand my message, help me spread this info to those who dont get it.
Reply
Apple is a hardware company, dont believe me? Read this Article!. For those who understand my message, help me spread this info to those who dont get it.
Reply
post #23 of 62
Quote:
Originally Posted by solipsism View Post

I the integrated graphics never seemed to be a real issue for me. It's better than the discrete offerings than Apple had in the old ATI Mobility Radeon 9550 in the late iBooks. X3100 can play HD my HD video just fine and the specs for X4500 are considerably better.

But I'm not a GPU-guy. I don't do any major video editing or play any graphics intensive games so I am definitely out of my league here. So can anyone tell me how moving to NVIDIA will affect the power usage over Intel's X4500 which has an even lower TDP than X3100? What will this offer the average MacBook user? Does this mean that even the MacBook will get DL-DVI, HDMI or DisplayPort connector capable of running 30" ACDs?

It just seems like a huge gamble and price increase for Apple for something that most MacBook users doesn't care about or need.

Did you watch a video on your MacBook? Did you use iPhoto to increase the brightness on a picture? Voila. You just became a "GPU-guy". I'm not being silly here, a discrete GPU of good performance, eg. 9300 or 9400, will have many overall benefits, which may not be immediately realised in the next few months. I'm not "feeling" the X4500 yet, but of course, it remains a possibility... smaller possibility, in my view.

Quote:
Originally Posted by Virgil-TB2 View Post

Apple probably doesn't have to pay them anything (other than the actual cost of the product). Apple being the first to use Nvidia chip sets in this way would be invaluable to Nvidia. If anything Nvidia should be paying Apple.


Yup, the way Apple operates is to make you give Apple a nice discount, then still feel like Apple is doing *you* the favour. Standard Apple operating procedure.
post #24 of 62
Quote:
Originally Posted by nvidia2008 View Post

Did you watch a video on your MacBook?

All the time. I even watch HD video encoded with x264 with 5.1 channel AAC encapsulated as MKV files. Until the latest Perian update I had to use VLC Player, but now QT works quite well. Excpet if there is built in subtitles, which requires turning off the subtitle video track from Command+J, since the Menu option doesn't show a subtitle option.

I've even played 1080p connect while using Exposé and running other apps and I have never had an issue with playback stuttering, even when using using ScreenFlow.app to capture to demostate the power of CoreAnimation.

The only thing I haven't tried is using an external, power-sucking Blu-ray drive to play back video on a MacBook, but that is pointless until we get HDCP support with Montevina (or NVIDIA?)
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
post #25 of 62
So NVDA at 6 bucks atm, which is pretty damn cheap, but then they have been blowing their own feet off very well of late, anyone feel game? I might try a quick trade but maxed out on AAPL. Ultimately wouldn't want to be long on NVDA.
post #26 of 62
Quote:
Originally Posted by solipsism View Post

All the time. I even watch HD video encoded with x264 with 5.1 channel AAC encapsulated as MKV files. Until the latest Perian update I had to use VLC Player, but now QT works quite well. Excpet if there is built in subtitles, which requires turning off the subtitle video track from Command+J, since the Menu option doesn't show a subtitle option.

I've even played 1080p connect while using Exposé and running other apps and I have never had an issue with playback stuttering, even when using using ScreenFlow.app to capture to demostate the power of CoreAnimation.

The only thing I haven't tried is using an external, power-sucking Blu-ray drive to play back video on a MacBook, but that is pointless until we get HDCP support with Montevina (or NVIDIA?)

Interesting. Do you think MKV with Perian and QT is using quite a bit of the GPU or is it still mostly CPU bound... I haven't actually kept myself informed on this. Sounds like CoreAnimation is using a lot of GPU there...?

As per my post, we need a bit more of the nitty gritty details on how much CoreAnimation/ ETC will try to get out of the GPU before it falls back to having the CPU render.
post #27 of 62
Quote:
Originally Posted by Captain Jack View Post

So NVDA at 6 bucks atm, which is pretty damn cheap, but then they have been blowing their own feet off very well of late, anyone feel game? I might try a quick trade but maxed out on AAPL. Ultimately wouldn't want to be long on NVDA.

Ooh... Very risky to fiddle with NVDA right now, IMHO.
post #28 of 62
This is a very odd rumor considering almost all the NVidia rumors on the PC sites are about them getting out of the chipset business altogether.

http://www.dailytech.com/article.aspx?newsid=13142

I dont know, we will see. Nvidias chipset are so so in performance and somewhat of powerhogs:

http://www.tomshardware.com/reviews/...I,1977-26.html

So unless the new set is some radical breakthrough, I am dubious that there are any significant performance advantages, especially for a notebook system. Intels next gen integrated graphics chip probably moves closer to Nvidia dedicated performance than Nvidia moves towards intels chipset power/performance.
post #29 of 62
Quote:
In conjunction with the Nvidia 9100M G integrated graphics, the 9600M GT supports Hybrid-SLI (only HybridPower). HybridPower is a technique to choose between the integrated and dedicated graphics core, if performance or battery runtime is needed. This works only in Windows Vista. Up to now the user has to use a tool to switch between the GPUs. Later Nvidia wants to switch automatically in the drivers. GeForceBoost is not supported with this card, as there would be no performance gain.
Quoted from NotebookCheck.com

Sounds promising if future MBP uses these.
Apple is a hardware company, dont believe me? Read this Article!. For those who understand my message, help me spread this info to those who dont get it.
Reply
Apple is a hardware company, dont believe me? Read this Article!. For those who understand my message, help me spread this info to those who dont get it.
Reply
post #30 of 62
Quote:
Originally Posted by solipsism View Post

Overall, their pruchases may be minor, but their purchases for a specific model chip would seem to be excessive compared to other OEMs. Apple is forcasted to sell some 12M units this year or next and has only a handful of options for their HW. As well as the free advertising, for better or worse) that comes along with being Apple.

PS: I'm still saying that X4500 is most likely for MB and MBAs.

Plus, they are able to pay their bills without much worry for cash. . Now that is a good customer.
Hard-Core.
Reply
Hard-Core.
Reply
post #31 of 62
I really hope this means nVidia will actually bother with Apple as a real customer from now on. The last SEVERAL point releases containing GPU driver updates for OS X with nVidia hardware have been an absolute nightmare while the ATI machines have been fine.
post #32 of 62
Quote:
Originally Posted by solipsism View Post

X3100 can play HD my HD video just fine and the specs for X4500 are considerably better.

Are you sure? Does that include 1080p Blurays? For instance, my old PC played H264 720p videos fine, but 1080p bluray where just slideshows...
And when playing these videos, what is actually doing the job? Your integrated GPU or your CPU? If the later, having the CPU at full throttle during the whole movie is not really helping the power usage...

Quote:
But I'm not a GPU-guy. I don't do any major video editing or play any graphics intensive games so I am definitely out of my league here.

You don't play games, but you use Core Animation, desktop effects and the like... And these rely on the GPU - or the CPU emulation the missing functions.
It's even worse if you intend to use Boot Camp to use other OS on your notebook.

Anyway, the use of the GPU for stuff outside of the strictly 3D world is the next trend. It started with the GUI - both Vista and Core Animation do that. It also started with scientific calculations - nVidia now sales GPU without any graphic output, so you can fit half a dozen into a computer for scientific calculation. Photo and Video software can also use the power of the GPU - which is after all a matrix calculation unit - to speed up filters. What takes seconds to the CPU could be calculated in real time in the GPU. I would not be surprised to see this kind of use in softwares like Lightroom, Photoshop or their plugins...
post #33 of 62
Quote:
Originally Posted by solipsism View Post

The only thing I haven't tried is using an external, power-sucking Blu-ray drive to play back video on a MacBook, but that is pointless until we get HDCP support with Montevina (or NVIDIA?)

That's beyond pointless- even if you got that HDCP support. Why would anyone ever want to use or watch blu-ray on a MacBook? It's screen is so small and not only that-its screen is so far from being HD!!?!?!?!?!
And as far as power-sucking goes, you obviously don't own an Apple TV. My AppleTV right now is not powered up and is generating heat as I type. And when on - I swear could make a cheese melt on it.
I still don't get why you have such a grudge against blu-ray??
post #34 of 62
Ah man.. if they only put a 9600M GT in the MacBook Pro.. I might actually switch to PC That card is already out of date

9650M GS at LEAST, I mean come on! Why would they put a 9600M GT in over a 9650M GT which is far far more powerful and yet has the same power draw? (((
post #35 of 62
9650M is better then 9600? Isn't that suppose to be the other way round?

9600 is the successor of 8600 btw.
Apple is a hardware company, dont believe me? Read this Article!. For those who understand my message, help me spread this info to those who dont get it.
Reply
Apple is a hardware company, dont believe me? Read this Article!. For those who understand my message, help me spread this info to those who dont get it.
Reply
post #36 of 62
Quote:
Originally Posted by teckstud View Post

That's beyond pointless- even if you got that HDCP support. Why would anyone ever want to use or watch blu-ray on a MacBook? It's screen is so small and not only that-its screen is so far from being HD!!?!?!?!?!

Some people only want to have to buy one copy of a movie, pointless having to get it on blu-ray and DVD just so you can watch it while travelling
post #37 of 62
Quote:
Originally Posted by jfanning View Post

Some people only want to have to buy one copy of a movie, pointless having to get it on blu-ray and DVD just so you can watch it while travelling

Yeah... I don't mind having an external FW400 Blu-Ray player/reader to connect to my MacBook.

720p, or even just 1080p downscaled to 720p is still nice on the MacBook compared to the crappy resolution of DVDs.

DVDs were great, but once you start watching even "just" 720p TV and movies and trailers on your PC/Mac... It's really hard to go back.
post #38 of 62
Quote:
Originally Posted by solipsism View Post

I the integrated graphics never seemed to be a real issue for me. It's better than the discrete offerings than Apple had in the old ATI Mobility Radeon 9550 in the late iBooks. X3100 can play HD my HD video just fine and the specs for X4500 are considerably better.

But I'm not a GPU-guy. I don't do any major video editing or play any graphics intensive games so I am definitely out of my league here. So can anyone tell me how moving to NVIDIA will affect the power usage over Intel's X4500 which has an even lower TDP than X3100? What will this offer the average MacBook user? Does this mean that even the MacBook will get DL-DVI, HDMI or DisplayPort connector capable of running 30" ACDs?

It just seems like a huge gamble and price increase for Apple for something that most MacBook users doesn't care about or need.

Oh, you're right that the Intel graphics are perfectly fine for the Macbook, but I get the feeling that they were a sticking point for a lot of people (then again, internet forums are full of complaints about everything). According to Intel, the X4500HD is able to decode 1080p Blu-Ray in hardware, by the way. Not that you'd be able to play it on a Macbook's screen. X4500 can also handle a dual-link DVI port.

I don't know how power consumption compares, but they should both be in the same range. One advantage Nvidia's IGPs have is that they are a single chip. There is no southbridge chip, which means manufacturers can design a smaller, simpler motherboard. But that's only an advantage until late next year when Intel will debut an all-new mobile platform.
post #39 of 62
Quote:
Originally Posted by wheelhot View Post

9650m is better then 9600? Isn't that suppose to be the other way round?

9600 is the successor of 8600 btw.

9650m gt >> 9650m gs >>>>>>>> 9600m gt
post #40 of 62
Quote:
Originally Posted by FuturePastNow View Post

According to Intel, the X4500HD is able to decode 1080p Blu-Ray in hardware, by the way. Not that you'd be able to play it on a Macbook's screen. X4500 can also handle a dual-link DVI port.

Assuming that Apple will offer BRD in some way (which I don't think they will), I can't help but wonder if Apple would still go the micro- or mini-DVI route when people may occasionally want to connect to their HDTVs, not just a 30" ACD. DisplayPort,and to a lesser degree HDMI, seem like the most likely option.

Quote:
I don't know how power consumption compares, but they should both be in the same range. One advantage Nvidia's IGPs have is that they are a single chip. There is no southbridge chip, which means manufacturers can design a smaller, simpler motherboard. But that's only an advantage until late next year when Intel will debut an all-new mobile platform.

Apple is notorious for looking farther ahead rather other vendors. Butt they have no choice as the new machine they make now will be the standard for premium for PCs for the next 3 years. Sure, the internal components change, but switching from an Intel chipset that works well enough to NVIDIA jsut to switch a year or two down the road back to Intel doesn't sound like Apple. How much better would NVIDIA's integrated solutions be? Personally, with all the issues with NVIDIA I would not buy a new Mac next week if they are the ones supplying the chipset, until it gets thoroughly tested for several months.
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › NVIDIA platform claimed likely for new MacBook line