or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Apple rumored to switch to Nvidia graphics for new MacBook Pros
New Posts  All Forums:Forum Nav:

Apple rumored to switch to Nvidia graphics for new MacBook Pros

post #1 of 64
Thread Starter 
New reports claim to have confirmed that Apple's upcoming MacBook Pros will eschew AMD graphics processors in favor of GPUs from Nvidia.

The Cupertino, Calif., company's current 15-inch and 17-inch MacBook Pro models make use of AMD's Radeon HD 6750M and 6770M graphics chips. The 13-inch model features integrated Intel graphics.

According to The Verge, a "trusted source" has confirmed that Apple will switch to Nvidia as the supplier of its discrete GPUs for the MacBook Pro. ABC News' Joanna Stern, formerly of The Verge, reported separately that, according to her publication's own sources, the next generation of MacBook Pros will sport Nvidia graphics chips.

Monday's reports counter a rumor from March that Apple had decided to drop Nvidia's Kepler graphics cards from a "large number" of its next-gen MacBook Pros over supply issues.

Chatter surrounding Apple's Mac plans has picked up in recent weeks. Benchmarks reportedly of unreleased MacBook Pro and iMac models were spotted on the GeekBench site late last week. Piper Jaffray analyst Gene Munster issued a note speculating that new MacBooks and iMacs will arrive by June. The inconsistently accurate DigiTimes claimed that component suppliers expect new MacBook models to launch in June.

AMD graphics
Apple's current 15-inch and 17-inch MacBook Pros feature AMD graphics.


Bloomberg also chimed in on Monday with its own sources, claiming that Apple would unveil its new MacBook Pros at the Worldwide Developers Conference in June. According to the report, the next-generation laptops will be thinner and feature Retina Display-like screens.

AppleInsider's own sources revealed in February that Apple is planning a radical redesign of its professional notebooks that should slim down its MacBook Pros and make them more like the MacBook Air.
post #2 of 64
At least we're no longer hearing rumours about Apple dropping the discrete GPU in their MBPs in favour of Intel's integrated GPUs.

This bot has been removed from circulation due to a malfunctioning morality chip.

Reply

This bot has been removed from circulation due to a malfunctioning morality chip.

Reply
post #3 of 64

I'm still not sure I trust nVidia after the 8600M fiasco.

post #4 of 64
Quote:
Originally Posted by Tallest Skil View Post

I'm still not sure I trust nVidia after the 8600M fiasco.

Yep, seems like a bad move on Apples part. Especially when AMD seems to have a better track record of the last couple of years.
post #5 of 64
Quote:
Originally Posted by Tallest Skil View Post

I'm still not sure I trust nVidia after the 8600M fiasco.

I like their desktop graphics cards. Aside from that, digitimes has been trolling everyone all day today.

post #6 of 64

Makes sense considering all the work that has been going on with the Nvidia drivers for ML

post #7 of 64
Welcome back, flickering screen. Yay!
post #8 of 64

after the 8600 though, Apple still used several others... The 9400s, 9600s, 320m, 330m were all after that and all didn't have the issues of the 8600.

post #9 of 64

if the new mbps are released around june 11th...when would they be in the stores ready to be bought? 

post #10 of 64
Quote:
Originally Posted by wizard69 View Post


Yep, seems like a bad move on Apples part. Especially when AMD seems to have a better track record of the last couple of years.

 

Quote:
Originally Posted by SolipsismX View Post

At least we're no longer hearing rumours about Apple dropping the discrete GPU in their MBPs in favour of Intel's integrated GPUs.

 

Quote:
Originally Posted by hmm View Post

I like their desktop graphics cards. Aside from that, digitimes has been trolling everyone all day today.

 

In any case the discrete GPU industry is on its way out. Only certain MBP 15" and all MBP 17" - esque MBPs will have discrete GPUs. AMD and Nvidia blew it, and Intel steamrolled them, legally and illegally (eg. locking out Nvidia). In the meantime PowerVR and ARM is eating everyone's breakfast, lunch and soon, dinner.

 

As I said before, my username is perhaps the peak of discrete GPUs (Even then the 8600M fiasco was quite bad, though that G92 GPU design was superb).

 

I have the famed Nvidia 320M in my MBP 13" 2010. It's okay, but compared to a integrated Intel MBP 13" nothing great.

 

So many laptop discreet GPUs in any case are such crippled versions of their desktop brethren the difference between them and Intel is nothing more than marketing. 2GB VRAM on a useless laptop discrete GPU? That's like those $300 HDMI cables.

 

Throw in the whole "casual gaming" phenomenon and that's the killing blow to the discrete GPU industry.

 

Consider this: just at the time when GPUs became ever more hot, heavy, expensive and noisy to support ever more complex, risky and expensive (software and hardware) games, people gravitated towards simpler, alternative "casual" games. The "perfect storm" that crushed almost all discrete GPU dreams.

 

The only decent GPUs in the future will be:

 

1. Megalithic desktop 500W-1KW multi-card setups, ie. niche stuff

 

2a. Intel Integrated which will benefit from modest improvements in GPU architecture but huge gains in CPU power (ie. GPGPU not so essential because CPU still handles a lot of tasks, including custom routines for video encoding said to be the forte of GPUs, which is now bollocks)

 

2b. Intel Integrated which will benefit a lot from process improvements which currently outpace anything TSMC/ AMD/ Nvidia can achieve

 

3. PowerVR which will come in from the ground up, ie. iPad 3/4/5 GPU is the next great gaming GPU.

 

So on one hand we have niche "high-perfomance" stuff that has no real world mainstream application, aging gaming consoles with now very paltry graphics, "next-gen" gaming consoles which are still dicey in terms of business "models", Intel Integrated which is sufficient for mainstream computing but nowhere near gaming-class,

 

And on the other hand... iPad. 'Nuff said.


Edited by nvidia2008 - 5/14/12 at 9:02pm
post #11 of 64
Quote:
Originally Posted by BillyLavoie View Post

Makes sense considering all the work that has been going on with the Nvidia drivers for ML

 

Like hell it does. Nvidia's Mobile chips pale in comparison to price and power wrt AMD, not to mention OpenCL performance.

post #12 of 64
Unless Nvidia have something up the sleeves, I wouldn't want a Mobile GK107 on a Macbook. The GK107 would be faster then 6770M, but we are only talking about ~ 20% difference. Since the new GPU would be based on 28nm I expect something much more powerful.

There are only two kind of people in this world.

Those who dont understand Apple and those who misunderstood Apple.

Reply

There are only two kind of people in this world.

Those who dont understand Apple and those who misunderstood Apple.

Reply
post #13 of 64
Quote:
Originally Posted by mdriftmeyer View Post

 

Like hell it does. Nvidia's Mobile chips pale in comparison to price and power wrt AMD, not to mention OpenCL performance.

 

True, AMD/ATI turned their boat 180 degrees with the 5000 Series, particularly the flagship performance/watt/price king, 5850. Nvidia has been simply coasting sideways at best since the G92 (8600, 9600, GT250/260/whatever).

 

Still, AMD's drivers are horrible (endless updates, Need For Speed: Shift and other games inextricably laggy), while Nvidia supposedly has better drivers but again laptop discrete GPUs for the cost ain't that fantastic.

 

OpenCL is a great idea, but in practice has met little success. Folding@Home on GPU was for many years incredibly unstable, OpenCL or not.

post #14 of 64
Quote:
Originally Posted by ksec View Post

Unless Nvidia have something up the sleeves, I wouldn't want a Mobile GK107 on a Macbook. The GK107 would be faster then 6770M, but we are only talking about ~ 20% difference. Since the new GPU would be based on 28nm I expect something much more powerful.

 

That's assuming they can get over their endless process issues with TSMC or whoever the fab-du-jour is. And for some reason Samsung's fabs have not really breached the market for making low-process, great performance/watt laptop/desktop-class GPUs.

 

It appears that due to "competition" everyone is going in five different directions, and the "farm" (as Steve jobs put it) has all these animals that don't make sense. You've got an obese chicken that doesn't lay eggs, an anorexic pig that produces milk, and a horse with three legs that poops spinach coming out of the barn. (Not to mention Sony's Cell processor which is now kinda like an upside-down cow - no surprise since the PS3 looks like it's made for grilling hamburgers)

 

An industry-wide chaos that causes indigestion for the consumer.

 

In terms of making kickass GPUs: Intel has the best fabs. PowerVR and ARM has the best mobile architecture. AMD has the best high-performance desktop/laptop architecture. Apple has the best stable software platform.

 

Wouldn't it be a better world where AMD designs high-performance desktop/laptop architecture, ARM and PowerVR designs the mobile architecture, Intel fabs them, and Apple features them on iOS, Mac and gaming platforms eg. iPad, AppleTV? Tell me this doesn't sound attractive to almost all consumers.

 

Of course, if most of the world had these kinds of stellar collaboration, we could probably eradicate poverty by 2050. That said, in the tech world at least, it is possible, usually with a visionary leader, such as Steve Jobs. So, will the next Steve Jobs please stand up, please stand up, please stand up?

 

RIP Voodoo PC, by the way, the dude in charge talked up a lot of hype post-buyout by HP. Sadly, nothing much came out since then.

 

On a final note, who needs a beyond-average GPU in a laptop anyway? Only certain niches (aka "verticals"). Gaming has potential but Windows is nonsense compared to the simplicity and tradeability of Xbox, PS and Wii, and Mac titles are as always, progressing but still laughable.


Edited by nvidia2008 - 5/14/12 at 9:25pm
post #15 of 64

I just started a video production course with a four year old Mac Book. I will need a better computer after I graduate. What hardware part is the most important for editing and creating video movies? I read about the GPU and the CPU and how over time both are improving. At this time discrete graphics processors seem to be ideal for video games or watching movies but how do they figure into editing and creating movies?

 

Having the most RAM possible is probably a good thing for all computing things. I just want to know what hardware item to concentrate on finding when I shop for my next computer. Experts please advise me. Screen size isn't important on a lap top because I use an external monitor on my desk.
 

post #16 of 64
Quote:

I have the famed Nvidia 320M in my MBP 13" 2010. It's okay, but compared to a integrated Intel MBP 13" nothing great.

 

The 320M is a more powerful GPU than Intels HD3000. It is only the better CPU in the newer MacBook Pro 13" that makes them faster. From a GPU standpoint, the current 13" was a downgrade.

 

 

Quote:
The new 13in MacBook Pros and their Intel HD Graphics 3000 processors weren't that impressive in our games tests, scoring lower than the older 13in systems with Nvidia GeForce 320M integrated graphics. In the 1024-by-768-resolution Call of Duty test, the 13in 2.3GHz Core i5 MacBook Pro displayed 26 fps (frames per second) on average, while the 13in 2.7GHz Core i7 MacBook Pro averaged 27 fps. Those results are well below the 33 fps displayed by the older 13in 2.66GHz Core 2 Duo MacBook Pro with Nvidia graphics.

 

While a single game is far from a perfect benchmark, it is not hard to find other tests that back up that up.

 

-kpluck

Do you use MagicJack?

The default settings will automatically charge your credit card each year for service renewal. You will not be notified or warned in anyway. You can turn auto renewal off.

Reply

Do you use MagicJack?

The default settings will automatically charge your credit card each year for service renewal. You will not be notified or warned in anyway. You can turn auto renewal off.

Reply
post #17 of 64
Quote:
Originally Posted by kpluck View Post

The 320M is a more powerful GPU than Intels HD3000. It is only the better CPU in the newer MacBook Pro 13" that makes them faster. From a GPU standpoint, the current 13" was a downgrade.

 

 

While that is true, that's also the spectre discrete GPUs face. The 13" was a downgrade no doubt, but how many people "suffered" as a result? Since CPU power massively improved, the GPU downgrade was not really felt (aside from high-end games, which is rare for Mac users).

 

Even for games, given the way the ports are done as well, Pyschonauts is unplayable on my 320M even though it's not Intel, and even though that game is, well, very, very old.

 

So while I enjoy the 320M's advantages in say, OpenGL Photoshop where zoom levels are antialiased/sampled properly, increasingly Intel GPUs can do this.

 

iPhoto and iMovie can be GPU-intensive to leverage the GPU, but on my MBP 13" 320M Core 2 Duo, using the GPU for certain iPhoto and iMovie tasks is ~slower~ compared to Sandy Bridge.

 

I guess I'm proposing that the promise of discrete GPUs has been obliterated by poor real-world implementation.

 

The discrete GPU companies have also now painted themselves into a corner:

 

How is this 100W-200W+ behemoth ever going to benefit our new mobile, tablet, sleek and slim lifestyle while still pushing ever greater interactive experiences?

 

GeForce_GTX_670_F-1_575px.jpg

post #18 of 64
Quote:
Originally Posted by mdriftmeyer View Post

 

Like hell it does. Nvidia's Mobile chips pale in comparison to price and power wrt AMD, not to mention OpenCL performance.

In laptops maybe, and not by much. Where are you getting your information?

Quote:
Originally Posted by Smallwheels View Post

I just started a video production course with a four year old Mac Book. I will need a better computer after I graduate. What hardware part is the most important for editing and creating video movies? I read about the GPU and the CPU and how over time both are improving. At this time discrete graphics processors seem to be ideal for video games or watching movies but how do they figure into editing and creating movies?

 

Having the most RAM possible is probably a good thing for all computing things. I just want to know what hardware item to concentrate on finding when I shop for my next computer. Experts please advise me. Screen size isn't important on a lap top because I use an external monitor on my desk.
 

You are asking the wrong questions. You need to do some research because it depends on the application and how you will be using it. We're well past the days of what computer do I need for X application. I wouldn't worry about the computer until you're close to graduation anyway, as requirements will change in that time.

Quote:
Originally Posted by nvidia2008 View Post

 

True, AMD/ATI turned their boat 180 degrees with the 5000 Series, particularly the flagship performance/watt/price king, 5850. Nvidia has been simply coasting sideways at best since the G92 (8600, 9600, GT250/260/whatever).

 

Still, AMD's drivers are horrible (endless updates, Need For Speed: Shift and other games inextricably laggy), while Nvidia supposedly has better drivers but again laptop discrete GPUs for the cost ain't that fantastic.

 

OpenCL is a great idea, but in practice has met little success. Folding@Home on GPU was for many years incredibly unstable, OpenCL or not.

I disagree. OpenCL has seen some nice adoption, and it's extremely useful in many more things today. More people benefit from a strong gpu and OpenCL support today than would have a few years ago.

Quote:
Originally Posted by ksec View Post

Unless Nvidia have something up the sleeves, I wouldn't want a Mobile GK107 on a Macbook. The GK107 would be faster then 6770M, but we are only talking about ~ 20% difference. Since the new GPU would be based on 28nm I expect something much more powerful.

Is this year's AMD significantly better? Mobile gpus were NVidia's weakest point.  It's a troll rumor anyway. Remember a few months ago? We had the same rumor, then a rumor they couldn't satisfy Apple, now another rumor of NVidia, all from the same source. Laptops debuting later than initially expected by the masses = many many fabricated rumors.

post #19 of 64
Quote:
Originally Posted by Smallwheels View Post

I just started a video production course with a four year old Mac Book. I will need a better computer after I graduate. What hardware part is the most important for editing and creating video movies? I read about the GPU and the CPU and how over time both are improving. At this time discrete graphics processors seem to be ideal for video games or watching movies but how do they figure into editing and creating movies?

 

Having the most RAM possible is probably a good thing for all computing things. I just want to know what hardware item to concentrate on finding when I shop for my next computer. Experts please advise me. Screen size isn't important on a lap top because I use an external monitor on my desk.
 

 

Well, in my opinion, if you are using your Mac for serious stuff, you have to consider it in totality. First, the hard disk, which is often overlooked. SSD is absolutely essential, with Thunderbolt to an external RAID0 (or 0+1 or something like that, somebody else probably knows better). Next, RAM, indeed, don't skimp on it. Start off with 8GB, use any "Apple-compatible" RAM, just not recommended to get regular PC or "Value" RAM.

 

Next CPU... In this case you want as fast as you can afford, because encoding, decoding, compression and so on is very CPU-intensive.

 

For the GPU, in this case yes, discrete GPUs are your only option.

 

Since you'll be getting a Mac, and you don't need a big screen, the choice is pretty simple. Get the ~fastest possible~ ie. best CPU and best GPU you can put into a custom MacBook Pro 15" with 8GB of RAM and a 300GB+ SSD. You save some of the money going towards the 17" laptop by instead "specing-out" the 15" MBP to the max.

 

My 2 cents :)

post #20 of 64
GPU is important for rendering complex video, but you also need a good CPU to compliment it. If you only get a good GPU and a crappy CPU, you will bottleneck because your computer won't be able to keep up. You should spend roughly the same amount for the CPU and GPU so look them up and make sure your GPU is not more powerful.
You were correct in saying that you need lots and lots of speedy RAM. This is very important for video editing.
post #21 of 64
Quote:

Originally Posted by hmm View Post

I disagree. OpenCL has seen some nice adoption, and it's extremely useful in many more things today. More people benefit from a strong gpu and OpenCL support today than would have a few years ago.

 

 

 

Interesting, name me 5 OpenCL OSX 10.7 apps on the Mac App Store that qualify for a decent, mainstream Mac experience. Core Image doesn't count, has to be mostly OpenCL, that is, the app calls the OpenCL stuff directly not OpenCL indirectly by calling to Core Image only.

 

"Analog" on the Mac App Store is really cool, but it's using my GPU for Core Image calls to do the filtering, and it's laggy at times, if it used pure CPU power on a higher-end Sandy Bridge MBP, it might be faster.

 

And with Ivy Bridge just around the corner, the case for discrete GPUs given the sheer power of Ivy Bridge CPU components is diminishing fast for ~mainstream laptop~ computing.

 

As far back as 2005, desktop sales jumped off a cliff and that was the point they took discrete GPUs with them.

 

IMG_2582.JPG


Edited by nvidia2008 - 5/14/12 at 10:38pm
post #22 of 64

At this stage as well, the Ivy Bridge CPU and GPU benchmarks are literally off-the-charts for anything ever conceived in an on-die CPU+GPU:

http://www.anandtech.com/show/5771/the-intel-ivy-bridge-core-i7-3770k-review

 

Crysis Warhead, Metro 2033, Dirt 3 at playable frame rates by Intel Integrated Graphics is nothing short of unbelievable given how bad they used to be.

 

I think at this stage only the high-end MBP 15" and high-end MBP 17" Ivy Bridge MBPs will have discrete GPUs.

 

I mean, Ivy Bridge appears to clearly outpace the ATI 5450, the staple low-but-not-horrible-end class of discrete GPU.

 

If Intel can ramp production suitably, Ivy Bridge is pretty much the nail in the coffin for discrete GPUs, and AMD will struggle in the mid to higher prices.

 

Given also that while games are useful for benchmarking the integrated GPU, for most mainstream use, if driver support is adequate then it's more than enough.

 

We're talking 30%-40% improvements over Sandy Bridge. Imagine, playing a DX11 game on an Intel Integrated GPU... albeit at 720p... but still... this is big.

 

And if you look at general compute of the ~GPU alone~ for Ivy Bridge, pretty much game over for AMD and Nvidia in the mid- to high-priced mainstream-use laptops:

 

45906.png

 

You buy one chip. You want fast CPU? It's in that one chip? You want a reasonable GPU for graphics and compute? It's in the same one chip. You want optimised video encoding? Also in the same chip.

 

Game over.


Edited by nvidia2008 - 5/14/12 at 10:59pm
post #23 of 64
Quote:
Originally Posted by kpluck View Post

The 320M is a more powerful GPU than Intels HD3000. It is only the better CPU in the newer MacBook Pro 13" that makes them faster. From a GPU standpoint, the current 13" was a downgrade.

 

While a single game is far from a perfect benchmark, it is not hard to find other tests that back up that up.


Indeed. Not only is the Intel GPU less capable, it is also buggy. Running the Pixel City screensaver on the 2010 MacBook Air was smooth as can be, on the 2011 model, the fans kick into high gear after  a short while and there are always artifacts on the screen. That was never a problem with the 2010 model.

 

.tsooJ

post #24 of 64
Quote:
Originally Posted by gyorpb View Post


Indeed. Not only is the Intel GPU less capable, it is also buggy. Running the Pixel City screensaver on the 2010 MacBook Air was smooth as can be, on the 2011 model, the fans kick into high gear after  a short while and there are always artifacts on the screen. That was never a problem with the 2010 model.

 

.tsooJ

 

Well, it is quite nice, but on my MBP with 320M, it's running at 15fps on an external 1680x1050 monitor, so, the point is fairly moot. On MBP 13" screen natively it's not too bad. Again, if anyone has OpenCL examples of great GPU-utilising apps, that would be nice.

 

Also if Intel's drivers improve, the Ivy Bridge GPU will make most discrete GPUs unnecessary. On Windows at least based on Anandtech's testing game compatibility and drivers for the most part seem quite alright and the Ivy Bridge GPU benches quite well against discrete cards. Nothing to suddenly turn the tide against discrete GPUs for enthusiast PC gaming, but for everything else, definitely something interesting happening with Ivy Bridge.

 

Drivers are a fair point, so I hope Apple and Intel get it right with Ivy Bridge, because they will be the basis of the fastest Mac laptops ever made, on the best and most elegant laptop platform ever created (think MacBook Air-esque design coming to a pro-level 15" laptop).

post #25 of 64

The fact the NVidia has won the race for the next MacBook Pro round with its Fermi architecture is known since the beginning of the year.

From a GPGPU standpoint, Fermi is going to provide extended precision floating point capacity (80-bit).

But I have also heard that NVidia is going drop support of OpenCL in favor of CUDA.

All of that remains to be confirmed.

Predictions are perilous, especially about future. Niels Bohr
Reply
Predictions are perilous, especially about future. Niels Bohr
Reply
post #26 of 64
Quote:
Originally Posted by doh123 View Post

after the 8600 though, Apple still used several others... The 9400s, 9600s, 320m, 330m were all after that and all didn't have the issues of the 8600.

The 320M was a great IGP. One thing I don't like about AMD GPUs is they run very hot. The X1900XT cards were problematic in the old Mac Pros. Their drivers are also not as good as NVidia's.

Someone benchmarked the NVidia 650M in an Alienware laptop here:

http://forum.notebookreview.com/alienware-m14x/661207-650m-preliminary-benchmarks-up-67-faster-than-555m.html

The fastest MBP GPU, the 6770M gets 23fps in Metro 2033 on high, the 650M is at 35fps. These benchmarks vary a lot but one interesting comment was that the GPU didn't go over 60 degrees during the test, presumably even after overclocking. This was in beside a quad-core i7.

The 6770M on the other hand can go a fair bit higher on stock:

https://discussions.apple.com/thread/3633821?start=0&tstart=0

A number of separate reports online say 80-90 degrees. It makes sense to go with a cool chip if you have a smaller enclosure. There's going to be more room without the optical but the cooler the better.

Here's the entire NVidia lineup:

http://www.geforce.com/whats-new/articles/geforce-600m-notebooks-efficient-and-powerful/

Even the 640M gets 30FPS in Metro 2033 on high.

I could see them using the 630M in the middle Mini, the 640M in the entry 15" and the 650M in the high-end MBPs. There is a GTX 680M coming on June 5th that would be suitable for the 27" iMac (might explain the delay):

http://www.legitreviews.com/news/13130/

Looks like a full refresh at WWDC.
post #27 of 64
Quote:
Originally Posted by nvidia2008 View Post

 

Game over.

 

But computer monitors are about to quadruple in pixels (hopefully starting with Macs in the next few weeks).

 

And 3D graphics are hardly lifelike yet. I think these discrete video card makers could create demand for their products by inventing and giving away new algorithms for e.g. lifelike movement, lifelike skin, lifelike trees, that require the compute power of their devices. Make is so that people aren't just comparing FPS, but that things look totally different altogether on an expensive card.

post #28 of 64
Quote:

Originally Posted by nvidia2008 View Post

 

Also if Intel's drivers improve, the Ivy Bridge GPU will make most discrete GPUs unnecessary. On Windows at least based on Anandtech's testing game compatibility and drivers for the most part seem quite alright and the Ivy Bridge GPU benches quite well against discrete cards. Nothing to suddenly turn the tide against discrete GPUs for enthusiast PC gaming, but for everything else, definitely something interesting happening with Ivy Bridge.

 

Drivers are a fair point, so I hope Apple and Intel get it right with Ivy Bridge, because they will be the basis of the fastest Mac laptops ever made, on the best and most elegant laptop platform ever created (think MacBook Air-esque design coming to a pro-level 15" laptop).

If Intel's drivers improve, indeed. Not when.

 

And rather than just deliver functional driver with new hardware, how about fixing the existing ones? I'm not about about to write off a six-month-old MacBook Air, simply because Intel decided to move on to the Next Big Thing. Granted, I never had issues in everyday use of the MacBook Air, but the rendering errors in the Pixel City screen saver are plantiful, embarrasing and telling.

 

.tsooJ

post #29 of 64

CUDA. 4 little letters that make the scientific computing community want Nvidia rather than OpenCL on another platform.

No, you won't find that especially important in the App store, or in the latest games (because the GPU is already busy), but for the PRO wanting/using an MBP it is a killer spec.

post #30 of 64
if this is true, then it wud be a decision that might come back to burn Apple due the 8600 gpu fiasco... i hope not because (while poor form) the Late Steve Jobs would have outright said no... (burned once, forever forgotten).
I only reason that they would be going with Nvidia would be due to Nvidias faster drivers, and that Nvidia could supply enough for the laptops. Perhaps this is the reason that GTX680 and GTX670 cannit be found for purchase on NEWEGG or elsewhere...
post #31 of 64

The GT 650M was found in the developer software. I'd say that chip is a good bet, significantly higher 3Dmark Vantage score than todays MBPs stock GPUs. And it's a true Kepler, not a re-bradge


http://www.geforce.com/hardware/desktop-gpus/geforce-gt-650m

 

600m-lineup.png

post #32 of 64
Quote:
Originally Posted by haar View Post
i hope not because (while poor form) the Late Steve Jobs would have outright said no... (burned once, forever forgotten).
 

 

Which is why after Microsoft ripped off Mac OS while they were working on software for the new Mac OS, Apple made sure to never again do business with Microsoft.  Oh wait...  :)  Yeah, I don't think reality is what you think it is.

post #33 of 64
Quote:
Originally Posted by Marvin View Post

I could see them using the 630M in the middle Mini, the 640M in the entry 15" and the 650M in the high-end MBPs. There is a GTX 680M coming on June 5th that would be suitable for the 27" iMac (might explain the delay):
http://www.legitreviews.com/news/13130/
Looks like a full refresh at WWDC.

 

A 630M in a refreshed mini this summer would be very nice.

post #34 of 64
Quote:
Originally Posted by Tallest Skil View Post

I'm still not sure I trust nVidia after the 8600M fiasco.

 

I was thinking the same thing. 

 

That and since when was ABC News a sure thing for knowing about any tech much less Apple. Something tells me we might find out that their trusted source is Digitimes

A non tech's thoughts on Apple stuff 

(She's family so I'm a little biased)

Reply

A non tech's thoughts on Apple stuff 

(She's family so I'm a little biased)

Reply
post #35 of 64
Quote:
Originally Posted by hmm View Post

In laptops maybe, and not by much. Where are you getting your information?
A few watts here and there make for a big difference in battery life. Beyond that AMD has been producing their latest series of laptop GPUs for a couple of months now. Producing, notably, without all the rumored reliability issues NVidia has been having.
Quote:
You are asking the wrong questions. You need to do some research because it depends on the application and how you will be using it. We're well past the days of what computer do I need for X application. I wouldn't worry about the computer until you're close to graduation anyway, as requirements will change in that time.
More importantly computers will change dramatically in that time. I suspect though that his Mac Book may be underpowered for the educational uses he has lined up for it.
Quote:
I disagree. OpenCL has seen some nice adoption, and it's extremely useful in many more things today. More people benefit from a strong gpu and OpenCL support today than would have a few years ago.
OpenCL has been one of Apples greatest success stories. Even Adobe is switching over to it. As to GPU acceleration in general, each iteration of the Mac OS continues to leverage the GPU to a greater extent. I'm not sure why the significance of OpenCL is so underestimated in these forums.
Quote:
Is this year's AMD significantly better? Mobile gpus were NVidia's weakest point.  It's a troll rumor anyway. Remember a few months ago? We had the same rumor, then a rumor they couldn't satisfy Apple, now another rumor of NVidia, all from the same source. Laptops debuting later than initially expected by the masses = many many fabricated rumors.
I would have to say yes, AMD is better. Their GPUs run at lower power, are better OpenCL machines and have not suffered from reliability problems like NVidia. AMDs drivers "MAY" be less reliable but much of the noise with respect to AMD drivers are the result of issues many years old. That doesn't mean they are in anyway perfect, driver wise, but NVidia hasn't exactly been wonderful on the Mac platform. Besides it is hard to tell who is responsible for what at Apple. Remember Apple is the company that is way behind with OpenGL support and other features.

NVidia may be the new play at Apple but I'm not convinced that they are worth it. Maybe for niche uses NVidia has advantages but for all around usage I'd prefer AMD.
post #36 of 64
Quote:
Originally Posted by Tallest Skil View Post

I'm still not sure I trust nVidia after the 8600M fiasco.

Tell me about it. Mine died a couple months ago. I didn't have money for a replacement or for a new Mac, so I bought a Zotac AD04 and... I just found out a few weeks ago that I could have had it fixed for free in an Apple Store. FFFFFFFFFFFFUUUUUUUUUUUUUUUUU!!!!!!!!!!!!!!!!!

post #37 of 64
Quote:
Originally Posted by heffeque View Post

Tell me about it. Mine died a couple months ago. I didn't have money for a replacement or for a new Mac, so I bought a Zotac AD04 and... I just found out a few weeks ago that I could have had it fixed for free in an Apple Store. FFFFFFFFFFFFUUUUUUUUUUUUUUUUU!!!!!!!!!!!!!!!!!
Do you still have the Mac? Did we learn a lesson?

This bot has been removed from circulation due to a malfunctioning morality chip.

Reply

This bot has been removed from circulation due to a malfunctioning morality chip.

Reply
post #38 of 64
Quote:
Originally Posted by nvidia2008 View Post

Interesting, name me 5 OpenCL OSX 10.7 apps on the Mac App Store that qualify for a decent, mainstream Mac experience. Core Image doesn't count, has to be mostly OpenCL, that is, the app calls the OpenCL stuff directly not OpenCL indirectly by calling to Core Image only.
Talk about trying to limit the discussion in your favor! Since when does the Mac App Store cater to high performance apps? Further do you really expect people to go out and do such research for you?
Quote:
"Analog" on the Mac App Store is really cool, but it's using my GPU for Core Image calls to do the filtering, and it's laggy at times, if it used pure CPU power on a higher-end Sandy Bridge MBP, it might be faster.
"might be faster". That is a real firm position to take. As for Sandy Bridge being faster for core image that is very doubtful, given that the SB chip is coupled with a modern GPU.
Quote:
And with Ivy Bridge just around the corner, the case for discrete GPUs given the sheer power of Ivy Bridge CPU components is diminishing fast for ~mainstream laptop~ computing.
For a low end laptop maybe. For a mainstream machine it is much more a mixed bag. Further with HiDPI we might actually see a performance regression. Here is the reality, the Ivy Bridge GPU still sucks, it doesn't even out perform AMDs year old APU.
Quote:
As far back as 2005, desktop sales jumped off a cliff and that was the point they took discrete GPUs with them.

I don't deny that descrete GPUs become harder to justify with each iteration of compute hardware. However Integrated GPUs (Ivy Bridge & Trinity) are a long way from providing replacement functionality in the likes of the MBPs. That will likely change in a couple of years but right now we still need descrete GPUs.

As to OpenCL I'm not sure why you constantly poo poo it. OpenCL is perhaps one of Apples greatest success stories from a developer perspective. The movement to OpenCL has taken place throughout the industry as it is the best open solution out there to leverage computing resources often found in GPUs. I really see no basis for your position.
post #39 of 64

nvidia2008, you need your head checked.

 

 

Quote:
Originally Posted by nvidia2008 View Post

 

 

 

In any case the discrete GPU industry is on its way out. Only certain MBP 15" and all MBP 17" - esque MBPs will have discrete GPUs. AMD and Nvidia blew it, and Intel steamrolled them, legally and illegally (eg. locking out Nvidia). In the meantime PowerVR and ARM is eating everyone's breakfast, lunch and soon, dinner.

 

Are you from the future? 2016? Because as far as I know, that's the only scenario where such a statement could possibly be true.

 

As I said before, my username is perhaps the peak of discrete GPUs (Even then the 8600M fiasco was quite bad, though that G92 GPU design was superb).

 

I have the famed Nvidia 320M in my MBP 13" 2010. It's okay, but compared to a integrated Intel MBP 13" nothing great.

 

Not only that's incorrect (it's better than Sandy Bridge graphics), but even if it wasn't, that'd be nothing more than the expected, seeing as that GPU was really low-end and that HD3000 is, oh I don't know, a generation newer?

 

So many laptop discreet GPUs in any case are such crippled versions of their desktop brethren the difference between them and Intel is nothing more than marketing. 2GB VRAM on a useless laptop discrete GPU? That's like those $300 HDMI cables.

 

Bullshit, and any quick look at Wikipedia or benchmarks at AnandTech proves it.

 

Throw in the whole "casual gaming" phenomenon and that's the killing blow to the discrete GPU industry.

 

Casual gaming is not going to kill AAA titles. That's simply a thoughtless assertion.

 

Consider this: just at the time when GPUs became ever more hot, heavy, expensive and noisy to support ever more complex, risky and expensive (software and hardware) games, people gravitated towards simpler, alternative "casual" games. The "perfect storm" that crushed almost all discrete GPU dreams.

 


BULLSHIT. Double one, at that:

1) GPUs are getting more and more power efficient and smaller. Look at the size of a Nvidia GTX 670.
2) People are NOT "gravitating towards simpler games", it's people who never played games that are being drawn in to casual gaming thanks to smartphones. They'd have never spent money on a gaming gadget otherwise - casual or not - and the few exceptions to that rule are ex-Nintendo DS users, which don't fit in with the market we're talking about.

 

Also, I enjoy how Nintendo's suffering with all this. Maybe then they'll stop selling gimmickry outdated hardware that sports only one or two quality titles.

 

The only decent GPUs in the future will be:

 

1. Megalithic desktop 500W-1KW multi-card setups, ie. niche stuff Nope

 

2a. Intel Integrated which will benefit from modest improvements in GPU architecture but huge gains in CPU power (ie. GPGPU not so essential because CPU still handles a lot of tasks, including custom routines for video encoding said to be the forte of GPUs, which is now bollocks)

 

2b. Intel Integrated which will benefit a lot from process improvements which currently outpace anything TSMC/ AMD/ Nvidia can achieve

 

3. PowerVR which will come in from the ground up, ie. iPad 3/4/5 GPU is the next great gaming GPU.

 

So on one hand we have niche "high-perfomance" stuff that has no real world mainstream application, aging gaming consoles with now very paltry graphics, "next-gen" gaming consoles which are still dicey in terms of business "models", Intel Integrated which is sufficient for mainstream computing but nowhere near gaming-class,

 

And on the other hand... iPad. 'Nuff said.

You're implying that AAA titles will vanish and that the future of gaming is based on Angry Birds lookalikes running on iPads?

I feel an urge to call names... Let's just settle on "THAT'S JUST INANE BS"

 

 

Quote:
Originally Posted by nvidia2008 View Post

 

On a final note, who needs a beyond-average GPU in a laptop anyway? Only certain niches (aka "verticals"). Gaming has potential but Windows is nonsense compared to the simplicity and tradeability of Xbox, PS and Wii, and Mac titles are as always, progressing but still laughable.

 

Oh scratch that... Angry Birds on iPads... AND CONSOLES? Considering the rumored specs for the "next-gen" are based on last-year's mid-end standard GPUs, many notebooks will potentially outperform these new consoles at launch. You know who you can think for that?

Discrete GPUs.

 

 

Quote:
Originally Posted by nvidia2008 View Post

 

While that is true, that's also the spectre discrete GPUs face. The 13" was a downgrade no doubt, but how many people "suffered" as a result? Since CPU power massively improved, the GPU downgrade was not really felt (aside from high-end games, which is rare for Mac users).

 

I didn't buy a MacBook last year because it's insane that a +$1000 doesn't have a goddamned real GPU. I own the same MBP you do.

 

Even for games, given the way the ports are done as well, Pyschonauts is unplayable on my 320M even though it's not Intel, and even though that game is, well, very, very old.

 

So while I enjoy the 320M's advantages in say, OpenGL Photoshop where zoom levels are antialiased/sampled properly, increasingly Intel GPUs can do this.

 

iPhoto and iMovie can be GPU-intensive to leverage the GPU, but on my MBP 13" 320M Core 2 Duo, using the GPU for certain iPhoto and iMovie tasks is ~slower~ compared to Sandy Bridge.

 

So there's this one task in which Sandy Bridge uses both CPU and GPU and the computer that uses just the GPU that's one year older LOSES? SHOCKING.

 

I guess I'm proposing that the promise of discrete GPUs has been obliterated by poor real-world implementation.

 

Poorly implemented by Apple, you mean.

 

The discrete GPU companies have also now painted themselves into a corner:

 

How is this 100W-200W+ behemoth ever going to benefit our new mobile, tablet, sleek and slim lifestyle while still pushing ever greater interactive experiences?

 

GeForce_GTX_670_F-1_575px.jpg

 

I could put a picture of an hexacore Intel Extreme i7 and ask the same thing. If we're comparing apples to oranges...

 
Quote:
Originally Posted by nvidia2008 View Post

At this stage as well, the Ivy Bridge CPU and GPU benchmarks are literally off-the-charts for anything ever conceived in an on-die CPU+GPU:

http://www.anandtech.com/show/5771/the-intel-ivy-bridge-core-i7-3770k-review

 

Not impressive considering how on-die GPUs have always been unusable.

 

Crysis Warhead, Metro 2033, Dirt 3 at playable frame rates by Intel Integrated Graphics is nothing short of unbelievable given how bad they used to be.

 

With a monstrous CPU dragging it, relatively low resolutions, and every single thing turned down to low.

I thought Mac users didn't settle for mediocrity. Guess I'm wrong.

 

I think at this stage only the high-end MBP 15" and high-end MBP 17" Ivy Bridge MBPs will have discrete GPUs.

A $1500 pro notebook. Without discrete graphics.

**** off.

 

I mean, Ivy Bridge appears to clearly outpace the ATI 5450, the staple low-but-not-horrible-end class of discrete GPU.

 

If Intel can ramp production suitably, Ivy Bridge is pretty much the nail in the coffin for discrete GPUs, and AMD will struggle in the mid to higher prices.

 

Given also that while games are useful for benchmarking the integrated GPU, for most mainstream use, if driver support is adequate then it's more than enough.

 

We're talking 30%-40% improvements over Sandy Bridge. Imagine, playing a DX11 game on an Intel Integrated GPU... albeit at 720p... but still... this is big.

 

And if you look at general compute of the ~GPU alone~ for Ivy Bridge, pretty much game over for AMD and Nvidia in the mid- to high-priced mainstream-use laptops:

 

Not only you are comparing it with old hardware, you're assuming that Intel can keep improving the GPUs by 40% every single year until they become comparable to mid-end discrete GPUs, while Nvidia and AMD do nothing but stand still babbling like idiots. That's not going to happen.

Unless of course you think that everyone but people who can afford $2200 MacBook Pros can use low-end GPUs without a problem.

In which case, I recommend you visit the mental hospital.

 

45906.png

 

You buy one chip. You want fast CPU? It's in that one chip? You want a reasonable GPU for graphics and compute? It's in the same one chip. You want optimised video encoding? Also in the same chip.

 

Game over. Look at the damned graphic. A low-mid end GPU that's two generations older is beating the crap out of HD4000. I'm not asking for Crysis Warhead on Ultra and 4x Anti Aliasing. I'm asking for a GPU that can play modern games, which could have had much higher minimum requirements were the consoles not dragging the industry and slowing its progress, at native resolutions and AT LEAST no low settings.

 

The 640M does that, and it fits on an ultrabook, with reasonable power consumption. Does it matter that's not in the same chip?

iPhone 4S 64GB, Black, soon to be sold in favor of a Nokia Lumia 920
Early 2010 MacBook Pro 2.4GHz, soon to be replaced with a Retina MacBook Pro, or an Asus U500

Reply

iPhone 4S 64GB, Black, soon to be sold in favor of a Nokia Lumia 920
Early 2010 MacBook Pro 2.4GHz, soon to be replaced with a Retina MacBook Pro, or an Asus U500

Reply
post #40 of 64
Quote:
Originally Posted by Lukeskymac View Post

Not only you are comparing it with old hardware, you're assuming that Intel can keep improving the GPUs by 40% every single year until they become comparable to mid-end discrete GPUs, while Nvidia and AMD do nothing but stand still babbling like idiots. That's not going to happen.

 

 

Have you looked into Haswell next year?  The move from Sandy Bridge to Ivy Bridge boosted the number of execution units for the GPU from 12 to 16.  Haswell will be making the jump from 16 to 40.  Even if we only got half that multiplier in performance increase, that would be a 125% improvment, which would make that 31fps you linked up nearly 70.  Yeah, with the jump to Haswell, they will certainly be continuing the extreme increase in performance for GPUs tied to their chips. 

 

Yes I realize that Haswell is a year out and the best card in that linked graphic was a GT440, but currently that card sells for 65-80 bucks.  Imagine the entire sub-$100 video card market disappearing b/c AMD and Intel have better GPUs included in their CPUs.  That is a lot of money to be lost by the GPU companies. 

New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Apple rumored to switch to Nvidia graphics for new MacBook Pros