nVidia are big in autonomous vehicle tech, particularly processors. I imagine Apple are interested in them for those products, not their desktop and laptop graphics.
I would really like to see more NVIDIA GPU's in Macs. Just my opinion, I think NVIDIA graphics are a little better these days. They're certainly better from a power standpoint.
Tell that to all the MacBook Pro owners from a few years back! I was very lucky and Apple replaced my motherboard but the vast majority of folks I knew ended up with large aluminum paper weights. I'd like to see a genuine 100% designed Apple GPUs, I'd trust Apple more than either AMD or NVidia.
As an owner of a 2011 MBP, I can say that with my third failing AMD Radeon HD 6750M (second this year) and the existing replacement program through the end of 2016, I'm not sure why you'd single out Nvidia as being problematic.
Ok they are both problematic then. The generation before yours were NVidia. My point was GPUs by third parties have been the #1 issue with otherwise amazingly well made and reliable laptops from Apple, hence my hope Apple gets into the GPU game themselves.
Macbook Pro used to be cutting edge technology but Cook has neutered it in favor of toys for college kids. It's 4 years behind the hardware curve which is embarrassing. How about a17 inch with 64gb ram, top class gou, and 5tb ssds and an 8 or 10 core Intel cpu for audio visual pros on the go? Apple could sell these for 4 grand and up like hotcakes
If there will be a discrete GPU option for future 15" models of the MacBook Pro, it will almost certainly be a BTO option. I think the Mac Pro and the iMac are more likely candidates for any Apple-specific work that Nvidia might be doing.
this is, by far, the most interesting rumor I've heard in what feels like years!!
How?
First of all it is mac related. Not a lot of rumors are those days. And second... well... I've almost given up hope for a decent Mac Pro with NVIDIA gpus. Whatever this rumor means. My hopes are up at least 1%
nVidia are big in autonomous vehicle tech, particularly processors. I imagine Apple are interested in them for those products, not their desktop and laptop graphics.
I'd love to see a return of the hardware to the base of the iMac (which could be as thin as a MacBook Pro and a razor thin display take the place of the current design. Moving the hardware back to the base would also give it more stability. Wireless charging is a possibility also, but I'd not bet on that part.
Congratulations. You've completely removed any way to VESA mount that iMac, which you've been able to do ever since moving from the lamp shade iMac G4.
I wouldn't put it past nVidia to post something like this just to bump their stock price.
Possible. GPU sales in general for AMD and NVidia are pretty sluggish and NVidias initiatives to break out have not Neenah successful. In the end though I suspect that the only place NVidia will be going is inside a Mac Pro upgrade. This only as an option. The fact is AMD's solutions are a better fit for how Apple uses GPUs.
Id really hate to see Apple return to flip flopping GPU suppliers every year like they did in the past. That really doesn't serve anybody well.
On on the other hand AMD could be as big a factor in the hold up to new Mac a Book Pros as Intel is. Any new design firm Apple really needs a 14nm class GPU in the MBP and the mobile variants are slow in coming.
I would really like to see more NVIDIA GPU's in Macs. Just my opinion, I think NVIDIA graphics are a little better these days. They're certainly better from a power standpoint.
Tell that to all the MacBook Pro owners from a few years back! I was very lucky and Apple replaced my motherboard but the vast majority of folks I knew ended up with large aluminum paper weights. I'd like to see a genuine 100% designed Apple GPUs, I'd trust Apple more than either AMD or NVidia.
The problem with GPUs in general is that they are bleeding edge performance devices that operate right on the edge of what a process can support. Running a chip hotter than hel doesn't lead to long life. GPU manufactures can get away with this due to user falling all over themselves to upgrade to the latest chip release. The problem then is how do you design long lifespan systems with chips that won't run that long reliably. In the Mac Pro they attacked the problem by lowering clock rates a bit (it doesn't take much).
Frankly I don't see this as a big deal, AMD or NVidia it doesn't matter, I don't expect to see a lot of discreet chips goinging into Macs in the future. The only thing that might change this is a big move by Apple into VR support. Virtual Reality though is already loosing steam from what I can see, at least in the visor/googles approach to VR. Given that though a Mac Mini replacement designed to support a high performance GPU to play in the VR world would be interesting. Of course that would require a Mini in a bigger chassis.
For Apple to truly get what they want out of their devices they could partner with Nvidia similarly to how they partnered with PowerVR for mobile...Apple seems to do its best work when they focus on their strengths complimented with other companies strengths...
I don't care who makes the GPU (assuming Apple quality) I just want to see Apple design and support them. GPUs have long been the Achilles' heel of Macs.
Yep for as long as I can remember.
everytime Appe has chased a performance GPU it has bit them in the ass. I suspect that they go midstream just to have reliable machines. In any event even with Apple initiatives like Metal and OpenCL I really don't see them going after performance. Let's face it Apples drivers are often the worst possible on any platform, some of the Linux Open Source drivers are known to deliver higher performance. Apple has never put a priority on performance.
Of of course there is a flip side here, Apples drivers are pretty reliable and so is the hardware in general.
I would really like to see more NVIDIA GPU's in Macs. Just my opinion, I think NVIDIA graphics are a little better these days. They're certainly better from a power standpoint.
Tell that to all the MacBook Pro owners from a few years back! I was very lucky and Apple replaced my motherboard but the vast majority of folks I knew ended up with large aluminum paper weights. I'd like to see a genuine 100% designed Apple GPUs, I'd trust Apple more than either AMD or NVidia.
As an owner of a 2011 MBP, I can say that with my third failing AMD Radeon HD 6750M (second this year) and the existing replacement program through the end of 2016, I'm not sure why you'd single out Nvidia as being problematic.
They are both problematic, that is the problem with trying to wring max performance out of the latest chip technology. From the standpoint of reliability neither AMD nor NVidia are standouts. Intel's GPUs may be slow but I've yet to hear of a reliability problem with the hardware. Discreet GPUs on the other hand seem to be designed for a very short lifespan.
i suspect that user expectations though are starting to have an impact on GPU manufactures. Most users no longer need to nor expect to upgrade a GPU card every year. For one the incremental increase in performance isn't there. We will see a big boost with the 14nm chips but another process shrink could be a ways off into the future. We could be seeing 3-4 years between GOU offerings that really justify the upgrade costs. The other issues is that for many GPU performance is good enough. Somwithoutnthe rpaid upgrade cycles having GPU cards fail is no longer an option as peon expect much longer run times out of their hardware.
Whatever it is Apple and nVidia are up to in the GPU department you can bet your bottom dollar a $500 Windows PC will still utterly cream any Mac in a pixel pushing parade.
Many of us would still like to see a return to a configurable MacPro but one which is priced below what many see as fraudulent. OSX is only as good as the weakest link in the hardware chain. That unfailingly is GPU performance.
The Mac Pros high price may seem excessive but Intel is partly to blame ther. Frankly Apple is missing a massive market that could be filled by a computer built like the Mac Pro with a sin hike desktop chip and a single GPU card. Something that can sell rationally in the $1200 to $1500 range for the base machine. Still expensive but there are enough pluses to justify the cost
Crazy thought: why not use the pixel-pushing power of the A10 or A9x used in the iPad Pro? Small. Efficient. Powerful. Cheap.
Powerful...but not powerful enough for use in a Mac. I do think eventually this will happen though. I'm sure Apple is working on their own Mac processors.
Right now, I'd be very sad indeed if Apple did that.
ultra-mobile processing simply isn't going to cut it with notebooks and desktops.
However... Imagine Apple taking PA Semi's design prowess and applying that toward desktop class chips (not iPhone version of "desktop class" - real desktop class) that have the design efficiency that the A Series does. It would be a revolution. Apple would not only beat Intel, etc., but they would have the most powerful, most efficient, custom chips around, able to do whatever they want, whenever they want. And if they did this to GPUs also - let's say with a reference from Nvidia or whomever...
You'd have the most compact, powerful, efficient designs ever. And it would be exclusive to the Mac platform.
One problem though - most big app developers would be reluctant to code for it as the port work would be monumental. And no, we don't want the Mac to be dominated by limited iOS style apps. Sure, it would be great to see some of the UI efficiency and intuitiveness come to Mac, but not the dumbed down limitations that so many iOS apps carry in comparison.
It wouldn't be sad at all!
You make a bunch of assumptions here that have little basis in fact or strangle contradict yourself. Building a desktop or laptop based on the A series architecture is what people tale about and want here. That doesn't mean iPhone class performance. What it means is 15 watt, 28 watt and whatever class processors that deliver considerably more performance due to the ability to run the chips faster. Combined with other minor enhancements like bigger caches performance would be very good even with a minor engineering effort. Note that this is for higher end machines the A series would already be a better choice in the Mac Book no upgrades needed.
This non sense about "big app developers" is getting a bit old. An overwhelming number of apps for the Mac would simply need a cross compile ( done auto scaly by XCode). Very few apps these days are written in such a way that they are hardware dependent. Also what is this nonsense about limited iOS apps on a Mac. The expectation is that the an ARM based Mac would run Mac OS and thus have identical capabilities. As long as the machines come with 8-16 GB of RAM we would not have any limitations.
However... Imagine Apple taking PA Semi's design prowess and applying that toward desktop class chips (not iPhone version of "desktop class" - real desktop class) that have the design efficiency that the A Series does. It would be a revolution. Apple would not only beat Intel, etc., but they would have the most powerful, most efficient, custom chips around, able to do whatever they want, whenever they want. And if they did this to GPUs also - let's say with a reference from Nvidia or whomever...
You'd have the most compact, powerful, efficient designs ever. And it would be exclusive to the Mac platform.
One problem though - most big app developers would be reluctant to code for it as the port work would be monumental. And no, we don't want the Mac to be dominated by limited iOS style apps. Sure, it would be great to see some of the UI efficiency and intuitiveness come to Mac, but not the dumbed down limitations that so many iOS apps carry in comparison.
The port wouldn't be monumental except for virtualization software such as Parallels, VMWare Fusion, or VirtualBox: just about nobody is programming in machine language anymore; it would be a simple recompile.
What's the news here? Apple has routinely alternated between ATI/AMD and nVidia; the rest is a hyped up job ad, designed to elicit enthusiasm from applicants who couldn't land a job at Apple...
However... Imagine Apple taking PA Semi's design prowess and applying that toward desktop class chips (not iPhone version of "desktop class" - real desktop class) that have the design efficiency that the A Series does. It would be a revolution. Apple would not only beat Intel, etc., but they would have the most powerful, most efficient, custom chips around, able to do whatever they want, whenever they want. And if they did this to GPUs also - let's say with a reference from Nvidia or whomever...
You'd have the most compact, powerful, efficient designs ever. And it would be exclusive to the Mac platform.
One problem though - most big app developers would be reluctant to code for it as the port work would be monumental. And no, we don't want the Mac to be dominated by limited iOS style apps. Sure, it would be great to see some of the UI efficiency and intuitiveness come to Mac, but not the dumbed down limitations that so many iOS apps carry in comparison.
The port wouldn't be monumental except for virtualization software such as Parallels, VMWare Fusion, or VirtualBox: just about nobody is programming in machine language anymore; it would be a simple recompile.
I don't get where this idea comes from that getting Mac OS working on an ARM machine would be difficult. Same for Apps. Yes a few might have issues but the overwhelming majority of them will be recompiles. I really don't think people understand how well optimized LLVM, CLang and now Swift are optimized for running on different platforms. I have to wonder if people even realize that a Linux port of Swift exists. Combine that with optimization a in XCode and the app stores, Apple could use an alternative CPU architecture overnight.
As as for virtualization, I do believe ARM supports that in hardware, the only difference is virtualization will support alternative operating systems running on ARM.
As as for machine language or assembly you are right that it is hardly ever made use of these days. More importantly Apple initiatives like OpenCL highlight just how easy it is to get good enough support for alternative architectures into an app. One has basically four GPU manufactures (3 for just the Macs) to worry about in Apple land, with those manufactures having dramatically different generations of hardware, yet a developer can easily leverage all of them.
In in general a developer COULD build i86 specific software for apples hardware but there are huge incentives not to do so. Any developer with any sense would want his libraries to be transferable/buildable for iOS.
Comments
How about a17 inch with 64gb ram, top class gou, and 5tb ssds and an 8 or 10 core Intel cpu for audio visual pros on the go? Apple could sell these for 4 grand and up like hotcakes
And second... well... I've almost given up hope for a decent Mac Pro with NVIDIA gpus. Whatever this rumor means. My hopes are up at least 1%
Id really hate to see Apple return to flip flopping GPU suppliers every year like they did in the past. That really doesn't serve anybody well.
On on the other hand AMD could be as big a factor in the hold up to new Mac a Book Pros as Intel is. Any new design firm Apple really needs a 14nm class GPU in the MBP and the mobile variants are slow in coming.
everytime Appe has chased a performance GPU it has bit them in the ass. I suspect that they go midstream just to have reliable machines. In any event even with Apple initiatives like Metal and OpenCL I really don't see them going after performance. Let's face it Apples drivers are often the worst possible on any platform, some of the Linux Open Source drivers are known to deliver higher performance. Apple has never put a priority on performance.
Of of course there is a flip side here, Apples drivers are pretty reliable and so is the hardware in general.
i suspect that user expectations though are starting to have an impact on GPU manufactures. Most users no longer need to nor expect to upgrade a GPU card every year. For one the incremental increase in performance isn't there. We will see a big boost with the 14nm chips but another process shrink could be a ways off into the future. We could be seeing 3-4 years between GOU offerings that really justify the upgrade costs. The other issues is that for many GPU performance is good enough. Somwithoutnthe rpaid upgrade cycles having GPU cards fail is no longer an option as peon expect much longer run times out of their hardware.
You make a bunch of assumptions here that have little basis in fact or strangle contradict yourself. Building a desktop or laptop based on the A series architecture is what people tale about and want here. That doesn't mean iPhone class performance. What it means is 15 watt, 28 watt and whatever class processors that deliver considerably more performance due to the ability to run the chips faster. Combined with other minor enhancements like bigger caches performance would be very good even with a minor engineering effort. Note that this is for higher end machines the A series would already be a better choice in the Mac Book no upgrades needed.
This non sense about "big app developers" is getting a bit old. An overwhelming number of apps for the Mac would simply need a cross compile ( done auto scaly by XCode). Very few apps these days are written in such a way that they are hardware dependent. Also what is this nonsense about limited iOS apps on a Mac. The expectation is that the an ARM based Mac would run Mac OS and thus have identical capabilities. As long as the machines come with 8-16 GB of RAM we would not have any limitations.
The port wouldn't be monumental except for virtualization software such as Parallels, VMWare Fusion, or VirtualBox: just about nobody is programming in machine language anymore; it would be a simple recompile.
As as for virtualization, I do believe ARM supports that in hardware, the only difference is virtualization will support alternative operating systems running on ARM.
As as for machine language or assembly you are right that it is hardly ever made use of these days. More importantly Apple initiatives like OpenCL highlight just how easy it is to get good enough support for alternative architectures into an app. One has basically four GPU manufactures (3 for just the Macs) to worry about in Apple land, with those manufactures having dramatically different generations of hardware, yet a developer can easily leverage all of them.
In in general a developer COULD build i86 specific software for apples hardware but there are huge incentives not to do so. Any developer with any sense would want his libraries to be transferable/buildable for iOS.