Intel combines Core with AMD Radeon on one package
http://techreport.com/news/32792/intel-brings-a-core-cpu-and-radeon-gpu-together-on-one-package
Huh. Spitballing, Apple basically single handedly asked for Intels eDRAM GPUs, with the first Iris Pro 5200. They're still one of the Plus lines biggest users. Since Nvidia near doubled Intels performance per watt on graphics (mx150 beats the Iris 650 by double with similar wattage), I think it's very possible Apple pushed for this solution.
Huh. Spitballing, Apple basically single handedly asked for Intels eDRAM GPUs, with the first Iris Pro 5200. They're still one of the Plus lines biggest users. Since Nvidia near doubled Intels performance per watt on graphics (mx150 beats the Iris 650 by double with similar wattage), I think it's very possible Apple pushed for this solution.
Comments
According to Ars Technica, the Intel CPUs will retain their own integrated graphics as well -- Anandtech concurs.
I don't get why most of these tech sites say iMac for this. Am I missing something? An Apple "gaming laptop" seems more likely: MacBook X?
What's more, the now official delay to the second half of 2018 for high-volume Cannon Lake production and support for LPDDR4 (i.e., 32 GB RAM in a true MacBook Pro) might make a "MacBook X" with 32 GB RAM more appealing to Apple? Still an emoji-worthy long shot , but kind of intriguing.
According to Anandtech, Intel will re-enter the discrete GPU field "from top to bottom." The article argues that it basically means they are going after NVIDIA, starting with the high-end. Koduri's best experience is in integrated graphics -- that's what he did at Apple (think Retina).
If that is the case and Intel are basically saying come to us mix and match dies to build your own SoC for embedded devices.
What is to stop Apple using an Ax as hub and prime graphics for Macbook, then extending with Radeon for the Pro's?
Intel is using the EMIB to connect the discrete Radeon GPU (and its high-bandwidth memory) to the CPU. It would be beyond surprising [!] if you could get the EMIB without including an Intel CPU, but Apple could definitely use EMIB to integrate an A-series or other processor into the mix...
I still think this is likely to be for a Mac mini...
It's worth noting that the HP Z2 Mini (competition for the Mac mini) used to offer the Xeon E3-1200 as one of its high-end options. Checking today, that is no longer the case -- the high end is currently Core i7-6700 or Core i7-7700. The only reason I'm aware of this is because of my sister's NYC architecture practice. There's a reason HP markets the Z2 Mini as "designed for CAD users." She uses Mac minis, but they are in need of an upgrade. Apple's semi-official assurances that the mini is not dead means she is waiting, but not so long ago she had me looking at the HP Z2 Mini as a possible option for her. It was either that or iMacs. Now we wait. Mac Mini Pro? Mac Pro mini?
To me, EMIB feels like pretty big news, especially when combined with the Radeon announcement and the advent of "Xeon E." Anandtech is among the most careful of the PC tech sites, and if they keep saying "Apple" when they talk about these developments, that carries weight with me.
Seems like maybe we have critical mass for an AI article? I just think it's pretty clear Apple has been waiting on some kind of technical development for the Mac mini -- EMIB fits the bill quite well.
Ugh. This is wrong -- you have to select the discrete NVIDIA GPU option before you can see the full array of CPU options, which include three Xeon E3-12xx...