Future Macs could adopt Intel's new, high-performance discrete graphics chips

Posted:
in General Discussion edited June 2018
Intel has confirmed it intends to create its own discrete graphics processing chips by 2020, a move that opens up the possibility of Apple using discrete Intel GPUs across its laptop and desktop Mac lines.




Initially revealed by CEO Brian Krzanich during an analyst event in early June, Intel plans to offer its discrete GPU in just a few years, reports MarketWatch. An official Intel Twitter account confirmed the news, first by noting the 2020 target date, then by retweeting the story.

Intel's intention is to provide the discrete GPUs to enterprise and consumer markets. For enterprise, Intel wants to provide its discrete GPUs for use in data centers, powering machine learning and AI in a similar way existing GPU technology from AMD and Nvidia are used.

In the consumer market, Intel will be directly competing against Nvidia and AMD, possibly by offering graphics cards for desktop computers or supplying notebook producers with GPUs. Intel does already provide integrated graphics, which is a feature of its processor lines, but a discrete card typically provides more performance than the integrated GPU, making Intel's current graphics offering appear less desirable.

Intel has seemingly already acknowledged the lower performance of its own integrated graphics, after revealing G-series processors in January that combined Intel CPUs with an onboard AMD GPU, providing the equivalent of discrete graphics performance on the same board as the processor.

An Intel G-Series processor, combining an Intel Core processor with an AMD Radeon RX Vega M GPU
An Intel G-Series processor, combining an Intel Core processor with an AMD Radeon RX Vega M GPU


Due to the significant effort required to create a new GPU architecture that can compete with AMD and Nvidia, Intel is likely to have worked on the project for some time already if it is to meet its 2020 target. In November 2017, Intel hired Raja Koduri, formerly the head of AMD's graphics arm and credited with improving the Radeon brand, to head up its graphics and compute projects.

While Nvidia and AMD's dominance in the GPU market over the last two decades may be hard for Intel to crack, the processor producer could be in a good position to create discrete GPUs that work better with its own CPUs than its rivals. This synergy could make an Intel CPU and GPU combination an attractive prospect for notebook producers, including Apple.

Current 15-inch MacBook Pro models include both Intel-based integrated graphics and AMD Radeon discrete graphics, which are used depending on the application's performance needs. The introduction of a discrete Intel GPU could give Apple more options for what to include in a future MacBook Pro, especially if there is an extra benefit in the Intel GPU and CPU combination other than performance.

For other Mac products, Intel's integrated graphics are offered alongside AMD Radeon discrete graphics, such as the integrated Intel Iris Plus Graphics 640 in the entry-level iMac against the discrete AMD Radeon Pro GPUs in the higher models, while the iMac Pro offers AMD's Vega GPUs. If acceptable to Apple, there is the prospect of Intel offered for both integrated and discrete graphics across the board.

Intel's attempt to join the discrete GPU market could also apply pressure on AMD and Nvidia, especially considering Intel's size and experience. The sudden appearance of a viable competitor may force the incumbent industry leaders to make bigger moves forward in performance, if only to make it harder for the relative newcomer to find a consumer audience.
«1

Comments

  • Reply 1 of 31
    I thought Apple was working on creating it's own GPUs.
    razorpitmacwhizlongpathhodarwatto_cobra
  • Reply 2 of 31
    lkrupplkrupp Posts: 10,557member
    Got the lawn chair, got the popcorn, got the beer! Let the techie wannabe wars begin! Competition is good? Can Intel make a dent in the AMD/Nvidia duopoly? Is Apple doomed if they go with Intel? 
    edited June 2018 razorpitwatto_cobra
  • Reply 3 of 31
    rob53rob53 Posts: 3,251member
    This is like way too many vapor-products of the early 2000's, all speculation and few results. Intel can't even come out with a proper laptop CPU in a realistic timeframe (>32GB RAM), what makes anyone think they can come out with a discrete GPU that comes anywhere near the others, including anything Apple is doing? Not me.
    longpathwatto_cobra
  • Reply 4 of 31
    macxpressmacxpress Posts: 5,808member
    onepotato said:
    I thought Apple was working on creating it's own GPUs.
    They do in their A-Series chips and they work quite well, but I'm not sure how well it would work pushing the size/resolution of screens Apple uses for its Macs. Maybe they'd be good. Or, maybe they're also working on a desktop class GPU as well. 

    That being said, it would be interesting to see that if Apple releases a Mac mini and/or MacBook Air with their own CPU, if it also has their own GPU inside it as well. 
    edited June 2018 razorpitwatto_cobra
  • Reply 5 of 31
    SoliSoli Posts: 10,035member
    onepotato said:
    I thought Apple was working on creating it's own GPUs.
    This would be competition for vendors that would also consider discreet GPUS from Nvidia and AMD, not the GPUs Apple will be making to be part of their ARM-based SoCs.


    macxpress said:
    onepotato said:
    I thought Apple was working on creating it's own GPUs.
    They do in their A-Series chips and they work quite well. 
    As far as I'm aware they've only used their internally designed GPU in one A-series chip, tjhe A11 Bionic. I don't think its in any older chips and if it's in other chips, like T-series for the Touch Bar display or S-sereis for Apple Watch display then I'm not aware of it.
    watto_cobra
  • Reply 6 of 31
    jameskatt2jameskatt2 Posts: 720member
    Obviously, Intel is eyeing the huge profits AMD and nVidia are reaping for selling their high end GPUs to BitCoin and other Cryptocurrency miners. They will pay higher prices than consumers. This is why Intel is getting into the discrete GPU market, not any other reason. 
    watto_cobra
  • Reply 7 of 31
    macwhizmacwhiz Posts: 18member
    macxpress said:
    They do in their A-Series chips and they work quite well, but I'm not sure how well it would work pushing the size/resolution of screens Apple uses for its Macs. Maybe they'd be good. Or, maybe they're also working on a desktop class GPU as well.
    The iPad Pro has a 2732 x 2048 resolution. A 15-inch MacBook Pro is 2880x1800. If the next iPad Pro has an Apple-designed GPU, then using that GPU in a MacBook is hardly a stretch. Given that the iPhone X has a resolution of 2436 x 1125 and uses an Apple-designed GPU, it's not that big a stretch.

    I can't see Apple using an Intel discrete GPU, for many reasons:

    • Intel has a long history of designing seriously underperforming integrated GPUs;
    • Intel hasn't designed discrete GPUs for 15 years, and they weren't competitive back then;
    • Apple has supposedly been actively trying to reduce its reliance on Intel parts;
    • Apple likes to develop hardware that works hand-in-glove with its software. The Apple GPU in the A11 Bionic is purpose-built for Metal, Apple's graphics API. With Apple depreciating OpenGL in macOS Mojave in favor of Metal, it's more likely Apple would design its own built-for-Metal GPU than use Intel's built-for-DirectX GPU;
    • If Apple indeed moves to an ARM-based Mac using the A-series processor, of course it's going to use the Apple GPU that's now part of the A-series chip.
    fahlmanluckylukeMplsPwatto_cobra
  • Reply 8 of 31
    hodarhodar Posts: 357member
    With nVidia's Pascal family out, the competition is fierce. Historically speaking, every time a competitor comes out that closes the performance gap with nVidia, all nVidia does is update their published driver and activates another section of their chip - enabling hidden additional performance capabilities.
    watto_cobra
  • Reply 9 of 31
    It won't really matter, by that time Apple's lineup will be condensed down to the Watch as a core unit, connecting to scalable visual interfaces like Glasses and a myriad of external displays, plus the current-gen AirPods, all using in-house chips. Power-hungry graphics applications will use cloud services. I might have skipped a decade or so, but I have ADHD.
    watto_cobra
  • Reply 10 of 31
    MacProMacPro Posts: 19,727member
    onepotato said:
    I thought Apple was working on creating it's own GPUs.
    If by 2020 Apple are using their own CPUs I don't see an alternative to their own GPUs (or perhaps some combination of CPU and GPU), that or a range of Macs some Intel some not.  Maybe not by 2020 (that's almost here!!!)  but one day this has to be coming.
    edited June 2018 watto_cobra
  • Reply 11 of 31
    Obviously, Intel is eyeing the huge profits AMD and nVidia are reaping for selling their high end GPUs to BitCoin and other Cryptocurrency miners. They will pay higher prices than consumers. This is why Intel is getting into the discrete GPU market, not any other reason. 
    Well, the days of GPUs for cryptocurrency mining are coming to an end. ASIC's are going to rule the waves very soon. Already started actually. 
    watto_cobra
  • Reply 12 of 31
    rob53rob53 Posts: 3,251member
    Oak Ridge National Lab just announced their Summit supercomputer rated around 200 petaflops. Here’s partial specs from the top500 website:

    The IBM-built system is comprised of 4,608 nodes, each one housing two Power9 CPUs and six NVIDIA Tesla V100 GPUs. The nodes are hooked together with a Mellanox dual-rail EDR InfiniBand network, delivering 200 Gbps to each server.

    Thus means 24k Nvidia GPUs in one computer. This is small change compare to putting them in normal computers but most supercomputers are using GPUs instead of CPUs so maybe Intel is pushing to play with the big boys. Don’t see it happening with their current GPU design, however. 
    watto_cobra
  • Reply 13 of 31
    williamlondonwilliamlondon Posts: 1,324member
    Grasping at straws to retain some semblance of relevance.
    watto_cobra
  • Reply 14 of 31
    bsimpsenbsimpsen Posts: 398member
    macxpress said:
    onepotato said:
    I thought Apple was working on creating it's own GPUs.
    They do in their A-Series chips and they work quite well, but I'm not sure how well it would work pushing the size/resolution of screens Apple uses for its Macs. Maybe they'd be good. Or, maybe they're also working on a desktop class GPU as well. 
    The 12.9" iPad Pro refreshes 2732x2048 pixels at up to 120fps. That's more pixels than a 15" MacBook Pro at twice the frame rate. If we're to believe the rumors that Apple will switch Macs to A family processors in a couple years, I seen no reason to think Apple will use Intel's discrete GPUs, unless Apple is actually a co-designer of those discrete GPUs and will use Intel as the foundry for future A family processors.
    edited June 2018 watto_cobra
  • Reply 15 of 31
    SoliSoli Posts: 10,035member
    bsimpsen said:
    macxpress said:
    onepotato said:
    I thought Apple was working on creating it's own GPUs.
    They do in their A-Series chips and they work quite well, but I'm not sure how well it would work pushing the size/resolution of screens Apple uses for its Macs. Maybe they'd be good. Or, maybe they're also working on a desktop class GPU as well. 
    The 12.9" iPad Pro refreshes 2732x2048 pixels at up to 120fps. That's more pixels than a 15" MacBook Pro at twice the frame rate. If we're to believe the rumors that Apple will switch Macs to A family processors in a couple years, I seen no reason to think Apple will use Intel's discrete GPUs, unless Apple is actually a co-designer of those discrete GPUs and will use Intel as the foundry for future A family processors.
    I see no reason to assume any discrete GPUs for any Apple silicon at this point. I also don’t think we should assume they will be A-series SoCs when they will surely need to be designed for a very different type of device, including, but limited to, 16GiB+ RAM.
    watto_cobra
  • Reply 16 of 31
    onepotato said:
    I thought Apple was working on creating it's own GPUs.
    And you also thought that Apple hold all patents on electronics. You might be wrong.
    edited June 2018 muthuk_vanalingam
  • Reply 17 of 31
    While Apple's A series CPUs are now running neck and neck with Intel's 4 core i7 CPUs, their mobile GPUs are still only 1/10th the speed of a current generation NVIDIA GPU. Apple is slowly catching up but they have a long way to go. Not a complaint as I am extremely impressed with Apple's mobile CPUs and GPUs but Apple would need a discrete GPU before it can use its A series processors in their laptop or desktop computers.
  • Reply 18 of 31
    mcdavemcdave Posts: 1,927member
    onepotato said:
    I thought Apple was working on creating it's own GPUs.
    This will make it harder as the could have cashed/crashed in on the sleeping giants but with Intel stoking them?…

    That said, I think Apple’s GPU/APU focus would be on mobile/portable Metal and image/video transcoding rather than compute numbers and gaming frame-rates.
  • Reply 19 of 31
    mcdavemcdave Posts: 1,927member
    While Apple's A series CPUs are now running neck and neck with Intel's 4 core i7 CPUs, their mobile GPUs are still only 1/10th the speed of a current generation NVIDIA GPU. Apple is slowly catching up but they have a long way to go. Not a complaint as I am extremely impressed with Apple's mobile CPUs and GPUs but Apple would need a discrete GPU before it can use its A series processors in their laptop or desktop computers.
    This is an unfair comparison.  Apple’s GPUs to date have been integrated and constrained to ~8W devices.  They currently achieve higher performance per watt than either AMD or Nvidia.  Most people caught the 1.3x improvement over the A10 GPU but missed the halving of core-count & power consumption doing so i.e. a generational increase of 2.6x per watt.

    Apple’s family 4 GPUs are currently delivering around 110Gflops and a Geekbench compute score of 5K per core.  The PowerVR architecture it’s based on is limited to 16 cores (memory bandwidth?) and GPU performance is highly scalable so even a cautious upgrade could see current tech matching RX Vega M. Upgrade the memory architecture (HBM2), increase the power budget and double/quadruple core count and you’re looking at an GTX/Vega killer.

    Apple discreet graphics+T-series for MBP 15” business case would be; no additional power or cost budget, release control, metal optimisation,  strengthen ARM cores for iOS ‘simulator’ + OS core/1st party app execution, precursor to ARM transition.

    The question should be; why wouldn’t they go this way? 
    watto_cobra
  • Reply 20 of 31
    Honest question: Is there a technical reason why an integrated GPU can't be as good as any discrete GPU?
    watto_cobra
Sign In or Register to comment.