After eating AMD & Nvidia's mobile lunch, Apple Inc could next devour their desktop GPU business

13»

Comments

  • Reply 41 of 52
    MarvinMarvin Posts: 15,443moderator
    oxonrich wrote: »
    Ha - 8 Hours on 7 machines for that image.   <span style="line-height:1.4em;">So, that's why no one uses Cinema 4D for real animation work.</span>

    All CPU render times are crazy when you compare them to real-time engines:


    [VIDEO]


    [VIDEO]


    [VIDEO]


    While they can't do the same effects yet, it's the difference between 0.03 seconds for a frame vs minutes/hours for a frame, 4-5 orders of magnitude faster. Memory is a limit just now but when it's not, I could see people ditching CPU-based engines (people who don't have huge renderfarms anyway). Artistic ability combined with a high quality engine with fast turnaround will give better results than endlessly tweaking more physically accurate engines. When it comes to UHD content, the traditional workflows just won't be feasible.

    There really ought to be a middle ground between the two by now. Real-time engines where you can enable some CPU processing for advanced effects and have them work together and not require every frame to come out in real-time but also not hours.
    oxonrich wrote: »
    <span style="line-height:1.4em;">I really hope the memory can get that high on the next generation of rMBP and nMP.   But then the old MP had 8 dimms and they've scrapped 4 to make it smaller.   There is nothing to stop them doing that again and you never know with Apple until it actually happens.</span>

    They used 4 memory slots per CPU. With Intel increasing the core count per CPU, it made sense to drop to one CPU because the price for two was way beyond what Apple had ever charged for a Mac Pro. They'd need two motherboards and would hardly ever use the dual socket one. Haswell EP already goes up to 18-core/36-thread in a single chip. Broadwell and Skylake can go even further and DDR4 offering up to 512GB should be enough for anything.

    If Broadwell EP is still coming then the next Mac Pro likely won't support 5K displays but if Intel drops Broadwell in Q1 2016 and goes with Skylake, there could be an option for 24-core/48-thread on one CPU.
  • Reply 42 of 52
    Quote:


    All CPU render times are crazy when you compare them to real-time engines:


     

    Maybe - but that's a ridiculously long time for a very simple image.   I could do that on a CPU in a fraction of the time in at least 3 different render engines.   

     

    Quote:


    Haswell EP already goes up to 18-core/36-thread in a single chip. 


     

    Not in any Mac Pro I can order.   

     

    Quote:


    If Broadwell EP is still coming then the next Mac Pro likely won't support 5K displays


     

    I don't know why it wouldn't.   The current MacPro supports 3 4K monitors at the same time, so a 5k monitor should be relatively easy for it.

  • Reply 43 of 52
    MarvinMarvin Posts: 15,443moderator
    oxonrich wrote: »
    Maybe - but that's a ridiculously long time for a very simple image.   I could do that on a CPU in a fraction of the time in at least 3 different render engines.

    It might not have been done as a production example, it was mainly to show off splitting one frame up. My main point there was you can do that to alleviate memory constraints.
    oxonrich wrote: »
    Haswell EP already goes up to 18-core/36-thread in a single chip. 

    Not in any Mac Pro I can order.

    That one probably won't ever be an option as the chip is over $6k but that's the core count Intel's at just now so I expect a future revision to be close to that. They'll never offer the highest-end that anyone can buy, they cater to groups that are large enough to be worth catering too. The current setup means that if 99.5% buy an 8-core or less, accommodating the remainder doesn't mean having a different motherboard.
    oxonrich wrote: »
    I don't know why it wouldn't.   The current MacPro supports 3 4K monitors at the same time, so a 5k monitor should be relatively easy for it.

    The bandwidth over a single connection is just a bit high. If they employed some sort of lossless compression, they could manage it ok but this introduces latency. They could run it at 50Hz rather than 60Hz. It also depends on the GPU output. Apparently it's a completely custom way they did it in the 5K iMac - custom chips and likely doubling up two displayport 1.2 outputs, which you can't do easily externally. If it was possible on Broadwell, it would be possible on the current model but they haven't done it and you can't run the current iMac as a 5K display like the old 1440p models. Dell's 5K display needs two inputs.
  • Reply 44 of 52

    I've got to admit - I don't really get your point.   You seem to be showing me what can be done in Cinema 4D and Unreal Engine, neither of which I use, to try and justify the hardware choices Apple have made but both are irrelevant to my current usage.

     

    You would never split a frame up when rendering a large animation because of the overhead involved.   Split frame rendering, or the equivalent in other render engines, is used for large resolution single frame renders.   When you're rendering thousands of frames, you're much better off rendering separate frames on separate machines rather than using multiple machines to render each frame.   In your Cinema 4D example, this would save you 11 hours even if your full animation was only 7 frames long.   If your animation was 700 frames long you've just saved more than 4 weeks of rendering time by not using split frame rendering.   

     

    I've been paying plenty of attention to Unreal Engine, and other equivalents.   They may have a role in my future work but they're a long way from being ready to do it.   On both the hardware and software fronts. 

     

    Quote:


    They used 4 memory slots per CPU. With Intel increasing the core count per CPU, it made sense to drop to one CPU because the price for two was way beyond what Apple had ever charged for a Mac Pro.   They'd need two motherboards and would hardly ever use the dual socket one. Haswell EP already goes up to 18-core/36-thread in a single chip.


     

    You're trying to justify choices they made 18 months ago with products they don't offer even today.   The software I use benefits from multiple cores because it's been written for Pro usage and to take advantage of the best hardware available.   The highest spec machine they offer is now limited to 12 cores when the competition are offering 36 core machines.   To say this doesn't make sense because of price is ridiculous.   Even for those people who can only justify an 8 core processor, it would currently be cheaper to buy dual E5-1650V3 3.5Ghz processors and put them on a dual socket motherboard than it would be to buy a single E5-1680V3 3.2Ghz.

     

    If they really wanted to make an actual Pro machine, maybe that's what they should be doing.   Making the base model dual socket.   The motherboard could be the same for everyone, scaling up from 8 core to 36.   The current 4 and 6 core models don't offer large performance differences over an iMac retina anyway.   When they do, it's generally down to the extra RAM or Graphics cards rather than the processor anyway.   They would get the cost benefits of a single motherboard design, and would not be limiting the performance to those who can actually use it.   Form factor taking precedence has resulted in the trashcan.   It looks cool, it doesn't take up much space, but it doesn't offer the best performance out there.

     

    Maybe they need to reintroduce a rack mounted machine for people that want to stick with OSX but can benefit from the extra performance available.   The more i think about it, the less the nMP makes sense.   The performance increase over the Retina isn't big enough for high end consumers (or Pros with lower requirements) to make the leap - and the max spec performance is poor compared to the competition.   You can say it will get better with Broadwell and Skylake all you want, but so will the competition.   The form factor has limited them and we will probably be stuck with it for 5-10 years.

     

    But we digress from Graphics Cards and the original article.   If Apple make their own, they need to make them for Pros, not just consumers.   Or just accept that the Mac Pro, is now the Mac Consumer Plus.

  • Reply 45 of 52
    MarvinMarvin Posts: 15,443moderator
    oxonrich wrote: »
    The highest spec machine they offer is now limited to 12 cores when the competition are offering 36 core machines.   To say this doesn't make sense because of price is ridiculous.   Even for those people who can only justify an 8 core processor, it would currently be cheaper to buy dual E5-1650V3 3.5Ghz processors and put them on a dual socket motherboard than it would be to buy a single E5-1680V3 3.2Ghz.

    A $6400 processor isn't Apple's competition:

    http://h30094.www3.hp.com/product/sku/11112625

    The CPUs alone would be $12800 plus the power draw would be nearly 300W. Some dual options would be cheaper than what Apple offers, which is Intel's fault - they're the ones pricing both. We don't know what prices they're getting from Intel either.
    oxonrich wrote: »
    If they really wanted to make an actual Pro machine, maybe that's what they should be doing.   Making the base model dual socket.

    They could do that with the current design but it's not really necessary, especially the more the core counts increase.
    oxonrich wrote: »
    Form factor taking precedence has resulted in the trashcan.   It looks cool, it doesn't take up much space, but it doesn't offer the best performance out there.

    They never offered the best performance out there. It was a case of looking at what they offered and realising the form factor didn't fit with the parts they used. There's not enough of a market for them to bother with crazy spec machines because so few people buy them anyway (not frequently either) and those people can buy two or more machines. If you want 36-cores then you don't have to get a $15k HP, you can get 3x 12-core MPs for $19.5k and that's without an update for 13 months. Or mix and match them, have a cheaper HP for grunt work and use a Mac front-end. Macs aren't render nodes, they're workstations, the only way they'll be both is with GPU processing, which the nMP is focusing on.

    If you want to be able to get fast output then it'll happen on GPUs, not CPUs. I know it's not now but it'll happen.

    http://blog.mentalray.com/2014/07/07/mental-rays-gpu-accelerated-gi-engine-prototype/

    One of the comparisons there was a 6-core CPU taking 13 hours vs the Quadro GPU taking 37 minutes. The memory isn't an issue as they aren't doing the texture sampling on the GPU, same way they did on Avatar and I think Tintin, they used Pantaray on CUDA for lighting and Renderman on Xeon for the final output.
    oxonrich wrote: »
    The performance increase over the Retina isn't big enough for high end consumers (or Pros with lower requirements) to make the leap - and the max spec performance is poor compared to the competition.   You can say it will get better with Broadwell and Skylake all you want, but so will the competition.   The form factor has limited them and we will probably be stuck with it for 5-10 years.

    This has been said for years - they always priced quads above the iMacs, the form factor isn't limiting them. Price-performance ratios are competitive. It's not as if these things are impulse buys. If you need a 12-core Mac then they make one. If you want a 36-core Mac, buy 3 just now or wait until the single-CPU core count increases and it'll be cheaper. A 12-core MP today is $6500 but if they jump to 18-core or 24-core in a year then what was the benefit in investing in a $15k 36-core HP today? If the work pays enough for the hardware then invest in the right hardware but Apple caters their lineup to the largest markets. It's good that they even offered a MP at all because financially, it's irrelevant to them.
    oxonrich wrote: »
    If Apple make their own, they need to make them for Pros, not just consumers.   Or just accept that the Mac Pro, is now the Mac Consumer Plus.

    They would make their own GPUs for OpenCL computation rather than gaming. The current design with PCIe SSDs, 6x 4K capable TB output is clearly not for consumers.
  • Reply 46 of 52
    Quote:


    A $6400 processor isn't Apple's competition:


     

    Apple offer powerful computers to creative professionals.   HP/Dell/Boxx offer powerful computers to creative professionals.   How exactly is this not the competition?

     

    Quote:


    The CPUs alone would be $12800 plus the power draw would be nearly 300W


     

    And it would not be the most expensive Mac my company has ever bought.   And I doubt the most power hungry.   

     

    Quote:


    which is Intel's fault


     

    What??!   Economic Theory.   They sell more Quad Cores, which amongst other things, make them cheaper to produce.   Apple could take advantage of that and offer a dual socket system that uses them.   Or just offer the expensive 8 core.   Intel are not at "fault" for that.   But heaven forbid that someone blame Apple for their decisions.   You sound like a fanboy blindly defending their every choice.

     

    Quote:


    They could do that with the current design but it's not really necessary, especially the more the core counts increase.


     

    It's not really necessary to you.   Which part of this do you not get?   I am not looking for a certain core count.   Intel (or AMD, or even Apple themselves) are not going to come out with a processor that hits the magical number and is all of a sudden fast enough.   I want as much processing power as you can fit in a single machine, accessible to the operating system at once, at a cost that is economically viable to me.   A single processor does not do that.

     

    Quote:


    They never offered the best performance out there


     

    I'm not sure that's exactly true, but even if it is, why not start?!   You only have to look at their iOS products to see that dominating the high end is what Apple aim to do.   They've had plenty of opportunity to release a cheap iPhone.   The closest they got, the iPhone 5c, lasted about a year and still outperformed a lot of phones on the market at the same time.   And, for those people who wanted and could afford more, they offered the market leading high end model.

     

    Quote:


    There's not enough of a market for them to bother with crazy spec machines


     

    Apple don't sit there and say there isn't a market.   They create a best in class product and make a new, better or larger market than existed before.   Corporate IT departments are making more effort than ever to support Apples existing software.   They are selling more Macs than ever.   They are decimating the low end computer market by offering mobile devices that are powerful enough for the majority of people.

    Are you really telling me that if they made a genuinely best in class server/workstation that it wouldn't sell in significant numbers?   People have built supercomputers from Xserves in the past.   Why not again?   They're already writing apps for IBM.   Write enterprise software and then offer the best hardware out there to run it on.   Show enterprise that they can have a top to bottom iOS/OSX environment   Conveniently for me, high end servers make kick ass creative machines.   Fast I/O.   Massive CPU performance.   Add great open CL performance and we're really talking.

     

    Quote:


    those people can buy two or more machines 


     

    It's clear that you have only read about producing 3D content and have never actually done it for a living.   There are some things that simply can not be spread across multiple computers, or suffer from too much overhead to do so efficiently.   People that do this always want more power in the computer at their desk.

     

    Quote:


    Macs aren't render nodes, they're workstations


     

    Why can't they be.   If they can offer best performance per dollar, they easily could be.   We use a Windows based render farm.   450 high spec render nodes, with 150 machines being rotated out each year.   We would buy an OSX based system in a heart beat if it could compete on price and performance density.   They don't come close.   But there is no reason that they couldn't if they wanted to.   Apple have the purchasing power to compete on price with anyone and the designers to work out the density.   Our smaller, and mobile, Render farm which goes on location actually does run on Mac Minis.   We have 24 of them running in a mobile system rack.   Unfortunately, the new Mac Mini has probably killed that as a viable option for us with it's dual core processors and soldered memory, for no space gain!

     

    Quote:


    One of the comparisons there was a 6-core CPU taking 13 hours vs the Quadro GPU taking 37 minutes. The memory isn't an issue as they aren't doing the texture sampling on the GPU


     

    Fantastic.   Video Memory isn't an issue anymore .   As long as you are happy to accept that you can only add 25 million tris.   I'll just turn off the other 99.5% of my scene.   Or we could just break up the render into 1000s of render passes and re-comp it all back together in post.   What read/write speed do you think we'd need, and how much memory, for 1000+ layers of 4k 32 bit footage to play back in real time?  

     

    Quote:


    they used Pantaray on CUDA for lighting


     

    CUDA.   That's awesome for anyone who wants to work on a MacPro.   These software packages aren't all interchangeable.   If we switched to a new render engine every time there was a new must have feature we'd spend months each year re-training people.   We have experts on the software we do use and we slowly evolve what that software is when it makes sense to our business.

     

    Quote:


    the form factor isn't limiting them.


     

    The form factor is absolutely limiting them.   The performance gap between the Mac Pro and top end iMac has been narrowing for years.   But the old Mac Pro still made sense for thousands of people because they either wanted dual processor extreme performance, or, you could do things with it that you simply couldn't do with the iMac.   Fill it with hard drives in RAID configurations.   Add external video processing or sound cards.   These things are now all taken care off externally over Thunderbolt.   Which means you can do it on the iMac just as easily.   All apart from that dual processor for Extreme Performance.   Which they've handily removed from the Mac Pro as well.   Great.   Through their own revolutionary design, they have made the market for their own product smaller than it was before.   The only real point to it is the graphics cards and the software to make the most of them doesn't really exist yet, apart from their own FCPX which lost a large amount of it's pro user base because they released a consumer focussed version of it in 2011.   Seriously!   You can keep pointing out more render engines if you like, but, I'd be quite surprised if they weren't using CUDA based technology.

     


    Quote:


    then what was the benefit in investing in a $15k 36-core HP today?


     

    Because it would be the fastest machine available at the time and would enable me to meet the needs of my clients.   If we buy the mid-range machine, by the end of a 3 year cycle it's struggling to keep up.   You keep talking as if the content we produce has a fixed performance requirement and a certain core count will fix it.   We've already discussed the huge jump in processing needs to go from 1080p to 4k.   Consumer cameras are filming in 4k.  Pro cameras are shooting in 6k and up.   Bit depths are increasing.   Poly counts are up.   Clients expectations are up.   Hardware is struggling to keep up, let alone get ahead of the curve.

     

    Quote:


    Apple caters their lineup to the largest markets


     

    What absolute rubbish.   The largest phone market is the high-end is it?   Android dominates the largest market.   Apple dominate the high end/high margin.   

     

    Quote:


    It's good that they even offered a MP at all because financially, it's irrelevant to them.


     

     

    Perfect - The creative industry that kept Apple alive pre iOS are now apparently irrelevant to them.   I don't think Tim Cook would agree with you though.

     

    Quote:


      They would make their own GPUs for OpenCL computation rather than gaming.


     

    Who on earth mentioned gaming?!   That well known staple of OSX use.

     

    Quote:


    The current design with PCIe SSDs, 6x 4K capable TB output is clearly not for consumers.


     

    It supports 3x4K screens.   One per Thunderbolt Bus.   Keep up.

  • Reply 47 of 52
    MarvinMarvin Posts: 15,443moderator
    oxonrich wrote: »
    Apple offer powerful computers to creative professionals.   HP/Dell/Boxx offer powerful computers to creative professionals.   How exactly is this not the competition?

    The CPUs you were talking about are in a different price bracket. They never sold a machine with $12800 worth of processors and never would have because they don't have enough customers requesting it for it to be worth doing. If you put in a request for 10,000 units with a $150m upfront order renewed yearly they might consider it. They obviously never had such a request, they're not going to turn down great business opportunities if they're available.
    oxonrich wrote: »
    I want as much processing power as you can fit in a single machine, accessible to the operating system at once, at a cost that is economically viable to me.   A single processor does not do that.

    Neither does dual processors, you'd be better with quad processors. You just decided that dual processors are a good compromise. Apple decided that since they only ever offered 12-cores that single processor was an acceptable compromise. They can see Intel's roadmap and the core count on a single CPU is good enough for a workstation in that price bracket.

    At that price bracket, they are competitive. Apple's base 12-core costs $6499, Dell's closest match for price is a 14-core for about $6049, which is down to them using the latest CPUs and just one GPU. Not enough to persuade someone at that price bracket away from the Mac platform.

    The Mac Pro isn't just catering to raw CPU users, it's also for other high-end productive work like editing, compositing, audio work.
    oxonrich wrote: »
    I'm not sure that's exactly true, but even if it is, why not start?!

    Are you really telling me that if they made a genuinely best in class server/workstation that it wouldn't sell in significant numbers?   People have built supercomputers from Xserves in the past.   Why not again?

    They gave the reason for dropping the XServe and it was because people weren't buying them. You just have to look at the average selling prices of server hardware. HP's average is around $1600, that's not best-in-class by a long way and yet they sell the most units.

    The total units for servers/workstations is around 1-3 million units per quarter for all vendors as opposed to 80 million consumer units. The volumes negate the advantages with margins.
    oxonrich wrote: »
    There are some things that simply can not be spread across multiple computers, or suffer from too much overhead to do so efficiently.   People that do this always want more power in the computer at their desk.

    We use a Windows based render farm.   450 high spec render nodes, with 150 machines being rotated out each year.

    I'm not following you here, I said about buying multiple machines, you said that's a bad idea then you're saying you have 450 nodes. If you have a renderfarm, why are you trying to render 4K animations on the single workstation? Did you mean you wanted Apple to make a server-like machine that could replace some of your nodes with that you'd also use as a workstation? The servers would surely be best as a different design like a blade but you wouldn't use that design on the workstation.
    oxonrich wrote: »
    We would buy an OSX based system in a heart beat if it could compete on price and performance density.   They don't come close.

    Why don't they come close, I just priced a $15k 36-core HP vs 3x 13-month old Mac Pros with 36-cores totalling $19.5k. That's close. They're not going to sell them with no profit margin.
    oxonrich wrote: »
    Our smaller, and mobile, Render farm which goes on location actually does run on Mac Minis.   We have 24 of them running in a mobile system rack.   Unfortunately, the new Mac Mini has probably killed that as a viable option for us with it's dual core processors and soldered memory, for no space gain!

    I didn't like to see the Mini losing the quad option but people clearly weren't buying that option.
    oxonrich wrote: »
    Video Memory isn't an issue anymore .   As long as you are happy to accept that you can only add 25 million tris.   I'll just turn off the other 99.5% of my scene.   Or we could just break up the render into 1000s of render passes and re-comp it all back together in post.   What read/write speed do you think we'd need, and how much memory, for 1000+ layers of 4k 32 bit footage to play back in real time?

    Memory still needs to move to a shared model and it's being worked on:

    http://developer.amd.com/community/blog/2015/01/15/opencl-2-0-fine-grain-shared-virtual-memory/

    Apple's setup with the Mac Pro could perhaps move to a custom shared memory model where the GPUs share RAM with the CPUs with a very high bandwidth link. The CPU probably doesn't need as high bandwidth as the GPU so they could put say 32GB on each GPU card. The CPU would use both to get 64GB, a portion would be mirrored for real-time uses and compute could have them separate. There's latency to consider too though.

    You seem to be deciding that GPU computation is not relevant and never will be to you but the potential performance benefit is huge. I know potential means nothing until it's production-ready but the Mac Pro design caters for this scenario pretty well.
    oxonrich wrote: »
    These software packages aren't all interchangeable.   If we switched to a new render engine every time there was a new must have feature we'd spend months each year re-training people.

    If you're stuck to one piece of software then you have to cater the hardware to it and if that excludes Apple so be it. Apple doesn't have to cater their hardware to a single software package or usage case.
    oxonrich wrote: »
    The performance gap between the Mac Pro and top end iMac has been narrowing for years.   But the old Mac Pro still made sense for thousands of people because they either wanted dual processor extreme performance, or, you could do things with it that you simply couldn't do with the iMac.

    The old one only went up to 12-core too. At most it was only ever 3x the performance of the iMac. The current 12-core MP is still around the same ratio vs the iMac.
    oxonrich wrote: »
    What absolute rubbish.   The largest phone market is the high-end is it?   Android dominates the largest market.   Apple dominate the high end/high margin.

    They do dominate in the high margin segment but they sold around the same number of smartphones as Samsung last quarter. They don't cater to groups where they'll sell hardy any units, which is why the XServe is gone as well as the 17" MBP.
    oxonrich wrote: »
    Perfect - The creative industry that kept Apple alive pre iOS are now apparently irrelevant to them.   I don't think Tim Cook would agree with you though.

    The creative industry nearly killed Apple pre iPod/iMac. Before the iMac, they were months from bankruptcy. People keep rewriting this part of history. When Steve Jobs went back to Apple, it was the iMac that turned it around and immediately outsold every other model. He's not sitting with a Mac Pro on top of him:

    1000

    Steve wanted to get rid of the high-end and it was Tim Cook that decided otherwise. Tim even said recently in an interview that the Mac Pro doesn't do very well for them but they want to keep making it.
    oxonrich wrote: »
    Who on earth mentioned gaming?!   That well known staple of OSX use.

    You mentioned consumer use, what other consumer use is there for a powerful GPU? OS X is used for gaming:

    http://appleinsider.com/articles/10/06/05/steam_survey_finds_more_than_8_of_gamers_use_apples_mac_os_x

    The stats will be higher than that because some of the Windows versions will be using a Mac.
    oxonrich wrote: »
    It supports 3x4K screens.   One per Thunderbolt Bus.   Keep up.

    All 6 ports are capable of running a 4K display, I know you can't run 6 at once but that doesn't make it a consumer platform. Every computer has a limit, falling short of the highest spec you can imagine is not the cutoff point between a consumer product and a professional product.
  • Reply 48 of 52
    Excellent article again DED & HD @Asymco are the only original thinkers writing tech at the moment.
  • Reply 49 of 52
    undefined
  • Reply 50 of 52

    It doesn't make sense for Apple to decouple the GPU from the CPU in the A(x) processor and manufacture it separately for use in their desktop and laptop computers. It would seem much more logical that Apple would release a MacBook Air with the A(x) SOC in its entirety than to build a motherboard for an Intel x86 processor with additional hardware to integrate an Apple GPU. 

     

    Apple's desktop and laptop sales are only a fraction of their mobile products and I don't expect to see a lot activity in that arena. Apple has the resources available to create an ARM processor with serious brawn, enough to compete with Intel's Xeon. It makes no sense to do so as it would remove the focus on Apple's mobile products where all the momentum is and where Apple makes its fortunes.

     

    Intel makes powerful processors for the desktop, but that market is now going into serious decay. My own iMac is barely used and primarily only for backing up my iPhone. I have no plans on upgrading the computer for a long time. Granted, I do not need the grunt to perform serious rendering or video, I don't do gaming, and I no longer need the Intel processor to run Windows as my workplace is iPad and iPhone friendly. I am able to do everything I need with an iPad and a real keyboard. However, with bluetooth, a real keyboard is easy to pair up.

     

    Apple has no compelling reason or need to migrate the Mac pro or MacBook pro to ARM. The Air might migrate for the sake of better battery life and I myself would be in the market to purchase a MacBook Air that could run on batteries for 12 hours. As I mentioned, I have no need for the Intel x86 processor and suspect the majority of users of the Air don't either.

    The A(X) processor with a superior GPU to Intel's integrated graphics would likely ease the Air's transition anyway.

     

    The pro versions are a different story, and I don't suspect that Apple will migrate them to an A(X) processor in the near or moderate term future anyway. 

     

    Over the long term, Apple will have a distinctly decided advantage over Intel. There just won't be enough of a desktop market for x86 in order to justify's Intel's very significant R&D budget going forward in a shrinking market. Samsung is now dedicating double of what Intel is for their CPU R&D budget. I just don't see any way for Intel to move forward. 

     

    Apple is pretty absolute and won't be using x86 processors in their mobile products. Samsung has what now seems an insurmountable lead in the manufacture of SOCs. Intel has spent billions chasing a market that has completely eluded them and there is no way for them to get on board. They needed to purchase an ARM license and go after Apple's business. They didn't and the rest will become history. 

     

    However, Intel will still have Apple's pro business for now. It doesn't really mean much for the longer term, however. It's now really only a matter of time before the A(X) series reaches parity with the Xeon. No amount of wishful thinking or Intel technobabble will stop that from happening. Apple and Samsung have too much momentum, extremely talented engineers and the desire to succeed above all else. When that day happens, all of Apple's business will move to ARM/PowerVR. Not even the legacy base of installed Windows programs can stop that from happening. And Apple won't have to separate the PowerVR GPU from the ARM CPU for that one to happen. When that happens, Nvidia and ATi will be automatically displaced anyway.

  • Reply 51 of 52
    hmmhmm Posts: 3,405member
    Quote:

    Originally Posted by herbivore View Post



     

    However, Intel will still have Apple's pro business for now. It doesn't really mean much for the longer term, however. It's now really only a matter of time before the A(X) series reaches parity with the Xeon. No amount of wishful thinking or Intel technobabble will stop that from happening. Apple and Samsung have too much momentum, extremely talented engineers and the desire to succeed above all else. When that day happens, all of Apple's business will move to ARM/PowerVR. Not even the legacy base of installed Windows programs can stop that from happening. And Apple won't have to separate the PowerVR GPU from the ARM CPU for that one to happen. When that happens, Nvidia and ATi will be automatically displaced anyway.


     

    It's unlikely that Apple is anywhere near the biggest customer of either AMD or NVidia. If anything their survival requires that they outlast intel. Just look at the current lineup. The macbook air doesn't contain them. A small percentage of 15" macbook pro configurations contain NVidia. The minis do not contain them, and they have been removed from the least expensive imacs. As it is now NVidia isn't in that many Macs. AMD effectively sells two per mac pro, but I have no idea how many are sold. It's obviously their lowest volume line with the possible exception of the mini.

  • Reply 52 of 52
    Nvidia and AMD are very likely much more threatened by Apple's mobile chips. It will be interesting how the A9 stacks up against the Tegra X1 in graphics performance.

    If the benchmarks are true regarding the Samsung Exynos 7420 and the chip is really in the ballpark as Intel's Corei5 in CPU performance, then Intel really is in trouble. The link is here:

    sonictech101.blogspot.com/2015/01/how-fast-are-tegra-k1-x1-and-exynos.html?m=1

    If this is true and the Apple A9 beats the Exynos 7420 in CPU performance along with the Tegra X1 in GPU performance, it is entirely conceivable in the near future, and very near future to see Apple launch an ARM powered MacBook.

    I just can't see Apple building a standalone GPU based on a PowerVR design for their Mac Pros.

    At the rate those mobile processors are improving, Apple will be moving their whole line over to ARM. They may continue to release Intel based Macs for a time in order to maintain compatibility, but really, for how long?

    With most software development occurring now on iOS or Android on ARM, how long is Intel or even Microsoft for that matter going to remain relevant.

    I think I would much rather have a Mac mini running on an A9 processor with a real GPU than an i7 with Intel graphics.

    I guess I just don't see the reasoning behind separating the GPU from the A(X) processor and eating AMD's and Nvidia's lunch. Intel's lunch is about to get eaten. That's where I see things are logically headed.

    Samsung is about to cause a great deal of pain and horrible agonizing pain to Intel. It is not going to be pretty. And I have many friends who work for Intel. I just never realized that ARM was this close to the Core processors in performance.
Sign In or Register to comment.