Apple silicon Mac documentation suggests third-party GPU support in danger

245

Comments

  • Reply 21 of 86
    netroxnetrox Posts: 1,422member
    The silicon SoC is ALWAYS faster than having discrete parts, assuming all devices have equal numbers. The reason is less time to travel.
     
    macplusplusjony0watto_cobra
  • Reply 22 of 86
    DetnatorDetnator Posts: 287member
    Sidtech said:
    JinTech said:
    Or you could read it as that this is what the Apple Silicon processor is directly compatible with. It doesn't say they will not support third-party. I could see Apple using their own GPU for primary tasks but third-party for more beefy tasks. Do we really think that Apple could compete with a GPU like the AMD Radeon Pro Vega II for professionals?
    Well this is Apple we are talking about, one whose ego knows no bounds. After all Apple did convince themselves that real pros as they are would be perfectly content with the abysmal disaster that was their butterfly keyboard(s) or how they made a bet in 2013 with the trash can Mac Pro with GPU options. 

    It would seem that, excluding the Mac Pro, in the future  all their Macs would come equipped with integrated graphics on Apple Silicon SOC. While not a bad thing at all, as devices like the XBox and PS4 are perfect functional examples of SOCs working fine, these comehcome equipped with fast GDDR6 memory, and  my question is howohow Apple handles fast bandwidth memory on their Macs
    Yeah... I’m sure they have no idea what they’re doing, and they’re going to release these Macs with really slow memory, effectively wasting all the rest of the performance capabilities. 

    I mean, we here know about the weakest link concept. I’ll email Tim right now and try to get him up to speed. God I hope he listens. Otherwise Apple is doomed!  Doomed I tell you. 

    /s
    😉 
    aderutterDavid H DennisRayz2016fastasleeprundhvidheadfull0wineBeatswatto_cobra
  • Reply 23 of 86
    DetnatorDetnator Posts: 287member
    keithw said:
    I'm finding it extraordinary difficult to believe that Apple could replace the existing Xeon-based Mac Pro (or even iMac Pro) with comparable performance & expandability in only two years.  Low end, no problem at all. 
    I’ll bet donuts to dollars they already have working prototypes of these in their labs and those prototypes are already wiping the floor with today’s Macs. There’s no way they’d be announcing this before that. 
    aderuttertmayspliff monkeyBeatsjony0watto_cobra
  • Reply 24 of 86
    KITAKITA Posts: 393member
    Beats said:
    jonahlee said:
    If they come up with some new port it instead of PCI it will likely only mean Apple cards work with it, and it will mean the expansion is a joke. And while an Apple GPU might be decent, it is not going to compete with AMD and NVIDIA mainly because of driver support, because of the smaller market share unless Apple can magically beat the GPU giants at their own game without making something super hot and without a huge power draw, that is also so much easier to program for.

    Oh goody this again. Happens every time.

    "Apple is not going to beat out market leaders."

    What evidence do you have that would prove Apple is on a path to beat NVIDIA or AMD?

    Both of which have strong roadmaps for development, access to the latest nodes from TSMC/Samsung, industry wide support and well optimized drivers. 

    It's not impossible and perhaps they will, but so far you haven't provided even the slightest bit of evidence to back up that claim.
    muthuk_vanalingamelijahgdysamoriaBeats
  • Reply 25 of 86
    JinTech said:
    Do we really think that Apple could compete with a GPU like the AMD Radeon Pro Vega II for professionals?
    YES

    It's easy to forget that this is not Steve Job's Apple of long ago that struggled to rise up with limited resources. It's now Tim Cooks $1.5 Trillion juggernaut. I can cram the market value of INTEL, AMD and NIVIDA together and Apple could buy them several times over with cash on hand and throw Qualcomm in for shits and giggles.

    Apple has immense resources and they are being underestimated, again. They are not going to produce a "me too" chip. I'm betting they will blow away all expectations. Cutting INTEL and AMD out will dramatically reduce costs (no middlemen) and allow for optimizations that simply cannot be achieved with 3rd party hardware. On top of that Apple doesn't have to produce whole families of CPU and GPU chips to suit dozens upon dozens of customers, since they are their only customer, and that requires significantly fewer resources than INTEL, AMD and NIVIDA have to throw out.
    rayboDetnatormacplusplusfastasleepRayz2016chiajony0watto_cobra
  • Reply 26 of 86
    KITAKITA Posts: 393member
    rob53 said:
    The A12Z Bionic is up to 8 GPU cores. The Most powerful and expensive GPUs have cores in the thousands. What would it take for Apple to create its own separate 500 core GPU SoC or maybe only a 100 core GPU with the ability to use several of them in a blade setup. There's nothing stopping Apple, other than patents, from making whatever they want to any way they want to. Look at the Mac Pro. It's a fantastic workstation. 
    The Mac Pro is not a fantastic workstation, it's actually extremely overpriced and underpowered. It might only be good in certain applications optimized under macOS, outside of that, nope.

    Repost from another thread:

    After Effects



    While Macs often perform fairly well, in After Effects there is simply no argument that a PC workstation is both faster and significantly less expensive. Compared to the $20k Mac Pro we tested, a $4k PC using an Intel Core i9 9900K and NVIDIA GeForce 2080 Ti ended up being about 5% faster overall, while a $5.5k PC using an AMD Threadripper 3960X is about 18% faster. Even compared to the much better priced iMac Pro, a PC that costs $1K less is going to be about 35% faster.

    What this means is that you can get the same or faster performance from a properly configured PC at a quarter (or less) the cost of a Mac Pro. With an application like After Effects where you can distribute renders across multiple machines using plugins like BG Render Max or RenderGarden, this isn't even about just getting similar performance at a lower price point. You can decrease your render times by 4-5x by purchasing multiple PCs and using network rendering to split up the work between each system. This only improves render performance (not live playback), but also gives you a ton of flexibility to have renders running on multiple machines while simultaneously working on other comps on your primary workstation.

    Or, you can simply save that $15k and spend it on a new car, home remodel, or a really, really fancy vacation.

    https://www.pugetsystems.com/labs/articles/After-Effects-performance-PC-Workstation-vs-Mac-Pro-2019-1718/

    Photoshop


    Since Photoshop is largely unable to take advantage of higher CPU core counts, there often isn't much of a difference between most modern mid/high-end CPUs - and that applies for a Mac just as much as it does for a PC workstation. Overall, if Photoshop is your primary concern, you can get about 10% higher performance from one of our $4,200 Puget Systems workstations with either an AMD Ryzen 3900X or Intel Core i9 9900K compared to the $19,599 Mac Pro (2019) we tested.

    Now, is 10% going to be a game-changer for your workflow? Probably not - it is right on the edge of what you might be able to notice in everyday work. The main takeaway here is not necessarily the performance alone, but rather how much you have to pay to get it. Even if you forget the Mac Pro and go with the much more reasonably priced iMac Pro, you are still likely to pay about twice the cost for equivalent performance.

    https://www.pugetsystems.com/labs/articles/Photoshop-performance-PC-Workstation-vs-Mac-Pro-2019-1716/

    Premiere Pro


    Since there are so many reasons why either a Mac or a PC may be right for you, we generally try to focus on the straight performance results and not tell you which you should purchase. But in this case, the Mac Pro is so underwhelming that it is hard to not simply say "Don't buy a Mac Pro for Premiere Pro".

    This isn't like our Photoshop testing where the Mac Pro was only a hair slower than a PC, or our After Effects testing where a PC can easily be 20% faster at a much lower cost. This time, we are talking a PC being up to 50% faster on average for 1/3 the cost. We understand that there is a lot of benefit to staying in the Apple ecosystem if you also have an iPhone, MacBook, etc., but that is a huge amount of performance and cost savings you will be giving up to get a Mac Pro.

    By skipping the Mac Pro and going with a PC, you could easily save $14,000 which could be used for a host of other things to improve your workflow. Maybe you can finally upgrade your reference monitor to a really nice Eizo or Flanders Scientific model. Or use it as an opportunity to move to a central NAS storage unit from LumaForge. Or just take a couple months off to recharge. And this isn't taking into account the amount of money you might be able to earn due to the higher performance of a PC.

    https://www.pugetsystems.com/labs/articles/Premiere-Pro-performance-PC-Workstation-vs-Mac-Pro-2019-1719/

    muthuk_vanalingamelijahgdysamoria
  • Reply 27 of 86
    netroxnetrox Posts: 1,422member
    KITA said:
    rob53 said:
    The A12Z Bionic is up to 8 GPU cores. The Most powerful and expensive GPUs have cores in the thousands. What would it take for Apple to create its own separate 500 core GPU SoC or maybe only a 100 core GPU with the ability to use several of them in a blade setup. There's nothing stopping Apple, other than patents, from making whatever they want to any way they want to. Look at the Mac Pro. It's a fantastic workstation. 
    The Mac Pro is not a fantastic workstation, it's actually extremely overpriced and underpowered. It might only be good in certain applications optimized under macOS, outside of that, nope.

    Repost from another thread:

    After Effects



    While Macs often perform fairly well, in After Effects there is simply no argument that a PC workstation is both faster and significantly less expensive. Compared to the $20k Mac Pro we tested, a $4k PC using an Intel Core i9 9900K and NVIDIA GeForce 2080 Ti ended up being about 5% faster overall, while a $5.5k PC using an AMD Threadripper 3960X is about 18% faster. Even compared to the much better priced iMac Pro, a PC that costs $1K less is going to be about 35% faster.

    What this means is that you can get the same or faster performance from a properly configured PC at a quarter (or less) the cost of a Mac Pro. With an application like After Effects where you can distribute renders across multiple machines using plugins like BG Render Max or RenderGarden, this isn't even about just getting similar performance at a lower price point. You can decrease your render times by 4-5x by purchasing multiple PCs and using network rendering to split up the work between each system. This only improves render performance (not live playback), but also gives you a ton of flexibility to have renders running on multiple machines while simultaneously working on other comps on your primary workstation.

    Or, you can simply save that $15k and spend it on a new car, home remodel, or a really, really fancy vacation.

    https://www.pugetsystems.com/labs/articles/After-Effects-performance-PC-Workstation-vs-Mac-Pro-2019-1718/

    Photoshop


    Since Photoshop is largely unable to take advantage of higher CPU core counts, there often isn't much of a difference between most modern mid/high-end CPUs - and that applies for a Mac just as much as it does for a PC workstation. Overall, if Photoshop is your primary concern, you can get about 10% higher performance from one of our $4,200 Puget Systems workstations with either an AMD Ryzen 3900X or Intel Core i9 9900K compared to the $19,599 Mac Pro (2019) we tested.

    Now, is 10% going to be a game-changer for your workflow? Probably not - it is right on the edge of what you might be able to notice in everyday work. The main takeaway here is not necessarily the performance alone, but rather how much you have to pay to get it. Even if you forget the Mac Pro and go with the much more reasonably priced iMac Pro, you are still likely to pay about twice the cost for equivalent performance.

    https://www.pugetsystems.com/labs/articles/Photoshop-performance-PC-Workstation-vs-Mac-Pro-2019-1716/

    Premiere Pro


    Since there are so many reasons why either a Mac or a PC may be right for you, we generally try to focus on the straight performance results and not tell you which you should purchase. But in this case, the Mac Pro is so underwhelming that it is hard to not simply say "Don't buy a Mac Pro for Premiere Pro".

    This isn't like our Photoshop testing where the Mac Pro was only a hair slower than a PC, or our After Effects testing where a PC can easily be 20% faster at a much lower cost. This time, we are talking a PC being up to 50% faster on average for 1/3 the cost. We understand that there is a lot of benefit to staying in the Apple ecosystem if you also have an iPhone, MacBook, etc., but that is a huge amount of performance and cost savings you will be giving up to get a Mac Pro.

    By skipping the Mac Pro and going with a PC, you could easily save $14,000 which could be used for a host of other things to improve your workflow. Maybe you can finally upgrade your reference monitor to a really nice Eizo or Flanders Scientific model. Or use it as an opportunity to move to a central NAS storage unit from LumaForge. Or just take a couple months off to recharge. And this isn't taking into account the amount of money you might be able to earn due to the higher performance of a PC.

    https://www.pugetsystems.com/labs/articles/Premiere-Pro-performance-PC-Workstation-vs-Mac-Pro-2019-1719/

    It's likely that Apple Silicon will outperform the current offerings. There are too many bottlenecks in performance as a result of fragmented devices together for a system and it would be nice to just have Apple address all those shortcomings with its own SoC. 


    macpluspluswatto_cobra
  • Reply 28 of 86
    cpsrocpsro Posts: 3,198member
    Has anyone gotten the beta 2 update on their DTK? Since setup, my DTK has always reported "Unable to check for updates," which makes me think something is wrong, that the system can't even check for an update.
    watto_cobra
  • Reply 29 of 86
    DetnatorDetnator Posts: 287member
    jonahlee said:
    Beats said:
    jonahlee said:
    If they come up with some new port it instead of PCI it will likely only mean Apple cards work with it, and it will mean the expansion is a joke. And while an Apple GPU might be decent, it is not going to compete with AMD and NVIDIA mainly because of driver support, because of the smaller market share unless Apple can magically beat the GPU giants at their own game without making something super hot and without a huge power draw, that is also so much easier to program for.

    Oh goody this again. Happens every time.

    "Apple is not going to beat out market leaders."

    Instead of just trolling the posters, how about comment on the substance. So you think Apple can beat high end GPU's from NVIDIA and AMD? What makes you think so? I would love for it to happen. Seems to me that supporting 3rd party graphics at least in the short term would be smart as Apple works to make some of it's own GPU hardware, but the chart does not make that seem likely.
    He’s not trolling. He’s got a great point. And he doesn’t really need to elaborate because we’ve been down this road before. But I’ll elaborate for him.  He’s just saying in two lines what I’ll now say in a load of paragraphs. 

    The question is, why shouldn't Apple be able to beat out the market leaders?  As I noted above, I’ll bet they already have. 

    But the problem is now I’ll be called a fanboy for blindly believing Apple can do no wrong. No. This is no more complicated than history repeating itself. 

    Sure, Apple’s made some mistakes - butterfly keyboard, trash can Mac Pro, iPod HiFi (lol that one was a bit of a farce wasn’t it). But other than people/companies that really never have my impact on the world, who hasn’t had failures? Anyone who knows anything about success knows it doesn’t come without risk. 

    Apple’s “courage” is laughed at because Phil Schiller applies that word to the wrong things. But that trash can Mac Pro was amazing in its time for what it was - it just wasn’t scaleable, so it stagnated.

    But this switch to their own silicon IS incredibly courageous because they DO care to take on the market leaders. It’s risky because they have to think of everything and cover everything. But as I said above they’re not making this announcement without having the prototypes of the whole lineup already working. And history has proven multiple times that when Apple makes a big change they do NOT do it half assed (Eg. compare with MS’s ARM Windows), they get it right - or at least right enough - and it changes the world. Again. Almost every one of their successful products has done that, from the original Apple I to the iDevices of today.

    Note I said their successful products. Sure they’ve had some failures along the way, but that’s ok. You can’t succeed without risk and you can’t risk properly without failing some of that time. Those failures are just as important as the successes because of what they learn and then change. They f’d up the Mac Pro with the trash can. They admitted it (eventually) and then made radical changes for the next one, and that’s an incredible machine, and a success. Then the failures fade into the archives of the universe and the successes that are left change the universe.  For the better. 

    History has also proven that every time they do this some people and groups completely underestimate and scoff and ridicule it. Until they turn out to be wrong and then they copy Apple.  How many laptops looked anything like the original MBA before it? And how about a year or two after it came out? And now every sub-notebook looks like it (except the hybrid convertible things). How many phones looked or functioned anything like the iPhone before the iPhone? And now?

    Those same people underestimated the other transitions. OSX when it first came out got terrible reviews and lots of ridicule. Some people just don’t like change. But then OSX changed the world. Again. (Not just because it was a revolutionary new computer OS but also because it was the foundation of iOS in iPhones and iPads that changed the world). 

    Bottom line: rest assured Apple knows what they’re doing and they obviously have the resources to implement what they know. These new Macs are going to change the world. Again. Possibly more so than any computer before them. 

    For one they’ll be more powerful. But two: Don’t underestimate how much having all their devices from the watch up to the MP on the same silicon (including GPU and other chips, not just CPU) will impact things. Running iPad apps directly and natively on a Mac is just the tip of the iceberg with what they could do with their entire lineup now. 

    These new Macs will do to the PC world what the iPhone and iPad did to the mobile device world. And it’s about time. 

    Call me a fanboy if you want, but nothing I’ve said above is blind faith. Nor is it “Apple can do no wrong.” It’s history repeating itself. And Apple can and does do plenty wrong. But they do enough right, and they don’t do these BIG changes without thoroughly figuring it out before they even announce it.  And with what they do do right, they change the world. And now they’ll do it again. I for one am really looking forward to it. 

    Back to the original point: As for supporting third party graphics, they don’t need to. Per history, there’s no way they announced this without already having their graphics kicking ass in the labs - same as with the CPUs. 
    edited July 2020 unsui_grepfastasleepkpomrundhvidchiajony0watto_cobra
  • Reply 30 of 86
    MplsPMplsP Posts: 3,931member
    the software developers in the crowd can answer this for me-
    Apple has shown that it can design processors. Even if we assume they can also design a graphics processor with capabilities on par with AMD or Nvidia, what does that mean for the software? If a developer has a piece of software that was designed to work with a 'traditional' graphics card, will it need to be re-written to actually work well with Apple's processor? Requiring software to be re-written would mean less software that is optimized and less software in general.

    Also, there's the problem of scale. Every Mac sold will use the Apple processor but the majority of people buying Macs don't really care about graphics processing. That makes the return on investment for designing and developing a high-end graphics processor more difficult. By using a 3rd party graphics card you let the other company develop and only the people who are interested in the capabilities pay for it.
  • Reply 31 of 86
    jdb8167jdb8167 Posts: 626member
    cpsro said:
    Has anyone gotten the beta 2 update on their DTK? Since setup, my DTK has always reported "Unable to check for updates," which makes me think something is wrong, that the system can't even check for an update.
    I'm pretty sure that discussing anything DTK related here is breaking an NDA. You should have access and use the Developer Forums Universal App Quick Start for DTK discussions. Despite how awful the new developers forums work.
    Rayz2016dysamoria
  • Reply 32 of 86
    macplusplusmacplusplus Posts: 2,112member
    dedgecko said:
    rob53 said:
    The A12Z Bionic is up to 8 GPU cores. The Most powerful and expensive GPUs have cores in the thousands. What would it take for Apple to create its own separate 500 core GPU SoC or maybe only a 100 core GPU with the ability to use several of them in a blade setup. There's nothing stopping Apple, other than patents, from making whatever they want to any way they want to. Look at the Mac Pro. It's a fantastic workstation. 
    Exactly. Look at their Afterburner Card. That’s a freaking monster, right there!
    Afterburner is not a GPU. It is an off-the-shelf FPGA with different strengths of which vertex calculations is none.
    Well if Apple wants to reinvent the 3D world, why just they don’t ask the countless game development studios.
    Apple has nothing to learn from game studios. During the keynote we saw how Rise of the Tomb Raider has performed under the most rudimentary form of Apple Silicon, namely A12Z. And that, even after the Rosetta translation of its Intel code !
    edited July 2020 fastasleepchiawatto_cobra
  • Reply 33 of 86
    MplsP said:
    the software developers in the crowd can answer this for me-
    Apple has shown that it can design processors. Even if we assume they can also design a graphics processor with capabilities on par with AMD or Nvidia, what does that mean for the software? If a developer has a piece of software that was designed to work with a 'traditional' graphics card, will it need to be re-written to actually work well with Apple's processor? Requiring software to be re-written would mean less software that is optimized and less software in general.

    Also, there's the problem of scale. Every Mac sold will use the Apple processor but the majority of people buying Macs don't really care about graphics processing. That makes the return on investment for designing and developing a high-end graphics processor more difficult. By using a 3rd party graphics card you let the other company develop and only the people who are interested in the capabilities pay for it.
    Think about it this way.

    For several years now Apple has implemented a graphics layer called Metal.  The name implies that it is a very direct interface to graphics cards.  But it is not completely direct, because Apple does not control the graphics cards and so there has to be a translation layer between what Apple does and the card's implementation.

    So imagine if Apple controls graphics.  Then Metal really is metal — the cards will be designed from the ground up to implement Metal directly.  Pow!  That should mean far superior performance to traditional cards.  Unless you don't think Apple can afford to hire the world's best graphics engineers?

    And of course the neural network features alone show that Apple has proven expertise in multi-core processor implementation.

    So to directly answer your question, Apple graphics programmers use Metal.  So whatever they write will run without changes on the new platform.  Just a lot faster, that's all.  Bam!

    macplusplusGG1fastasleeptmaychiajony0watto_cobra
  • Reply 34 of 86
    indiekidukindiekiduk Posts: 381member
    In danger? It's gone completely. ARM Macs don't even have PCIe
  • Reply 35 of 86
    jdb8167jdb8167 Posts: 626member
    In danger? It's gone completely. ARM Macs don't even have PCIe
    There are no ARM Macs yet. Apple has released a very limited Developer Transition Kit that is minimally useful by developers for testing their macOS software on the new Apple Silicon. It is impossible to reliably infer anything about upcoming Apple Silicon Macs based on the DTK. Apple has been very clear on this. The DTK does not represent new Apple hardware in any way. It is based on a CPU from the iPad Pro where the SoC was first released in 2018 with a minor adjustment to enable an additional GPU core in 2020.

    And point of fact, Apple SoCs have used PCIe to interface with NVMe flash since the iPhone 6s.
    edited July 2020 fastasleepcommentzillakpomRayz2016dysamoriachiajony0watto_cobra
  • Reply 36 of 86
    See the thing that worries me most are the little things in apple's actions already. If they have behind the scenes test hardware for all their configurations already, and yet they couldn't manage to get to get Thunderbolt into their developers machine. Does that mean thunderbolt is gone for them, or they couldn't get it working? Either way is not a good thing considering how much they have leaned on Thunderbolt for pro work over the past years.

    And the whole idea of no external graphics card, just everything on one the chip again means less chance of expansion and more chance of just having to keep upgrading the machines every few years. The thing that was so great about the old macpro was the ability upgrade the graphics card, and graphics on the chip are not going to be upgradeable.

    And then there is the whole getting companies to support it, it is great that Adobe is working on it, but what about AVID. It took them this long to support Catalina and that is with AMD graphcis.
    elijahgdysamoria
  • Reply 37 of 86
    viclauyycviclauyyc Posts: 849member
    netrox said:
    The silicon SoC is ALWAYS faster than having discrete parts, assuming all devices have equal numbers. The reason is less time to travel.
     
    True. But just look at the current CPU and GPU, how much power they needs and how much heat it generated. 

    Even Apple can work magic, it will make more sense to separate the CPU and GPU. Not to mention the amount GPU needs. 

    The Apple Silicon Mac is not an iPhone. The form factor is not that important, especially the Mac Pro. Unless Apple want to make a Mac Pro in the size of Mac Mini.

  • Reply 38 of 86
    jonahlee said:
    See the thing that worries me most are the little things in apple's actions already. If they have behind the scenes test hardware for all their configurations already, and yet they couldn't manage to get to get Thunderbolt into their developers machine. Does that mean thunderbolt is gone for them, or they couldn't get it working? Either way is not a good thing considering how much they have leaned on Thunderbolt for pro work over the past years.

    And the whole idea of no external graphics card, just everything on one the chip again means less chance of expansion and more chance of just having to keep upgrading the machines every few years. The thing that was so great about the old macpro was the ability upgrade the graphics card, and graphics on the chip are not going to be upgradeable.

    And then there is the whole getting companies to support it, it is great that Adobe is working on it, but what about AVID. It took them this long to support Catalina and that is with AMD graphcis.
    The A12z already worked with USB-C so it was a no brainer for the developer hardware and there was no point adding TB since I'm sure they're going to use a new chip for the production unit. I'm guessing eGPUs will work over TB with drivers once they are available but that's anybodies guess.

    I'm sure Adobe and Avid will come along for the ride once they see how this new hardware outperforms x86. Technically all of their software will already run with a few days of modifications, so they'll already bee there. Xcode is processor agnostic so it's not like they have to rewrite it from the ground up and they'll be able to optimize over time as new version come out.
    edited July 2020 watto_cobra
  • Reply 39 of 86
    fastasleepfastasleep Posts: 6,420member
    jonahlee said:
    See the thing that worries me most are the little things in apple's actions already. If they have behind the scenes test hardware for all their configurations already, and yet they couldn't manage to get to get Thunderbolt into their developers machine. Does that mean thunderbolt is gone for them, or they couldn't get it working? Either way is not a good thing considering how much they have leaned on Thunderbolt for pro work over the past years.
    Couldn't manage to or had no need to?
    edited July 2020 jdb8167chiawatto_cobra
  • Reply 40 of 86
    hexclockhexclock Posts: 1,256member
    Beats said:
    I say go ahead. I'm confident Apple can leap ahead of the industry.

    Sidtech said:
    JinTech said:
    Or you could read it as that this is what the Apple Silicon processor is directly compatible with. It doesn't say they will not support third-party. I could see Apple using their own GPU for primary tasks but third-party for more beefy tasks. Do we really think that Apple could compete with a GPU like the AMD Radeon Pro Vega II for professionals?
    Well this is Apple we are talking about, one whose ego knows no bounds. After all Apple did convince themselves that real pros as they are would be perfectly content with the abysmal disaster that was their butterfly keyboard(s) or how they made a bet in 2013 with the trash can Mac Pro with GPU options. 

    It would seem that, excluding the Mac Pro, in the future  all their Macs would come equipped with integrated graphics on Apple Silicon SOC. While not a bad thing at all, as devices like the XBox and PS4 are perfect functional examples of SOCs working fine, these comehcome equipped with fast GDDR6 memory, and  my question is howohow Apple handles fast bandwidth memory on their Macs

    1-post troll not sure if you'll ever return but:
    Apple did good. You guys read too many tech-biased articles and it becomes part of your knowledge then you flood the internet with your ignorance and convince yourself to be intelligent. The same trolls who call iPhone a "failure" if it only sells 2% of the worlds population.

    "they are would be perfectly content with the abysmal disaster that was their butterfly keyboard(s) "

    Discontinued. Move on.

    "Trash can Mac" a cheap shot to ignore how brilliant a device is. Same as "toilet paper" Homepod, "giant iPod" iPad, and "chocolate bar" phone of the past. You eat up the SHIT these tech writers feed you and regurgitate it onto the public while thinking you're superior.

    Get lost.
    I love my “trash can”. That thing still easily handles anything I can throw at it with Logic. 
    It certainly feels plenty fast compared to what my G5 did at the same age. 
    Beatswatto_cobra
Sign In or Register to comment.