madan

About

Banned
Username
madan
Joined
Visits
29
Last Active
Roles
member
Points
309
Badges
0
Posts
103
  • Editorial: Will Apple's $6k+ Mac Pro require brainwash marketing to sell?

    Remember that it's 5999 PLUS TAX and Apple Care.  With those additions, that computer almost hits 7000.  If you upgrade the RAM yourself and the storage (the measly 256 GB) yourself, you're looking at another 500 dollars MORE.  And that's BEFORE you even look at a real graphics card.  The Mac Pro's 580 is only 30% faster than the AMD APUs in higher level 3400Gs.  30% over integrated graphics isn't "powerful".  So by the time you sink another 1000+ in a Vega 2 card, you're looking at least 8500 dollars (probably closer to 9000).

    And even then, you could build a Mac with 90% that performance for a quarter of the price.


    williamlondon
  • Editorial: Will Apple's $6k+ Mac Pro require brainwash marketing to sell?

    You know, come to think of it.  You could get an iMac Pro, with 64 GB of RAM, a base Xeon, more storage and a Vega 56, stone the base Mac Pro over the head and it STILL COMES WITH A 5K LG MONITOR BUILT IN.  How nuts is that?

    Sure it doesn't come with support with 12 TB 3 lanes but srsly, you're probably not going to need that.  A base Xeon wouldn't be able to handle that throughput anyways.  So honestly, the iMac Pro is a better deal because in 5-6 years you just buy another iMac Pro and you get a whole new system, PLUS A WHOLE NEW MONITOR, to boot, for the same price.  Like I said, the Mac Pro only makes "sense" once you start cracking the 20,000 USD threshold. Once you start putting in gpus and cpu configurations that can handle the crazy bandwidth and performance than an iMac Pro just can't touch. But the base system? A Vega 56 is 30% faster than the Mac Pro's base Radeon 580.  And the system costs LESS and BRINGS  A MONITOR.
    gatorguywilliamlondonfastasleep
  • Editorial: Will Apple's $6k+ Mac Pro require brainwash marketing to sell?

    MacPro said:
    madan said:
    Unfortunately, the Mac Pro has a distinct issue on its value curve.  It's a horrible value system at its base price, that quickly ramps in value as the price becomes astronomical.

    At its 6000 USD base price tag, the computer is a joke.  The base Xeon it has was about 1200 bucks (on release).  It was blessed with 240 dollars of ECC RAM (on release).  It had a nice, airflow-centric case to be sure.  Good cases that are solid steel/aluminum are, often, 200-300 USD.  Even if we counted the Mac Pro's case as a 500 dollar case, and counted its M.2 storage in the default model as 240 dollars, we'd still be sitting at 3000 dollars for the system.  The Radeon 580 is a naught 200 dollar card (even on release).  

    That means you're paying effectively ~ 3000 dollars for a power supply and motherboard.  Which is kinda nuts.  I mean the power supply itself is about 200 bucks at most (actually less) and the fans can't be more than 100 bucks.  So you're buying a, albeit ultra bleeding edge, motherboard for 2700 USD, which is highway robbery.

    Yes, the special component of the Mac Pro isn't the CPU or the GPU (although the Mac Pro can top out with sky-high Xeons and absolutely monstrous Arcturus-precursor dual Vega 2s), it's the motherboard.  The base system doesn't ship with any of that super hardware though.  Yes, the motherboard accommodates 1.5 TB of ECC RAM.  Yes it has the ability to run almost a dozen bus lanes for TB 3.  Yes, it accommodates both power via the port and via adapter for gpus.  Yes the Pro Vega 2 is a beast of a card, dwarfing the Radeon VII's already ludicrous 16 GB of HBM2.  But you get NONE of that with a 6000 dollar base system.

    With a 6000 dollar base system, you get an amazing motherboard, that might never be used.  You get a low-end Xeon that is outperformed by most Core i9s (Xeon reliability is worth 800 dollars?!).  You get a gpu that is budget by today's standards (the MacBook Pro's Vega gpu is about as fast as a 565-570 which itself is only 10-15% slower than the Mac Pro's 580...).  And a bunch of super components like psus and the like that may never be used unless you upgrade them yourself down the line.

    You could build a DIY computer with pretty much identical performance for less than 1500 dollars.  No, I'm not kidding.  Sure, it's not upgradeable with ECC RAM. Sure, it doesn't have 12 TB 3 lanes or 10 gigabit ether.  No, it doesn't have a ridiculously overpowered psu for a system that draws under 300 Watts.  But still, you're buying a system with such low specs all those upgradeable touches are pointless unless you spend thousands more upgrading the system anyways. 


    Sure, you can get a great high end Xeon and push the RAM to 1.5 TB.  Yes, 2 Pro Vega 2s are absolutely nuts, with a max of 128 GB of HBM2 RAM.  But that system costs 50k.  The base system gets you NOTHING.  And it's 6000 USD.  For workflow alone, a computer 1/4 the price will do the job.  

    So yes, the Mac Pro may be a great machine at the high end but anyone that buys it in the low end better not convince themselves they're getting a super computer because it's a budget system, at most and they're paying between 4-10x as much for the privilege of the Apple emblem.
    I agree the base config is not ideal.  I am hoping it is possible for DIY RAM upgrade as I don't want to may Apple RAM prices and I am used to 64 GB in the trash can so I'd want at least that and the GPU choice is still open in my mind until I see pricing but I suspect even the base is a leap from my dual AMD Firepros.  8 or 12 core would be enough for me for sure.  That all said in five years this machine will still be totally configurable an iMac Pro isn't.
    You can totally upgrade the RAM and GPU yourself.  The problem is anything worthy of that motherboard is going to run you thousands of dollars.  Which means you're looking at an 8k system.  Again, that kind of workflow lends itself to mission-critical server work, not prosumer production.  The Radeon 580 is about 20% more than the D700s in the old Mac Pro.  That's it.  Sure, you only have one gpu so the support is probably better but a Radeon 580 is a budget card.  If you move to a Pro Vega 2, you're looking at least a 1000 dollar increase in price.  That's because the card is basically an up-RAMMED Radeon VII which MSRPed at 700 USD.  It's basically an mi60 on steroids.

    In 6 years, the system's CPU will be woefully underpowered.  The GPU will be upgradeable.  But you're paying almost 10,000 USD for the privilege if you do it correctly.
    gatorguy
  • Apple's macOS Catalina causing problems with select eGPU setups

    Egpus are actually a great solution for the average user, if albeit an expensive one. You get the perfect portability of a slim laptop or small desktop, coupled with the grunt/horsepower of a desktop computer. The fact that Apple's run into an egpu upgrade snag isn't a big deal. Windows updates have broken egpus, both of the Bootcamp/Mac variety and of the Windows-Razr variety. They've broken egpus through updates 1903.1862.300, 329, 356, 387/8/9 & 418. They've broken egpus for TWO MONTHS. As for nVidia gpus...there is no problem with avoiding nVidia gpus as there's really nothing you're missing out on with nVidia support other than CUDA and it is: A. A software issue. There's no reason why other applications can't adopt OpenCL, Vulkan or, yes, Metal 2 support for advanced computing API capabilities. The reason people are pushing CUDA so hard is that nVidia paid a bunch of people off to adopt it in the first place. nVidia, for many that don't know are pretty sleazy. Metal 2 competes favorably with DX 12 in graphics API performance and isn't far behind CUDA in compute performance. The reason Metal 2 isn't more prevalent in applications is that Apple hasn't paid everyone under the table and nVidia is as dirty as it gets. B. CUDA is dead in the water. Even with CUDA and dedicated hardware, AMD cards, including Vega, Vega 2 and Arcturus are far, far better at FP and compute. How much better? I use my Radeon VII egpu setup for several math/science test suites and I get benched performance in line with a 100% more expensive 2080 Ti. nVidia is really a gaming gpu and not much more. If you stripped CUDA support off some apps (specifically some video apps), you'd quickly see that in brute computing performance through OpenCL or Metal 2, AMD would bury nVidia pretty much everywhere except gaming where it has parity/slight disadvantage. Oh and AMD cards are not only more well-rounded, they're cheaper to boot. While a Radeon VII is about the same as a 2080 and only about 5% behind a 2080 Super in games, it's up to 50% faster *on average* in compute, FP, and mining. You know, work/money stuff. All for less than a 2080. This isn't Apple's fault. nVidia refused to allow Apple to use egpus without pushing its CUDA platform and Apple told them to get drenched. This happened AFTER that fiasco where nVidia shipped hundreds of THOUSANDS of defective 8600 GT mobility chipsets to Apple that were failing and cost Apple millions in replacement costs. If anyone's to blame over nVidia not appearing in a Mac, look to nVidia and stop blaming Apple.
    cy_starkmanchiaemoellerGG1macpluspluscaladanianfastasleepmuthuk_vanalingambeeble42
  • AMD launches RX 5000-series graphics cards with 7nm Navi GPUs

    There is a second myth that AMD is doing terribly in performance.

    Some important tidbits.
    A. Depending on the metric or tool, Nvidia may drastically outperform AMD.  However, you could just as easily select a tool like Luxmark where an AMD Radeon 7, with Catalysts 19.41 obliterates a DOUBLE PRICE 2080 Ti.

    The truth is, on average, for most tasks, a Vega 56 with the newest drivers and mild undervolt/overclock will match a reference 1080.  A Vega 64 with NO modifications is about 5% faster than a 1080, on average.  A Radeon 7, is about 5-7% slower than a 2080, for about 50-100 USD less.  However, the V7 only has 2 full driver revisions, vs Turing's 7 full driver revisions.  It's obvious that AMD has a lot of room to grow, especially for a GPU that has 60 CUs, 64 ROPs and an absolutely ridiculous 1 TB of bandwidth and 16 GB of HBM2.

    People need to stop pushing propaganda that Nvidia is outperforming AMD on GPUs across the board.  The fact is this is a lie.  They may, in games, on average, be 5-10% more powerful and they may sip less power and be slightly quieter (10 dB), but they also, on average, cost about 5-10% more.  And the fact that, even today, GCN is better at compute due to generalized CU vs purposed Turing cores means that Vegas and even Polaris will eventually outperform similar Nvidia cards with drivers and time.

    We have seen this over and over again.  6970? Started slower than the 780.  Now it practically trades with the 780 Ti. 7970? Radeon FuryX?
    Radeon 580? 5% slower than the 1060 at launch.  Now almost 10% faster on most games.
    Radeon 570?  Now faster than a 1650, despite it's "newer architecture".  Definitively faster than a 1060 3GB, despite costing 25% *less on launch*.

    The Vega 64 started 5-10% slower than a 1080 and now, without even an underclock, on average, a Vega 64 is faster by about 5% on most titles.  And it absolutely destroys the 1080 on 80% of compute tasks. And that's even with the benefit of CUDA vs OpenCL.

    B.  For production, development and, yes, even gaming, sinking 800 dollars into a card with 8GB of GDDR6x is just dumb.  We've had this stupid circle jerk about graphics memory for YEARS now.  The result is that, while it doesn't matter usually the year it comes out, it always invariably matters eventually.  Cards with 2-3 GB of RAM are useless today, even if their core architecture is comparable to midrange cards.  Their performance most certainly is *not*.  

    Pointing to one synth benchmark is equally dumb.  There are already AMD fanboy Youtube channels getting 10-15% additional performance out of V7s and that's with only G2 drivers.  Navi this year is going to drop a Vega 48-type card, with better bandwidth, thermals, at a 1.25%+ IPC increase.  That puts it squarely in the Vega 64 category.  That type of card would be only 5% slower than a 2070.  If they can drop that card at 350-400 USD, Nvidia is dead in the water.

    And yes, I'd take a Radeon V7 over a 2080 any day of the year.  I know...I have one.

    macroninfastasleepwatto_cobra