madan

About

Banned
Username
madan
Joined
Visits
29
Last Active
Roles
member
Points
309
Badges
0
Posts
103
  • Apple's macOS Catalina causing problems with select eGPU setups

    Egpus are actually a great solution for the average user, if albeit an expensive one. You get the perfect portability of a slim laptop or small desktop, coupled with the grunt/horsepower of a desktop computer. The fact that Apple's run into an egpu upgrade snag isn't a big deal. Windows updates have broken egpus, both of the Bootcamp/Mac variety and of the Windows-Razr variety. They've broken egpus through updates 1903.1862.300, 329, 356, 387/8/9 & 418. They've broken egpus for TWO MONTHS. As for nVidia gpus...there is no problem with avoiding nVidia gpus as there's really nothing you're missing out on with nVidia support other than CUDA and it is: A. A software issue. There's no reason why other applications can't adopt OpenCL, Vulkan or, yes, Metal 2 support for advanced computing API capabilities. The reason people are pushing CUDA so hard is that nVidia paid a bunch of people off to adopt it in the first place. nVidia, for many that don't know are pretty sleazy. Metal 2 competes favorably with DX 12 in graphics API performance and isn't far behind CUDA in compute performance. The reason Metal 2 isn't more prevalent in applications is that Apple hasn't paid everyone under the table and nVidia is as dirty as it gets. B. CUDA is dead in the water. Even with CUDA and dedicated hardware, AMD cards, including Vega, Vega 2 and Arcturus are far, far better at FP and compute. How much better? I use my Radeon VII egpu setup for several math/science test suites and I get benched performance in line with a 100% more expensive 2080 Ti. nVidia is really a gaming gpu and not much more. If you stripped CUDA support off some apps (specifically some video apps), you'd quickly see that in brute computing performance through OpenCL or Metal 2, AMD would bury nVidia pretty much everywhere except gaming where it has parity/slight disadvantage. Oh and AMD cards are not only more well-rounded, they're cheaper to boot. While a Radeon VII is about the same as a 2080 and only about 5% behind a 2080 Super in games, it's up to 50% faster *on average* in compute, FP, and mining. You know, work/money stuff. All for less than a 2080. This isn't Apple's fault. nVidia refused to allow Apple to use egpus without pushing its CUDA platform and Apple told them to get drenched. This happened AFTER that fiasco where nVidia shipped hundreds of THOUSANDS of defective 8600 GT mobility chipsets to Apple that were failing and cost Apple millions in replacement costs. If anyone's to blame over nVidia not appearing in a Mac, look to nVidia and stop blaming Apple.
    cy_starkmanchiaemoellerGG1macpluspluscaladanianfastasleepmuthuk_vanalingambeeble42
  • AMD launches RX 5000-series graphics cards with 7nm Navi GPUs

    KITA said:
    xsmi said:
    The new Mac Pro announcement at WWDC this year could be a train wreck if Apple says "You can use an GPU you want, as long as it is made by AMD." It is hard to ignore the 5x speed increase that comes from hardware ray tracing on NVIDIA RTX GPUs.
    On what application?

    I agree, in 2021 or so, this may be a big deal. It isn't today.
    Games Mike. Don't you know that the ONLY thing that GPU's are good for are games? The things that the AMD cards are better at (compute beasts) don't matter.
    NVIDIA:

    The first software providers debuting acceleration with NVIDIA RTX technology in their 2019 releases include:

    • Adobe Dimension & Substance Designer
    • Autodesk Arnold & VRED
    • Chaos Group V-Ray
    • Dassault Systèmes CATIALive Rendering & SOLIDWORKS Visualize 2019
    • Daz 3D Daz Studio
    • Enscape Enscape3D
    • Epic Games Unreal Engine 4.22
    • ESI Group IC.IDO 13.0
    • Foundry Modo
    • Isotropix Clarisse 4.0
    • Luxion KeyShot 9
    • OTOY Octane 2019.2
    • Pixar Renderman XPU
    • Redshift Renderer 3.0
    • Siemens NX Ray Traced Studio
    • Unity Technologies Unity (2020)
    And yet Nvidia can't be bothered to release drivers for Mojave.

    So the point stands.  Nvidia doesn't care about Mac users.


    watto_cobra
  • AMD launches RX 5000-series graphics cards with 7nm Navi GPUs

    There is a second myth that AMD is doing terribly in performance.

    Some important tidbits.
    A. Depending on the metric or tool, Nvidia may drastically outperform AMD.  However, you could just as easily select a tool like Luxmark where an AMD Radeon 7, with Catalysts 19.41 obliterates a DOUBLE PRICE 2080 Ti.

    The truth is, on average, for most tasks, a Vega 56 with the newest drivers and mild undervolt/overclock will match a reference 1080.  A Vega 64 with NO modifications is about 5% faster than a 1080, on average.  A Radeon 7, is about 5-7% slower than a 2080, for about 50-100 USD less.  However, the V7 only has 2 full driver revisions, vs Turing's 7 full driver revisions.  It's obvious that AMD has a lot of room to grow, especially for a GPU that has 60 CUs, 64 ROPs and an absolutely ridiculous 1 TB of bandwidth and 16 GB of HBM2.

    People need to stop pushing propaganda that Nvidia is outperforming AMD on GPUs across the board.  The fact is this is a lie.  They may, in games, on average, be 5-10% more powerful and they may sip less power and be slightly quieter (10 dB), but they also, on average, cost about 5-10% more.  And the fact that, even today, GCN is better at compute due to generalized CU vs purposed Turing cores means that Vegas and even Polaris will eventually outperform similar Nvidia cards with drivers and time.

    We have seen this over and over again.  6970? Started slower than the 780.  Now it practically trades with the 780 Ti. 7970? Radeon FuryX?
    Radeon 580? 5% slower than the 1060 at launch.  Now almost 10% faster on most games.
    Radeon 570?  Now faster than a 1650, despite it's "newer architecture".  Definitively faster than a 1060 3GB, despite costing 25% *less on launch*.

    The Vega 64 started 5-10% slower than a 1080 and now, without even an underclock, on average, a Vega 64 is faster by about 5% on most titles.  And it absolutely destroys the 1080 on 80% of compute tasks. And that's even with the benefit of CUDA vs OpenCL.

    B.  For production, development and, yes, even gaming, sinking 800 dollars into a card with 8GB of GDDR6x is just dumb.  We've had this stupid circle jerk about graphics memory for YEARS now.  The result is that, while it doesn't matter usually the year it comes out, it always invariably matters eventually.  Cards with 2-3 GB of RAM are useless today, even if their core architecture is comparable to midrange cards.  Their performance most certainly is *not*.  

    Pointing to one synth benchmark is equally dumb.  There are already AMD fanboy Youtube channels getting 10-15% additional performance out of V7s and that's with only G2 drivers.  Navi this year is going to drop a Vega 48-type card, with better bandwidth, thermals, at a 1.25%+ IPC increase.  That puts it squarely in the Vega 64 category.  That type of card would be only 5% slower than a 2070.  If they can drop that card at 350-400 USD, Nvidia is dead in the water.

    And yes, I'd take a Radeon V7 over a 2080 any day of the year.  I know...I have one.

    macroninfastasleepwatto_cobra
  • AMD launches RX 5000-series graphics cards with 7nm Navi GPUs

    kruegdude said:
    It would be interesting to really know what’s keeping the NVIDIA drivers from being approved. I’ve read all the linked articles and the articles linked from those articles and it’s a lot of inference based on information from sources in engineering that say they don’t know who or why the decision to exclude the drivers has been made in upper levels of Apple management plus some historical information about the problems Apple has had with their use of NVIDIA chips.  There’s also a vague statement from NVIDIA saying that Apple is not approving their drivers but that doesn’t say anything. 
    This is a myth.  if you frequent Nvidia's developer exchange forums, individuals contacted both Apple and Nvidia concerning the drivers.  It turns out that Apple hasn't denied any drivers.  What's happened is that Nvidia hasn't submitted drivers.  Only a small number of individuals are using eGPUs and, therefore, leveraging Nvidia cards within their Apple computers.  Even fewer are using hackintoshed systems.  The truth is that Nvidia doesn't give a damn about those users.

    There is also an inherent animosity between Apple and Nvidia stemming from Jobs decision to move away from Nvidia after the 8-series mobility part fiasco in the Macbook Pros.  Secondly, Apple has pretty much stated their intention to completely deprecate Open CL and avoid other compute technologies like CUDA.  They're also ditching Open GL completely and were sluggish to provide APIs that could allow Metal to connect to AMD's Mantle (not that it matters now since Mantle is dead according to AMD themselves).  

    What this all means is that for pure compute, as well as for graphics performance APIs, Apple is "eating its own dogfood" .  They've gone whole hog into their own propriety compute and Metal 2 graphics implementations, leaving CUDA out in the cold.  This obviously is a burr in Nvidia's side, worsening an already poisoned relationship.

    However, the point stands.  If you want to blame someone for no Nvidia drivers, blame Nvidia.  They haven't really submitted any to Apple that can support Turing.  This is eminently provable since Kepler and Pascal cards were supported during High Sierra but after a move to Mojave, a bunch of cards stopped working properly and started crashing in productivity apps.


    fastasleepwatto_cobra
  • Apple's new 2019 iMacs deliver twice the speed as previous model, Vega graphics option

    madan said:
    ElCapitan said:
    This is what an incremental update looks like!  :)

    Hopefully Apple has learned a lesson or two, and will keep these incremental coming at regular, reasonably predictable intervals. Meanwhile, here is a lot to like for many existing and new users. 

    No there isn't.  This iMac is a joke.  And this is coming from a huge Apple fan that has not one, but TWO 27" iMacs in his house.


    The new model is practically identical in performance to the model that shipped TWO years ago.


    I specced a machine because I was looking for a new computer and for 3000 dollar I could be saddled with a Core i5, 8 GB of RAM and a Vega 48 that trades with a 150 dollar RX 590. 

    Only a blithering moron thinks that's good hardware for the money.


    Jesus Christ what a let down.  What happened to the new 3800 series?  Why not move the Vega 64 or hell, if they're cheap, the 56 down to the iMac space and simply use the Vega 7 for the Pro?


    No, instead we're stuck with this nonsense.  A fiercely overpriced toy machine for  a ridiculous price.  I have no idea what I'm going to do now.
    Yes, if you only look at the hardware and forget about the Apple software and ecosystem that comes with it -- along with the years of support.

    Sure.  The software is worth something. it's NOT worth a 100% markup.  That monitor is worth 1300 dollars.  The rest of the computer, even with the Vega 48 and 16 GB of RAM is hardly worth 700 dollars.  Paying a few hundred more for Apple build quality and software? Sure. Paying A THOUSAND more over a 2000 dollar machine, for WORSE performance?

    No, that's dumb.
    avon b7