madan

About

Banned
Username
madan
Joined
Visits
29
Last Active
Roles
member
Points
309
Badges
0
Posts
103
  • AMD launches RX 5000-series graphics cards with 7nm Navi GPUs

    kruegdude said:
    It would be interesting to really know what’s keeping the NVIDIA drivers from being approved. I’ve read all the linked articles and the articles linked from those articles and it’s a lot of inference based on information from sources in engineering that say they don’t know who or why the decision to exclude the drivers has been made in upper levels of Apple management plus some historical information about the problems Apple has had with their use of NVIDIA chips.  There’s also a vague statement from NVIDIA saying that Apple is not approving their drivers but that doesn’t say anything. 
    This is a myth.  if you frequent Nvidia's developer exchange forums, individuals contacted both Apple and Nvidia concerning the drivers.  It turns out that Apple hasn't denied any drivers.  What's happened is that Nvidia hasn't submitted drivers.  Only a small number of individuals are using eGPUs and, therefore, leveraging Nvidia cards within their Apple computers.  Even fewer are using hackintoshed systems.  The truth is that Nvidia doesn't give a damn about those users.

    There is also an inherent animosity between Apple and Nvidia stemming from Jobs decision to move away from Nvidia after the 8-series mobility part fiasco in the Macbook Pros.  Secondly, Apple has pretty much stated their intention to completely deprecate Open CL and avoid other compute technologies like CUDA.  They're also ditching Open GL completely and were sluggish to provide APIs that could allow Metal to connect to AMD's Mantle (not that it matters now since Mantle is dead according to AMD themselves).  

    What this all means is that for pure compute, as well as for graphics performance APIs, Apple is "eating its own dogfood" .  They've gone whole hog into their own propriety compute and Metal 2 graphics implementations, leaving CUDA out in the cold.  This obviously is a burr in Nvidia's side, worsening an already poisoned relationship.

    However, the point stands.  If you want to blame someone for no Nvidia drivers, blame Nvidia.  They haven't really submitted any to Apple that can support Turing.  This is eminently provable since Kepler and Pascal cards were supported during High Sierra but after a move to Mojave, a bunch of cards stopped working properly and started crashing in productivity apps.


    fastasleepwatto_cobra
  • Hands-on with Apple's new Core i9 iMac 5K with Vega graphics: benchmarks and first impress...

    That doesn't refute anything I said.  Literally.

    The Vega 48 is still a problem.
    williamlondon
  • Hands-on with Apple's new Core i9 iMac 5K with Vega graphics: benchmarks and first impress...

    neilm said:
    This machine has so far impressed us -- especially when it comes to comparing against the iMac Pro. If you are in the new market, and don't need the upgradability of the iMac Pro, this is a powerful machine that gets you near the same performance at a much lower cost.

    "Upgradability of the iMac Pro"? What upgradability would that be?

    You can't even add RAM to the iMac Pro, at least not without almost complete disassembly and throwing your warranty out of the window in the process. 
    I believe Andrew was referring to the socketed processor in the iMac Pro, whereas the iMac 5K does not have the same. I'll talk to him about it and see.
    Neilm, as Mike said, I was referring to what you could upgrade the iMac Pro to as far as Apple's configurator while purchasing -- not yourself. Such as an 18-core processor, 256GB of RAM, or the Pro Vega 64X graphics with 16GB of VRAM. Apple won't you add near this amount of power to the standard iMac 5K. So when you compare base model to this spec'd out model, they are quite close. If you are looking for serious performance, you still need to go iMac Pro and pick up those above upgrades.
    I love how everyone keeps ignoring the elephant in the room.

    Everyone keeps benching and speccing the Core i9.  No one is doubting that the Core i9 is impressive.  The problem is you're spending 3000 dollars on a computer that has a Vega 48 and those graphics benchmarks are a joke.  Like, literally 5-10% better than a RX 580 (a 150 dollar card).  It's an embarrassment to spend that much money on a system with LESS THAN half the performance of a 1080 or 2070.  The machine has literally ZERO future proofing the day you buy it.

    It's like a dude that hits the gym and works out his arms, chest and back but can only squat 125.  And the issue is that eGPUs are not a real solution because for 800 dollars on top of an already astronomically expensive 3000 Mac, you're getting 1070-class performance, which would only be about 20% faster than what's in the iMac.  I suppose you could pony up for an RX Vega 7 and an eGPU but then you're spending well over 1000 dollars for 1070 Ti/1080 performance and your computer is running well over 4000.  Why not just go for an iMac Pro at that point and just bury that system?

    The Vega 48 is the problem.  The Vega 48 is the problem.


    The Vega 48 is the problem.  The Vega 48 is the problem.


    The Vega 48 is the problem.  The Vega 48 is the problem.

    When will people stop disingenously posting CPU benchmarks when a computer is only as fast as its slowest PU and the iMac's GPU is a joke for the price they're charging.  Literally...a  JOKE.




    williamlondon
  • Editorial: Apple is making us wait for a new iMac for no good reason

    Dave Kap said:
    For no good reason? How do you determine  what is a good reason?
    This article is a hatchet-job on Apple for no good reason.

    Apple is waiting because iMacs don't use mobility CPUs and they're between cycle right now.  It makes no sense to produce a new iMac with identical specs and just a small generational speedbump in Intel CPUs.  The 2013, 15 and 17 all have desktop class CPUs...not mobility.  That's a prime reason Apple is waiting.

    They may be ironing out the technologies and economies of scale for the new screen.  Screens that rich and dense are extremely difficult to produce and to roll into an AiO pricepoint.

    The biggest reason Apple is waiting is because the 2017 iMacs are using AMD GPUs: 570 and 580s.  What would they upgrade to 2 years later? Apple doesn't expect users to be clueless and purchase brand new computers based on a 7xxx to 8xxx Intel speedbump.  They also expect a noticeable, significant performance increase in graphics horsepower.  What AMD GPU would they sneak into the iMac to make that happen?

    Surely they wouldn't just put another 580 in there would they?  You're going to pay 2000 dollars for a computer that gets nuked by a GTX 2060 or 1660 Ti? A 2000 dollar computer with an almost 3 year old midrange graphics card?  Only an abject fool does that.  They don't use NVidia.  So that eliminates the 1070, 1070 Ti, 1080, 2060, 1060Ti, 2070 and 2080.

    AMD only has FOUR graphics cards faster than the 580: The 590, which is only 8% faster with the newest drivers.  And It has 20% more TDP no less. Then you have the Vega 56 (iMac Pro), 64 (iMac Pro) and the VII, which is faster than any of the aforementioned and is an 800 dollar card.

    There simply isn't a GPU available for Apple to use in its TWO year upgrade on the iMac and using a THREE year old MIDRANGE GPU in a premium 2000 dollar computer is absolutely unacceptable.

    That leaves us...and Apple waiting for AMD and the Polaris 11 based 3080 which uses about the same wattage as a 580 but has 1070 Ti- 1080 performance.

    That is what we're waiting for.


    Dave KaptmaystompyapplesnorangeselijahgPylonswatto_cobra
  • If you think Tim Cook is 'robbing' you, then so was Steve Jobs

    To pick a nit:

    Consistent gross margins don't tell me anything about changes to the affordability of products. One doesn't need to be a financial analyst to figure out that the price of a 15" MacBook Pro is substantially higher, even after inflation, than it was five years ago. If the reason for that isn't growing margins, then obviously costs have also increased. Maybe Apple has a problem with cost control and/or spending decisions?
    Or maybe the ratio of costs to selling prices have remained steady for cutting edge technologies.  If it were milk or toasters we were talking about then I could understand your implication that input costs should not be increasing and perhaps even be going down.  But Apple is doing the same thing today as it was doing 10, 15, 20 years ago; developing new products with increasing performance and capabilities.  Seems that will always remain the same percentage of total costs.  
    I hear you, except that price increases are accelerating compared to the past. Why are costs so much higher lately than they used to be?

    It may well be that this is just how much it costs to make fancy-pants computers now. I'm neither qualified nor adequately informed to offer an opinion about what Apple should or could do. All I'm saying is the current approach is moving the income level required to be an Apple user even higher. Our middle-class household can no longer afford the products we used to buy on a three-year cycle. Maybe I need to just accept that and walk away. I hope not, though.

    It's worse than that.  Not only are the products quickly escaping low-mid middle class household budgets but high-mid household budgets and even low-wealthy households are hard pressed to justify the cost.

    Example.

    I'm in the market for a next-gen iMac.  I'm looking for the 2019.  A Core i7 is fine. I'm sure they'll have 8th-9th gen in there by then.  I'm sure they'll have 16 GB of DDR4. The screen is spectacular and that's ok.  Storage is fine.  But a lot of my work (3D modeling and real-time texture rendering) requires a beefy graphics card and the current 580 I have is good (but not great).  I expect the new iMac to have 1080-class performance 2 years after the RX 580 iMac.  At least 1080. 

    Let's assume that by virtue of the fact that Apple refuses to contract with NVidia that AMD is the only supplier they have (which Soli thinks they probably also develop nyuk nyuk). The 680 isn't ready yet.  And if it isn't ready by next May-June on iMac release, Apple may just shove another 580 in the high end non-Pro iMac.  Well, that means I'm looking at the same performance as the 2017 model for 2500-3000 dollars. Ridiculous.

    My point is, it's not just about the price eliminating middle-tier families from purchasing Apple products (although that's likely to happen) but also shooing away professionals and prosumers that can get Wintel systems at the same price that, no, may not run Mac OS but are literally 100% faster. We can see that situation plain as day with the new Mini. 

    It's not that Apple is simply more expensive than ever before.  It's that they seem to offer less than ever before for those high prices.
    elijahgmuthuk_vanalingam