Apple's newly updated Mac desktops feature only ATI graphics

2

Comments

  • Reply 21 of 56
    irnchrizirnchriz Posts: 1,564member
    Well, the new iMacs should last longer than the 24" models with the nVidia 8800 chipset. I personally know of 6 of these which all packed in with the exact same issue, the nVidia graphics card blew up. Same issue they had with the 8600 chips in the Macbooks but nVidia didn't accept responsibility for the shitty manufacturing of their cards in the iMac (at least not yet).



    I would be hard pushed to take any Mac with an nVidia chipset after their recent performance.
  • Reply 22 of 56
    Quote:
    Originally Posted by toddr View Post


    Apple has been pushing their top end models for scientific computing. Those of us in that community have been waiting for the update thinking that it would support the double precision capability that comes with Fermi.



    Is it possible to put the Fermi GPU into the latest MacPro and access it through OpenCL (or CUDA)?



    Doesn't the HD5870 also have native 64-bit floating point capability?



    Quote:
    Originally Posted by zonk3r View Post


    Nothing wrong with ATI gfx in a pro machine.



    The only problem ATI has had is with their drivers which never seem to reflect the quality of their hardware. Sadly this is true of both their own Windows drivers and the Apple supplied drivers for OS X.



    I think going with ATI is a smart choice from a driver standpoint. The recent track record of nVidia OS X drivers haven't been good, with performance issues with Bioshock on 10.6.0 and 10.6.1 until it was fixed in 10.6.2 and Valve and Blizzard recommending nVidia GPU users stay away from 10.6.4 due to driver issues. ATI also seems to be putting the effort in making sure their GPUs get a full lifespan in OS X, with the X1000 series still being in good shape and supported in Valve's titles and Starcraft II, whereas nVidia's equivalent 7000 series is not supported due to incomplete and buggy drivers.
  • Reply 23 of 56
    Quote:
    Originally Posted by desides View Post


    They appear to be the full-fledged desktop-class GPUs.



    I will wager any amount of money they are Mobility parts for a variety of reasons, not the least of which is that CNET's review identifies the low end 21" that way.

    When AI verifies that, it should clarify this article.



    The obvious choice for the bottom model would be from the current 5XXXX generation, below the 5670 in the model above. The desktop 5570 is only slightly less powerful than the 4670 (~6%), but has DrctX11 threading, tesselation, etc. not to mention OpenGL4 (should Apple get around to supporting it... in whateverdecade.)

    Why the choice of the 4670? ATI doesn't make a Mobility Radeon HD5570.
  • Reply 24 of 56
    mdriftmeyermdriftmeyer Posts: 7,142member
    I'd imagine the legal issues with RAMBUS played a part in their decision, not to mention the strides AMD has made with their lineup of GPGPU cards.
  • Reply 25 of 56
    mdriftmeyermdriftmeyer Posts: 7,142member
    Quote:
    Originally Posted by ltcommander.data View Post


    Doesn't the HD5870 also have native 64-bit floating point capability?





    I think going with ATI is a smart choice from a driver standpoint. The recent track record of nVidia OS X drivers haven't been good, with performance issues with Bioshock on 10.6.0 and 10.6.1 until it was fixed in 10.6.2 and Valve and Blizzard recommending nVidia GPU users stay away from 10.6.4 due to driver issues. ATI also seems to be putting the effort in making sure their GPUs get a full lifespan in OS X, with the X1000 series still being in good shape and supported in Valve's titles and Starcraft II, whereas nVidia's equivalent 7000 series is not supported due to incomplete and buggy drivers.



    Yes, the HD5870 supports the entire OpenGL 4.1 specification and the OpenCL 1.1 specification.
  • Reply 26 of 56
    sheffsheff Posts: 1,407member
    I guess Apple got pissed about intel forcing their own graphics down their throat and decided to retaliate against it. nVidia is just collateral damage since AMD probably asked ATI to be used if future CPUs are to be exclusive/priority for Apple.



    I think that it would be a good move, since intel already has big customers like dell and hp, while AMD is always second best and could use a major player rocking their silicone. Plus it could be a boon for AMD innovation as apple will push them to come up with better chips for their computers.



    One problem spot is that AMD is entering the mobile space as well and here Apple chose to go with A4, which means AMD is SOL here, but it's not gonna be as big a deal to them as it was to Intel that had quite a special relationship with Apple for a while.



    I would be happy to see AMDs inside the macs.
  • Reply 27 of 56
    mdriftmeyermdriftmeyer Posts: 7,142member
    Quote:
    Originally Posted by sheff View Post


    I guess Apple got pissed about intel forcing their own graphics down their throat and decided to retaliate against it. nVidia is just collateral damage since AMD probably asked ATI to be used if future CPUs are to be exclusive/priority for Apple.



    I think that it would be a good move, since intel already has big customers like dell and hp, while AMD is always second best and could use a major player rocking their silicone. Plus it could be a boon for AMD innovation as apple will push them to come up with better chips for their computers.



    One problem spot is that AMD is entering the mobile space as well and here Apple chose to go with A4, which means AMD is SOL here, but it's not gonna be as big a deal to them as it was to Intel that had quite a special relationship with Apple for a while.



    I would be happy to see AMDs inside the macs.



    I think the Bulldozer based CPUs would be the first architecture that Apple seriously tests with Mac OS X/OS X Server.
  • Reply 28 of 56
    guinnessguinness Posts: 473member
    Quote:
    Originally Posted by vinney57 View Post


    Two factors; Apple are getting pissed with Intel, and OpenCL- Nvidia dragging their heels.



    It's the only sensible choice (currently) if Apple wishes/gets around to adding OpenGL 4.x support.



    The Nvidia GTX 480/470 (GF100) were way, way too power hungry, and the 460 (GF104) just shipped - AMD had a 6+ month lead for this generation of GPU's, and the only thing that Nvidia has really got them beat in, is tessellation, but not many games/apps are currently taking advantage.
  • Reply 29 of 56
    programmerprogrammer Posts: 3,409member
    • ATI drivers used to be very problematic... a long time ago. Now they have their ups and downs just like nVidia does.

    • This year's ATI hardware is quite good on the desktop, and generally better than nVidia's if you include power, heat, cost, etc.

    • These ATI GPUs fully support OpenCL and in some ways do it better than nVidia's. CUDA is entirely nVidia-specific and should go away over time as OpenCL reaches full maturity.

    • Adopting ATI doesn't bear any relationship to adopting AMD x86 processors.

    • These ATI parts are real desktop parts.

    • Apple maintains relationships with both GPU makers because they tend to jockey back and forth and every year it is a new equation. Who knows, perhaps next year ATI will be in mobile with Fusion and nVidia will be king of the discrete desktop again.

    • Some people will always invent something to gripe about.

  • Reply 30 of 56
    oxygenhoseoxygenhose Posts: 236member
    I doubt Apple is pissed at Nvidia, but the later probably didn't have Apple's desired product choice.



    Almost everytime Mac Pros are updated Apple adds more graphics cards options within 6 months.



    The lot of you need to put down the gongs and stop taking X-files reruns so seriously. Sheesh!
  • Reply 31 of 56
    mj webmj web Posts: 918member
    So Intel can buy Nvdia at a discount.
  • Reply 32 of 56
    So Adobe comes out with an app that uses the CUDA based Nvidia Cards ... this is something ONLY the Mac Pro can do..... yet Apple in their infinite wisdom releases new mac pros with only unsupported ATI cards. Apple your idiots and racist against Adobe thats what I say and progress



    No BluRay.



    Hey I love Apple but stupid is stupid and Apple is being idiotic. Great way to Kill a product line by pricing it out of reach of most users. And Alienate the main users of the product.This is what Apple really wants. Listen/watch the Steve TED discussion and draw your own concussions.





    SP
  • Reply 33 of 56
    programmerprogrammer Posts: 3,409member
    Quote:
    Originally Posted by Slackpacker View Post


    So Adobe comes out with an app that uses the CUDA based Nvidia Cards ... this is something ONLY the Mac Pro can do..... yet Apple in their infinite wisdom releases new mac pros with only unsupported ATI cards. Apple your idiots and racist against Adobe thats what I say and progress



    Adobe should be using OpenCL instead. It is very similar to CUDA, has Apple's (and AMD's, nVidia's, Intel's) official support and would allow their app to run on all machines not just a minority. Don't lay this at Apple's feet -- the plan has been clear for over a year, Adobe chose to not pay heed.
  • Reply 34 of 56
    charlitunacharlituna Posts: 7,195member
    Quote:
    Originally Posted by John.B View Post


    I just seems strange that there was this huge marketing push with OpenCL and NVIDIA when Snow Leopard was released, and now there are no desktops except the mini with NVIDIA cards,



    That may be due to the various licensing issues with NVIDIA that have popped up over the last 6-8 months



    Quote:
    Originally Posted by Slackpacker View Post


    So Adobe comes out with an app that uses the CUDA based Nvidia Cards ...




    It is my understanding that CUDA is something just for NVIDIA cards, making it a nonstandard. Makes sense to me that Apple would be not support it, or not as their primary. Apple is moving towards open formats and standards as their dominant items.



    If Adobe chooses not to support the open standards they are the foolish ones.



    Quote:

    No BluRay.




    I'm sorry but at this point, given all that Apple has said and done (and not done) to believe that they will ever put Blu-ray in their machines is what is stupid. They have all but said "There will never be blu-ray in our machines even if you paid us to do it."



    Quote:

    Hey I love Apple but stupid is stupid and Apple is being idiotic.




    Again, I have to disagree. What is stupid is this notion that Apple should listen to the whims and wants of a small majority of users and cater to them, rather than go with the majority of buyers.



    Quote:

    Great way to Kill a product line by pricing it out of reach of most users.



    You say that and yet their products are selling at high rates. Perhaps folks are willing to pay that higher up front due to a perceived benefit over a longer period than they get with Windows based machines.
  • Reply 35 of 56
    christophbchristophb Posts: 1,429member
    Quote:
    Originally Posted by Slackpacker View Post


    Apple your idiots and racist against Adobe thats what I say and progress



    SP



    Wait, huh? Bigots maybe?
  • Reply 36 of 56
    I get the OpenCL argument, but is it really mature enough for prime time? Remember Adobe's implementation of CUDA was in development for years. I posted about this "open" logic here:



    http://nofilmschool.com/2010/07/appl...-new-mac-pros/



    "I can hear Steve Jobs touting Apple?s openness by saying Adobe?s decision to go with a proprietary nVidia technology is ?closed,? whereas ATI graphics cards run on ?open? standards like OpenCL. Yes, implementing nVidia?s CUDA architecture was a proprietary decision on Adobe?s part, but let?s put Apple?s own reasoning to work here. Similar to Apple?s argument for keeping Flash off of iOS, Adobe made a decision based on performance: nVidia?s CUDA is more mature than OpenCL at this point in time. Regarding Adobe Flash, Steve Jobs himself stated, ?we have routinely asked Adobe to show us Flash performing well on a mobile device, any mobile device, for a few years now. We have never seen it.? So in an effort to improve the performance of their device, Apple went with a Flash-free approach because there wasn?t an extant example of Flash performing well. Similarly, in an effort to improve the performance of their editing suite, Adobe went with CUDA because there wasn?t an extant example of OpenCL performing well. It?s the same argument, but apparently it doesn?t go both ways."
  • Reply 37 of 56
    jeffdmjeffdm Posts: 12,946member
    Quote:
    Originally Posted by Slackpacker View Post


    Apple your idiots and racist against Adobe thats what I say and progress



    How is it racist? Do you know what it means for someone to be racist? Don't use misuse strong words because it suits your needs, it only serves to water down their meaning.
  • Reply 38 of 56
    docno42docno42 Posts: 3,238member
    Quote:
    Originally Posted by wizard69 View Post


    I'd add a third factor in that AMD simply hasn't had the production problems NVidia has had.



    Bingo. Frankly I'm surprised Apple has stuck with Nvidia this long.



    And Apple's Pro apps have traditionally worked better with ATI anyway. Waiting for that 5870 to be a stand along upgrade option for us existing Mac Pro owners!
  • Reply 39 of 56
    ssquirrelssquirrel Posts: 1,196member
    Quote:
    Originally Posted by Slackpacker View Post


    So Adobe comes out with an app that uses the CUDA based Nvidia Cards ... this is something ONLY the Mac Pro can do..... yet Apple in their infinite wisdom releases new mac pros with only unsupported ATI cards. Apple your idiots and racist against Adobe thats what I say and progress





    Why exactly is it something that only the MP can do? The NVIDIA cards in the laptop line and prior iMac line not CUDA capable?
  • Reply 40 of 56
    avidfcpavidfcp Posts: 381member
    Quote:
    Originally Posted by Bill P. View Post


    Checking the Apple store, I didn't see the ATI Radeon HD 5870 card offered as an option on either the Pro line or the iMacs. Am I missing something?



    Thanks



    I don't want to start a hackntosh vs mac pro debate but with the hacks now having scripts that allow retail install, no hacking required, how much of a hit does it take not using nvidia boards desiged to work excellent Witt Adobe Master Suie. This could turn into a very serious matter. Anyone have nvidia benchmarks. I read it blows FCP out if the water. But want the fact first.
Sign In or Register to comment.