Horrific GPU's

13»

Comments

  • Reply 41 of 58
    jbljbl Posts: 555member
    For someone who is completely ignorant about graphics cards and system architecture, can someone please tell me what applications or situations would benefit from a 9800? (Or even the 9600?) I thought the idea of Quartz Extreme was to unload much of the Quartz drawing system onto the graphics card. Thus I would think lots of things would benefit from a faster graphics card. From the responses to this thread, however, I take it that is wrong?
     0Likes 0Dislikes 0Informatives
  • Reply 42 of 58
    powerdocpowerdoc Posts: 8,123member
    Quote:

    Originally posted by JBL

    For someone who is completely ignorant about graphics cards and system architecture, can someone please tell me what applications or situations would benefit from a 9800? (Or even the 9600?) I thought the idea of Quartz Extreme was to unload much of the Quartz drawing system onto the graphics card. Thus I would think lots of things would benefit from a faster graphics card. From the responses to this thread, however, I take it that is wrong?



    QE works on my G4 533 and his lame geforce 2 mx card : lame because even the dumb 5200 smoke it.

    So i doubt that any cards implemented by Apple is a problem for Quartz Extreme.

    The modern cards are so fast (compared to the first 8 mb video card) in 2 D, that even if the 9800 pro is certainly faster than the 9600 pro (due to the higher clock speed of the GPU). The RAMDAC or these differents cards are nearly the same, and are quite sufficiant to use all screens until the resolution of 1600 per 1200 in VGA (more in DVI).

    In 2D there is also differences of qualities between video card. Matrox for example in the PC world, produce lame 3D video card, but top of the art qualitie image on 2D.

    ATI is supposed to have better image qualitie than Nvidia, but the lattest production of Nvidia have improved. i don't know what is the related qualitie of the 5200, some member here said it sucks. I don't know and will check PC site to confirm this issue. Anyway if we use DVI, logcically the qualitie of the image should not change a lot between various video cards in 2 D mode.



    For video playing both ATI and Nvidia have their own technologie, they are near, but the ATI one seems to be better and more smoothly according to certains web sites who made comparisons.



    The major issue is 3D rendering. That's the only aera you should consider to make a choice. Here no doubt : take the more powerfull video card ...



     0Likes 0Dislikes 0Informatives
  • Reply 43 of 58
    cindercinder Posts: 381member
    I'm glad the G5s don't ship with super powered video cards.



    Because if I want to play games I'll select a better card myself, or get it after-market.





    Show me a high end PC with a standard config $400 video card in it.

    Show me.
     0Likes 0Dislikes 0Informatives
  • Reply 44 of 58
    klinuxklinux Posts: 453member
     0Likes 0Dislikes 0Informatives
  • Reply 45 of 58
    klinuxklinux Posts: 453member
    Also, on a related note, a popular platform on the PC side is the nForce2 (soon to be nForce3) that has a GF4MX level GPU right on the MB. This is a great way to lower cost especially for people who do not need a dedicated graphics card. People who do, whether for business (Quadra) or game (FX or ATI PRO) can choose whatever card they wish that suit their purpose.



    First, it is unlikley but I would like to see IBM and Apple coming up with a solution like this as it is a great way to lower cost.



    Second, I recall hearing in the presentation that they forced the Xeon to do the graphical processing and through the jerkiness of the demos, it showcased the tremendous power of G5. While that is all good and stuff but no one who do graphical work/gaming would ever think of using the CPU instead of a dedicated GPU to do graphical heavy lifting! What is up with that?



    Just a few thoughts...
     0Likes 0Dislikes 0Informatives
  • Reply 46 of 58
    maxcom29maxcom29 Posts: 44member
    Quote:

    Originally posted by moazam

    Built it yourself eh?



    I take it that it does'nt have that ever so important "workstation credential" either?



    -M




    Never said it was a "workstation," but it is by current industry standards a high end pc, leaving the new G5's as high end macs w/ poor default configured gpu's.

    I will not use ProEngineer or some CAD on this machine, where certification would be nice. But, it sure is nice to launch Office XP in the blink of an eye, and Photoshop in a mere 6 seconds and get pretty awesome perfomance in games.
     0Likes 0Dislikes 0Informatives
  • Reply 47 of 58
    powerdocpowerdoc Posts: 8,123member
    The video cards are not so bad. benchmarks



    The video card of the single G5 tower is the geforce 5200 ultra (325/325) compared to the regular 5200 (250/250) it' s not a crippled card like the lame MX series. The card support direct X 9 (the geforce 4 mx supported only direct 7 x)

    It perform better in many case than the radeon 9000 pro.



    The radeon 9600 pro is better than the geforce 5600 utra in many benchmarks.



    Horrific GPU is not a good term. Not so long ago, the macs have terrible GPU : i remember when the macs where stuck with a rage 128 pro.

    There is also a lot of difference between a geforce 2 mx card of the regular i mac and the "basic" 5200 ultra card.



    The powermac G5 is shipped with low end and middle range modern video card, with the possibility to have a top end card like the radeon 9800 pro (only in 128 mb, but the 256 mb does not bring any advantage in any benchmark i have ever seen).
     0Likes 0Dislikes 0Informatives
  • Reply 48 of 58
    bungebunge Posts: 7,329member
    Quote:

    Originally posted by Powerdoc

    The powermac G5 is shipped with low end and middle range modern video card, with the possibility to have a top end card like the radeon 9800 pro (only in 128 mb, but the 256 mb does not bring any advantage in any benchmark i have ever seen).



    Couldn't Quartz Extreme use more ram, even if that would never show up on a benchmark?
     0Likes 0Dislikes 0Informatives
  • Reply 49 of 58
    powerdocpowerdoc Posts: 8,123member
    Quote:

    Originally posted by bunge

    Couldn't Quartz Extreme use more ram, even if that would never show up on a benchmark?



    When Apple introduced Quartz extreme, they said that they need geforce and radeon video card or better and a minimum of 16 MB of ram and 32 MB optimal.

    I doubt that any 2D soft even with transparencies requires more than 64 MB.
     0Likes 0Dislikes 0Informatives
  • Reply 50 of 58
    kraig911kraig911 Posts: 912member
    ITs all about FPU in the pro world, graphics cards don't mean sh*t for rendertimes, previews, and photoediting. They just make the windows move smoother... What an expensive game machine... go buy an xbox or something.
     0Likes 0Dislikes 0Informatives
  • Reply 51 of 58
    macroninmacronin Posts: 1,174member
    Quote:

    Originally posted by kraig911

    ITs all about FPU in the pro world, graphics cards don't mean sh*t for rendertimes, previews, and photoediting. They just make the windows move smoother... What an expensive game machine... go buy an xbox or something.



    Wrong!



    With the release of Maya v5.0, there is now the ability to use the GPU (nVidia) as the target for hardware rendering...



    Makes for some REALLY fast 8-pass frame renders...



    And it keeps the passes available as seperate layers, for further comping later...



    I mean like substantially less than one minute, on a dual 1.25GHz G4 with the Nvidia GeForce 4Ti OpenGL card...



    But this is only with the recent release of Maya v5.0...



    Would LOVE to see what could be done with a QuadroFX2000 in a dual G5 box...!



    ;^p
     0Likes 0Dislikes 0Informatives
  • Reply 52 of 58
    Quote:

    Originally posted by Powerdoc

    The powermac G5 is shipped with low end and middle range modern video card, with the possibility to have a top end card like the radeon 9800 pro (only in 128 mb, but the 256 mb does not bring any advantage in any benchmark i have ever seen).



    Well, doing the "fast switch" cube effect on a dual Cinema HD system (this means that the gpu has to treat 4 1920x1200 textures spinning around) could take quite a bunch of VRAM. Surely not 256MB,but it can still help with things such as Doom III (the one game that the devs. never tought about "will it run on an existing machine").
     0Likes 0Dislikes 0Informatives
  • Reply 53 of 58
    lemon bon bonlemon bon bon Posts: 2,383member
    I was initially disappointed by the cards. But they aint bad and perform much better than the previous bundles...you could prob' get a decent game of Doom 3 with lowered settings.



    Tear through Photoshop. Better than an Ati Rage for Lightwave by a mile!



    Anyone of these machine's cards is lightyears ahead of the Ati Rage in terms of 3D pushing power.



    Want the cutting edge? Get the Ati 9800. With 128 megs of ram and a bunging 4 gig of Ram in yer dual 2 gig G5...no complaints! Take on the PC WORLD! Cept from PC weenies suffering from Penis envy...



    Play Unreal Tourney 2003 easy.



    I doubt I'll last much past August for buying the top end.



    It seems Lemon Bon Bon's four year wait is coming to an end...



    Lemon Bon Bon
     0Likes 0Dislikes 0Informatives
  • Reply 54 of 58
    kraig911kraig911 Posts: 912member
    mmm quadroFX on a dual G5 I'm sure they are going to do it I mean its stupid if they don't, the AGP bus they put in there supports it too.
     0Likes 0Dislikes 0Informatives
  • Reply 55 of 58
    bungebunge Posts: 7,329member
    Quote:

    Originally posted by Lemon Bon Bon

    I doubt I'll last much past August for buying the top end.



    Music to my ears!
     0Likes 0Dislikes 0Informatives
  • Reply 56 of 58
    qaziiqazii Posts: 305member
    Quote:

    Originally posted by Powerdoc

    When Apple introduced Quartz extreme, they said that they need geforce and radeon video card or better and a minimum of 16 MB of ram and 32 MB optimal.

    I doubt that any 2D soft even with transparencies requires more than 64 MB.




    This may be true right now. But is QE or its successors going to need more RAM? Is Apple going to offload more and more to the graphics card?
     0Likes 0Dislikes 0Informatives
  • Reply 57 of 58
    matsumatsu Posts: 6,558member
    Gamers, for crissakes, buy consoles, much better investment, for the kinds of games that make "computer" gaming different, a GF2mx is enough.



    Personally, I think that the cards on offer right now are pretty good. IIRC, someone who actually understands this stuff explained that the way the mac uses OpenGL you can actually get comparable performance out of the better mainstream cards without going to specialized pro cards which pretty much use the same chips anyway ??? I'm sure that not accurate the way I phrased it, but someone who knows might be able to explain?
     0Likes 0Dislikes 0Informatives
  • Reply 58 of 58
    mmmpiemmmpie Posts: 628member
    Im not worried about the cards. There are three types of users,

    a) consumers, they arent going to be buying G5's. For the few that do the 5200 is heaps faster than any other gpu in the consumer machines.

    b) professionals, as long as they stick with a digital display they should be happy. Analog on any of the cards is a bit dissapointing.

    c) gamers, buy PC's.



    It is good that the radeon 9200 isnt offered. Its not a fully programmable card like the 5200.



    Im sure that a future release of QE, maybe even in panther, will use the programmability of the gpu to render pdf. That will really speed up the system. I also bet that it will only work on cards that support pixel and vertex shaders 2.0 or higher ( no direct x 8 cards need apply ).
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.