New ATI Graphics Card... Will be?

13»

Comments

  • Reply 41 of 59
    zapchudzapchud Posts: 844member
    Any proof of all this, g3pro?
  • Reply 42 of 59
    g3prog3pro Posts: 669member
    Quote:

    Originally posted by Zapchud

    Any proof of all this, g3pro?





    yeah, check the news stories at anandtech.com, nvnews.net, sharkyforums.com.
  • Reply 43 of 59
    xypexype Posts: 672member
    Quote:

    Originally posted by g3pro

    Being a serious fanboy? How about being pissed that 'one particular card company' * cough * ATi * cough * actually had the balls to tell game developers not to produce games using any new features because ATi didn't want to be behind in the feature race. That's anti-progress. I'm sure you enjoy having a video card manufacturer advising game developers to retard design, right?



    Uhm, that's the same thing nVidia is doing to ATi with their "The way it's meant to be played" mumbo-jumbo, telling people not to use ATi-specific extensions or rather to code nVidia paths only.



    Anyhow, you sound like a fanboy and are getting upset way too much than it's good for your heart. Apple will choose what they seem best and offer an alternative for their customers. At the moment, for Apple this means ATi for the G5, big PowerBooks, eMacs and iBooks, nVidia for iMacs and the 12" PowerBook. Since they can't afford to offer 10 different cards per model you'll only see 2-3 choices, where one company might get two cards to sell and the other only 1.



    And please don't come with your drivers quality babble since ATi is making great work on their Mac drivers and people are quite happy with it (they're more bothered with the changes Apple implements in their *GL APIs now and then). I've seen ATi react very good when developers complained, seek help in testing and do quite some work to make their RADEON family work right and the developers happy. If you're not one of the developers or don't have contact with any of them you're not really qualified to rant about it.
  • Reply 44 of 59
    g3prog3pro Posts: 669member
    Quote:

    Originally posted by xype

    Uhm, that's the same thing nVidia is doing to ATi with their "The way it's meant to be played" mumbo-jumbo, telling people not to use ATi-specific extensions or rather to code nVidia paths only.





    The Way it's meant to be played is a service nVidia performs for game developers to make sure that there are no graphics related problems associated with nVidia cards at launch, and to make sure the game is running as well as it can (because nVidia compiler requires optimizations to make sure the texels and shaders are computed when they should be computed). I have no idea what this "ATi-specific extension" thing you are talking about. Are you sure you have an idea what you are talking about?





    Too bad Apple doesn't give much of a choice for OEMs.
  • Reply 45 of 59
    airslufairsluf Posts: 1,861member
    Kickaha and Amorph couldn't moderate themselves out of a paper bag. Abdicate responsibility and succumb to idiocy. Two years of letting a member make personal attacks against others, then stepping aside when someone won't put up with it. Not only that but go ahead and shut down my posting priviledges but not the one making the attacks. Not even the common decency to abide by their warning (afer three days of absorbing personal attacks with no mods in sight), just shut my posting down and then say it might happen later if a certian line is crossed. Bullshit flag is flying, I won't abide by lying and coddling of liars who go off-site, create accounts differing in a single letter from my handle with the express purpose to decieve and then claim here that I did it. Everyone be warned, kim kap sol is a lying, deceitful poster.



    Now I guess they should have banned me rather than just shut off posting priviledges, because kickaha and Amorph definitely aren't going to like being called to task when they thought they had it all ignored *cough* *cough* I mean under control. Just a couple o' tools.



    Don't worry, as soon as my work resetting my posts is done I'll disappear forever.
  • Reply 46 of 59
    programmerprogrammer Posts: 3,467member
    Quote:

    Originally posted by g3pro

    The Way it's meant to be played is a service nVidia performs for game developers to make sure that there are no graphics related problems associated with nVidia cards at launch, and to make sure the game is running as well as it can (because nVidia compiler requires optimizations to make sure the texels and shaders are computed when they should be computed). I have no idea what this "ATi-specific extension" thing you are talking about. Are you sure you have an idea what you are talking about?



    Yes, he does. I've heard nVidia say these things myself. ATI has just as many ATI-specific extensions as nVidia does, and with their 9700-9800 they supported a few common capabilities far better than nVidia did at the time. Both companies have their own agendas and aren't to be trusted.
  • Reply 47 of 59
    jobjob Posts: 420member
    I guess this is a good place to ask, so:



    What the hell is this??



    http://eshop.macsales.com/Catalog_It...m=ATI100433022



    A Radeon 9000 with 128MB VRAM?
  • Reply 48 of 59
    xypexype Posts: 672member
    Quote:

    Originally posted by job

    A Radeon 9000 with 128MB VRAM?



    Must be a typo, I think the next low-end card would rather be 9600 or something similiar.
  • Reply 49 of 59
    Quote:

    Originally posted by xype

    Must be a typo, I think the next low-end card would rather be 9600 or something similiar.



    I just thought it was a modified version for Macs only.
  • Reply 50 of 59
    xypexype Posts: 672member
    Quote:

    Originally posted by oldmacfan

    I just thought it was a modified version for Macs only.



    Could be, too. But since the PC world is getting their X800 soon I think selling a 9000 for the Mac folk might look a tad stupid (especially at $159.99).
  • Reply 51 of 59
    jobjob Posts: 420member
    Quote:

    Originally posted by xype

    Could be, too. But since the PC world is getting their X800 soon I think selling a 9000 for the Mac folk might look a tad stupid (especially at $159.99).



    Hopefully it's a typo and they actually meant a 9600 with 128 MB VRAM.
  • Reply 52 of 59
    zapchudzapchud Posts: 844member
    Now that the cards seem to be out testing has been conducted and it's clear that the X800 XT and 6800 are pretty neck and neck performance wise, while the X800 only requires one card slot and significantly less power, but lacks the PS 3.0 support of the 6800.



    I'd take the ATi card. :-)
  • Reply 53 of 59
    mattyjmattyj Posts: 898member
    Ditto, I'd take the Ati card.
  • Reply 54 of 59
    powerdocpowerdoc Posts: 8,123member
    I have read a review about the X800XT and pro. The X800 XT is the fastest card of the market for todays games. The Nvidia drivers have certainly more room to be improved, than the ATI ones, but ATI have a small edge with today games.

    In some games, the limitant factor is the CPU anyway (unreal 2003 for example).
  • Reply 55 of 59
    rhumgodrhumgod Posts: 1,289member
    Unless you are a twelve year-old sitting in a room playing games all day long, I really don't think you're ever going to see a great difference between bleeding edge cards anyway.



    (Even if you were a twelve year-old, you probably still wouldn't see a difference)!
  • Reply 56 of 59
    lemon bon bonlemon bon bon Posts: 2,383member
    x800 for me.



    Cooler. Smaller. Doesn't need your own power station to run it.



    Ati seems to have gone smaller, cooler. Nvidia bigger, hotter, more power hungry.



    What's amazing is that ATI didn't have to crank up the r300 core that much to get a card that humbles the hype machine geforce 6. (Everybody was impressed that it offered x..blah...% more juice than the previous Nvidia...but the ATI is offering a consistent 100% over previous best ati card in gaming benches. Even the Ati 'Pro' card seems to humble the Nvidia card on some benches...)



    How long will it be before we have a gig of ram on graphics cards..?



    Looking at some of the graphics demos on Ati/Nvidia cards...it won't be another couple of years before we don't bother with pre-rendered stuff?



    The ati card is apparently pushing 800 million polygons per second?







    Will Ati break the billion barrier 1st?



    If Apple releases a dual 2.5 gig G5 with one of these babies? That would be a hot machine. Maybe this is what Apple have been waiting for regarding the Rev B PowerMac launch?



    Heh, the years of being stuck with 'Face-Hugging' Ati Rage 16 cards are truly behind us. Steve Jobs and Nvidia certainly lit a torch under Ati's ass. No more 3rd class citizens treatment from Ati, eh?



    I think the Ati is the the more impressive card. And by the time game devlepers start taking advantage of the shader 3 shat then I'm sure Ati will be pouring on the agony with the R500 (which is shaping up to be a truly amazing card if this is what Ati can do with the R400.)



    Come on IBM/Apple. 9xx 3 gigger so I can get on with handing out my cash to you. I've important stuff to do...playing Doom 3 and Half Life 2 (if we get a copy...) ahem...I mean...video work...ah...an...Photoshopping...and serious 3d....



    Blush.



    Lemon Bon Bon
  • Reply 57 of 59
    lemon bon bonlemon bon bon Posts: 2,383member
    Another thing, on Direct X 10?



    Will Open GL 2.0 match it? When is Open Gl 2.0 to arrive?



    Will it have its own shader 3 style gubbins? (Alot of fuss has been made by Nvidia supporting this sort of stuff in M$'s gaming APIs?)



    Lemon Bon Bon
  • Reply 58 of 59
    pbpb Posts: 4,255member
    Quote:

    Originally posted by Lemon Bon Bon

    Another thing, on Direct X 10?



    Will Open GL 2.0 match it? When is Open Gl 2.0 to arrive?







    There was initially speculation that Panther may include OpenGL 2.0, but this has been proved soon wrong. And I don't know when it actually arrives.



    Now, even if OpenGL 2.0 matches DirectX 10, the benefit would not be that great for the mac games that are ported from DirectX to OpenGL. The port effect will be still there. Pure OpenGL games may be another story.
  • Reply 59 of 59
    telomartelomar Posts: 1,804member
    There's really no noticeable improvement in the quality of image with games that use Shader Model 3.0, yet. The benefit comes mainly in the extended programmability that really no game utilises fully yet, not even those that would claim otherwise.



    There is also nothing stopping the hardware that enables the SM 3.0 APIs being utilised with OpenGL. A lot of what it means to be SM 3.0 compliant is a statement that the hardware operates at a certain precision and meets certain specifications.
Sign In or Register to comment.