Forget nVidia....these are the goods!

Posted:
in Future Apple Hardware edited January 2014
Wow, I come back, look at ATI's site, and both the Radeon 9000 Pro and Radeon 9700 (R300) have been announced for MAC! ANd it looks like it will be in the same time period as the PC versions. The 900 Pro looks to available on both platforms pretty soon.



This is awesome. Especially considering how long it took for the Mac to get the Radeon and then Radeon 8500. Good news. Could this be a hint at some graphics options for the new towers?



This thing wipes the floor with the GF4 Ti 4600. Go check out anandtech to see the test results. Looks nice.
«1

Comments

  • Reply 1 of 33
    tabootaboo Posts: 128member
    [quote]Originally posted by TigerWoods99:

    <strong>Wow, I come back, look at ATI's site, and both the Radeon 9000 Pro and Radeon 9700 (R300) have been announced for MAC! ANd it looks like it will be in the same time period as the PC versions. The 900 Pro looks to available on both platforms pretty soon.



    This is awesome. Especially considering how long it took for the Mac to get the Radeon and then Radeon 8500. Good news. Could this be a hint at some graphics options for the new towers?



    This thing wipes the floor with the GF4 Ti 4600. Go check out anandtech to see the test results. Looks nice.</strong><hr></blockquote>



    Umm....when ATI announced the Radeon card, it was in June/July (I forget which year). The cards, both Mac and PC were supposed to be available in Sept. When September rolled around, lo and behold, the PC version was available, but the Mac version was not even available as a pre-order. It wasn't until Dec that the AGP version came available, and a few months beyond that for the Mac PCI. Neither one had stable drivers (ATI's big failing in both worlds) for several more months.

    In other words, don't jump for joy quite yet - it's all vaporware 'til it hits the stores.
  • Reply 2 of 33
    cowofwarcowofwar Posts: 98member
    It doesn't matter. They're 8x AGP cards, so the current systems wouldn't even be able to support them. Plus the G4 cpu would hold back performance since the cpu wouldn't be able to keep up to the GPU.
  • Reply 3 of 33
    penheadpenhead Posts: 45member
    Yup, those Radeons look pretty tasty for those of us who have $3-6k to spend on a graphics card



    The low-end model whips some 4MX ass well, as far as cripled GPUs go, i guess thats a good thing



    [quote] Plus the G4 cpu would hold back performance since the cpu wouldn't be able to keep up to the GPU. <hr></blockquote>



    Not true. This is precisely why we have GPUs. To offload the CPU in graphics-intense operations.



    And if there's an update to the G4 powermacs any time soon, im pretty sure they will have agp 8x, in eiter case, r300 should be the fastest thing around on a 4x bus as well.
  • Reply 4 of 33
    telomartelomar Posts: 1,804member
    [quote]Originally posted by cowofwar:

    <strong>It doesn't matter. They're 8x AGP cards, so the current systems wouldn't even be able to support them. Plus the G4 cpu would hold back performance since the cpu wouldn't be able to keep up to the GPU.</strong><hr></blockquote>



    I don't believe you even managed to get one fact in that whole post correct. The cards are actually 2x/4x/8x AGP cards so virtually any of the G4 towers would support them you just wouldn't see optimum performance out of them. In fact even with 8x AGP the card will be getting constrained by bandwidth.



    There's always a point at which the CPU holds back the GPU but I don't believe you'd find that truly the CPU is the major constraint as much as memory bandwidth.
  • Reply 5 of 33
    woozlewoozle Posts: 64member
    While these cards can do some pretty mean triangle numbers, they also support several 'vertex amplification' techniques.

    ATI has truform, and claim to support displacement mapping ( with DX9 - might not be available under ogl ).



    With these techniques the cpu can generate a relatively sparse vertex stream and the gpu will fill it in to use up all that triangle capability.



    At any rate, games arent going to generate that sort of detail for a long time to come. The hardware is far outstriping the software atm. I believe that it is finally hitting its limits for the sort of fast improvements that we have seen for the last 3-4 years. They have caught up with cpu's in terms of chip complexity, and will find themselves bound by moores law in the future.
  • Reply 6 of 33
    xypexype Posts: 672member
    [quote]Originally posted by penhead:

    <strong>Yup, those Radeons look pretty tasty for those of us who have $3-6k to spend on a graphics card

    </strong><hr></blockquote>



    considering the 9700 is estimated to sell at $400 it seems you want to buy more than 10 od them?
  • Reply 7 of 33
    penheadpenhead Posts: 45member
    ooops.
  • Reply 8 of 33
    xypexype Posts: 672member
    [quote]Originally posted by penhead:

    <strong>ooops.</strong><hr></blockquote>



    it's only good - I wonder to whom ATI will be selling their high-end cards that you probably saw (I think they call them radeon 8800 or so).
  • Reply 9 of 33
    lemon bon bonlemon bon bon Posts: 2,383member
    <a href="http://www.tomshardware.com/graphic/02q2/020418/vgacharts-05.html"; target="_blank">http://www.tomshardware.com/graphic/02q2/020418/vgacharts-05.html</a>;



    Apparently, the Radeon 9700 buries these scores with the highest resolutions and anti aliasing on!



    In fact, with these features on its twice as fast as the Geforce 4 Ti!!!



    Just hope the next round of 'G' cpus from Apple can feed these babies...



    Lemon Bon Bon
  • Reply 10 of 33
    telomartelomar Posts: 1,804member
    [quote]Originally posted by woozle:

    <strong>At any rate, games arent going to generate that sort of detail for a long time to come. The hardware is far outstriping the software atm. I believe that it is finally hitting its limits for the sort of fast improvements that we have seen for the last 3-4 years. They have caught up with cpu's in terms of chip complexity, and will find themselves bound by moores law in the future.</strong><hr></blockquote>

    They graphics card companies always release their latest offering ahead of the software support. DirectX 9 will appear around the end of the year though with the first games based on it appearing probably around 2H 2003, give or take.



    As for the future of Graphics cards I wouldn't expect a slow down just yet. Certainly both ATI and nVidia are already planned out and not slowing down for at least the next 18+ months.



    That said one of these cards would certainly last you a while in terms of usability.
  • Reply 11 of 33
    programmerprogrammer Posts: 3,458member
    [quote]Originally posted by woozle:

    <strong>While these cards can do some pretty mean triangle numbers, they also support several 'vertex amplification' techniques.

    ATI has truform, and claim to support displacement mapping ( with DX9 - might not be available under ogl ).

    </strong><hr></blockquote>



    They should be exposed via extensions on OpenGL. They are currently on the PC where such things aren't really policed, and I think Apple is finally going to allow them to be exposed on the Mac. Certainly the Apple-driven shader extensions will hopefully soon arrive.



    [quote]<strong>

    With these techniques the cpu can generate a relatively sparse vertex stream and the gpu will fill it in to use up all that triangle capability.

    </strong><hr></blockquote>



    That certainly helps -- especially in getting around the AGP bottleneck. Depending on the kind of geometry work being done, and how well the app&driver are coded, it is possible for the hardware to reach across the AGP bus and read the geometry straight out of memory so that the CPU doesn't have to do any of the work. An Xserve style architecture is great for this since the GPU can use all of that extra bandwidth that the G4 can't use... without getting in the way of the G4.



    [quote]<strong>

    At any rate, games arent going to generate that sort of detail for a long time to come. The hardware is far outstriping the software atm. I believe that it is finally hitting its limits for the sort of fast improvements that we have seen for the last 3-4 years. They have caught up with cpu's in terms of chip complexity, and will find themselves bound by moores law in the future.</strong><hr></blockquote>



    Console games are already starting to have this level of detail, but who knows if we'll see any of these ports to the Mac since its such a small market. The 9700 is capable of some really amazing things, and it'll be interesting to see if any developers really take advantage of it for non-game applications. I have a feeling Apple might since it could mitigate their current disadvantage in terms of raw CPU performance. ATI's new software technology has promise to take Maya & RenderMan shaders and move them directly into the GPU hardware shaders... in which case the number of people capable of creating them suddenly increases dramatically.



    I don't think GPU development will slow just yet. Their clock rates are lower, they are only on a 0.15 micron process, and in some ways their designs are simpler & more "regular" than CPU designs. The POWER4 is a 1GHz 170 million transistor monster, for example, so that would be a 50+% increase in transistor count for ATI... and a tripling of the clock rate. The pixel and poly rate numbers may stop increasing at the rate which they have up until now ... but with the advent of shaders (and many other new capabilities), there are other dimensions to graphics performance now. Since graphics is such an easily "parallelizable" task, we'll also see multi-chip boards. The main limitation is memory bandwidth.



    I can't wait to get my hands on one of these, but I'm waiting for the next PowerMac before I drop the money on it.





    BTW: don't forget nVidia... apparently the nv30 will have 20% more transistors than ATI's offering, which means it'll be even more powerful than the 9700.



    [ 07-20-2002: Message edited by: Programmer ]</p>
  • Reply 12 of 33
    Well, only a few years ago I was dreaming of 3D cards like these coming to the Mac with a nice version of Maya.



    We're here. 325,000 polys per sec. That's amazing!



    And there's the N30 to come after the 9700. I guess Ati get to hog the spotlight...for now. I heard Nvidia were having problems getting their next gen' card out. Certainly, ATi are giving them a serious run for their money now!



    Nice to see the Geforce 4mx in the top iMac. A card that aint too far behind the Ati 8500 in my Athlon PC.



    It'll be interesting to see the benches of the Geforce 4mx on the iMac 800mhz G4...running Quake or Maya. The Geforce 4mx hangs with the Geforce 3 Ti in many instances. Not too far away in benches at all.



    The iMac is starting to make a nice prosumer 3D machine. (Certainly, the Geforce 4mx is far superior to the Geforce 2mx I used in my Athlon a couple of years ago...and that did okay!)



    I'm seriously tempted by the latest iMac widescreen. Even a Maya company guy was lusting after it! I don't blame him! I know I want a 'G5' POWERMac but the widescreen iMac has my knees wobbling.



    But with a possible next gen' cpu on the cards within the next year...I give pause before dishing out almost £2,000 on a new system. Memory, cpu, bus...case...all these things are on the imminent 'up'. I don't want to be left with out of date mobo tech'.



    I hope Apple deliver the 'G5' by San Fran next year or a little after. I could really use that power in 3D rendering.



    It's time for the next level in power on the Mac. Those August 'power'Macs...? .75?



    Lemon Bon Bon
  • Reply 13 of 33
    ebbyebby Posts: 3,110member
    [quote]Originally posted by Programmer:

    <strong>An Xserve style architecture is great for this since the GPU can use all of that extra bandwidth that the G4 can't use... without getting in the way of the G4.</strong><hr></blockquote>



    Just wondering, but could you use a Xserve as a normal desktop computer? (All games/programs will run on it, right?) Or does the software limit it to server-type functions?
  • Reply 14 of 33
    lemon bon bonlemon bon bon Posts: 2,383member
    I've heard an X-serve. BOY! Is it...NOISY!!!



    Like a giant hairdryer!



    Lemon Bon Bon
  • Reply 15 of 33
    ebbyebby Posts: 3,110member
    Oh, I was thinking it was more like a laptop. Woops.
  • Reply 16 of 33
    leonisleonis Posts: 3,427member
    If I have to make a choice I always will pick ATI rather than nVidia.



    I find the image sharpness and color accuracy etc are usally better on ATI cards
  • Reply 17 of 33
    programmerprogrammer Posts: 3,458member
    [quote]Originally posted by Lemon Bon Bon:

    <strong>Well, only a few years ago I was dreaming of 3D cards like these coming to the Mac with a nice version of Maya.



    We're here. 325,000 polys per sec. That's amazing!</strong><hr></blockquote>



    What?! I think you meant 325,000 polys per frame at 60 fps... right? The Radeon9700 ought to be able to sustain that kind of throughput, depending on the shader complexity. They're not kidding when they say it can render in real-time CG movies from a couple of years ago.



    [ 07-21-2002: Message edited by: Programmer ]</p>
  • Reply 18 of 33
    bartobarto Posts: 2,246member
    [quote]Originally posted by Ebby:

    <strong>



    Just wondering, but could you use a Xserve as a normal desktop computer? (All games/programs will run on it, right?) Or does the software limit it to server-type functions?</strong><hr></blockquote>



    You can run it as a normal desktop computer. Put some feet on it, and pretend it's a noisy LC475!



    Of course, it's hopeless for even running the GUI unless you get the Radeon 8500 version.



    Barto
  • Reply 19 of 33
    jakemanjakeman Posts: 22member
    ATI loves announcing products 3-4 months before they ship. nVidia announces products and they ship within a week. the NV30 is due to come out fairly soon. i would be willing the bet the NV30 will be available before the new Radeons and it will outperform them.
  • Reply 20 of 33
    telomartelomar Posts: 1,804member
    NV30 won't be out until very late in the year. ATI's offering will ship before you ever see nVidia's this time.
Sign In or Register to comment.