R300 to be officially unvieled July 18...

Posted:
in Future Apple Hardware edited January 2014
... a day after Steve's keynote.



This was reported by c't - known to most for the latest PPC SPEC benchmark - and they are fairly objective and do not engage into roumoring.
«1

Comments

  • Reply 1 of 39
    penheadpenhead Posts: 45member
    Most websites seem to agree on a summer schedule for R300, but this means ATi might screw up again and give us some inside on which systems it might be in before the Stevenote



    Also, R300 is the only card thus far to be able to run Doom3! How cool is that for the mac as a game platform?



    [some r300 info: <a href="http://www.anandtech.com/showdoc.html?i=1632"; target="_blank">Anandtech's E3 roundup</a>
  • Reply 2 of 39
    cindercinder Posts: 381member
    Uh, it's not that cool.



    Doom3 isn't out.



    By the time it is, there will be plenty more cards that will run it.
  • Reply 3 of 39
    applenutapplenut Posts: 5,768member
    [quote]Originally posted by penhead:

    <strong>

    Also, R300 is the only card thus far to be able to run Doom3! How cool is that for the mac as a game platform?



    [some r300 info: <a href="http://www.anandtech.com/showdoc.html?i=1632"; target="_blank">Anandtech's E3 roundup</a></strong><hr></blockquote>



    that's not true. even Carmack has said that Doom 3 will be playable on a Geforce 2MX.



    Maybe it will be te first card to play it with all settings maxxed at a respectable frame rate
  • Reply 4 of 39
    hmurchisonhmurchison Posts: 12,146member
    I remembe reading it has more to do with Software issues. The R300 sounds like it will be a nice card...I can't wait.
  • Reply 5 of 39
    nathan22tnathan22t Posts: 317member
    Geforce3/Xbox and the Radeon 8500 represent the level of card needed for maximum impact(all features turned on with decent/good frame rates) in Doom3. The next Nvidia card(which may or may not be named the gf5) as well as the R300 will both debut this summer. These more powerful cards will most likely add little to the Doom experience beyond additional frames per second. Carmack choose the R300 to run the e3 demos of Doom3 because it simply offered better performance than the next generation Nvidia card. Carmack feels Nvidia became side tracked with the Xbox and that this is the reason they feel behind in performace. Though he has lost no respect for the company and their products, which he strongly supports.



    The gf/gf2/gf4mx and radeon (cards with t&l and other advanced features, but no pixel/vertex shaders) will run a separate build of the game which has less spectacular effects, but is fully playable none the less.



    [ 06-17-2002: Message edited by: nathan22t ]</p>
  • Reply 6 of 39
    eupfhoriaeupfhoria Posts: 257member
    No game should be playable with all of the effects on and maxxed at a respectable FPS on the current hardware when it is released.



    It is one of those things that allows the game to grow: it's not until a year later until you can even run the game full quality...
  • Reply 7 of 39
    xaqtlyxaqtly Posts: 450member
    I hope Doom 3 will run acceptably well on my iMac/800... I know it won't have the highest framerates nor the most graphical goodies but all I ask is for a stable framerate and maybe just some of the goodies.
  • Reply 8 of 39
    kecksykecksy Posts: 1,002member
    Actually the nVidia Geforce 3 running at 640 X 480 with all the effects at 50% just breaks 30fps. It is unlikely then that the R300 or NV30 will be able to run Doom at it's absolute maximum. The leap in GPU power needed hit 60+fps at double that resolution and detail will not come for at least another generation.



    [ 06-17-2002: Message edited by: Keeksy ]</p>
  • Reply 9 of 39
    nathan22tnathan22t Posts: 317member
    [quote]Originally posted by Keeksy:

    <strong>Actually the nVidia Geforce 3 running at 640 X 480 with all the effects at 50% just breaks 30fps.

    </strong><hr></blockquote>



    Your statement approaches truth, but isn't quite there. Quotes from Carmack:



    "We are aiming to have a GF3 run Doom with all features enabled at 30 fps."



    "Our "full impact" platform from the beginning has been targeted at GF3/Xbox level hardware. Slower hardware can disable features, and faster hardware gets higher frame rates and rendering quality."
  • Reply 10 of 39
    programmerprogrammer Posts: 3,409member
    [quote]Originally posted by nathan22t:

    <strong>



    Your statement approaches truth, but isn't quite there. Quotes from Carmack:



    "We are aiming to have a GF3 run Doom with all features enabled at 30 fps."



    "Our "full impact" platform from the beginning has been targeted at GF3/Xbox level hardware. Slower hardware can disable features, and faster hardware gets higher frame rates and rendering quality."</strong><hr></blockquote>



    Roughly speaking, if Doom3 is fill rate limited (likely), the nv30 should then be able to run 1280x960 @ 30 Hz. [email protected] Hz (60 fps is meaningless on a typical computer display unless it happens to be set to a 60Hz refresh rate). The gf4Ti has about double the gf3's memory bandwidth, and the nv30 will double that again. The vertex rate probably won't be an issue, but if it is, then the nv30 ought to be more than 4x faster than the gf3.



    It'll be interesting to see if id takes advantage of any of the advanced capabilities of the new hardware though -- that will make a bigger difference than just raising the resolution.
  • Reply 11 of 39
    yurin8oryurin8or Posts: 120member
    [quote]Originally posted by nathan22t:

    <strong>Carmack choose the R300 to run the e3 demos of Doom3 because it simply offered better performance than the next generation Nvidia card. </strong><hr></blockquote>



    Actually, I read that the card offered to Carmac by Nvidia was just a souped up 4600...not a next generation card at all.
  • Reply 12 of 39
    programmerprogrammer Posts: 3,409member
    [quote]Originally posted by yurin8or:

    <strong>Actually, I read that the card offered to Carmac by Nvidia was just a souped up 4600...not a next generation card at all.</strong><hr></blockquote>



    Yes, this is correct. nVidia is the last to announce their new hardware this year (Matrox, 3DLabs, and ATI got there first). Of the other three I think only ATI has actual hardware, and they are probably a couple of months ahead of nVidia this time around. It doesn't matter a whole lot, really... the window is quite narrow and there is a big difference between shipping samples/demos and shipping in quantity.
  • Reply 13 of 39
    eskimoeskimo Posts: 474member
    [quote](60 fps is meaningless on a typical computer display unless it happens to be set to a 60Hz refresh rate). <hr></blockquote>



    Modern video drivers and games allow for the framerate to be rendered indpendently of the refresh rate. I'm not a video guru so I don't know how it works but I do know that you are no longer limited by monitor refresh rates as far as rendering to the screen.
  • Reply 14 of 39
    farfar Posts: 17member
    Where can I find specs of the R300?
  • Reply 15 of 39
    tigerwoods99tigerwoods99 Posts: 2,633member
    The graphics card industry sure doesn't slow done, it only heats up. I hope that R300 will be introduced on the Mac side at the same time, it would be a great option to have in one of the new PowerMacs. Matrox' new card looks really interesting, originally I thought it was just a bunch of BS but the latest issue of Maximum PC (Mac bashers to the fullest) had a big article about the Parhelia and how it can power 3 displays. Any word on when this card will officially be out? I've heard that they will support Mac OS X.
  • Reply 16 of 39
    programmerprogrammer Posts: 3,409member
    [quote]Originally posted by Eskimo:

    <strong>Modern video drivers and games allow for the framerate to be rendered indpendently of the refresh rate. I'm not a video guru so I don't know how it works but I do know that you are no longer limited by monitor refresh rates as far as rendering to the screen.</strong><hr></blockquote>



    Yes, but the reason to sync to anything at all is to avoid tearing. The original quote mentioned "60 Hz" as if it was some magic threshhold, but that is only true on a display with a 60 Hz refresh rate.
  • Reply 17 of 39
    programmerprogrammer Posts: 3,409member
    [quote]Originally posted by TigerWoods99:

    <strong>The graphics card industry sure doesn't slow done, it only heats up. I hope that R300 will be introduced on the Mac side at the same time, it would be a great option to have in one of the new PowerMacs. Matrox' new card looks really interesting, originally I thought it was just a bunch of BS but the latest issue of Maximum PC (Mac bashers to the fullest) had a big article about the Parhelia and how it can power 3 displays. Any word on when this card will officially be out? I've heard that they will support Mac OS X.</strong><hr></blockquote>



    I don't think a ship date or MacOSX drivers have been announced. I suspect that Matrox announced early to get some press before ATI & nVidia arrived with their superiour next gen boards, which would make Matrox look like an "also ran".



    Don't expect graphics chips to slow down any time soon.
  • Reply 18 of 39
    eskimoeskimo Posts: 474member
    [quote]Originally posted by Programmer:

    <strong>



    Yes, but the reason to sync to anything at all is to avoid tearing. The original quote mentioned "60 Hz" as if it was some magic threshhold, but that is only true on a display with a 60 Hz refresh rate.</strong><hr></blockquote>



    Well my system has sync turned off and performs at 200FPS @ 75Hz with no tearing that I can see. Not sure how it works but it does.
  • Reply 19 of 39
    programmerprogrammer Posts: 3,409member
    [quote]Originally posted by Eskimo:

    <strong>Well my system has sync turned off and performs at 200FPS @ 75Hz with no tearing that I can see. Not sure how it works but it does.</strong><hr></blockquote>



    That's because the frame rate is really high and the display's refresh rate is near the perceptual limit. On a 60 Hz display with a frame rate near or below 60, you would notice.



    Any game running at 200 fps isn't drawing enough.
  • Reply 20 of 39
    stevessteves Posts: 108member
    [quote]Originally posted by Programmer:

    <strong>



    Roughly speaking, if Doom3 is fill rate limited (likely), the nv30 should then be able to run 1280x960 @ 30 Hz. [email protected] Hz (60 fps is meaningless on a typical computer display unless it happens to be set to a 60Hz refresh rate). The gf4Ti has about double the gf3's memory bandwidth, and the nv30 will double that again. The vertex rate probably won't be an issue, but if it is, then the nv30 ought to be more than 4x faster than the gf3.



    It'll be interesting to see if id takes advantage of any of the advanced capabilities of the new hardware though -- that will make a bigger difference than just raising the resolution.</strong><hr></blockquote>



    Why do you expect Doom3 to be fill rate limited? While this has been true with older generations of games with relatively low polygon counts, etc. I wouldn't assume this to be the case with Doom 3.



    The bottom line is that game developers are writing games based on nVidia hardware as a baseline. They did the same thing a few years back with 3dfx until the nVidia came out with the Geforce and took the lead. ATI makes nice cards, but they have always been and likely always will be a generation behind nVidia (with the exception of mobile graphics chips).



    As for Doom 3, I suggest people actually read what Carmack has said as opposed to what people claim he said.



    Here are a few links:



    <a href="http://www.webdog.org/plans/1/"; target="_blank">http://www.webdog.org/plans/1/</a>;



    This is his .plan file. He specifically discusses ATI's 8500, nVidia's Geforce 3, and complains about the 4mx.



    <a href="http://slashdot.org/comments.pl?sid=33453&cid=3619007"; target="_blank">http://slashdot.org/comments.pl?sid=33453&cid=3619007</a>;



    Here is he specifically notes the Geforce4 is faster than the ATI 8500.



    <a href="http://slashdot.org/comments.pl?sid=33453&cid=3619372"; target="_blank">http://slashdot.org/comments.pl?sid=33453&cid=3619372</a>;



    Here he gives rationale for the requirement for high end video cards.



    <a href="http://www.beyond3d.com/interviews/carmackdoom3/"; target="_blank">http://www.beyond3d.com/interviews/carmackdoom3/</a>;



    Here is a very technical description of the Doom3 engine. This also discusses why some of ATI's proprietary technologies such as Truform won't work properly, etc.



    <a href="http://www.gamespy.com/e32002/pc/carmack/"; target="_blank">http://www.gamespy.com/e32002/pc/carmack/</a>;



    Again, he confirms the Geforce 4ti is the best "current" video card for this game.



    etc, etc... There are plenty more articles and comments from Carmack.



    Both ATI and nVidia are coming out with new cards this summer. I have no reason to believe ATI will all of a sudden take over the performance leadership.



    Steve
Sign In or Register to comment.