Hold off those video card purchases (details on GeForce 4 family/new ATI cards)

Posted:
in Future Apple Hardware edited January 2014
Here's the goods folks.



NV25 (GeForce 4 Ti4400 & 4600):



-63 Million Transistors

-0.15 micron.

-4 rendering pipelines, 2 TMUs each.

-300 MHz core frequency.

- NVidia declares 4.9 Gpixels/s fillrate for the upcoming GPU. We already know that nowadays developers declare so-called

-effective fillrate for their processors, so, take these enormous numbers as "peak numbers" that do not have a lot to do with the real games in real life.

-The graphics processor will be very complex and will heat pretty well, so it will be equipped with an integrated heat sink as well as a special hardware monitoring function called NVidia Thermal control system.

-Frequency: 650 MHz DDR SDRAM.

-Four BGA memory chips on the front side of the board, just like on the FireGL 8700. However, NVidia solution also has memory heatsinks.

-Peak memory bandwidth: 10.4 GB/s.

-128MB Frame Buffer.

-Lightspeed Memory Architecture II, the new version now includes the following features:

Lossless 4:1 Z-compression: four times decreases the memory bandwidth consumed by Z-buffer

Second generation Occlusion Culling. As we know from various reviews, Hidden Surface Removal technique is more efficient in RADEON 8500 and KYRO II than in GeForce 3; in GeForce 4, NVidia introduces a new algorithm that will be more effective than its predecessor.

Quadcache Architecture.

The speed of the GeForce 4 Ti4600 based videocards is going to be extremely high compared to GeForce 3 and RADEON 8500. Additionally, there is also a couple of new features so that those interested in capabilities also had a reason to buy a new videocard

The new patent pending type of AntiAliasing will be introduced with the NV25. Its brand name is going to be Accuview AA and represent the new AA mask and technology, that saves one frame buffer write cycle. Accuview AA is another multisampling type of antialiasing. Originally, multisampling needs to write subpixel locations into the memory and only the calculate the pixel color values. Thanks to the new subpixel mask [and the improved algorithm and caching] the entire memory write cycle will be saved causing smaller perfomance decrease compared to the GeForce3. Another interesting feature of the new GPU is the nFiniteFX II engine. Basically, it is the same as on the GeForce 3, but with an additional vertex shader pipeline. Due to the increased frequency, the software that can use the power of the second vertex-shader pipeline will run up to three times faster on the GeForce 4 compared to the predecessor. We can also expect a 50% perfomance gain from the newest solutions when calculation pixel shaders due to the same reason. NVidia declares support for the so-called Z-Correct bump mapping. We couldn't get any more or less precise information about this, and we don't want to speculate on this, so we'll just say nothing this time. The final most important thing about GeForce4 is the support for the nView technology. The new, more feature-rich an easier to use version of TwinView.



That's just a little teaser. I will post more including NV17 (GeForce 4 MX) details, pictures of the new GeForce 4 cards, and info on Hercules ATI cards, the Fire GL cards, 7500DV, and new Kyro. Stay tuned.



[ 01-25-2002: Message edited by: TigerWoods99 ]</p>
«1345

Comments

  • Reply 1 of 97
    That is SICK. Where you get your info? <img src="graemlins/bugeye.gif" border="0" alt="[Skeptical]" />
  • Reply 2 of 97
    [quote]Originally posted by TigerWoods99:

    <strong>Here's the goods folks.



    NV25 (GeForce 4 Ti4400 & 4600):



    *snip really impressive specs I don't understand*

    [ 01-25-2002: Message edited by: TigerWoods99 ]</strong><hr></blockquote>



    So, does that mean it's 'holy crap!' fast, or just 'smokin' fast? Those are really benchmark terms by the way.



    tsukurite
  • Reply 2 of 97
    NV17 aka GeForce4 MX:





    ? 0.15 micron.

    ? 2 rendering pipelines, 2 TMUs each.

    ? GeForce 4 MX420 - 250 MHz.

    ? GeForce 4 MX440 - 275 MHz.

    ? GeForce 4 MX460 - 300 MHz.

    ? 128, 64 or 32-bit memory interface.

    ? GeForce 4 MX420 - 166 MHz SDR SDRAM.

    ? GeForce 4 MX440 - 400 MHz DDR SDRAM.

    ? GeForce 4 MX460 - 550 MHz DDR SDRAM.

    ? The most powerful version of the GeForce 4 MX will be equipped with 64 MB of 550 MHz DDR SDRAM memory. Such memory is packed in BGA package.

    ? The GeForce 4 MX also supports Lightspeed Memory Architecture II, the new version that includes the following features:

    ? Lossless 4:1 Z-compression: four times decreases the memory bandwidth consumed by Z-buffer.

    ? Second generation Occlusion Culling

    ? Quadcache Architecture

    ? Auto Pre-Charge: The ability to tell the memory device proactively to pre-charge areas of the memory that are not in use but are likely to be used in the very near future.

    ? Fast Z-Clear: Faster clearing the Z-buffer of if old data.

    ? Crossbar-Memory Controller: two independent memory controllers, instead of four on the GeForce3 and GeForce4 Ti.

    ? Accuview AA



    The GeForce 4 MX will have integrated TV encoder (maximum resolution 1024x768, support nView technology (dual DVI version is also to be available) and will be HDTV ready. The upcoming graphics processor will also have a video processing engine, MPEG-2 decoder MPEG2 decode algorithm which includes Inverse Quantization (IQ), Inverse Discrete Cosine Transform (IDCT), Motion Compensation, and Color Space Conversion (CSC) functions.



    MORE TO COME!
  • Reply 4 of 97
    [quote]Originally posted by TigerWoods99:

    <strong>NV17 aka GeForce4 MX:





    • 0.15 micron.

    • 2 rendering pipelines, 2 TMUs each.

    • GeForce 4 MX420 - 250 MHz.

    • GeForce 4 MX440 - 275 MHz.

    • GeForce 4 MX460 - 300 MHz.

    • 128, 64 or 32-bit memory interface.

    • GeForce 4 MX420 - 166 MHz SDR SDRAM.

    • GeForce 4 MX440 - 400 MHz DDR SDRAM.

    • GeForce 4 MX460 - 550 MHz DDR SDRAM.

    • The most powerful version of the GeForce 4 MX will be equipped with 64 MB of 550 MHz DDR SDRAM memory. Such memory is packed in BGA package.

    • The GeForce 4 MX also supports Lightspeed Memory Architecture II, the new version that includes the following features:

    • Lossless 4:1 Z-compression: four times decreases the memory bandwidth consumed by Z-buffer.

    • Second generation Occlusion Culling

    • Quadcache Architecture

    • Auto Pre-Charge: The ability to tell the memory device proactively to pre-charge areas of the memory that are not in use but are likely to be used in the very near future.

    • Fast Z-Clear: Faster clearing the Z-buffer of if old data.

    • Crossbar-Memory Controller: two independent memory controllers, instead of four on the GeForce3 and GeForce4 Ti.

    • Accuview AA



    The GeForce 4 MX will have integrated TV encoder (maximum resolution 1024x768, support nView technology (dual DVI version is also to be available) and will be HDTV ready. The upcoming graphics processor will also have a video processing engine, MPEG-2 decoder MPEG2 decode algorithm which includes Inverse Quantization (IQ), Inverse Discrete Cosine Transform (IDCT), Motion Compensation, and Color Space Conversion (CSC) functions.



    MORE TO COME!</strong><hr></blockquote>



    All sarcasm aside. Jeezus! This thing's gonna rock! (That beats 'holy crap' as stated in my previous post ).



    tsukurite
  • Reply 5 of 97
    screedscreed Posts: 1,077member
    [quote]Originally posted by TigerWoods99:

    <strong>NV17 aka GeForce4 MX:

    [snip]

    The GeForce 4 MX will have integrated TV encoder (maximum resolution 1024x768, support nView technology (dual DVI version is also to be available) and will be HDTV ready. The upcoming graphics processor will also have a video processing engine, MPEG-2 decoder MPEG2 decode algorithm which includes Inverse Quantization (IQ), Inverse Discrete Cosine Transform (IDCT), Motion Compensation, and Color Space Conversion (CSC) functions.



    MORE TO COME!</strong><hr></blockquote>



    Let the countdown to Kormac rearing his head and saying "See!? I told you."



    Of course, John Carmack is right now designing games that already exceed these specs!



    Screed



    [ 01-25-2002: Message edited by: sCreeD ]</p>
  • Reply 6 of 97
    There will also be a Quadro4 apparently.



    How o I post jpg pictures that I have on my computer? I want to post pics of some the GeForce 4 products and also Fire GL cards. Thanks.
  • Reply 7 of 97
    Okay TW99,



    If you got info on these forthcoming nVidia cards, and you say there is also going to be a Quadro version...



    Is this going to show up on the Mac platform?



    Would only make sense to have a nVidia GeForce4 QuadroDCC to go with Maya on Mac OS X...!



    Let us know...
  • Reply 8 of 97
    Here are the names of the upcoming nVidia products:



    NVIDIA GeForce4 MX 460

    NVIDIA GeForce4 MX 440

    NVIDIA GeForce4 MX 420

    NVIDIA Quadro4 500XGL

    NVIDIA Quadro4 200/400NVS

    NVIDIA Quadro4 550XGL

    NVIDIA GeForce4 Ti 4600

    NVIDIA GeForce4 Ti 4400

    NVIDIA GeForce4 Ti 4200

    NVIDIA Quadro4 900XGL

    NVIDIA Quadro4 750XGL

    NVIDIA Quadro4 700XGL



    NV17

    NV17-1

    NV17M

    GeForce4 420 Go (32M)

    GeForce4 440 Go (64M)



    Looks like nVidia's next mobile GPU will be based on the GeForce4 MX.
  • Reply 9 of 97
    bodhibodhi Posts: 1,424member
    Percentage of those technologies to be implemented into the Mac? 20% TOPS.



    I had an 867 with a GeForce3. Loaded up Giants and half of the graphics options are shaded. Why? Cause even though that game was made to play like a dream on the GeForce3 Apple has yet to implement most of the advatnages of that card into OSX. We never saw FSAA on the Radeon, never saw it on the GeForce3 and I doubt we will see half of the cool technologies that the PC users will enjoy on these cards cause Apple just is not interested in implementing the technology.



    <img src="graemlins/hmmm.gif" border="0" alt="[Hmmm]" />
  • Reply 10 of 97
    g-newsg-news Posts: 1,107member
    Nice, makes you want to wait for G5 and DDR together with one of those 500$ GeForce Ti 4600.



    G-News
  • Reply 11 of 97
    matsumatsu Posts: 6,558member
    There were already threads with links to c|net articles giving most of those details. Hardware fanboy gaming sites have been talking about them for weeks. When Kormac sticks his pin head back in here he won't even need to say "I told you so." The president of the Kormac ass lickers club ***cough, nostra... cough*** is going to do it for him. But it's okay, by now most of them are oblivious to the taste of shit.
  • Reply 12 of 97
    Yes I posted those other details in another thread, but there are much more details.



    The high-end GeForce 4 is said to be $399.
  • Reply 13 of 97
    amorphamorph Posts: 7,112member
    [quote]Originally posted by TigerWoods99:

    <strong>There will also be a Quadro4 apparently.



    How o I post jpg pictures that I have on my computer? I want to post pics of some the GeForce 4 products and also Fire GL cards. Thanks.</strong><hr></blockquote>



    Put them up on a webpage somewhere and link to them with a URL. If your 'net provider gives you web space, you can use that, or a free hosting service. Or ou can ask someone else with web space to host them for you.



    If you use GeoCities, wrap them in a simple HTML page first.
  • Reply 14 of 97
    paulpaul Posts: 5,278member
    [quote]Originally posted by Amorph:

    <strong>



    Put them up on a webpage somewhere and link to them with a URL. If your 'net provider gives you web space, you can use that, or a free hosting service. Or ou can ask someone else with web space to host them for you.



    If you use GeoCities, wrap them in a simple HTML page first.</strong><hr></blockquote>



    whats wrong with, ummm ANYONE's iDisk? (you can u/l stuff to the public folder of i'm pretty sure anyone's iDisk) you should be able to link from there right?
  • Reply 15 of 97
    I don't have any of those things.
  • Reply 16 of 97
    qaziiqazii Posts: 305member
    [quote]Originally posted by TigerWoods99:

    <strong>

    -63 Million Transistors

    </strong><hr></blockquote>



    Whoa! More transistors than the G5!
  • Reply 17 of 97
    programmerprogrammer Posts: 3,458member
    [quote]Originally posted by Bodhi:

    <strong>Apple has yet to implement most of the advatnages of that card into OSX. We never saw FSAA on the Radeon, never saw it on the GeForce3 and I doubt we will see half of the cool technologies that the PC users will enjoy on these cards cause Apple just is not interested in implementing the technology.

    </strong><hr></blockquote>



    You are correct that these technologies haven't yet been implemented on the Mac, but they are coming. The latest OpenGL ARB minutes includes a lengthy discussion of Apple-initiated extensions to OpenGL to support several of these key features. We ought to see these technologies available by mid-year (earlier to developers so they can start taking advantage of it). The reason for the delay is that Apple wants it to work across all hardware, rather than each developer having to code for older hardware, ATI hardware, and nVidia hardware seperately. This is similar to Microsoft's DirectX8, but it needs buy in from the OpenGL architecture review board (ARB).
  • Reply 18 of 97
    g-newsg-news Posts: 1,107member
    It has more Gigaflops too.

    In fact Graphics card chips have taken over the CPUs in terms of instructions per second and complexity/transistor count several years ago, roughly with the second generation of graphic GPUs (Voodoo2)



    G-News
  • Reply 19 of 97
    ATI announced a new All In Wonder Radeon 7500. <a href="http://www.ati.com"; target="_blank">www.ati.com</a>



    Also, there is a FireGL 8800 card with 128 MB RAM.



    Apparently Guillemot and ATI are coming out with a Radeon 8500XT card, with a 300/300 clock and 128 MB DDR RAM.



    Another rumor is that the GeForce4 MX cards will be PCI & AGP. Hmmm.



    I'm working on putting the pics of the boxes up. They are tak8ing pre-orders of these cards already apparently on the web.
  • Reply 20 of 97
    serranoserrano Posts: 1,806member
    pci! :eek: i thought i was gunna be stuck with this radeon forever! ...however it might not be wise to slap in a gpu that clocks higher than your cpu... can someone say bottleneck <img src="graemlins/hmmm.gif" border="0" alt="[Hmmm]" />
Sign In or Register to comment.