NVIDIA Gelato, Quadro FX 4000, and Quadro FX 4000 SDI

Posted:
in Future Apple Hardware edited January 2014
WOW! NVIDIA Released all new Super Highend Graphics cards today.



Quote:

Originally from Joseph Tan, Monday, 19 April 2004 at

CGNetworksThe NVIDIA Quadro FX 4000 is single AGP 8x board device that will occupy two slot spaces. The Quadro FX 4000 will contain 256MB of 450MHz (900MHz) DDR3 memory running across a 256-bit bus. Video outputs are two DVI-I interfaces and a 3-pin mini-DIN for shutter glass support.



The Quadro FX 4000 looks similar to the GeForce 6800 Ultra, the bulky cooling apparatus is used to keep the GPU and RAM within thermal limits. Maximum power consumption for the Quadro FX 4000 is 130 Watts, NVIDIA recommends a minimum of 480 Watt power supply for workstations. As visible from the board shot, two auxiliary power connectors are present as the AGP bus alone does not supply enough current to power the device. According to NVIDIA, future PCI Express iterations of the Quadro FX should dispense with the need for auxiliary power connectors, or at least, minimize their use. NVIDIA was unwilling to divulge GPU clockspeeds.





The Quadro FX 4000 SDI is designed for applications that require high-quality, real-time graphics output in Serial Digital Interface (SDI) format. The Quadro FX 4000 SDI also contains the NV40GL GPU and is also a dual board device. The SDI output capability is what sets the Quadro FX 4000 SDI apart. The illustration shows a single DVI-I output, two SDI video co-axial outputs, digital sync, and analog sync. According to our briefing material it can output SD or HD resolution.





NV40GL ? New Quadro Pixel Power Plant

The new NV40GL GPU contains significantly more hardware than previous Quadro generations: 16 pixel pipelines, six vertex shader units, and two fragment/pixel shader units per pipeline. NVIDIA claims 4-8 times faster pixel shader FP32 support, and twice the geometry and fill rate performance over the Quadro FX 3000. We'll look into these performance claims when the final board arrives for us to perform benchmarks.



Compared to the Quadro FX 3000, the FX 4000 will feature:





Support for Microsoft Vertex Shader 3.0 and Pixel Shader 3.0. FP32 format support is now a requisite.



Support for High Dynamic Range (HDR) rendering. NVIDIA is claiming full hardware support for the OpenEXR 16-bit floating-point standard.



Integrated programmable video processor (meant to support encoding and decoding of MPEG2/4)



A new rotated grid sampling full-screen anti-alias scheme (RG-FSAA). NVIDIA claims that 4x RG-FSAA is superior in quality to 8x ordered grid FSAA in the Quadro FX 3000.



Like previous performance-oriented Quadro products, the new model Quadros are for professional applications that frequently feature multiple rendering windows._ The Quadro FX 4000 and Quadro FX 4000 SDI offer additional capabilities over and above the consumer-oriented GeForce 6800:





Antialiased points and lines for wireframe display.



OpenGL logic operations.



Up to eight clip regions (GeForce 6800 supports one).



Hardware accelerated clip planes.



Memory usage optimization for multiple simultaneous graphics windows.



Support for two-sided lighting.



Hardware overlay planes.



Support for quad-buffered stereo for shutter glasses.







The Quadro FX 4000 looks like a significant development to the FX 3000. We look forward to giving it a spin in a few weeks. The Quadro FX 4000 SDI is not targeted for most DDC users but should interest people in a studio broadcast environment.












With the new slew of consumer affordable highend applications coming from Apple yesterday mainly geared toward studio environments. Don't you think Apple needs some of the finest highend video graphics cards to be available for them in PowerMac G5's?



Any thoughts?

Comments

  • Reply 1 of 12
    hmurchisonhmurchison Posts: 12,437member
    <drool>
  • Reply 2 of 12
    kraig911kraig911 Posts: 912member
    crosses fingers * please release for mac * please please please. who can I deliver my first born too? C'mon i'll take it for the team? jk
  • Reply 3 of 12
    onlookeronlooker Posts: 5,252member
    Quote:

    Originally posted by kraig911

    C'mon i'll take it for the team? jk



    Ohhhhho ho hoh your sick.... lol.



    Did you see what it said here:



    "Up to eight clip regions (GeForce 6800 supports one)."



    What ever that means I know 8 is better than 1. lol



    And these go to 11.
  • Reply 4 of 12
    Quote:

    Originally posted by onlooker

    Ohhhhho ho hoh your sick.... lol.



    Did you see what it said here:



    "Up to eight clip regions (GeForce 6800 supports one)."



    What ever that means I know 8 is better than 1. lol



    And these go to 11.




    Quote:

    As visible from the board shot, two auxiliary power connectors are present as the AGP bus alone does not supply enough current to power the device.



    But 3 arent better then 1...
  • Reply 5 of 12
    onlookeronlooker Posts: 5,252member
    Quote:

    Originally posted by T'hain Esh Kelch

    But 3 arent better then 1...



    Well have to wait for PCIe for that, but I would easily take a Quadro FX 4000 the way it is. I don't think PowerMac's will see PCIe at WWDC. I don't even think we'll see 3GHz PowerMac's at WWDC anymore, but PCIe will probably hold off for "at least" one more revision - maybe more. It's hard to say right now. It would be nice to see Apple lead the way on this one. Not just to bring out old memories getting there first and pushing the industry forward, but something as simple as adopting PCIe early could get a high-end 3D card manufacturer to take notice, and give us a shot. That I would really appreciate. I know a lot of Mac 3D guys would praise whomever did that for us. That would be equivalent to witnessing a miracle. (maybe a slight exaggeration, but you know what I mean)
  • Reply 6 of 12
    airslufairsluf Posts: 1,861member
    Kickaha and Amorph couldn't moderate themselves out of a paper bag. Abdicate responsibility and succumb to idiocy. Two years of letting a member make personal attacks against others, then stepping aside when someone won't put up with it. Not only that but go ahead and shut down my posting priviledges but not the one making the attacks. Not even the common decency to abide by their warning (afer three days of absorbing personal attacks with no mods in sight), just shut my posting down and then say it might happen later if a certian line is crossed. Bullshit flag is flying, I won't abide by lying and coddling of liars who go off-site, create accounts differing in a single letter from my handle with the express purpose to decieve and then claim here that I did it. Everyone be warned, kim kap sol is a lying, deceitful poster.



    Now I guess they should have banned me rather than just shut off posting priviledges, because kickaha and Amorph definitely aren't going to like being called to task when they thought they had it all ignored *cough* *cough* I mean under control. Just a couple o' tools.



    Don't worry, as soon as my work resetting my posts is done I'll disappear forever.
  • Reply 7 of 12
    programmerprogrammer Posts: 3,467member
    Quote:

    Originally posted by AirSluf

    As has been beat to death in previous threads, the Mac high-end graphics cards are for all intents and purposes equivalent to the supposed pro cards, some exceedingly small differences conceded. PC gamer cards are also essentially the same, but with hobbled firmware and drivers. It hasn't been worth the effort for ATi and NVidia to write two versions of Mac drivers. The single biggest difference is a full certification QA on an application by application basis.



    Not anymore. Yesterday 3DLabs announced a smokin' new Pro card that impresses even me.
  • Reply 8 of 12
    onlookeronlooker Posts: 5,252member
    Quote:

    Originally posted by AirSluf

    As has been beat to death in previous threads, the Mac high-end graphics cards are for all intents and purposes equivalent to the supposed pro cards, some exceedingly small differences conceded. PC gamer cards are also essentially the same, but with hobbled firmware and drivers. It hasn't been worth the effort for ATi and NVidia to write two versions of Mac drivers. The single biggest difference is a full certification QA on an application by application basis.



    That is such BS. There is no equivalent to a Quadro FX 3000 let alone QFX 3000G on the Mac side. There isn't isn't even a GeForce FX 5900, let alone the Ultra. or the 5950 / Ultra available. .



    So for all intents and purposes bull sh*t.



    Now NVIDIA releases a 6800 series, and the Quadro 4000 FX, and the Quadro 4000 FX SDI. So for all intents, and purposes we already missed a batch.



    So don't think for a second just because there is no cause for concern from those in need because there is a NVIDIA GeForce FX 5200 Ultra at the apple store, and it's magically gong to perform as well as Quadro FX 3000 FX for me because it isn't. And these cards are tested by Alias on their hardware recommendation page, and It just wont cut it. There is not a Mac card available that does.



    Quote:

    Originally posted by Programmer

    Not anymore. Yesterday 3DLabs announced a smokin' new Pro card that impresses even me.



    Comparatively It's not that impressive. Not 1 CG site has even done a story on it. So it's funny that the first time I hear about it is in AI, and I read CG sites all day usually while I'm rendering. It's doesn't sound that great anyway. just because it has a 1GB ram isn't going to make it that great.
  • Reply 9 of 12
    programmerprogrammer Posts: 3,467member
    Quote:

    Originally posted by onlooker

    Comparatively It's not that impressive. Not 1 CG site has even done a story on it. So it's funny that the first time I hear about it is in AI, and I read CG sites all day usually while I'm rendering. It's doesn't sound that great anyway. just because it has a 1GB ram isn't going to make it that great.



    Its interesting that this card seems to have in spades the "Pro" features that are clammered for, but you're not impressed? I guess you'll just have to hope that Apple uses the new nVidia boards in their next round of PowerMacs.
  • Reply 10 of 12
    onlookeronlooker Posts: 5,252member
    Quote:

    Originally posted by Programmer

    Its interesting that this card seems to have in spades the "Pro" features that are clammered for, but you're not impressed? I guess you'll just have to hope that Apple uses the new nVidia boards in their next round of PowerMacs.



    Quote:

    Original Article at site

    On Monday, graphics hardware developer 3Dlabs Inc. announced the name and technology behind its next-generation 3D card offering, the Wildcat Realizm.



    3Dlabs, which was purchased by Creative Technology Ltd., will not market the Realizm cards to PC users or even high-end gamers, according to Jeff Little, director of marketing at the company. Instead, the Realizm card will challenge Nvidia Corp.'s new Quadro FX4000 chip, as well as other solutions for professional 3D graphics used in special-effects houses and CAD design teams.





    According to Little, the Realizm card combines the "big iron" approach of the company's Intense 3D cards with the programmability of the Wildcat line. 3Dlabs has chosen not to release product-specific details of the cards themselves until the launch date, currently slated around the middle of the year. AGP 8X as well as PCI Express versions will be sold, he said.



    However, the Realizm cards themselves will be designed with CAD benchmarks such as the Viewperf metric in mind, and "we'll be delivering twice the performance of our competitors," Little said.







    Little acknowledged that the upcoming cards haven't been tested against the Nvidia Quadro FX4000 or other competing chips. "But I expect we'll still be head and shoulders above them," he said.



    Like other enterprise 3D chips, the Realizm architecture uses several discrete chips to render the complex scenes used by high-ed graphics designers. The Realizm architecture is made up of two chips: the Video Processing Unit, or VPU; and the Vertex/Scalability Unit, a hybrid AGP bridge and vertex offload engine. Two VPUs can be paired with a single VSU for additional performance, Little said.



    The Realizm line can be used with both Direct3D implementations as well as the OpenGL language 3Dlabs has championed.



    By itself, the VPU is relatively compact, boasting 150 million transistors on a 0.13-micron process. By comparison, Nvidia's recently released GeForce 6800 family includes 222 million transistors, also on a similar 0.13-micron process.



    Each VPU can support up to 512 Mbytes of physical GDDR-3 memory and 16 Gbytes of virtual memory through a 256-bit interface, meaning that cards with a gigabyte of frame-buffer memory are likely, Little said. All of the 3dlabs cards will use GDDR-3 memory, Little said.



    Internally, the Realizm graphics engine calculates information using a full 32 bits of precision. However, those results can also optionally be compressed using a scheme called "5s10" to store the 32-bit results with 16-bit precision using only 10 bits of data.



    3Dlabs doesn't describe its architecture in terms of pipelines or texturing units; instead, it calculates 3D data using an array of 16 vertex processing elements, 48 "fragment" or pixel shader engines, and back out through 16 pixel engines for anti-aliasing and compositing functions. Finally, the data is output using a pair of dual-channel DVI connections, to support the latest DVI flat-panel displays. The command processor, visibility processor (to determine which triangles can be "seen" by the user) fragment processor and pixel processing element will each contain a small number of dedicated registers, or cache.



    The fragment array or effects processing engine can support up to 256,000 lines of individual fragment instructions. "Nvidia and ATI are claiming infinite numbers here," Little said. "That's true if you count looping. If you don't, it's closer to 56,000."



    When combined with the VSU, the two VPU chips automatically turn off their I/O and vertex engines. Inside the VSU, however, is a pair of vertex engines, which pick up the slack and allow the Video Processing Unit to concentrate purely on effects processing.



    Previous two-chip graphics solutions have used one chip to process each line of horizontal resolution, passing off the next line to the other chip and so on, alternating back and forth. Instead, the Realizm chipset divvies up the scene into a "checkerboard" of 64 blocks, and each VPU processes every other block, Little said.





    I'm not complaining, but 3Dlabs has never been on top, but they have claimed it a few times. If it's available for Mac that would be great, but I would still prefer Nvidia. I've never heard a 3D guy give 3Dlabs a positive review.
  • Reply 11 of 12
    airslufairsluf Posts: 1,861member
    Kickaha and Amorph couldn't moderate themselves out of a paper bag. Abdicate responsibility and succumb to idiocy. Two years of letting a member make personal attacks against others, then stepping aside when someone won't put up with it. Not only that but go ahead and shut down my posting priviledges but not the one making the attacks. Not even the common decency to abide by their warning (afer three days of absorbing personal attacks with no mods in sight), just shut my posting down and then say it might happen later if a certian line is crossed. Bullshit flag is flying, I won't abide by lying and coddling of liars who go off-site, create accounts differing in a single letter from my handle with the express purpose to decieve and then claim here that I did it. Everyone be warned, kim kap sol is a lying, deceitful poster.



    Now I guess they should have banned me rather than just shut off posting priviledges, because kickaha and Amorph definitely aren't going to like being called to task when they thought they had it all ignored *cough* *cough* I mean under control. Just a couple o' tools.



    Don't worry, as soon as my work resetting my posts is done I'll disappear forever.
  • Reply 12 of 12
    matsumatsu Posts: 6,558member
    Well I don't know about 3d, but anyone doing 2-d work or video would be stoked to have a 1GB card that does full 32bit precision, can you imagine the extent to which that would free up system resources? Dropping it in would be like having a whole new system.
Sign In or Register to comment.