SLI to make a big comeback - should Apple adopt?

2»

Comments

  • Reply 21 of 27
    amorphamorph Posts: 7,112member
    Apple has a couple of options here. If they're just using the GPU for its computational abilities, and you're using PCI Express instead of AGP, you can solder a chipset to the board that does nothing but crunch numbers and then leave a slot (or, since PCIe allows this) a port for a full graphics card to drive the display. This approach scales down; for some time after it's uninteresting as a graphics chipset, and when it's nice and dirt cheap, an FX5200 could still sit on the board, perhaps even integrated into a controller, and spend all its time crunching data for the OS while some newer, shinier GPU powered the display.



    This ignores SLI, which I think Apple can afford to do for the time being, while using multiple GPUs in the way that Apple seems eager to.



    Long term, I think the problem of huge cards like the 6800 can be solved either by waiting for the tech to get mature enough not to need all that cooling, or by externalizing the graphics card — PCI Express works just fine over cables, so you could connect the latest and greatest graphics driver as if it were a FW device.
  • Reply 22 of 27
    Next revision of PowerMacs need to have PCIe to keep up. I really think that SLI needs to be implemented in the very high end powermac,



    Dual 3Ghz G5

    2 x16 PCIe



    Dual 2.5Ghz

    1 x16 PCIe



    Dual 2.0Ghz

    1 x16 PCIe



    Dual 2.0Ghz

    1 8x AGP or x12 PCIe



    Video designers as well as software developers could use this power to totally take the burden off of the main CPU. Of course with Quartz and Core Image/Video, that's what it's doing. Apple needs to stay high-end, and top of the game.



    We also need ATI to jump in and supply x800's!
  • Reply 23 of 27
    hmurchisonhmurchison Posts: 12,419member
    I haven't read any information from Nvidia about the SLI but I do have a question.



    Say I have two GPU cards installed with 256MB of memory each. Is OSX going to be able to utilze the full 512MB for different data or are we duplicating the data in each cards memory?



    From nvidia.com



    GPU communication: Featuring an intelligent communication protocol embedded in the GPU, and a high-speed digital interface, NVIDIA SLI-based GPUs can easily communicate with one another without the overhead associated with a bus-only implementation. In addition, unique software algorithms efficiently share the workload to deliver unbelievable performance.





    So the cards are "unique" in that they are special SLI configurations with the bridge logic.



    Will all board manufacturers support it (SLI connector)?

    Every high-end PCI Express reference board from NVIDIA will support SLI. It will be up to the add-in-card partner to determine if they follow NVIDIA?s reference design. The current products that support this are the GeForce 6800 Ultra, GeForce 6800 GT, and Quadro FX 3400.




    Nice. Nvidia wants to spur adoption. Every board will be SLI enabled as long as it's PCI express. The unified drivers will support SLI or standard cards.







    Is the frame-buffer memory shared, i.e., can I access a 512MB frame-buffer if I have two GPUs with 256MB each?



    No, each GPU maintains its own frame-buffer. Rendering data, such as texture and geometry information, is duplicated.




    Answered my question, found this on the Developers FAQ





    So from my fledgling mind I hope to see the future unfold like this.



    1.SLI takes off because it's the easiest way to increase speed without spending gobs of money.



    2.GPU process shrinks eventually allow Dual GPU on one card but SLI capabilities still remain. Now we have Quad GPU processing. Frame buffer coherency is created so that both GPU's can share say a 512MB frame buffer between the two and in a SLI configuration as well albeit at a slight performance hit.



    3. Power and heat problems create a radical shift in video processing. Soon you have a high speed graphics port available on your workstation that allows for you to hook up a 2U rack consisting of up to 4 Cards. Easing power and heat requirements.



    This might sound crazy but when you consider that something like an



    Art Pure PCI Raytracing card costs $3500 you could easily see the value in have a bank of GPUs replacing this card with their easier programmability.



    Ok I was skeptical at first but this SLI idea is pretty damn cool. I don't think Apple is going to hop on board for at least another year but I think when they go PCIe they will be looking at this technology. After all they did go DVI and so the door is wide open.
  • Reply 24 of 27
    Quote:

    Originally posted by hmurchison

    1.SLI takes off because it's the easiest way to increase speed without spending gobs of money.



    Buying two graphic cards is still quite expensive!



    Quote:

    3. Power and heat problems create a radical shift in video processing. Soon you have a high speed graphics port available on your workstation that allows for you to hook up a 2U rack consisting of up to 4 Cards. Easing power and heat requirements.



    I definitely follow you on this one : progressively, the graphic cards have grown in complexity, becoming a "mobo on the mobo", a computer in a computer if you prefer. Which is very neat, conceptually, but a real pain for heat concerns. With PCIe allowing the graphics "card" to be plugged elsewhere than on the mobo, I believe we'll go one step further in that whole "informatics recursivity" thing, which will definitely lead, in powerful graphic workstations, to what hmurchison told us about.
  • Reply 25 of 27
    hmurchisonhmurchison Posts: 12,419member
    The One to Rescue



    Expensive. I know I blew SLI off as a gamers thing as a first reaction but then I started to think about all those people that have seats of Maya or XSI or Max. They can justify spending $2000 for a Quadro FX 4000 and sometimes augmenting that with Render Farms which can easily go $10k plus.



    I'm sure Nvidia would love to tap into those dollar$. The question is how many GPUs would it take to replace high end DSP render cards from Art and others?



    Also GDDR3 memory is expensive. Since the frame buffers merely duplicate the same data will Nvidia be able to ship retail configurations where the GDDR3 memory is only on one card. Ahhh I can't even find a page that'll give me an idea about how much 256MB of GDDR3 memory would cost. I'm thinking it's probably around $300 or so but I could be way off.
  • Reply 26 of 27
    Quote:

    Originally posted by hmurchison

    Ahhh I can't even find a page that'll give me an idea about how much 256MB of GDDR3 memory would cost. I'm thinking it's probably around $300 or so but I could be way off.



    According to a teacher of mine who works in the whole semi-conductor industry, memory is about 40-60% of the cost of a graphic card, so $300 must be a good guess.
  • Reply 27 of 27
    onlookeronlooker Posts: 5,252member
    Quote:

    Originally posted by zpapasmurf

    You figured wrong.



    If you look at the benchmarks the 6800 ultra gets outperformed by the x800 on 90% of the tests run.







    How do you figure? There is no DX9 on OSX, and ATI OpenGL is tragic. Those statistics only apply to DX9, and we don't use it. I doubt we'll ever see iDX9, or OS-X-DX9. Why would we? I'm not as impressed with DX tech, as SGI's OpenGL. The SGI OpenGL GLU is solely a graphics library which is exactly what it should be. Apple already has Audio Core, or Core Audio (I forget which direction) and now they introduce Image Core, or Core Image. (whatever) What Apple has in the Mac arsenal for game, and graphics (accept graphics cards) looks much better than what the x86, and DX10 is going to have in the future.

    Xcode has included OpenGL technology, tools, and Shader documentation, to Mac users, and the fact that core-image is reliant on OpenGL makes me think Apple will be sticking with OpenGL for quite some time. Mac OS has always rallied around OpenGL.

    Nvidia's mid range Graphics cards outperform ATI's highend offerings in OpenGL benchmarks using lesser cards, and technologies, and Nvidia highend cards kick the crap out of ATI's on 100% of the tests run.

    So I wouldn't get that hyped about those ATI cards. Chances are you'll never see the type of performance out of them that you think you will. If you want performance on the Macintosh stick with Nvidia. They used to be the soul of SGI's integrated graphics Motherboard designs in their Super Highend Workstations, and they come correct. WERD!



    -----------------------------------------------------------------



    Quote:

    Thread original title

    SLI to make a big comeback - should Apple adopt?






    YES!
Sign In or Register to comment.