The Return of SLI !

Jump to First Reply
Posted:
in Future Apple Hardware edited January 2014
I just stumbled across this on the net and my jaw nearly hit the floor.



Now this is the kind of innovation that Apple needs to adopt to generate some excitement in their hardware again. Ars posted a blurb on this here as well.



In short, the new Alienware ALX has 2 graphics card PCI Express slots (x16 I assume) where you can plug in *any* two video cards (they don't even have to be from the same manufacturer!!) and have them chug away in parallel. One card works on one half of the screen while the other works on the other half of the screen and it will dynamically allocate the processing load. This is a little different from the old Voodoo SLI style of rendering in alternate lines. Talk about a win-win SLI solution, they've already demonstrated and alpha version of this machine at E3 and expect to have it available Q3-Q4 this year.



There are still some questions that need to be answered however - what is the real-life performance boost and what is the display quality like when one card has certain rendering features the other doesn't. That said, if the tech proves itself you can bet it will be adopted very quickly by the rest of the PC industry.



Apple!



Adopt this!



Adopt this now!!!



C.

Comments

  • Reply 1 of 18
    concordconcord Posts: 312member
    Here is an interview with Alienware which has more details on this new system. It's water cooled, has a 700+ watt PSU and is dual processor t'boot. Cha-ching $$$$!



    C.
     0Likes 0Dislikes 0Informatives
  • Reply 2 of 18
    hmurchisonhmurchison Posts: 12,459member
    Alienware the new kings of the useless PR release.



    Voodoo cards in SLI configuration helped back in the days when cards where far more limited than they are today. There is no need to run two GeForce 6800's or XT800s.



    This is also a proprietary method which would mean games would have to be coded for it. I'll give you 3 to 1 odds this thing dies a quiet death in a year or two.



    Remember the Bitboys and their grandiose claims about 3D performance used embedded DRAM? We've all heard it before.



    Todays games are not limited by fillrate but rather textures(high quality) and shading. Attempting an SLI for todays market is spinning your wheels.
     0Likes 0Dislikes 0Informatives
  • Reply 3 of 18
    stoostoo Posts: 1,490member
    Remember, 2x 6800 require four Molex hard drive style power connectors. Still, non-interleaved dual PCI Express sounds handy anyway.
     0Likes 0Dislikes 0Informatives
  • Reply 4 of 18
    concordconcord Posts: 312member
    Quote:

    Originally posted by hmurchison:

    Voodoo cards in SLI configuration helped back in the days when cards where far more limited than they are today. There is no need to run two GeForce 6800's or XT800s.



    There are games on the near horizon that will tax even the best graphics cards: Doom III, HL2, Stalker and so on. Also, there could be some pro advantages as well. What I see as one of it's benefits is that if you upgrade to a new vid card you can keep your old one in the second slot and it will still provide some benefit.

    Quote:

    This is also a proprietary method which would mean games would have to be coded for it. I'll give you 3 to 1 odds this thing dies a quiet death in a year or two.



    I haven't heard anything suggesting that games will have to be specifically coded for it. If anything, the interview link I posted seems to suggest just the opposite.

    Quote:

    Remember the Bitboys and their grandiose claims about 3D performance used embedded DRAM? We've all heard it before.



    Bitboys never produced anything at all as far as I can tell. While I admit we still need to see this in action before making any concrete statements, it's not as though Alienware is just some 2-bit company out of nowhere - they are the player in the PC enthusiast market. If they think it can be done, I'm willing to extend them a little slack.

    Quote:

    Todays games are not limited by fillrate but rather textures(high quality) and shading. Attempting an SLI for todays market is spinning your wheels.



    We'll know soon enough if this boasting holds any water...



    "Kevin Wasielewski - 3dMark 2003, SPECviewperf, UT 2003, FarCry, HL2; Performance gains from Alienware?s Video Array are projected at 50% or greater over present single-GPU video cards. Again, the solution scales as new graphics technology becomes available, so that performance gain should hold in future as well. Absolute performance gains will depend on the specific configuration of the system being tested and the benchmark used."



    Cheers,



    C.
     0Likes 0Dislikes 0Informatives
  • Reply 5 of 18
    hmurchisonhmurchison Posts: 12,459member
    Quote:

    What I see as one of it's benefits is that if you upgrade to a new vid card you can keep your old one in the second slot and it will still provide some benefit.



    If Alienware can make that work fine. Voodoo's SLI process required identical cards.





    Yes I'm assuming for the meantime that games might have to be coded specifically for it. Alienware is applying for a patent on this meaning it's unlikely that this is a feature of that PCIe spec.



    I'll be interested in seeing just how this works out. I'm hoping for the best. Some well heeled gamers will love it.
     0Likes 0Dislikes 0Informatives
  • Reply 6 of 18
    mmmpiemmmpie Posts: 628member
    Its not like 3DFX would have released a card which had a feature to let you use any other car you happened to own. They made cards, thats what they did, they wanted you to buy another, very expensive voodoo II.



    Alienware have no such business need. They want you to buy their machine, and if you have a video card already you can leverage it. This is the enthusiast market, they will love it.
     0Likes 0Dislikes 0Informatives
  • Reply 7 of 18
    concordconcord Posts: 312member
    Quote:

    I'll be interested in seeing just how this works out. I'm hoping for the best. Some well heeled gamers will love it.



    Though really, my thoughts in posting this was more along the lines of, "Is this a direction Apple should consider adopting?". We don't see that much in the way of "bleeding edge" hardware design out of Apple these days. It would be nice if they would take the initiative and run with something to generate a little more excitement on the hardware side of things.



    Quieter PSUs?

    Silent, water-cooled Powermacs?

    More wireless periferal options?

    Two button mice?





    Cheers,



    C.
     0Likes 0Dislikes 0Informatives
  • Reply 8 of 18
    hmurchisonhmurchison Posts: 12,459member
    Concord I wish. Apple has always had an aversion to games. They seem to have wanted to ensure that the Mac would never be perceived as a toy. All along the PC developed a great cottage industry in gaming that bloomed.



    Apple has made so many blunders it's amazing that they're still around. Gaming is the one area that Apple could enter and not piss off other developers. Gaming ensures that Mac users have forward momentum on tech like OpenGL and sound features.



    Apple could have made a difference here. I don't think the platform will ever be that good for games. But I do look with excitement to what games are pushing the envelope on the PC.
     0Likes 0Dislikes 0Informatives
  • Reply 9 of 18
    amorphamorph Posts: 7,112member
    This is not unlike IBM's (and others') technology for powering very high resolution screens, except that Alienware is taking advantage of PCI Express to offer a significantly more elegant implementation.



    The great advantage of this generally is that it gets around the resolution limitations of DVI, which we're rapidly approaching, without requiring a new standard or a new connector. There are at least some circumstances where this could be really advantageous for Mac users - we might not play games much, as a group, but we have been chewing up screen real estate for years and years.



    All in all an interesting technology, and it sounds like a robust implementation.



    Does anyone know where DVI 2, or whatever is intended to replace DVI, is right now? Those parts of the Digital Displays Working Group home page that are accessible say nothing, and VESA doesn't seem to have anything either.
     0Likes 0Dislikes 0Informatives
  • Reply 10 of 18
    airslufairsluf Posts: 1,861member
    Kickaha and Amorph couldn't moderate themselves out of a paper bag. Abdicate responsibility and succumb to idiocy. Two years of letting a member make personal attacks against others, then stepping aside when someone won't put up with it. Not only that but go ahead and shut down my posting priviledges but not the one making the attacks. Not even the common decency to abide by their warning (afer three days of absorbing personal attacks with no mods in sight), just shut my posting down and then say it might happen later if a certian line is crossed. Bullshit flag is flying, I won't abide by lying and coddling of liars who go off-site, create accounts differing in a single letter from my handle with the express purpose to decieve and then claim here that I did it. Everyone be warned, kim kap sol is a lying, deceitful poster.



    Now I guess they should have banned me rather than just shut off posting priviledges, because kickaha and Amorph definitely aren't going to like being called to task when they thought they had it all ignored *cough* *cough* I mean under control. Just a couple o' tools.



    Don't worry, as soon as my work resetting my posts is done I'll disappear forever.
     0Likes 0Dislikes 0Informatives
  • Reply 11 of 18
    oldmacfanoldmacfan Posts: 501member
    I can't see having two video cards powering one monitor without the monitor having multiple inputs.



    My hope is someone can take this technology and allow me to have a gamer card, and another type of card powering a multitude of different displays or projectors.
     0Likes 0Dislikes 0Informatives
  • Reply 12 of 18
    airslufairsluf Posts: 1,861member
    Kickaha and Amorph couldn't moderate themselves out of a paper bag. Abdicate responsibility and succumb to idiocy. Two years of letting a member make personal attacks against others, then stepping aside when someone won't put up with it. Not only that but go ahead and shut down my posting priviledges but not the one making the attacks. Not even the common decency to abide by their warning (afer three days of absorbing personal attacks with no mods in sight), just shut my posting down and then say it might happen later if a certian line is crossed. Bullshit flag is flying, I won't abide by lying and coddling of liars who go off-site, create accounts differing in a single letter from my handle with the express purpose to decieve and then claim here that I did it. Everyone be warned, kim kap sol is a lying, deceitful poster.



    Now I guess they should have banned me rather than just shut off posting priviledges, because kickaha and Amorph definitely aren't going to like being called to task when they thought they had it all ignored *cough* *cough* I mean under control. Just a couple o' tools.



    Don't worry, as soon as my work resetting my posts is done I'll disappear forever.
     0Likes 0Dislikes 0Informatives
  • Reply 13 of 18
    mmmpiemmmpie Posts: 628member
    The 3DFX system combined two signals into one cable, and I expect that the Alienware system will have to do something similar, with some sort of built in DVI combiner.



    This isnt a solution to the bandwidth problem for dvi. The two cards can only use half of the dvi bandwidth each ( so that it can be combined into one DVi cable ).



    IBM have hi-res monitors which require two or more video inputs to drive. This has been problematic, but nvidia's latest chips allow the video outputs to be synchronised so that they dont have artifacts when used to drive the same display. However, you still need to plug in two or four cables to make it work.
     0Likes 0Dislikes 0Informatives
  • Reply 14 of 18
    aquaticaquatic Posts: 5,602member
    Quote:

    Voodoo cards in SLI configuration helped back in the days when cards where far more limited than they are today. There is no need to run two GeForce 6800's or XT800s.



    No one will ever need more than 640k of RAM!!



    And computers get bigger...It seems like mainstream desktops keep getting bigger even as subnotes and Mini/Micro ATX get smaller. Ever noticed this? Classic Mac, IIsi. iMac, PowerMac. PowerMac G5, 20" iMac FP. Duo to 17" PB.
     0Likes 0Dislikes 0Informatives
  • Reply 15 of 18
    talksense101talksense101 Posts: 1,738member
    Playstation 2 or an XBox is a better gaming rig.



    I personally think 750watts is too much power consumption. Wouldn't it be better if graphic card manufacturers used two GPUs on their boards and took care of the details which Alienware is now pushing on to other devices and drivers?
     0Likes 0Dislikes 0Informatives
  • Reply 16 of 18
    ti fighterti fighter Posts: 863member
    Quote:

    to significantly enhance the

    performance of graphics intensive applications including extreme gaming





    haha extreme gaming oooooh
     0Likes 0Dislikes 0Informatives
  • Reply 17 of 18
    stecsstecs Posts: 43member
    Multiple cards might allow 3D projection at reasonable resolution..
     0Likes 0Dislikes 0Informatives
  • Reply 18 of 18
    msanttimsantti Posts: 1,377member
    Alienware is just throwing more silicon at the problem.



    And its really not that practical for anyone but the utmost in geekdom.
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.