What are you talking about 2 downgraded 16X PCI-E slots? Where are your facts?
I think your refering to the original Nforce 4 PCI-E that intel was using that was splitting one 16X slot.
If look at the first AMD version of the Nforce SLI they have been using a dual
? NVIDIA nForce Professional 2200 &
? NVIDIA nForce Professional 2050 (I/O-4) - Connected to CPU 2
?Thats Two x16 PCI Express FULL SPEED slots
- Slot1 PCI-E x16 from nForce? Prof. 2200
- Slot3 PCI-E x16 from nForce? Prof. 2050
So how is that getting downgraded to 8x?
Posts: 2222
hi onlooker, i did more research, yes each of the two pciexpress slots theoretically is 16x full. sandrasoft confirms this on my machine to some degree. i imagine if i had two video cards driving two monitors separately that might work in the 2 x 16 PCIExpress mode
however in most boards, for SLI mode, eg. my asus a8n sli, there is a hardware 'paddle' (like a big jumper of sorts) that you flip around to convert the two 16x slots into two 8x slots when setting up sli. why they did this i am not sure but i am sure nvidia have their reasons for setting it up this way on the intial SLI rollout
SLI full 2 x 16 will be supported in the next rev's of nForce4 SLI 16x boards as mentioned in this anandtech article:
What are you talking about 2 downgraded 16X PCI-E slots? Where are your facts?
I think your refering to the original Nforce 4 PCI-E that intel was using that was splitting one 16X slot.
If look at the first AMD version of the Nforce SLI they have been using a dual ? NVIDIA nForce Professional 2200 &
? NVIDIA nForce Professional 2050 (I/O-4) - Connected to CPU 2
?Thats Two x16 PCI Express FULL SPEED slots
- Slot1 PCI-E x16 from nForce? Prof. 2200
- Slot3 PCI-E x16 from nForce? Prof. 2050
So how is that getting downgraded to 8x?
Posts: 2222
When you DO SLI... the clock speed of the slots are downgraded from 16x to 8x.
Quote from NVidia NForce4
Quote:
Last on the list is the top-end nForce 4 SLI, which includes all the Ultra edition's qualities, but adds dual graphics card support. The SLI's PCI Express 16x interface is flexible, allowing its user to mount a single graphics card under 16x mode, and will allow two PCI Express graphics cards running under 8x mode. As the 8x PCIe is still good for 4GB/s bi-directional bandwidth, there will be no performance bottleneck with today's graphics cards.
It was a cached quote on google. Finding original source....
The reason is..... PCI-Express only has 20 lanes. Using SLI the cards must share these lanes with the north bridge. So with SLI the lanes must be divided in half which in effect downgrades the bandwidth to 8x. Now some dual opteron boards can handle dual 16x buses because there are in fact 2 chipsets on these boards. But when it comes to regular AMD 64 procs... they are dual 8x. So yes you can get it with a dual chipset board, like the ones you listed... but who will fork out the money for those?
Is it really worth it for a dual chipset? To rendering farms... yah... but consumers... not even close... and that translates into fewer developers working on SLI, in turn... less support.
What are you talking about 2 downgraded 16X PCI-E slots? Where are your facts?
I think your refering to the original Nforce 4 PCI-E that intel was using that was splitting one 16X slot.
If look at the first AMD version of the Nforce SLI they have been using a dual
? NVIDIA nForce Professional 2200 &
? NVIDIA nForce Professional 2050 (I/O-4) - Connected to CPU 2
?Thats Two x16 PCI Express FULL SPEED slots
- Slot1 PCI-E x16 from nForce? Prof. 2200
- Slot3 PCI-E x16 from nForce? Prof. 2050
So how is that getting downgraded to 8x?
Posts: 2222
hi onlooker, i did more research, yes each of the two pciexpress slots theoretically is 16x full. sandrasoft confirms this on my machine to some degree. i imagine if i had two video cards driving two monitors separately that might work in the 2 x 16 PCIExpress mode
however in most boards, for SLI mode, eg. my asus a8n sli, there is a hardware 'paddle' (like a big jumper of sorts) that you flip around to convert the two 16x slots into two 8x slots when setting up sli. why they did this i am not sure but i am sure nvidia have their reasons for setting it up this way on the intial SLI rollout
SLI full 2 x 16 will be supported in the next rev's of nForce4 SLI 16x boards as mentioned in this anandtech article:
That's not the next board for anything other that Intel processors. That's been out for the Opteron forever.
It sounds like your using one of the early boards, or an Athlon, or intel version. The Board I am referring to was available immediately when The Opteron Motherboard was available with SLI. It's been like over a year.
Even that anandtech article only mentions Intel, and Dell not using AMD anytime soon. I think it's a shifty article that is trying to lump All the SLI boards into the original Intel versons problem. I knew about this way before the First intel SLI board was released, and I also new it was done properly when the Opteron version came out.
Is it really worth it for a dual chipset? To rendering farms... yah... but consumers... not even close... and that translates into fewer developers working on SLI, in turn... less support.
Gonna nitpick a little here.
In a rendering farm you don't necessarily even need a video card in any of the boxes, you can SSH in and render, CPU's do the work... What you're thinking about is a graphics workstation.
In a rendering farm you don't necessarily even need a video card in any of the boxes, you can SSH in and render, CPU's do the work... What you're thinking about is a graphics workstation.
That's correct. The rendering capabilities that SLI offers for workstation advantages is real time on screen rendering in your normal working, or creation environment. Moving, manipulating, and generally forging millions polygons is seriously tasking on a systems graphics, and that is even before shading.
I think SLI has potential value to offer in gaming - exactly when the cards can pull it off totally transparently. Only they are probably never going to be able to do that due to the crazy optimization that goes down in games. And there's a network effect here: as long as 1% of the market has SLI, you really can't put in a whole new level of eyecandy for that crowd. They'll get more anti-aliasing, more resolution, and that's it. To take advantage of the added resolution, you need a big resolution screen. Which, again, few will afford.
The nicest thing you can do while tinkering with a system is to silence it and then overclock if you still can after silencing. Doesn't cost much, either.. of course you can go to extremes in this like in everything else.
I think SLI has potential value to offer in gaming - exactly when the cards can pull it off totally transparently. Only they are probably never going to be able to do that due to the crazy optimization that goes down in games. And there's a network effect here: as long as 1% of the market has SLI, you really can't put in a whole new level of eyecandy for that crowd. They'll get more anti-aliasing, more resolution, and that's it. To take advantage of the added resolution, you need a big resolution screen. Which, again, few will afford.
The nicest thing you can do while tinkering with a system is to silence it and then overclock if you still can after silencing. Doesn't cost much, either.. of course you can go to extremes in this like in everything else.
Right now gaming is the majority of the SLI market, but as gaming stands on the Mac it would be better suited, and touted as a highend 3D workstation solution than a gaming machine. Games are pretty limited, and late on Mac's typically.
Right now gaming is the majority of the SLI market, but as gaming stands on the Mac it would be better suited, and touted as a highend 3D workstation solution than a gaming machine. Games are pretty limited, and late on Mac's typically.
I'm willing to bet that will all change soon enough
I wouldn't get my hopes up too high, but I think there will be a few more. I really don't see Macs having an equal amount of games any time too soon. When Apple goes intel if they can acquire 15% share of computer users even that is still a limited amount because what percent of 15% are gamers?
Although in todays world just about all kids are gamers, but with the next gen. consoles I see PC game playing, and programming declining, and console programming being offered at more tech schools in the near future. Heck, I already have a down payment on an XBOX 360, and I doubt I'll be playing on anything other than that or my PS3 when that comes out.
Which more or less is why I think SLI will probably be pushed to mature more rapidly in the direction of a pro graphics oriented solution in the immediate future.
Nobody is going to switch to OS X for games, and here are 3 reasons why:
1) Not enough games due to
---A. Crap video cards included in most systems (so porting is pointless because even most mac users can't play many games)
---B. Mac OS X has a small market share
2) OS X is MUCH slower than windows for gaming.. don't blame me! it's been benchmarked!
3) An expensive box like a Mac can't really compete with one of these new consoles (which, btw, will arrive sooner than the intel macs)
Since Apple is being fairly single minded (mind on the money and the money on the mind), there's no way in hell they're going to let other computer companies make their boxes for them. This would take care of problems 1A and 3, or half the problem.
But Mac users and gamers never did like eachother anyway.. me being both I feel like Johnathan Swift.
But Mac users and gamers never did like eachother anyway.. me being both I feel like Johnathan Swift.....
well i got an AMD-nVidia6600gt (cybermonkey wheres my poem - i'll keep asking till i get it) and got this whole 'macs suck for games, mac don't have good gpu' stuff out of my system and satisfied my *curiousity* ... i feel like i've been unfaithful but i've always gone both ways in myriad senses of the word
2) OS X is MUCH slower than windows for gaming.. don't blame me! it's been benchmarked!
I think that is because of Apple needs to use a better understanding between OpenGL, and the OS. I think there will be some big changes in the future with Intel Macs, and graphics which will lead to a new way that the OS has it's OGL structured for gaming, and app specific graphics, but it seems the OS has it's OGL built deep within to compliment the OS's features it self, which is great, but they need both to satisfy the end user sufficiently, and completely.
I think that is because of Apple needs to use a better understanding between OpenGL, and the OS. I think there will be some big changes in the future with Intel Macs, and graphics which will lead to a new way that the OS has it's OGL structured for gaming, and app specific graphics, but it seems the OS has it's OGL built deep within to compliment the OS's features it self, which is great, but they need both to satisfy the end user sufficiently, and completely.
Perhaps the problems that OSX on PPC was having with creating and releasing threads, and which appears to be solved on Intel will be a big step forward for Mac gaming, perhaps a huge step. I believe that the folks that say that OSX on Intel 'feels' quicker are seeing this, because when those same boxes are thrown a huge job to do they crumble, and for those that like benches checkout how bad OSX on PPC is at threads, but everything else is fine. Games should do well if this problem has been solved on Intel HW.
well, i am tending towards the otherway. here are the challenges with the above 6800gt dual and 6600gt dual from gigabyte:
1. no distinct improvements over 2 separate SLI'ed cards in benchmarks (it benches the same as two separate SLI'ed cards)
2. you can only use a gigabyte motherboard AFAIK
3. the dual is big, heavy and noisy, compared with say, two MSI 6600gts which have the 'copperflower big-fan' type of heatsink design which IMHO is the best and quietest type of design around. eg. my zalman pure-copper-flower-big-fan cpu cooler.
admittedly, the dual-gpu is slightly cheaper-per-GPU than two cards.......
edit: the dual-gpu on one card still is only the equivalent of dual x8 down a 1x16 pci express slot (which one would think, of course, since there is no 1x32 pci express slot at the moment)
Comments
What are you talking about 2 downgraded 16X PCI-E slots? Where are your facts?
I think your refering to the original Nforce 4 PCI-E that intel was using that was splitting one 16X slot.
If look at the first AMD version of the Nforce SLI they have been using a dual
? NVIDIA nForce Professional 2200 &
? NVIDIA nForce Professional 2050 (I/O-4) - Connected to CPU 2
?Thats Two x16 PCI Express FULL SPEED slots
- Slot1 PCI-E x16 from nForce? Prof. 2200
- Slot3 PCI-E x16 from nForce? Prof. 2050
So how is that getting downgraded to 8x?
Posts: 2222
hi onlooker, i did more research, yes each of the two pciexpress slots theoretically is 16x full. sandrasoft confirms this on my machine to some degree. i imagine if i had two video cards driving two monitors separately that might work in the 2 x 16 PCIExpress mode
however in most boards, for SLI mode, eg. my asus a8n sli, there is a hardware 'paddle' (like a big jumper of sorts) that you flip around to convert the two 16x slots into two 8x slots when setting up sli. why they did this i am not sure but i am sure nvidia have their reasons for setting it up this way on the intial SLI rollout
SLI full 2 x 16 will be supported in the next rev's of nForce4 SLI 16x boards as mentioned in this anandtech article:
http://www.anandtech.com/cpuchipsets...oc.aspx?i=2493
Originally posted by sunilraman
Originally posted by MacRonin
I would have to go with the somewhat hot twins...
ah... and therein lies the marketing appeal of SLI.
Are the twins in the market too?
Originally posted by onlooker
What are you talking about 2 downgraded 16X PCI-E slots? Where are your facts?
I think your refering to the original Nforce 4 PCI-E that intel was using that was splitting one 16X slot.
If look at the first AMD version of the Nforce SLI they have been using a dual ? NVIDIA nForce Professional 2200 &
? NVIDIA nForce Professional 2050 (I/O-4) - Connected to CPU 2
?Thats Two x16 PCI Express FULL SPEED slots
- Slot1 PCI-E x16 from nForce? Prof. 2200
- Slot3 PCI-E x16 from nForce? Prof. 2050
So how is that getting downgraded to 8x?
Posts: 2222
When you DO SLI... the clock speed of the slots are downgraded from 16x to 8x.
Quote from NVidia NForce4
Last on the list is the top-end nForce 4 SLI, which includes all the Ultra edition's qualities, but adds dual graphics card support. The SLI's PCI Express 16x interface is flexible, allowing its user to mount a single graphics card under 16x mode, and will allow two PCI Express graphics cards running under 8x mode. As the 8x PCIe is still good for 4GB/s bi-directional bandwidth, there will be no performance bottleneck with today's graphics cards.
It was a cached quote on google. Finding original source....
The reason is..... PCI-Express only has 20 lanes. Using SLI the cards must share these lanes with the north bridge. So with SLI the lanes must be divided in half which in effect downgrades the bandwidth to 8x. Now some dual opteron boards can handle dual 16x buses because there are in fact 2 chipsets on these boards. But when it comes to regular AMD 64 procs... they are dual 8x. So yes you can get it with a dual chipset board, like the ones you listed... but who will fork out the money for those?
Also Crossfire will only support dual 8x Crossfire dual 8x
Is it really worth it for a dual chipset? To rendering farms... yah... but consumers... not even close... and that translates into fewer developers working on SLI, in turn... less support.
Originally posted by sunilraman
Originally posted by onlooker
What are you talking about 2 downgraded 16X PCI-E slots? Where are your facts?
I think your refering to the original Nforce 4 PCI-E that intel was using that was splitting one 16X slot.
If look at the first AMD version of the Nforce SLI they have been using a dual
? NVIDIA nForce Professional 2200 &
? NVIDIA nForce Professional 2050 (I/O-4) - Connected to CPU 2
?Thats Two x16 PCI Express FULL SPEED slots
- Slot1 PCI-E x16 from nForce? Prof. 2200
- Slot3 PCI-E x16 from nForce? Prof. 2050
So how is that getting downgraded to 8x?
Posts: 2222
hi onlooker, i did more research, yes each of the two pciexpress slots theoretically is 16x full. sandrasoft confirms this on my machine to some degree. i imagine if i had two video cards driving two monitors separately that might work in the 2 x 16 PCIExpress mode
however in most boards, for SLI mode, eg. my asus a8n sli, there is a hardware 'paddle' (like a big jumper of sorts) that you flip around to convert the two 16x slots into two 8x slots when setting up sli. why they did this i am not sure but i am sure nvidia have their reasons for setting it up this way on the intial SLI rollout
SLI full 2 x 16 will be supported in the next rev's of nForce4 SLI 16x boards as mentioned in this anandtech article:
http://www.anandtech.com/cpuchipsets...oc.aspx?i=2493
That's not the next board for anything other that Intel processors. That's been out for the Opteron forever.
It sounds like your using one of the early boards, or an Athlon, or intel version. The Board I am referring to was available immediately when The Opteron Motherboard was available with SLI. It's been like over a year.
Even that anandtech article only mentions Intel, and Dell not using AMD anytime soon. I think it's a shifty article that is trying to lump All the SLI boards into the original Intel versons problem. I knew about this way before the First intel SLI board was released, and I also new it was done properly when the Opteron version came out.
Originally posted by emig647
Is it really worth it for a dual chipset? To rendering farms... yah... but consumers... not even close... and that translates into fewer developers working on SLI, in turn... less support.
Gonna nitpick a little here.
In a rendering farm you don't necessarily even need a video card in any of the boxes, you can SSH in and render, CPU's do the work... What you're thinking about is a graphics workstation.
Originally posted by Gon
Gonna nitpick a little here.
In a rendering farm you don't necessarily even need a video card in any of the boxes, you can SSH in and render, CPU's do the work... What you're thinking about is a graphics workstation.
That's correct. The rendering capabilities that SLI offers for workstation advantages is real time on screen rendering in your normal working, or creation environment. Moving, manipulating, and generally forging millions polygons is seriously tasking on a systems graphics, and that is even before shading.
The nicest thing you can do while tinkering with a system is to silence it and then overclock if you still can after silencing. Doesn't cost much, either.. of course you can go to extremes in this like in everything else.
Originally posted by Gon
I think SLI has potential value to offer in gaming - exactly when the cards can pull it off totally transparently. Only they are probably never going to be able to do that due to the crazy optimization that goes down in games. And there's a network effect here: as long as 1% of the market has SLI, you really can't put in a whole new level of eyecandy for that crowd. They'll get more anti-aliasing, more resolution, and that's it. To take advantage of the added resolution, you need a big resolution screen. Which, again, few will afford.
The nicest thing you can do while tinkering with a system is to silence it and then overclock if you still can after silencing. Doesn't cost much, either.. of course you can go to extremes in this like in everything else.
Right now gaming is the majority of the SLI market, but as gaming stands on the Mac it would be better suited, and touted as a highend 3D workstation solution than a gaming machine. Games are pretty limited, and late on Mac's typically.
Originally posted by onlooker
Right now gaming is the majority of the SLI market, but as gaming stands on the Mac it would be better suited, and touted as a highend 3D workstation solution than a gaming machine. Games are pretty limited, and late on Mac's typically.
I'm willing to bet that will all change soon enough
Although in todays world just about all kids are gamers, but with the next gen. consoles I see PC game playing, and programming declining, and console programming being offered at more tech schools in the near future. Heck, I already have a down payment on an XBOX 360, and I doubt I'll be playing on anything other than that or my PS3 when that comes out.
Which more or less is why I think SLI will probably be pushed to mature more rapidly in the direction of a pro graphics oriented solution in the immediate future.
1) Not enough games due to
---A. Crap video cards included in most systems (so porting is pointless because even most mac users can't play many games)
---B. Mac OS X has a small market share
2) OS X is MUCH slower than windows for gaming.. don't blame me! it's been benchmarked!
3) An expensive box like a Mac can't really compete with one of these new consoles (which, btw, will arrive sooner than the intel macs)
Since Apple is being fairly single minded (mind on the money and the money on the mind), there's no way in hell they're going to let other computer companies make their boxes for them. This would take care of problems 1A and 3, or half the problem.
But Mac users and gamers never did like eachother anyway.. me being both I feel like Johnathan Swift.
But Mac users and gamers never did like eachother anyway.. me being both I feel like Johnathan Swift.....
well i got an AMD-nVidia6600gt (cybermonkey wheres my poem - i'll keep asking till i get it) and got this whole 'macs suck for games, mac don't have good gpu' stuff out of my system and satisfied my *curiousity* ... i feel like i've been unfaithful but i've always gone both ways in myriad senses of the word
Originally posted by slughead
2) OS X is MUCH slower than windows for gaming.. don't blame me! it's been benchmarked!
I think that is because of Apple needs to use a better understanding between OpenGL, and the OS. I think there will be some big changes in the future with Intel Macs, and graphics which will lead to a new way that the OS has it's OGL structured for gaming, and app specific graphics, but it seems the OS has it's OGL built deep within to compliment the OS's features it self, which is great, but they need both to satisfy the end user sufficiently, and completely.
Originally posted by onlooker
I think that is because of Apple needs to use a better understanding between OpenGL, and the OS. I think there will be some big changes in the future with Intel Macs, and graphics which will lead to a new way that the OS has it's OGL structured for gaming, and app specific graphics, but it seems the OS has it's OGL built deep within to compliment the OS's features it self, which is great, but they need both to satisfy the end user sufficiently, and completely.
Perhaps the problems that OSX on PPC was having with creating and releasing threads, and which appears to be solved on Intel will be a big step forward for Mac gaming, perhaps a huge step. I believe that the folks that say that OSX on Intel 'feels' quicker are seeing this, because when those same boxes are thrown a huge job to do they crumble, and for those that like benches checkout how bad OSX on PPC is at threads, but everything else is fine. Games should do well if this problem has been solved on Intel HW.
enjoys.
dual nVidia 6800GT on one beastly card. YEAH.
http://www.3dvelocity.com/reviews/gigabyte3d1/gv3d1.htm
"...It IS big and it IS clever, but it's also heavy, occasionally loud and choosy about where it lives..."
compared to a crucial x850xt radeon
chunky heatsink as we predicted
the frontside of the card with chips and heatsink removed
1. no distinct improvements over 2 separate SLI'ed cards in benchmarks (it benches the same as two separate SLI'ed cards)
2. you can only use a gigabyte motherboard AFAIK
3. the dual is big, heavy and noisy, compared with say, two MSI 6600gts which have the 'copperflower big-fan' type of heatsink design which IMHO is the best and quietest type of design around. eg. my zalman pure-copper-flower-big-fan cpu cooler.
admittedly, the dual-gpu is slightly cheaper-per-GPU than two cards.......
edit: the dual-gpu on one card still is only the equivalent of dual x8 down a 1x16 pci express slot (which one would think, of course, since there is no 1x32 pci express slot at the moment)