or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Screw SLI, when do we get this in our Macs...?!?
New Posts  All Forums:Forum Nav:

Screw SLI, when do we get this in our Macs...?!? - Page 2

post #41 of 162
for sure the nvidia chip ca be usaed on a single card, i saw it. nvidia itself say that it not their primary focus, but encurage third party to explore that direction. they talk about more than 40 chip sli, also on single card. that can be raelly impressive
Close the world, open the Next
Reply
Close the world, open the Next
Reply
post #42 of 162
Quote:
Originally posted by MacRonin
....Damn it! I want both of my 16x slots running at 16x!...

here you go: 2 PCIexpress 16x FULL SPEED slots
http://www.tyan.com/products/html/thunderk8we.html

this will filter into the SLI mainstream, athlon64 (not opteron) and intel pentium boards have been announced, for 2006 or maybe even christmas 2005

re: this stuff not being available on a mac: for us$1000 you can get all this out of your system and then go back to using a mac most of the time knowing what you're (not) missing out on
post #43 of 162
Quote:
Originally posted by gelosilente
for sure the nvidia chip ca be usaed on a single card, i saw it. nvidia itself say that it not their primary focus, but encurage third party to explore that direction. they talk about more than 40 chip sli, also on single card. that can be raelly impressive

at the moment cooling might be a problem though. two cards allows for using existing shipping chips and have two fans, one for each GPU. i could be stretching here but a dualcore 7800gtx would require a bigass CPU-heatsink-fan type thing. which is hard to put on a vga board.
post #44 of 162
Quote:
Originally posted by sunilraman
at the moment cooling might be a problem though. two cards allows for using existing shipping chips and have two fans, one for each GPU. i could be stretching here but a dualcore 7800gtx would require a bigass CPU-heatsink-fan type thing. which is hard to put on a vga board.

for sure, but in theory....
Close the world, open the Next
Reply
Close the world, open the Next
Reply
post #45 of 162
i'm sure they'll figure out how to make something like this for a dual or quad-core 7800 GTX at the moment its weight would rip existing vga boards in half

post #46 of 162
Quote:
Originally posted by sunilraman
i'm sure they'll figure out how to make something like this for a dual or quad-core 7800 GTX at the moment its weight would rip existing vga boards in half


FYI I love that damn cpu cooler. I have it in a coolermaster centurion 5 case... the one with the front vented with filters. It sucks the air from the front to the back like the g5. It is the coolest air system I have built and have overclocked a 3.6ghz p4 to 4 ghz. That's how all cpu coolers should be.

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #47 of 162
Originally posted by emig647
FYI I love that damn cpu cooler. I have it in a coolermaster centurion 5 case... the one with the front vented with filters. It sucks the air from the front to the back like the g5. It is the coolest air system I have built and have overclocked a 3.6ghz p4 to 4 ghz. That's how all cpu coolers should be.


hey cool. thanks for the review. it definitely looks badass, albeit somewhat ridiculously so. i'll settle for the pure copper flower zalman thing because my msi nvidia6600gt (cybermonkey where is my poem??) has a similar 'look'




edit: also, you cannot believe how dusty it is where i am living now (long story). that zalman is the max "closeness" of fins i can handle any tighter and dust would just clog up. an ordinary table fan running after 5 days, you can see blackish dust stuff on the table fan

yeah i know i should look into filtered cases or use the aircon filters at my local Ace hardware... (they have one in my local Malaysistan mall... they're everywhere!!!)
post #48 of 162
an old toothbrush is your fan*. as i have learnt quite quickly \ *freudian slip. i meant 'friend'
post #49 of 162
msi nvidia 6600gt copper-flower fan:
post #50 of 162
I'm pretty sure this is the Opteron board that Alienware, and BOXX use.
http://www.tyan.com/products/html/thunderk8we.html
Very fast. Pretty much the option for ultimate 3D performance at present.
Your only "not" missing out on it if your life doesn't require having the fastest in graphics available at all times. SLI is only going to continue to double your 3D performance until they make it faster. No matter how fast your setup is using one card, SLI can double your performance in most cases.

Still curious to see what Nvidia has in mind for SLI v2.
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
post #51 of 162
Quote:
Originally posted by onlooker
I'm pretty sure this is the Opteron board that Alienware, and BOXX use.
http://www.tyan.com/products/html/thunderk8we.html
Very fast. Pretty much the option for ultimate 3D performance at present.
Your only "not" missing out on it if your life doesn't require having the fastest in graphics available at all times. SLI is only going to continue to double your 3D performance until they make it faster. No matter how fast your setup is using one card, SLI can double your performance in most cases.

Still curious to see what Nvidia has in mind for SLI v2.

People keep saying double your performance. SLI doesn't double your performance or come close to doubling the performance. The PCI-E buses are downgraded to 8x slots as opposed to 16x slots. The latency in time of communications between the cards is also another bottleneck. Just like dual processors... doesn't double performance... but dual core is faster... why? because of less latency and an inline memory controller.

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #52 of 162
Quote:
Originally posted by emig647
People keep saying double your performance. SLI doesn't double your performance or come close to doubling the performance. The PCI-E buses are downgraded to 8x slots as opposed to 16x slots. The latency in time of communications between the cards is also another bottleneck. Just like dual processors... doesn't double performance... but dual core is faster... why? because of less latency and an inline memory controller.

I would've said that but nobody likes a party pooper :P
Mac user since before you were born.
Reply
Mac user since before you were born.
Reply
post #53 of 162
Quote:
Originally posted by slughead
I would've said that but nobody likes a party pooper :P

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #54 of 162
No SLI system that's out yet has been about "value". It's a way to get more performance than the best card of the moment has: if a 6800 Ultra wasn't enough, you could take 2x 6800GT or 2x 6800 Ultra. That's all. No one with a brain has bought 2 6600GT's when the 6800GT has been the same price. Because nVidia's solution is only guaranteed to work on identical cards, say goodbye to finding obscenely good deals later on. Your needs are so specific that finding a huge deal on the right card later on is looking for a needle in the haystack. Even if you find a deal, just how much are you in front of the method of looking for a cost-effective new card (say, the 6800GT) and eBaying your old card?
post #55 of 162
aw, you all are so mean. i love my SLI, i have one empty slot waiting to be filled with another 6600gt (MSI brand) when it comes down in price and when there is more support for it. and also so that my 3dmark can go from 4000 to 6000 or something... just because...

i'm going to have a little cry now
post #56 of 162
Originally posted by onlooker
I'm pretty sure this is the Opteron board that Alienware, and BOXX use.
http://www.tyan.com/products/html/thunderk8we.html
Very fast. Pretty much the option for ultimate 3D performance at present.


hmm not sure about BOXX but Alienware has gone over to Pentium Extreme (dualcore+hyperthreading) for their top-of-theline SLI stuff. marketing and economics wise, this would make more sense than a dual opteron setup, that board is more for servers, and ultra-elite SLI. Alienware has AFAIK is currently not using SLI with full two 16X pci express channels....
post #57 of 162
Quote:
Originally posted by sunilraman
aw, you all are so mean. i love my SLI, i have one empty slot waiting to be filled with another 6600gt (MSI brand) when it comes down in price and when there is more support for it. and also so that my 3dmark can go from 4000 to 6000 or something... just because...

i'm going to have a little cry now

Heh you love something you haven't used yet?

SLI is a great idea. But IMO it needs to be greatly improved to be worth the money... to where 2 SLI 6600gt's will be a 6800gt. I think a lot of people are in that position. They want the performance before they will buy it. Developers are going to ignore it (not all but most) until it is mainstream among consumers. Sure rendering farms would like it. But developers dont' develop for very few customers like that.

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #58 of 162
Quote:
Originally posted by emig647
Heh you love something you haven't used yet?

SLI is a great idea. But IMO it needs to be greatly improved to be worth the money... to where 2 SLI 6600gt's will be a 6800gt. I think a lot of people are in that position. They want the performance before they will buy it. Developers are going to ignore it (not all but most) until it is mainstream among consumers. Sure rendering farms would like it. But developers dont' develop for very few customers like that.


heh ... overall its the "feeling of upgradeability" that i have with my pc system. my mouse and keyboard is from like 10 years ago, i kid you not. hard disk from few months ago. cpu and graphics and mobo and ram from few weeks ago, dvdrom drive from a 1 month ago to boot mac os x.4.0 (long story).

eBay in malaysistan is there but hasn't taken off yet and the market for 2nd hand computer stuff is very small or simply hard to complete the sale.

at the end of the day, i just like the idea of being able to pop in another video card (i'm happy with that specific MSI model) if i want to, a dvd-writer, or drop in an Athlon 64fx, or Athlon x2, or more/faster ram, or a sataII ncq hard disk....

seeing that my other mac is my dad's iBook g4 and i've possibly violated the warranty modding in a 5400rpm notebook drive, this is a less risky way to tinker with stuff.

If i had a powermac G4 and good access to G4 parts resellers, or i was in the USA at the moment, then yes, no worries, i would be fooling around with upgrading my powermac g4. but in terms of cost and availability, living here in south east asia at the moment, it's pc/windows/linux geektime for me when my dad's using his iBook

you have a very salient point about value, so that's how i've hedged my bets, a SLI and non-SLI board is not that great a price difference. who knows, maybe by middle of next year manufacturers may have a bundle of two 7800gt(not gtx) which may be less than one 7900 or 8000 gtx or whatever

okay okay here's my twisted analogy, since we're on the topic. depends what you're into. would you prefer to have sex with one really really hot person, or two somewhat hot twins
post #59 of 162
Quote:
Originally posted by emig647
People keep saying double your performance. SLI doesn't double your performance or come close to doubling the performance. The PCI-E buses are downgraded to 8x slots as opposed to 16x slots. The latency in time of communications between the cards is also another bottleneck. Just like dual processors... doesn't double performance... but dual core is faster... why? because of less latency and an inline memory controller.

What are you talking about 2 downgraded 16X PCI-E slots? Where are your facts?

I think your refering to the original Nforce 4 PCI-E that intel was using that was splitting one 16X slot.
If look at the first AMD version of the Nforce SLI they have been using a dual NVIDIA nForce Professional 2200 &
NVIDIA nForce Professional 2050 (I/O-4) - Connected to CPU 2

Thats Two x16 PCI Express FULL SPEED slots
- Slot1 PCI-E x16 from nForce Prof. 2200
- Slot3 PCI-E x16 from nForce Prof. 2050

So how is that getting downgraded to 8x?


Posts: 2222
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
post #60 of 162
Thread Starter 
Quote:
Originally posted by sunilraman
...would you prefer to have sex with one really really hot person, or two somewhat hot twins...

I would have to go with the somewhat hot twins...
Late 2009 Unibody MacBook (modified)
2.26GHz Core 2 Duo CPU/8GB RAM/60GB SSD/500GB HDD
SuperDrive delete
Reply
Late 2009 Unibody MacBook (modified)
2.26GHz Core 2 Duo CPU/8GB RAM/60GB SSD/500GB HDD
SuperDrive delete
Reply
post #61 of 162
Originally posted by MacRonin
I would have to go with the somewhat hot twins...


ah... and therein lies the marketing appeal of SLI.
post #62 of 162
Originally posted by onlooker
What are you talking about 2 downgraded 16X PCI-E slots? Where are your facts?

I think your refering to the original Nforce 4 PCI-E that intel was using that was splitting one 16X slot.
If look at the first AMD version of the Nforce SLI they have been using a dual
NVIDIA nForce Professional 2200 &
NVIDIA nForce Professional 2050 (I/O-4) - Connected to CPU 2
Thats Two x16 PCI Express FULL SPEED slots
- Slot1 PCI-E x16 from nForce Prof. 2200
- Slot3 PCI-E x16 from nForce Prof. 2050

So how is that getting downgraded to 8x?
Posts: 2222


hi onlooker, i did more research, yes each of the two pciexpress slots theoretically is 16x full. sandrasoft confirms this on my machine to some degree. i imagine if i had two video cards driving two monitors separately that might work in the 2 x 16 PCIExpress mode

however in most boards, for SLI mode, eg. my asus a8n sli, there is a hardware 'paddle' (like a big jumper of sorts) that you flip around to convert the two 16x slots into two 8x slots when setting up sli. why they did this i am not sure but i am sure nvidia have their reasons for setting it up this way on the intial SLI rollout

SLI full 2 x 16 will be supported in the next rev's of nForce4 SLI 16x boards as mentioned in this anandtech article:
http://www.anandtech.com/cpuchipsets...oc.aspx?i=2493
post #63 of 162
I would choose the opteron board. Therefore: two really hot twins.
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
post #64 of 162
Quote:
Originally posted by sunilraman
Originally posted by MacRonin
I would have to go with the somewhat hot twins...


ah... and therein lies the marketing appeal of SLI.

Are the twins in the market too?
'L'enfer, c'est les autres' - JPS
Reply
'L'enfer, c'est les autres' - JPS
Reply
post #65 of 162
Quote:
Originally posted by onlooker
What are you talking about 2 downgraded 16X PCI-E slots? Where are your facts?

I think your refering to the original Nforce 4 PCI-E that intel was using that was splitting one 16X slot.
If look at the first AMD version of the Nforce SLI they have been using a dual NVIDIA nForce Professional 2200 &
NVIDIA nForce Professional 2050 (I/O-4) - Connected to CPU 2

Thats Two x16 PCI Express FULL SPEED slots
- Slot1 PCI-E x16 from nForce Prof. 2200
- Slot3 PCI-E x16 from nForce Prof. 2050

So how is that getting downgraded to 8x?


Posts: 2222

When you DO SLI... the clock speed of the slots are downgraded from 16x to 8x.

Quote from NVidia NForce4
Quote:
Last on the list is the top-end nForce 4 SLI, which includes all the Ultra edition's qualities, but adds dual graphics card support. The SLI's PCI Express 16x interface is flexible, allowing its user to mount a single graphics card under 16x mode, and will allow two PCI Express graphics cards running under 8x mode. As the 8x PCIe is still good for 4GB/s bi-directional bandwidth, there will be no performance bottleneck with today's graphics cards.

It was a cached quote on google. Finding original source....

The reason is..... PCI-Express only has 20 lanes. Using SLI the cards must share these lanes with the north bridge. So with SLI the lanes must be divided in half which in effect downgrades the bandwidth to 8x. Now some dual opteron boards can handle dual 16x buses because there are in fact 2 chipsets on these boards. But when it comes to regular AMD 64 procs... they are dual 8x. So yes you can get it with a dual chipset board, like the ones you listed... but who will fork out the money for those?

Also Crossfire will only support dual 8x Crossfire dual 8x

Is it really worth it for a dual chipset? To rendering farms... yah... but consumers... not even close... and that translates into fewer developers working on SLI, in turn... less support.

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #66 of 162
Quote:
Originally posted by sunilraman
Originally posted by onlooker
What are you talking about 2 downgraded 16X PCI-E slots? Where are your facts?

I think your refering to the original Nforce 4 PCI-E that intel was using that was splitting one 16X slot.
If look at the first AMD version of the Nforce SLI they have been using a dual
NVIDIA nForce Professional 2200 &
NVIDIA nForce Professional 2050 (I/O-4) - Connected to CPU 2
Thats Two x16 PCI Express FULL SPEED slots
- Slot1 PCI-E x16 from nForce Prof. 2200
- Slot3 PCI-E x16 from nForce Prof. 2050

So how is that getting downgraded to 8x?
Posts: 2222


hi onlooker, i did more research, yes each of the two pciexpress slots theoretically is 16x full. sandrasoft confirms this on my machine to some degree. i imagine if i had two video cards driving two monitors separately that might work in the 2 x 16 PCIExpress mode

however in most boards, for SLI mode, eg. my asus a8n sli, there is a hardware 'paddle' (like a big jumper of sorts) that you flip around to convert the two 16x slots into two 8x slots when setting up sli. why they did this i am not sure but i am sure nvidia have their reasons for setting it up this way on the intial SLI rollout

SLI full 2 x 16 will be supported in the next rev's of nForce4 SLI 16x boards as mentioned in this anandtech article:
http://www.anandtech.com/cpuchipsets...oc.aspx?i=2493

That's not the next board for anything other that Intel processors. That's been out for the Opteron forever.
It sounds like your using one of the early boards, or an Athlon, or intel version. The Board I am referring to was available immediately when The Opteron Motherboard was available with SLI. It's been like over a year.

Even that anandtech article only mentions Intel, and Dell not using AMD anytime soon. I think it's a shifty article that is trying to lump All the SLI boards into the original Intel versons problem. I knew about this way before the First intel SLI board was released, and I also new it was done properly when the Opteron version came out.
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
post #67 of 162
Quote:
Originally posted by emig647
Is it really worth it for a dual chipset? To rendering farms... yah... but consumers... not even close... and that translates into fewer developers working on SLI, in turn... less support.

Gonna nitpick a little here.

In a rendering farm you don't necessarily even need a video card in any of the boxes, you can SSH in and render, CPU's do the work... What you're thinking about is a graphics workstation.
post #68 of 162
Quote:
Originally posted by Gon
Gonna nitpick a little here.

In a rendering farm you don't necessarily even need a video card in any of the boxes, you can SSH in and render, CPU's do the work... What you're thinking about is a graphics workstation.

That's correct. The rendering capabilities that SLI offers for workstation advantages is real time on screen rendering in your normal working, or creation environment. Moving, manipulating, and generally forging millions polygons is seriously tasking on a systems graphics, and that is even before shading.
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
post #69 of 162
I think SLI has potential value to offer in gaming - exactly when the cards can pull it off totally transparently. Only they are probably never going to be able to do that due to the crazy optimization that goes down in games. And there's a network effect here: as long as 1% of the market has SLI, you really can't put in a whole new level of eyecandy for that crowd. They'll get more anti-aliasing, more resolution, and that's it. To take advantage of the added resolution, you need a big resolution screen. Which, again, few will afford.

The nicest thing you can do while tinkering with a system is to silence it and then overclock if you still can after silencing. Doesn't cost much, either.. of course you can go to extremes in this like in everything else.
post #70 of 162
edit: nevermind. i was not comfortable with what i posted
post #71 of 162
Quote:
Originally posted by Gon
I think SLI has potential value to offer in gaming - exactly when the cards can pull it off totally transparently. Only they are probably never going to be able to do that due to the crazy optimization that goes down in games. And there's a network effect here: as long as 1% of the market has SLI, you really can't put in a whole new level of eyecandy for that crowd. They'll get more anti-aliasing, more resolution, and that's it. To take advantage of the added resolution, you need a big resolution screen. Which, again, few will afford.

The nicest thing you can do while tinkering with a system is to silence it and then overclock if you still can after silencing. Doesn't cost much, either.. of course you can go to extremes in this like in everything else.

Right now gaming is the majority of the SLI market, but as gaming stands on the Mac it would be better suited, and touted as a highend 3D workstation solution than a gaming machine. Games are pretty limited, and late on Mac's typically.
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
post #72 of 162
Quote:
Originally posted by onlooker
Right now gaming is the majority of the SLI market, but as gaming stands on the Mac it would be better suited, and touted as a highend 3D workstation solution than a gaming machine. Games are pretty limited, and late on Mac's typically.

I'm willing to bet that will all change soon enough

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #73 of 162
I wouldn't get my hopes up too high, but I think there will be a few more. I really don't see Macs having an equal amount of games any time too soon. When Apple goes intel if they can acquire 15% share of computer users even that is still a limited amount because what percent of 15% are gamers?
Although in todays world just about all kids are gamers, but with the next gen. consoles I see PC game playing, and programming declining, and console programming being offered at more tech schools in the near future. Heck, I already have a down payment on an XBOX 360, and I doubt I'll be playing on anything other than that or my PS3 when that comes out.

Which more or less is why I think SLI will probably be pushed to mature more rapidly in the direction of a pro graphics oriented solution in the immediate future.
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
post #74 of 162
Nobody is going to switch to OS X for games, and here are 3 reasons why:

1) Not enough games due to
---A. Crap video cards included in most systems (so porting is pointless because even most mac users can't play many games)
---B. Mac OS X has a small market share
2) OS X is MUCH slower than windows for gaming.. don't blame me! it's been benchmarked!
3) An expensive box like a Mac can't really compete with one of these new consoles (which, btw, will arrive sooner than the intel macs)

Since Apple is being fairly single minded (mind on the money and the money on the mind), there's no way in hell they're going to let other computer companies make their boxes for them. This would take care of problems 1A and 3, or half the problem.

But Mac users and gamers never did like eachother anyway.. me being both I feel like Johnathan Swift.
Mac user since before you were born.
Reply
Mac user since before you were born.
Reply
post #75 of 162
Originally posted by slughead
But Mac users and gamers never did like eachother anyway.. me being both I feel like Johnathan Swift.....


well i got an AMD-nVidia6600gt (cybermonkey wheres my poem - i'll keep asking till i get it) and got this whole 'macs suck for games, mac don't have good gpu' stuff out of my system and satisfied my *curiousity* ... i feel like i've been unfaithful but i've always gone both ways in myriad senses of the word
post #76 of 162
Quote:
Originally posted by slughead


2) OS X is MUCH slower than windows for gaming.. don't blame me! it's been benchmarked!

I think that is because of Apple needs to use a better understanding between OpenGL, and the OS. I think there will be some big changes in the future with Intel Macs, and graphics which will lead to a new way that the OS has it's OGL structured for gaming, and app specific graphics, but it seems the OS has it's OGL built deep within to compliment the OS's features it self, which is great, but they need both to satisfy the end user sufficiently, and completely.
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
post #77 of 162
Quote:
Originally posted by onlooker
I think that is because of Apple needs to use a better understanding between OpenGL, and the OS. I think there will be some big changes in the future with Intel Macs, and graphics which will lead to a new way that the OS has it's OGL structured for gaming, and app specific graphics, but it seems the OS has it's OGL built deep within to compliment the OS's features it self, which is great, but they need both to satisfy the end user sufficiently, and completely.

Perhaps the problems that OSX on PPC was having with creating and releasing threads, and which appears to be solved on Intel will be a big step forward for Mac gaming, perhaps a huge step. I believe that the folks that say that OSX on Intel 'feels' quicker are seeing this, because when those same boxes are thrown a huge job to do they crumble, and for those that like benches checkout how bad OSX on PPC is at threads, but everything else is fine. Games should do well if this problem has been solved on Intel HW.
Please consider throwing extra cycles at better understanding Alzheimer's, Mad Cow (CJD), ALS, and Parkinson's disease go here <a href="http://folding.stanford.edu/" target="_blank">http://folding....
Reply
Please consider throwing extra cycles at better understanding Alzheimer's, Mad Cow (CJD), ALS, and Parkinson's disease go here <a href="http://folding.stanford.edu/" target="_blank">http://folding....
Reply
post #78 of 162
bump! here is tha sampling dual-gpu-on-one-board by gigabyte.
enjoys.

dual nVidia 6800GT on one beastly card. YEAH.
http://www.3dvelocity.com/reviews/gigabyte3d1/gv3d1.htm

"...It IS big and it IS clever, but it's also heavy, occasionally loud and choosy about where it lives..."

compared to a crucial x850xt radeon


chunky heatsink as we predicted
post #79 of 162
more GPU pr0n here: the backside of the card with more heatsink stuff and one more fan


the frontside of the card with chips and heatsink removed
post #80 of 162
I still love the dual gpu models. For consumers going dual is a lot more practical than sli IMO.

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Screw SLI, when do we get this in our Macs...?!?