Who needs friggin SLI when you got this? Yeah baby, OUTBOARD GPU multiplexed. Oh, and I see in one of the specs 4 core Quadro GPU power...!!! YEAHHHHHH. Frack SLI and Crossfire. Get one of this into ya. (Temporarily ignores any calls for "What API to use" and "What connection/ bandwidth" etc.) How friggin cool is this? Like your GPUs are in like this external box. Frack me, that's cool. http://www.nvidia.com/page/quadroplex.html
Not sure how to feel about the proprietary connector.
I'm not sure what else they could have really used to have that level of GPU power flowing in and out of the PC. I'm sure you have some suggestions on new tech that's more standardised that would handle it. PCIE2?
I'm not sure what else they could have really used to have that level of GPU power flowing in and out of the PC. I'm sure you have some suggestions on new tech that's more standardised that would handle it. PCIE2?
Express 2 takes care of a lot of things. four times the bandwidth, if they decide to use it would allow 4 chips on a board, if it didn't burn up. You know, they are already talking about a separate power supply for the graphics sub sections. A four board top of the line SLI uses several hundred watts, and generates, how much heat?
Express 2 takes care of a lot of things. four times the bandwidth, if they decide to use it would allow 4 chips on a board, if it didn't burn up. You know, they are already talking about a separate power supply for the graphics sub sections. A four board top of the line SLI uses several hundred watts, and generates, how much heat?
And 1st gen DX10 s supposed to be even hotter/more power hungry. I hear the 8800 drinks the watthours faster than a freshmen drinks beer at Welcome Week.
Hah! I thought you all were hardcore pros and stuff. Now it's all, wa wahh wahh, too much electricity, can't use it, it's too big and powerful and scary, boo hoo hoo...
And 1st gen DX10 s supposed to be even hotter/more power hungry. I hear the 8800 drinks the watthours faster than a freshmen drinks beer at Welcome Week.
They need some Intel skillz to get DX10 GPUs down to 65nm and have some dualcore-on-one-chip high performance-per-watt happening. Video cards pushing 200W and more are not fun
Heh. Welcome week. I was always back to skool too late to partake in that. 3rd year of college/ uni though the floodgates opened as I found myself quite often fumbling my way back home, I think once I passed out at a friend's house party. Not good. And can you believe all that drinking without making out with anyone until 1999 (started drinking and partying thing in what, '98 or so)... Ah, the memories... ...Keep in mind that when I had my end of high school prom/ "formal dance" at the end of '95 I did get a date, and she was cute in a tomboyish kind of way, but not only was the school a Lutheran Christian school, so was she. As "Vice Captain" or something of our grade she actually made some speech on "you know, I know y'all gonna party hard and all that, but you know, think of the consequences..." and I'm not sure what exactly she said but it was along the lines of don't get drunk and have sex and stuff. Save yourself from sin...!!!!! WTF. She was inspiration to go to my Chemistry class though. First day of class I sneakily got a seat next to her. And was like, "Hey, I missed what the teacher said, do you know about this part..."
I don't know who is going to use that thing. Supercomputers, after all, use custom hardware.
Pixar springs to mind. Maybe some of the more flash-with-cash games companies with inhouse 3D animators. I'd imagine it'd create pretty much realtime movie grade rendering.
Also, medical imaging. My Mum used to work for SGI and they had a demo room in which you walked in and you could pull organs from a patient in a 3D immersive space or zoom in around and view cell structure. That was cool back in 1995 or so but they were using a couple of Crays to do it.
I'd imagine it'd create pretty much realtime movie grade rendering.
Full realtime photorealistic immersive virtual reality (surely the VR headset and all that has improved by now). The GPU power is there, the refresh response with VR headset motion is so much easier tech nowadays. Whack in a Conroe and we are ready to rock with a prosumer set of what required Crays 10 years ago. What the frack happened to VR???
Pixar springs to mind. Maybe some of the more flash-with-cash games companies with inhouse 3D animators. I'd imagine it'd create pretty much realtime movie grade rendering.
Also, medical imaging. My Mum used to work for SGI and they had a demo room in which you walked in and you could pull organs from a patient in a 3D immersive space or zoom in around and view cell structure. That was cool back in 1995 or so but they were using a couple of Crays to do it.
Oh and Gamers with more money than sense.
What purpose could Pixar find for this? It's useless for rendering their work, and it isn't needed to view it either.
Full realtime photorealistic immersive virtual reality (surely the VR headset and all that has improved by now). The GPU power is there, the refresh response with VR headset motion is so much easier tech nowadays. Whack in a Conroe and we are ready to rock with a prosumer set of what required Crays 10 years ago. What the frack happened to VR???
Movies grade rendering is something that GPU's aren't even close to doing.
Full realtime photorealistic immersive virtual reality (surely the VR headset and all that has improved by now). The GPU power is there, the refresh response with VR headset motion is so much easier tech nowadays. Whack in a Conroe and we are ready to rock with a prosumer set of what required Crays 10 years ago. What the frack happened to VR???
Macrumors' source is expecting Conroes in low end Mac Pro and Woodcrest in high end. Also Xeon (Woodcrest) xServe. They don't expect 3GHz for 1U form factor, but Dell does offer 3GHz 1U server.
Comments
Onlooker: get a fresh pair of underwear ready. You have been warned. 8)
This new format totally frigs with my unique Quoting style.
To the contrary, I'm happy it no longer inserts that boldness nonsense that I keep having to remove.
Yeah baby, OUTBOARD GPU multiplexed.
Not sure how to feel about the proprietary connector.
Not sure how to feel about the proprietary connector.
I'm not sure what else they could have really used to have that level of GPU power flowing in and out of the PC. I'm sure you have some suggestions on new tech that's more standardised that would handle it. PCIE2?
I'm not sure what else they could have really used to have that level of GPU power flowing in and out of the PC. I'm sure you have some suggestions on new tech that's more standardised that would handle it. PCIE2?
Express 2 takes care of a lot of things. four times the bandwidth, if they decide to use it would allow 4 chips on a board, if it didn't burn up. You know, they are already talking about a separate power supply for the graphics sub sections. A four board top of the line SLI uses several hundred watts, and generates, how much heat?
Express 2 takes care of a lot of things. four times the bandwidth, if they decide to use it would allow 4 chips on a board, if it didn't burn up. You know, they are already talking about a separate power supply for the graphics sub sections. A four board top of the line SLI uses several hundred watts, and generates, how much heat?
And 1st gen DX10 s supposed to be even hotter/more power hungry. I hear the 8800 drinks the watthours faster than a freshmen drinks beer at Welcome Week.
That thing is probably going to cost 4x as much as a Mac Pro. As cool as it is I can't imagine I'd be able to afford it.
Afford it? I was trying to imagine what I'd use it for!
Keeping my feet warm in winter was what I came up with.
And 1st gen DX10 s supposed to be even hotter/more power hungry. I hear the 8800 drinks the watthours faster than a freshmen drinks beer at Welcome Week.
They need some Intel skillz to get DX10 GPUs down to 65nm and have some dualcore-on-one-chip high performance-per-watt happening. Video cards pushing 200W and more are not fun
Heh. Welcome week. I was always back to skool too late to partake in that. 3rd year of college/ uni though the floodgates opened as I found myself quite often fumbling my way back home, I think once I passed out at a friend's house party. Not good. And can you believe all that drinking without making out with anyone until 1999 (started drinking and partying thing in what, '98 or so)... Ah, the memories... ...Keep in mind that when I had my end of high school prom/ "formal dance" at the end of '95 I did get a date, and she was cute in a tomboyish kind of way, but not only was the school a Lutheran Christian school, so was she. As "Vice Captain" or something of our grade she actually made some speech on "you know, I know y'all gonna party hard and all that, but you know, think of the consequences..." and I'm not sure what exactly she said but it was along the lines of don't get drunk and have sex and stuff. Save yourself from sin...!!!!! WTF. She was inspiration to go to my Chemistry class though. First day of class I sneakily got a seat next to her. And was like, "Hey, I missed what the teacher said, do you know about this part..."
I don't know who is going to use that thing. Supercomputers, after all, use custom hardware.
Pixar springs to mind. Maybe some of the more flash-with-cash games companies with inhouse 3D animators. I'd imagine it'd create pretty much realtime movie grade rendering.
Also, medical imaging. My Mum used to work for SGI and they had a demo room in which you walked in and you could pull organs from a patient in a 3D immersive space or zoom in around and view cell structure. That was cool back in 1995 or so but they were using a couple of Crays to do it.
Oh and Gamers with more money than sense.
Oh and Gamers with more money than sense.
C'mon we know deep inside you want 500fps++ at Half Life 2...!!
I'd imagine it'd create pretty much realtime movie grade rendering.
Full realtime photorealistic immersive virtual reality (surely the VR headset and all that has improved by now). The GPU power is there, the refresh response with VR headset motion is so much easier tech nowadays. Whack in a Conroe and we are ready to rock with a prosumer set of what required Crays 10 years ago. What the frack happened to VR???
Pixar springs to mind. Maybe some of the more flash-with-cash games companies with inhouse 3D animators. I'd imagine it'd create pretty much realtime movie grade rendering.
Also, medical imaging. My Mum used to work for SGI and they had a demo room in which you walked in and you could pull organs from a patient in a 3D immersive space or zoom in around and view cell structure. That was cool back in 1995 or so but they were using a couple of Crays to do it.
Oh and Gamers with more money than sense.
What purpose could Pixar find for this? It's useless for rendering their work, and it isn't needed to view it either.
Full realtime photorealistic immersive virtual reality (surely the VR headset and all that has improved by now). The GPU power is there, the refresh response with VR headset motion is so much easier tech nowadays. Whack in a Conroe and we are ready to rock with a prosumer set of what required Crays 10 years ago. What the frack happened to VR???
Movies grade rendering is something that GPU's aren't even close to doing.
Full realtime photorealistic immersive virtual reality (surely the VR headset and all that has improved by now). The GPU power is there, the refresh response with VR headset motion is so much easier tech nowadays. Whack in a Conroe and we are ready to rock with a prosumer set of what required Crays 10 years ago. What the frack happened to VR???
SGI were doing it without headsets. :-)
Macrumors' source is expecting Conroes in low end Mac Pro and Woodcrest in high end. Also Xeon (Woodcrest) xServe. They don't expect 3GHz for 1U form factor, but Dell does offer 3GHz 1U server.