Wouldn't it be cool if the Cell machines turned out to be multi-core PowerPCs with VMX? </strong><hr></blockquote>
That was sort of where I was thinking. Anything that makes the 'next Insanely Great thing' not a major shift in direction programming-wise is a good thing. The prevalence of Duals currently has everyone thinking 'make a separate thread' for anything possible, and avoiding reentrance, etc.
A really well written 2CPU app probably wouldn't care much if it _was_ running on 256 CPUS
That must be why we won't see commercial versions of the chip for two years even though it is almost "taped out" and they've been playing with components of the design for some time now?
With all due respect to you and Programmer, you seem to be ignoring the fact that what IBM is currently contracted to develop is the heart of the PS3. Despite the tremendous hype around this processor, they aren't touting it as the next generation in general computing, they're bragging about video streaming, network throughput, graphics rendering -- DSP functions for crying out loud.
So OK, this will make for one great chipset or a SoC, able to handle everything from 3D graphics to network I/O to video to joysticks and other input devices. But by the time it's available and they've developed API's for it, etc., your basic Radeon or GeForce may be faster thanks to incremental improvements in their core graphics engines.
I'm fairly certain it will NOT be a CPU (PPC derivative or not) nonetheless.
QB]<hr></blockquote>
:shrug: I think the 2 year "delay" can be explained simply by them figuring out how to produce such a massive design at a low enough price that it can go into a console, and to figure out the board-level stuff enough that they can interconnect the chips like they seem to be intending to do.
Sure IBM is contracted to do this for Toshiba & Sony, but this has been IBM's direction for some time now and I think that Sony went to IBM because of that. The cellular research at IBM will extend well beyond PS3. The line between general computing and "DSP" functions has blurred and more generalized processors are doing a better job of it these days than most DSPs. Graphics, in particular, are a key target for the PS3 and that is not a traditional DSP application.
You could be right, but if this sort of power is going to appear in that timeframe then Apple had better be ready to jump on it or be left in the dust.
<strong>Sure IBM is contracted to do this for Toshiba & Sony, but this has been IBM's direction for some time now and I think that Sony went to IBM because of that.</strong><hr></blockquote>
Well sure, and it's quite likely that whatever IBM learns in this venture will have future applications. (Assuming it doesn't turn out to be another MAJC or Crusoe, that is.) I just don't see any reason to suppose that Apple will be adopting this technology as a viable alternative to the PPC. At least, not this particular implementation, in any event. But I could see this used in a co-processor/GPU scenario. (Assuming issues around software development, connectivity, and bandwidth don't prove insurmountable.)
Comments
<strong>
Wouldn't it be cool if the Cell machines turned out to be multi-core PowerPCs with VMX?
That was sort of where I was thinking. Anything that makes the 'next Insanely Great thing' not a major shift in direction programming-wise is a good thing. The prevalence of Duals currently has everyone thinking 'make a separate thread' for anything possible, and avoiding reentrance, etc.
A really well written 2CPU app probably wouldn't care much if it _was_ running on 256 CPUS
[QB]
That must be why we won't see commercial versions of the chip for two years even though it is almost "taped out" and they've been playing with components of the design for some time now?
With all due respect to you and Programmer, you seem to be ignoring the fact that what IBM is currently contracted to develop is the heart of the PS3. Despite the tremendous hype around this processor, they aren't touting it as the next generation in general computing, they're bragging about video streaming, network throughput, graphics rendering -- DSP functions for crying out loud.
So OK, this will make for one great chipset or a SoC, able to handle everything from 3D graphics to network I/O to video to joysticks and other input devices. But by the time it's available and they've developed API's for it, etc., your basic Radeon or GeForce may be faster thanks to incremental improvements in their core graphics engines.
I'm fairly certain it will NOT be a CPU (PPC derivative or not) nonetheless.
QB]<hr></blockquote>
:shrug: I think the 2 year "delay" can be explained simply by them figuring out how to produce such a massive design at a low enough price that it can go into a console, and to figure out the board-level stuff enough that they can interconnect the chips like they seem to be intending to do.
Sure IBM is contracted to do this for Toshiba & Sony, but this has been IBM's direction for some time now and I think that Sony went to IBM because of that. The cellular research at IBM will extend well beyond PS3. The line between general computing and "DSP" functions has blurred and more generalized processors are doing a better job of it these days than most DSPs. Graphics, in particular, are a key target for the PS3 and that is not a traditional DSP application.
You could be right, but if this sort of power is going to appear in that timeframe then Apple had better be ready to jump on it or be left in the dust.
<strong>Sure IBM is contracted to do this for Toshiba & Sony, but this has been IBM's direction for some time now and I think that Sony went to IBM because of that.</strong><hr></blockquote>
Well sure, and it's quite likely that whatever IBM learns in this venture will have future applications. (Assuming it doesn't turn out to be another MAJC or Crusoe, that is.) I just don't see any reason to suppose that Apple will be adopting this technology as a viable alternative to the PPC. At least, not this particular implementation, in any event. But I could see this used in a co-processor/GPU scenario. (Assuming issues around software development, connectivity, and bandwidth don't prove insurmountable.)