First of all, down the road Apple could be in a fine position to take advantage of such tech advances. Doubtful if Intel, MS, and Dell could be so nimble.
As an aside, I wonder if it would be the approach for TV's to render raw digital data on the fly. That would circumvent half of the current workflow of content creation, computer rendering, compression, and delivery.
I hate the wild claims and speculation that these articles do. More computing power than I can imagine? Give me a break, I can imagine a heck of a lot and it would make 16 teraflops quiver and run away with its tail between its legs. "Broadband on a chip"? What the heck is that supposed to mean and how is it going to bring everyone together in something that sounds like Utopia? C'mon, keep the old feet on the ground. Who are they quoting here, a maintenance engineer in one of Sony's offices?
"C'mon, keep the old feet on the ground. Who are they quoting here, a maintenance engineer in one of Sony's offices?"
Yeah, how'd they get hold of Kormac over at Samsung for a Sony article?
But Big P, what's your take on the specificity in the article, namely:
A single Cell chip is expected to surpass 250 billion floating point operations, or 250 gigaflops, per second, six times as fast as Nvidia's new graphics chip.
A single Cell chip is expected to surpass 250 billion floating point operations, or 250 gigaflops, per second, six times as fast as Nvidia's new graphics chip.
And that each chip will have 8 cores.
Well my take on it is that it will be very hard to keep such a chip fed with information, that it will be difficult to write useful algorithms that excercise peak computational abilities of the chip and that that number is just a marketing spec for peak performance and reality is a bit more believable.
Of course I am a dull realist but I have heard all this before from FPGA's and how many machines have those built in? Yes they can be incredibly powerful and useful, but nobody uses them outside of specialized applications (e.g. cruise missiles).
Cell will undoubtedly be nice, but I tend to think that it will not live up to all its hype. Of course, I'd like to be surprised
Cell will likely make a wonderful addition to PowerMacs.
Well my take on it is that it will be very hard to keep such a chip fed with information, that it will be difficult to write useful algorithms that excercise peak computational abilities of the chip and that that number is just a marketing spec for peak performance and reality is a bit more believable.
Of course I am a dull realist but I have heard all this before from FPGA's and how many machines have those built in? Yes they can be incredibly powerful and useful, but nobody uses them outside of specialized applications (e.g. cruise missiles).
Cell will undoubtedly be nice, but I tend to think that it will not live up to all its hype. Of course, I'd like to be surprised
Cell will likely make a wonderful addition to PowerMacs.
Apple's dual 2.5 GHz G5 has a theoretical peak performance of:
How much is really usable? Well if the VA Big Mac is any indication only about 9 GFLOPS, or a little under 25% the peak number... and that is only on a benchmark used for the purpose. Nonetheless, having a single chip that can apparently fit in a game console turn in more than 8 times the peak rate of the fastest current PowerMac is pretty impressive.
Of course I am a dull realist but I have heard all this before from FPGA's and how many machines have those built in? Yes they can be incredibly powerful and useful, but nobody uses them outside of specialized applications (e.g. cruise missiles).
Quantel makes high end HD edit systems that use FPGAs to handle the video processing. Their eQ system can mix and match HD and SD video in one sequence without any proxies or other silliness. Very fast at any resolution.
I guess they are the cruise missile of editing equipment.
How much is really usable? Well if the VA Big Mac is any indication only about 9 GFLOPS, or a little under 25% the peak number... and that is only on a benchmark used for the purpose. Nonetheless, having a single chip that can apparently fit in a game console turn in more than 8 times the peak rate of the fastest current PowerMac is pretty impressive.
Actually heat transfer is great, the real enemy is current leakage and power consumption in general. Without heat transfer you wouldn't be able to run your 3.8 GHz P4 more than a millisecond without it cooking itself.
How much is really usable? Well if the VA Big Mac is any indication only about 9 GFLOPS, or a little under 25% the peak number... and that is only on a benchmark used for the purpose. Nonetheless, having a single chip that can apparently fit in a game console turn in more than 8 times the peak rate of the fastest current PowerMac is pretty impressive.
Ignore the hype, but keep an eye on this thing.
I agree. The hype is definitely out of hand, but I think that a cell coprocessor would be a wonderful addition to an Apple Pro machine. I would like to see how Apple will feed a cell so that it wasn't just spinning cycles (of course, theoretical peak performance is also the performance of a somewhat useless task unless you are scaling all your data by some constant). I'd expect Cell to have the same memory hogging characteristics that Altivec has.
The reason why this all works out is that Apple is going to do much of the hard work of making APIs that will use the Cell CPU (i.e. Core Video/CoreAudio/Quartz). No software developer in his right mind would ignore such useful and hardware accelerated APIs.
Quantel makes high end HD edit systems that use FPGAs to handle the video processing. Their eQ system can mix and match HD and SD video in one sequence without any proxies or other silliness. Very fast at any resolution.
I guess they are the cruise missile of editing equipment.
Don't get me wrong, FPGA's are great, but despite all their power, they just dont find their way into as many hardware systems as they could.
Half of what would make cell a nice optin is that Apple would be bundling it in their machines as a standard option (well at least for their pro machines).
Anyhow, I have to go back to work (four weeks until we're gold!)
What's with these authors and their crazy hyperbole?
And I thought they were being conservative.
My prediction for the Cell presentation: A Japanese engineer is introduced and walks on stage. The audience is stunned by the presentation. Then, towards the end, he says, "One more thing..." , while ripping off a mask to reveal he is actually Steve Jobs. "All of the cool stuff you have seen today has been running on the new PowerMac Cell. He presses the clicker and a Keynote slide showing the machine appears with the words "Shipping worldwide starting today."
"Broadband on a chip"? What the heck is that supposed to mean and how is it going to bring everyone together in something that sounds like Utopia?
I got a kick out of that too. What now? Are they going to get OC3 lines to everybody's house in a chip? "Broadband on a chip" What a clown. This guy makes no sense.
?One area of wide speculation is whether Apple might become a partner in the Cell alliance in the future. Apple is already the largest customer for the PowerPC chip, and it would be simple for the company to take advantage of the Cell design. Several people familiar with Apple?s strategy, however, said that the computer maker had yet to be convinced that the Cell technology could provide a significant performance advantage.?
Of course I am a dull realist but I have heard all this before from FPGA's and how many machines have those built in? Yes they can be incredibly powerful and useful, but nobody uses them outside of specialized applications (e.g. cruise missiles).
Thats funny - I thought that FPGAs were very common - my company uses boatloads of them in our products, and we are not primarily a defense company.
Broadband on a the chip is the VISION for the Cell shared by IBM, Sony and Toshiba.
Yep those guys are real clowns about the broadband capabilities of the Cell.
I still think it's bad language, and more of a buzz term than anything. How are you supposed to get broadband on a chip if you have dial up? What good is the chip then? That I'd like to know.
I still think it's bad language, and more of a buzz term than anything. How are you supposed to get broadband on a chip if you have dial up? What good is the chip then? That I'd like to know.
I'm no networking expert but a little Googling around will show that IBM has been building on-chip hardware acceleration for the protocols that send data out the NIC's thus speeding up the broadband connection.
If you are on dial-up you don't get to play, it's just out of your league.
Comments
Originally posted by mugwump
Super Cell article
First of all, down the road Apple could be in a fine position to take advantage of such tech advances. Doubtful if Intel, MS, and Dell could be so nimble.
As an aside, I wonder if it would be the approach for TV's to render raw digital data on the fly. That would circumvent half of the current workflow of content creation, computer rendering, compression, and delivery.
I hate the wild claims and speculation that these articles do. More computing power than I can imagine? Give me a break, I can imagine a heck of a lot and it would make 16 teraflops quiver and run away with its tail between its legs. "Broadband on a chip"? What the heck is that supposed to mean and how is it going to bring everyone together in something that sounds like Utopia? C'mon, keep the old feet on the ground. Who are they quoting here, a maintenance engineer in one of Sony's offices?
Yeah, how'd they get hold of Kormac over at Samsung for a Sony article?
But Big P, what's your take on the specificity in the article, namely:
A single Cell chip is expected to surpass 250 billion floating point operations, or 250 gigaflops, per second, six times as fast as Nvidia's new graphics chip.
And that each chip will have 8 cores.
Goes into production by midyear at East Fishkill.
Already "on its way" to 65nm production.
Hmmmmmm??
Originally posted by mugwump
A single Cell chip is expected to surpass 250 billion floating point operations, or 250 gigaflops, per second, six times as fast as Nvidia's new graphics chip.
And that each chip will have 8 cores.
Well my take on it is that it will be very hard to keep such a chip fed with information, that it will be difficult to write useful algorithms that excercise peak computational abilities of the chip and that that number is just a marketing spec for peak performance and reality is a bit more believable.
Of course I am a dull realist but I have heard all this before from FPGA's and how many machines have those built in? Yes they can be incredibly powerful and useful, but nobody uses them outside of specialized applications (e.g. cruise missiles).
Cell will undoubtedly be nice, but I tend to think that it will not live up to all its hype. Of course, I'd like to be surprised
Cell will likely make a wonderful addition to PowerMacs.
Originally posted by Yevgeny
Well my take on it is that it will be very hard to keep such a chip fed with information, that it will be difficult to write useful algorithms that excercise peak computational abilities of the chip and that that number is just a marketing spec for peak performance and reality is a bit more believable.
Of course I am a dull realist but I have heard all this before from FPGA's and how many machines have those built in? Yes they can be incredibly powerful and useful, but nobody uses them outside of specialized applications (e.g. cruise missiles).
Cell will undoubtedly be nice, but I tend to think that it will not live up to all its hype. Of course, I'd like to be surprised
Cell will likely make a wonderful addition to PowerMacs.
Apple's dual 2.5 GHz G5 has a theoretical peak performance of:
2.5 GHz * 2 processors * 4-way SIMD * 2 ops/multiply-add = 40 GFLOPS
How much is really usable? Well if the VA Big Mac is any indication only about 9 GFLOPS, or a little under 25% the peak number... and that is only on a benchmark used for the purpose. Nonetheless, having a single chip that can apparently fit in a game console turn in more than 8 times the peak rate of the fastest current PowerMac is pretty impressive.
Ignore the hype, but keep an eye on this thing.
Originally posted by Yevgeny
Of course I am a dull realist but I have heard all this before from FPGA's and how many machines have those built in? Yes they can be incredibly powerful and useful, but nobody uses them outside of specialized applications (e.g. cruise missiles).
Quantel makes high end HD edit systems that use FPGAs to handle the video processing. Their eQ system can mix and match HD and SD video in one sequence without any proxies or other silliness. Very fast at any resolution.
I guess they are the cruise missile of editing equipment.
http://www.quantel.com/domisphere/in...sf/html/eQmain
Originally posted by onlooker
Anybody have any ideas why Apple is still absent from those on board on IBM's power list?
Power.org is centered around Linux, not OS X which might explain the lack of Apple being a member.
Originally posted by Programmer
Apple's dual 2.5 GHz G5 has a theoretical peak performance of:
2.5 GHz * 2 processors * 4-way SIMD * 2 ops/multiply-add = 40 GFLOPS
How much is really usable? Well if the VA Big Mac is any indication only about 9 GFLOPS, or a little under 25% the peak number... and that is only on a benchmark used for the purpose. Nonetheless, having a single chip that can apparently fit in a game console turn in more than 8 times the peak rate of the fastest current PowerMac is pretty impressive.
Ignore the hype, but keep an eye on this thing.
Aint Heat Transfer a bitch?
Originally posted by mdriftmeyer
Aint Heat Transfer a bitch?
Actually heat transfer is great, the real enemy is current leakage and power consumption in general. Without heat transfer you wouldn't be able to run your 3.8 GHz P4 more than a millisecond without it cooking itself.
IBM, Sony, Toshiba to announce "Supercomputer on a Chip."
http://www.macsimumnews.com/index.php/archive/2619/
Originally posted by murk
Neo's at it again...
http://www.macsimumnews.com/index.php/archive/2619/
What's with these authors and their crazy hyperbole?
Originally posted by Programmer
Apple's dual 2.5 GHz G5 has a theoretical peak performance of:
2.5 GHz * 2 processors * 4-way SIMD * 2 ops/multiply-add = 40 GFLOPS
How much is really usable? Well if the VA Big Mac is any indication only about 9 GFLOPS, or a little under 25% the peak number... and that is only on a benchmark used for the purpose. Nonetheless, having a single chip that can apparently fit in a game console turn in more than 8 times the peak rate of the fastest current PowerMac is pretty impressive.
Ignore the hype, but keep an eye on this thing.
I agree. The hype is definitely out of hand, but I think that a cell coprocessor would be a wonderful addition to an Apple Pro machine. I would like to see how Apple will feed a cell so that it wasn't just spinning cycles (of course, theoretical peak performance is also the performance of a somewhat useless task unless you are scaling all your data by some constant). I'd expect Cell to have the same memory hogging characteristics that Altivec has.
The reason why this all works out is that Apple is going to do much of the hard work of making APIs that will use the Cell CPU (i.e. Core Video/CoreAudio/Quartz). No software developer in his right mind would ignore such useful and hardware accelerated APIs.
Originally posted by WelshDog
Quantel makes high end HD edit systems that use FPGAs to handle the video processing. Their eQ system can mix and match HD and SD video in one sequence without any proxies or other silliness. Very fast at any resolution.
I guess they are the cruise missile of editing equipment.
http://www.quantel.com/domisphere/in...sf/html/eQmain
Don't get me wrong, FPGA's are great, but despite all their power, they just dont find their way into as many hardware systems as they could.
Half of what would make cell a nice optin is that Apple would be bundling it in their machines as a standard option (well at least for their pro machines).
Anyhow, I have to go back to work (four weeks until we're gold!)
Originally posted by Programmer
What's with these authors and their crazy hyperbole?
And I thought they were being conservative.
My prediction for the Cell presentation: A Japanese engineer is introduced and walks on stage. The audience is stunned by the presentation. Then, towards the end, he says, "One more thing..." , while ripping off a mask to reveal he is actually Steve Jobs. "All of the cool stuff you have seen today has been running on the new PowerMac Cell. He presses the clicker and a Keynote slide showing the machine appears with the words "Shipping worldwide starting today."
Originally posted by Programmer
"Broadband on a chip"? What the heck is that supposed to mean and how is it going to bring everyone together in something that sounds like Utopia?
I got a kick out of that too. What now? Are they going to get OC3 lines to everybody's house in a chip? "Broadband on a chip" What a clown. This guy makes no sense.
Originally posted by onlooker
"Broadband on a chip" What a clown. This guy makes no sense.
Broadband on a the chip is the VISION for the Cell shared by IBM, Sony and Toshiba.
Yep those guys are real clowns about the broadband capabilities of the Cell.
http://www.nytimes.com/2005/02/07/te...gy/07chip.html
?One area of wide speculation is whether Apple might become a partner in the Cell alliance in the future. Apple is already the largest customer for the PowerPC chip, and it would be simple for the company to take advantage of the Cell design. Several people familiar with Apple?s strategy, however, said that the computer maker had yet to be convinced that the Cell technology could provide a significant performance advantage.?
Of course I am a dull realist but I have heard all this before from FPGA's and how many machines have those built in? Yes they can be incredibly powerful and useful, but nobody uses them outside of specialized applications (e.g. cruise missiles).
Thats funny - I thought that FPGAs were very common - my company uses boatloads of them in our products, and we are not primarily a defense company.
Originally posted by Aphelion
Broadband on a the chip is the VISION for the Cell shared by IBM, Sony and Toshiba.
Yep those guys are real clowns about the broadband capabilities of the Cell.
I still think it's bad language, and more of a buzz term than anything. How are you supposed to get broadband on a chip if you have dial up? What good is the chip then? That I'd like to know.
Originally posted by onlooker
I still think it's bad language, and more of a buzz term than anything. How are you supposed to get broadband on a chip if you have dial up? What good is the chip then? That I'd like to know.
I'm no networking expert but a little Googling around will show that IBM has been building on-chip hardware acceleration for the protocols that send data out the NIC's thus speeding up the broadband connection.
If you are on dial-up you don't get to play, it's just out of your league.