New Powermacs to use Cell Processor?

1567911

Comments

  • Reply 161 of 220
    Quote:

    Originally posted by murk

    eWeek on Apple & Cell



    Great article.
  • Reply 162 of 220
    shadowshadow Posts: 373member
    Quote:

    Originally posted by murk

    eWeek on Apple & Cell



    Great article indeed. The author (John Rizzo) has outlined the difficulties along the possible Cell implementation by Apple, both from the Apple side ("porting" the software) and IBM side (customizing the chip). The point is:



    While the specs of the new chip are impressive, especially with its integrated support for virtualization and speedy video performance, analysts said differences between the Cell and the current PowerPC architectures will make any transition an unlikely prospect for the next few years.



    It is hard to disagree with the arguments of the author. However, playing the optimist on this forum I would like to note that there is one week point there .

    Let's assume that migration to the Cell requires 4-5 years. The author speculates that IBM may be trying to entice Apple to get involved. The reading of the article leaves the impression that this "convincing game" started around the time of the public introduction of the first Cell implementation this month. It is more logical to assume that the first round of it started 2-3 years ago. We don't know the outcome but it is safe to assume that Apple and IBM do have some definite short and medium term plans regarding Apple processors and some long term projections. We should be aware that the processor performance per clock cycle could be (or, to be more precise, IS) emulated long before the first silicon is ready. There is one BIG unknown at that point - the clock speed which will be achieved (well, and the challenges of the production process which contribute to the amount of the investments needed to get the processor out). So let's analyze the possibilities:



    1. Apple was NOT convinced (1.5 - 2 years ago). Steve decided to wait and see and stay with the G5 for the time being. Even if this is the case, Apple would make it's best to play safe (they had enough processor-related problems, didn't they?) and to keep an eye on the development of the project and to move the operating system in a way that will make it easier to adopt the Cell if needed (just in case). Even in this scenario it is not possible to answer the main question "Will Apple use Cell in future PowerMac". However, it puts the timeframe for the possible adoption at least 3-4 years away. Based on the information we have on Cell this is the safest bet. Under this scenario Apple will use the Cell ONLY and IF it gives a definite, measurable advantage over classic design. But Steve knows more about Cell than we do... IF the things we do not know are promising there are two more scenarios.



    2. Apple was convinced and is preparing for the Cell. However, it is very unlikely tat Apple will use the version of the Cell which was presented this month. So let's move the 3rd scenario.



    3. Apple was interested in the Cell's potential and discussed with IBM (1.5 - 2 years ago) the optimizations they need. Both companies started to work in this direction. The Apple version could include a Power Processor Element which is much more sophisticated than the one shown this month. It could be redesigned G5. The redesign most notably affected the bus and supplemental resources - the cache, registers etc. with the main core (with the instruction scheduling architecture) almost unaffected with multithreading and some power-saving enhancements added. Note that the size and transistor count of the current G5 suggests that to match the demonstrated version of Cell (in terms of size) you can add some 4 SPEs. This is much more effective solution (cost and performance) than use the Cell as a co-processor. The arrival of this new architecture is only 2 years away (tied to Longhorn release?).





    So, we don't have internal information and we have to speculate on the publicly available information.

    What we know:

    1. Steve likes the idea of distributing OS tasks between different subsystems, and it looks like he likes it very much: NEXT boxes used DSPs, and now we have Quartz, Quartz Extreme, Core Image, Core Audio and Core Video - all capable to offload the main processor and to use other subsystem available. One specific aspect of Core Image implementation (may be it is similar in Core Audio and Core Video as well): When editing an image in Photoshop, all filters are applied to the entire image one after another. In Core Image different filters can be "chained" and to be applied to the image virtually "simultaniosly". This could be a good candidate for Cell's streaming capability. More on Core Data later.

    2. Apple did NOTHING to move Tiger to 64 bit architecture. Few low level patches, available in Panther as well, for those who need to use 64 bitness - but not available in Cocoa.

    3. Apple is aggressively fighting for the living room. It's main consumer applications: iPhoto, iMovie, iTunes, iDVD, iChat AV, GarageBand, Keynote - with increasing efforts on interoperability between all the applications and the OS as well as media streaming capabilities.

    4. No need to talk about the Apple pro applications, just to name them: Logic Pro, Motion, Final Cut Pro HD, DVD Studio Pro, Shake - all media related. Plus Xsan.

    5. New OS features: OS-level support for Camera RAW format (Finder, Preview.app, we already have the iPhoto; supported by NSImage, CIImage. And who sad the iPhoto/Finder have to make previews of the images one by one?). Spotlight, Core Data - permanently running database engines. Core Data is based on the EOF (Enterprise Objects Framework) found in NEXT/WebObjects with the notable difference that it does not communicate with an external database but uses embedded database engine. A lot of the underlying technics of the Core Data (Faulting, Uniquing, Conflict Detection, Snapshotting) would benefit a lot from a multi-core architecture (dual-core, dual threaded G5/"conventional" G6 as well, for that matter).



    May be there is room for the Cell after all!



    P.S. Sorry for the long post!
  • Reply 163 of 220
    jrgjrg Posts: 58member
    Quote:

    Originally posted by shadow



    2. Apple did NOTHING to move Tiger to 64 bit architecture. Few low level patches, available in Panther as well, for those who need to use 64 bitness - but not available in Cocoa.





    Apple has added a significant amount of 64 bit support in Tiger.



    There is complete support for 64 bit binaries and 64 bit arithmetic. While it doesn't support Cocoa, so far as we are aware, it amounts to far more than a few low level patches.



    The support is there where 64 bit matters: in the Unix layers. I can't think of a reason for the Finder, the system applications (Mail etc) or any user visible application to require 64 bit goodness. Write the 64 bit portion as a daemon or server and call it from a 32 bit client.
  • Reply 164 of 220
    shadowshadow Posts: 373member
    Quote:

    Originally posted by JRG

    Apple has added a significant amount of 64 bit support in Tiger.



    There is complete support for 64 bit binaries and 64 bit arithmetic. While it doesn't support Cocoa, so far as we are aware, it amounts to far more than a few low level patches.



    The support is there where 64 bit matters: in the Unix layers. I can't think of a reason for the Finder, the system applications (Mail etc) or any user visible application to require 64 bit goodness. Write the 64 bit portion as a daemon or server and call it from a 32 bit client.




    Ooops!

    OK, no 64 bit support in Cocoa. I am not sure that really matters - we all know that 64 bit is not going to provide the leap in performance for all aplications the 16 bit - 32 bit transition did.
  • Reply 165 of 220
    That article is poor, and just plain inaccurate in some ways. Everybody saying that the Cell's Power core is not adequate for Apple is ignoring the 2:1 clock rate differential, which is just silly. This core will, on average, be about that same speed as current 970 offerings and we don't have much hope for faster 970s. As a result the Cell's bandwidth, 8 vector units, and on-chip memory controller will grossly outmatch the 970-based Macs.
  • Reply 166 of 220
    Quote:

    Originally posted by Programmer

    I give up. Lets just wait until they publish more information.



    There's no technical reason preventing Apple from using cell. IAW you're right, so take it easy



    End of Line
  • Reply 167 of 220
    onlookeronlooker Posts: 5,252member
    Quote:

    While the specs of the new chip are impressive, especially with its integrated support for virtualization and speedy video performance, analysts said differences between the Cell and the current PowerPC architectures will make any transition an unlikely prospect for the next few years.



    That is my summary on Apple, and Cell also. It's definitely not something we'll be seeing in the next year.
  • Reply 168 of 220
    snoopysnoopy Posts: 1,901member
    Quote:

    Originally posted by Programmer

    That article is poor, and just plain inaccurate in some ways. Everybody saying that the Cell's Power core is not adequate for Apple is ignoring the 2:1 clock rate differential, which is just silly. This core will, on average, be about that same speed as current 970 offerings and we don't have much hope for faster 970s. As a result the Cell's bandwidth, 8 vector units, and on-chip memory controller will grossly outmatch the 970-based Macs.





    Thanks. I was hoping you would reply to this. I disagreed with the article when I read it, and thought it was full of logical holes. Yet I didn't know enough about the topic to be confident of my conclusion.



    IMO, the dumbest thing the author says is that IBM hopes to interest Apple in the Cell processor. I believe there is a good chance Apple had knowledge of Cell for quite some time, and is preparing Tiger to run on it.
  • Reply 169 of 220
    onlookeronlooker Posts: 5,252member
    Quote:

    Originally posted by snoopy

    I believe there is a good chance Apple had knowledge of Cell for quite some time, and is preparing Tiger to run on it.



    That's hysterical.
  • Reply 170 of 220
    Quote:

    Originally posted by snoopy

    IMO, the dumbest thing the author says is that IBM hopes to interest Apple in the Cell processor. I believe there is a good chance Apple had knowledge of Cell for quite some time, and is preparing Tiger to run on it.



    Two things to be careful of:



    1) The Cell is an STI project that Apple has no doubt been aware of, but I doubt they've had terribly accurate knowledge of the details. The AIM partnership always suffered from an inability to share information with other companies because it had to get past 3 sets of management to allow it to go out. I expect STI is much the same, perhaps even worse because they are international and government restrictions on trade & export would come into play at some point. So Apple may just be finding out about this thing now (in detail), but may have had promises from IBM about how applicable it is and the general direction of the architecture.



    2) "Preparing Tiger to run on it" can be interpreted in two ways. Option 1, typical of the more literal minded, is that they are writing code that will use the Cell's vector cores and allow Tiger to boot on some mythical Cell-based Mac. I would be extremely surprised if this was the case -- see point 1 for more details. Option 2, for the planners among us, is to build things into Tiger which would allow Apple programmers to quickly take advantage of the vector units in a way that could immediately benefit applications that are taking advantage of current MacOS X technologies, plus those coming in Tiger. OpenGL, Quartz2D, CoreAudio, CoreImage, QuickTime, etc are all design to provide computationally intensive services to an application without that application knowing how the computation is actually being done. These systems can use AltiVec, multiple cores, programmable GPUs (and in the future, Cell vector units) all without explicit knowledge of the application. For all these applications that do this to leverage much of the Cell's power, all that needs to happen is for Apple to write the Cell vector code for each of these modules. This is the direction I would steer Apple if I was at the helm, and this is the direction they appear (to me) to be taking. And if they're not taking it then hopefully SJ is reading this and going "hey, this guy is pretty smart, maybe we'd better do what he says before Sony & IBM steamroller us".
  • Reply 171 of 220
    snoopysnoopy Posts: 1,901member
    Quote:

    Originally posted by Programmer



    . . . The Cell is an STI project that Apple has no doubt been aware of, but I doubt they've had terribly accurate knowledge of the details. The AIM partnership always suffered from an inability to share information with other companies because it had to get past 3 sets of management to allow it to go out. I expect STI is much the same, perhaps even worse because they are international and government restrictions on trade & export would come into play at some point. So Apple may just be finding out about this thing now (in detail), but may have had promises from IBM about how applicable it is and the general direction of the architecture. . .







    This is certainly a logical way to look at the situation, and you may very well be right. On the other hand, STI may have actively sought involvement of Apple and other potential customers for Cell. That's why I give it good chance. Maybe not 50-50 but decent odds.



    I considered what STI's motivation might be from what I read. At one extreme, STI could wish to keep it proprietary, for their own advantage. At the other, STI may wish to get Cell into everybody's products and have it become the new world standard. I'm guessing that STI's position is much closer to the second statement. That's why I'm optimistic.



    I believe STI has been motivated to get Cell into products quickly. A gradual slow adoption of Cell gives the competition too much time to respond, which will make Cell's adoption even slower. However, If Cell can hit the market and reach "critical mass" quickly, it is in a much better position to duke it out with the king pin chip maker.



    So, how would STI achieve such a goal? By talking to key potential customers far enough ahead of time. First, potential customers got STI's sales pitch. Those who wanted to be one of the first to put Cell in a product would have entered into an agreement with STI and have been given early details, which allows product design to proceed.



    I think there is a good chance that STI took this aggressive approach.
  • Reply 172 of 220
    macroninmacronin Posts: 1,174member
    I keep thinking back to the keynote address with the guy from Sony, and Steve saying 'someday maybe music and computers'...



    And then the interview with Steve in Fortune where he states that three big name PC manufacturers were courting Apple to license OS X...



    And now we have Cell, a 'revolutionary' compute unit from a three-headed consortium of, wait for it, PC manufacturers...



    And when they mention DCC workstations based on Cell, who do we know that makes a pretty nice package for DCC work...?!?



    Draw your own conclusions, but start out the Mac hardware lineup with a basic Cell-based Mac mini, and scale up from there...
  • Reply 173 of 220
    Just a thought, I have been thinking about those posters that say that by the time that Apple would be able to use Cell that the 9xx series could have gotten much faster. They are both moving targets. But if IBM makes a dual core 9xx and it includes an on die memory controller and Apple keeps the line-up with two chip for the pro computers then things would get interesting. We would have 8FPUs, and 8VMX units in that confuguration.



    As I understand it there are a few things that hold back the performance of the 970. Compilier, IBMs is apparently is much better, Apple is working on GCC. Auto vectorizing that is now coming to GCC. I think that memory bandwidth could rearing its ugly head when VMX is heavily used. This could get much worse if when we have two chips, four cores, 8 FPUs, and 8 VMX units. I hope tiger is heavily multi threaded. I hope Apple has something up their sleeve to alleviate memory bandwidth issues.
  • Reply 174 of 220
    Quote:

    Originally posted by snoopy

    I considered what STI's motivation might be from what I read. At one extreme, STI could wish to keep it proprietary, for their own advantage. At the other, STI may wish to get Cell into everybody's products and have it become the new world standard. I'm guessing that STI's position is much closer to the second statement. That's why I'm optimistic.



    Keeping it prorpietary and involving other parties in the design process are two very different things. Successful product design projects usually do not directly involve potential customers in the design process. They consider (or ask) what the customer needs, but the project itself is focused and as streamlined as possible. Failure to do that usually results in a project failure.



    Quote:

    I believe STI has been motivated to get Cell into products quickly. A gradual slow adoption of Cell gives the competition too much time to respond, which will make Cell's adoption even slower. However, If Cell can hit the market and reach "critical mass" quickly, it is in a much better position to duke it out with the king pin chip maker.



    Absolutely, but that just means making samples available to customers -- not having them take part in the design. "Too many cooks in the kitchen..."



    Besides, between Sony's PS3, Toshiba's TVs, and IBM's workstations and supercomputers I'm sure that their fabs will be busy. That is pretty much a given, otherwise Sony and Toshiba wouldn't be building additional fabs just for making Cell processors.



    Quote:

    So, how would STI achieve such a goal? By talking to key potential customers far enough ahead of time. First, potential customers got STI's sales pitch. Those who wanted to be one of the first to put Cell in a product would have entered into an agreement with STI and have been given early details, which allows product design to proceed.



    Yes, but far enough ahead of time is about now (or maybe a few months ago). Keep in mind that Cell-based products are probably still a year away. ISSCC talks about chips well before consumers can buy products built around them. What did they say? Production starting 2H '05?



    Quote:



    From Brendon:

    Just a thought, I have been thinking about those posters that say that by the time that Apple would be able to use Cell that the 9xx series could have gotten much faster. They are both moving targets. But if IBM makes a dual core 9xx and it includes an on die memory controller and Apple keeps the line-up with two chip for the pro computers then things would get interesting. We would have 8FPUs, and 8VMX units in that confuguration.




    Last time I looked the 970 wasn't moving that much. The dual core 970 is probably, but there are zero indications that an on-chip memory controller is coming. And a dual core 970 is only 2 instruction streams, compared to the Cell's 10. Two 970s have 4 FPU execution units, 2 VMX math units and 2 VMX permute units, I don't know where you get 8/8 from. This compares to some unknown (but smaller) number of units in the PPE, and probably 2 per vector core (x8)... for something like 18-20 total. And they operate on different code sequences and are thus more flexible.



    Quote:

    As I understand it there are a few things that hold back the performance of the 970. Compilier, IBMs is apparently is much better, Apple is working on GCC. Auto vectorizing that is now coming to GCC. I think that memory bandwidth could rearing its ugly head when VMX is heavily used. This could get much worse if when we have two chips, four cores, 8 FPUs, and 8 VMX units. I hope tiger is heavily multi threaded. I hope Apple has something up their sleeve to alleviate memory bandwidth issues.



    All the compiler work on auto-vectorization will benefit Cell much more heavily, and I'd guess that the Cell was the real motivation for it. What is really holding back the 970 is its microarchitecture -- it simply isn't scaling up, most likely due to the complexity of the OoOE unit and its automated design. I say that because the Cell's PPE doesn't have one is hand tuned, and runs at >4 GHz.



    As for memory bandwidth, the Mac memory subsystem needs faster types of RAM to catch up with the 970's existing FSB. Once that is saturated then it becomes an issue of the 970's design, but that is a ways off yet.



    There are more than enough threads in MacOS X (even pre-Tiger) to keep 2 Power threads busy. That's how many both Cell and a 970MP have. A dual 970MP machine might be under-utilized unless you were running multi-threaded apps, but there is no reason to run all the cores at full speed unless you have work for them -- and things like CoreImage are designed to distribute work across many cores (whether they are PPC, GPU, or Cells).
  • Reply 175 of 220
    Quote:

    Originally posted by Programmer

    Last time I looked the 970 wasn't moving that much. The dual core 970 is probably, but there are zero indications that an on-chip memory controller is coming. And a dual core 970 is only 2 instruction streams, compared to the Cell's 10. Two 970s have 4 FPU execution units, 2 VMX math units and 2 VMX permute units, I don't know where you get 8/8 from. This compares to some unknown (but smaller) number of units in the PPE, and probably 2 per vector core (x8)... for something like 18-20 total. And they operate on different code sequences and are thus more flexible.



    Clarification: If Apple continues to ship the pro computers with two chips and IBM puts two cores on each chip that would be four processors, each with two FPUs and two VMX units for a grand total of 8FPUs and 8VMX units.



    Quote:

    All the compiler work on auto-vectorization will benefit Cell much more heavily, and I'd guess that the Cell was the real motivation for it. What is really holding back the 970 is its microarchitecture -- it simply isn't scaling up, most likely due to the complexity of the OoOE unit and its automated design. I say that because the Cell's PPE doesn't have one is hand tuned, and runs at >4 GHz.



    No argument here.



    Quote:

    As for memory bandwidth, the Mac memory subsystem needs faster types of RAM to catch up with the 970's existing FSB. Once that is saturated then it becomes an issue of the 970's design, but that is a ways off yet.



    Agreed



    Quote:

    There are more than enough threads in MacOS X (even pre-Tiger) to keep 2 Power threads busy. That's how many both Cell and a 970MP have. A dual 970MP machine might be under-utilized unless you were running multi-threaded apps, but there is no reason to run all the cores at full speed unless you have work for them -- and things like CoreImage are designed to distribute work across many cores (whether they are PPC, GPU, or Cells).



    As you point out, the "Cores" (image, audio, video, data) of Tiger are designed for work across many cores, and Apple will have four cores to spread the work across as well as the GPU. Not saying that Cell is not all that but just saying that the 970 ain't dead either. June will be an interesting month and we will know much more then.
  • Reply 176 of 220
    I find it kind of odd that John Rizzo is talking about Cell processors. If this is the John Rizzo of www.macwindows.com fame I think he's a bit out of his element and that would explain some of the innacuracies in the article.



    I don't really think that we need to run OSX Tiger for Cell but rather I'd love to see Apple move to Cell for their multimedia. Either as a coprocessor or if Quicktime could be sat on top of Cell for media center like devices.
  • Reply 177 of 220
    wizard69wizard69 Posts: 13,377member
    Quote:

    Originally posted by snoopy

    This is certainly a logical way to look at the situation, and you may very well be right. On the other hand, STI may have actively sought involvement of Apple and other potential customers for Cell. That's why I give it good chance. Maybe not 50-50 but decent odds.



    There are the obivous public contacts between the companies to support your position. Further I would imagine that there are people getting advance information years before it is made public. Now the question is would the Cellpeople want Apple involved?

    Quote:



    I considered what STI's motivation might be from what I read. At one extreme, STI could wish to keep it proprietary, for their own advantage. At the other, STI may wish to get Cell into everybody's products and have it become the new world standard. I'm guessing that STI's position is much closer to the second statement. That's why I'm optimistic.



    There is little advantage to proprietary parts for high volumn products. The other thing that people seem to be failing to grasp is that STI has very much said publicly that Cell is going to go into alot of products. So like you I'm optimistic. I'm also optimistic that Apple is far along with a Cell based product. Maby it won't be released this half of the year but late in the year is a possiblity.

    Quote:



    I believe STI has been motivated to get Cell into products quickly. A gradual slow adoption of Cell gives the competition too much time to respond, which will make Cell's adoption even slower. However, If Cell can hit the market and reach "critical mass" quickly, it is in a much better position to duke it out with the king pin chip maker.



    Cell doesn't have anyone to duke it out with. I suspect STI's biggest problems will be production ramp and code development.

    Quote:



    So, how would STI achieve such a goal? By talking to key potential customers far enough ahead of time. First, potential customers got STI's sales pitch. Those who wanted to be one of the first to put Cell in a product would have entered into an agreement with STI and have been given early details, which allows product design to proceed.



    Exactly! Everyone involved with STI most likely has product far along in the development cycle. What those products are and how they use Cell is a mystery now, but I could see both Toshiba and Sony applying this technology to a bunch of stuff beyond game boxes. The question becomes is Apple in the loop, I have to say yes at this point.

    Quote:



    I think there is a good chance that STI took this aggressive approach.



    Well I'm going to say right now that I think Apple is in the loop in some fashion. Maybe not the Cell chip we recently seen announced but using the technology certainly. In a nut shell that is what Cell is all about a technology platform more than anything. Apple could very well have this technology ready for the portable platforms or even the Mini, adapted of course for the thermal environments.



    Dave
  • Reply 178 of 220
    Quote:

    Originally posted by hmurchison

    I find it kind of odd that John Rizzo is talking about Cell processors. If this is the John Rizzo of www.macwindows.com fame I think he's a bit out of his element and that would explain some of the innacuracies in the article.



    I don't really think that we need to run OSX Tiger for Cell but rather I'd love to see Apple move to Cell for their multimedia. Either as a coprocessor or if Quicktime could be sat on top of Cell for media center like devices.




    I believe that about 5+ years ago there were questions about Quicktime not needing Windows to run on intel computers, there was thought put to including a Unix core to Quicktime and that way bypassing Windows and MS. I don't know what happened to that, but I remember that Apple said that Quicktime was designed for sitting on different hardware and could in time be an operating system as long as it could use Unix for the core functions. Note that this came out about the time that NetScape said that they could make in operating system that would allow the user to never know that windows was running under the browser.



    So to start a thread: Is it possible and economically viable for Apple to have a standalone version od quicktime with a unix core so that Quicktime could be included or sold seperately to PS3 owners since Cell should be able to run multiple operating systems.



    And is it possible that an economic reason for Apple to be interested in Cell other than speeding up Tigers Cores would be for the ease in porting games from the PS3 and Xbox since they will be written for cell. If this is possible, from what I have heard game porting is not difficult but porting and keeping the speed is very difficult requiring lots of processor specific code to be ported to a different processor, then the Mac paltform would no longer suffer this problem.
  • Reply 179 of 220
    onlookeronlooker Posts: 5,252member
    Quote:

    Originally posted by hmurchison

    I find it kind of odd that John Rizzo is talking about Cell processors. If this is the John Rizzo of www.macwindows.com fame I think he's a bit out of his element and that would explain some of the innacuracies in the article.



    I don't really think that we need to run OSX Tiger for Cell but rather I'd love to see Apple move to Cell for their multimedia. Either as a coprocessor or if Quicktime could be sat on top of Cell for media center like devices.




    I'm with you. If apple is even thinking of using a cell design yet I think the first product we would see a cell processor in would be a new one.
  • Reply 180 of 220
    wizard69wizard69 Posts: 13,377member
    Quote:

    Originally posted by Programmer

    Keeping it prorpietary and involving other parties in the design process are two very different things. Successful product design projects usually do not directly involve potential customers in the design process. They consider (or ask) what the customer needs, but the project itself is focused and as streamlined as possible. Failure to do that usually results in a project failure.



    Doesn't this sort of fly in the face of what Cell accomplished or what AIM accomplished with Alt-Vec. Apple was heavily involved in the development of Alt-Vec and could be involved heavily in the development of Cell.



    In the general business sense though your statment about customers not being involved in the design process is just wrong. It doesn't matter if it is a ball bearing uint for a car or a wing for an airplane of a PC chip success needs customer involment.

    Quote:





    Absolutely, but that just means making samples available to customers -- not having them take part in the design. "Too many cooks in the kitchen..."



    This simply isn't modern business practice.

    Quote:



    Besides, between Sony's PS3, Toshiba's TVs, and IBM's workstations and supercomputers I'm sure that their fabs will be busy. That is pretty much a given, otherwise Sony and Toshiba wouldn't be building additional fabs just for making Cell processors.



    Yes all these fabs do indicate that a huge number of Cell based porcessors will soon be on the market. This clearly indicates a wide take up of the design.

    Quote:





    Yes, but far enough ahead of time is about now (or maybe a few months ago). Keep in mind that Cell-based products are probably still a year away. ISSCC talks about chips well before consumers can buy products built around them. What did they say? Production starting 2H '05?



    The evidence is clear that Apple knew about the 970 well before it was publicly announced at ISSCC and it was a year after that when we got good hardware. Apple certainly could have been in the loop before the ISSCC debut of Cell.



    While it may be a stretch I think the clearest indication yet is the lack of a low power 970 that could be used for Apples other needs. A lower pwoer variant of Cell could very well be intended for Apples low power needs where it seems to be the best offering to date. That is the PPE is apparrently the only low power 64 bit core available at the moment.

    Quote:





    Last time I looked the 970 wasn't moving that much. The dual core 970 is probably, but there are zero indications that an on-chip memory controller is coming. And a dual core 970 is only 2 instruction streams, compared to the Cell's 10. Two 970s have 4 FPU execution units, 2 VMX math units and 2 VMX permute units, I don't know where you get 8/8 from. This compares to some unknown (but smaller) number of units in the PPE, and probably 2 per vector core (x8)... for something like 18-20 total. And they operate on different code sequences and are thus more flexible.



    Actually the fact that they operate on different code sequences makes them less flexible. 970 based multi core products would still have an advantage in some applications. Cell is at an advantage when the SPE's can be used to advantage. It is not clear at all, atleast to me, how effective the SPE's would be with scalar integer ops. That is could one feed a regular PPC thread to them for execution. The indication are that this is impossible, so flexibility is really not there.

    Quote:





    All the compiler work on auto-vectorization will benefit Cell much more heavily, and I'd guess that the Cell was the real motivation for it. What is really holding back the 970 is its microarchitecture -- it simply isn't scaling up, most likely due to the complexity of the OoOE unit and its automated design. I say that because the Cell's PPE doesn't have one is hand tuned, and runs at >4 GHz.



    Well it remains to be seen how well the autovectorizing compiler work in the first place. However I do not see them benefiting Cell if the code has to be dramatically different on Cell.

    Quote:



    As for memory bandwidth, the Mac memory subsystem needs faster types of RAM to catch up with the 970's existing FSB. Once that is saturated then it becomes an issue of the 970's design, but that is a ways off yet.



    An on board memory controller does move that memory closer to the processor even if it is the same old technology. So even today one could see a benefit on the 970 or a follow on. There seems to be little in the rumor mill about a 970 with on board memory controller coming, inidcating to me that 970 technology isn't long for this world. IBM has seen little uptake in the 970 and its I/O bus ouside of Apple.

    Quote:



    There are more than enough threads in MacOS X (even pre-Tiger) to keep 2 Power threads busy. That's how many both Cell and a 970MP have.



    Yes but lets not confuse the two one is a second processor and the other is the result of SMT. The two shouldn't be compared untile we get a handle on how well IBM's SMT works on Cell. Plus we have the possibility that the cores on the MP might be threaded.

    Quote:



    A dual 970MP machine might be under-utilized unless you were running multi-threaded apps, but there is no reason to run all the cores at full speed unless you have work for them -- and things like CoreImage are designed to distribute work across many cores (whether they are PPC, GPU, or Cells).



    Even my single processor Linux machine that I'm writing this on is under-utilized 95% of the time. But and it is a big but I can still load it down from time to time in ways that can be frustrating. Even more frustrating is that there are times when on solution (Cell) would be more efective than the other (an MP) and vs versa. For most of those times though Cell would seem to be the ideal solution for a better computing experience.



    Dave
Sign In or Register to comment.