The Reg on future G4 roadmap

124

Comments

  • Reply 61 of 93
    arty50arty50 Posts: 201member
    [quote]Originally posted by RazzFazz:

    <strong>



    But the point is, it's only an incredibly small part of the market that actually has any benefits from going to 64bit chips. And what's even worse, those who need it have been using it for a long time now, only on other platforms - what reason would they have to switch to Macs?



    And bragging rights alone surely wouldn't justify the costs.



    Bye,

    RazzFazz</strong><hr></blockquote>



    You have to ask yourself, "Why is it a small part of the market?"



    The number one overriding answer is COST. It has nothing to do with willingness. If vastly larger number of people could suddenly afford to do CG Animation, they would jump on the opportunity instantly. That's CCR65's point. If Apple could provide a 64-bit consumer machine it would open up this type of work to a much larger portion of the population. The big boys could stick with SGI, Sun, etc. But, independent filmmakers could now afford to put some tight graphics into their movies.



    The DV revolution is a perfect example of this. DV allowed Joe Schmoe to shoot, edit, and record a film to media all on a personal computer. I know people who have produced shorts, and damn good ones, on Pismos. Before, you basically HAD to use an Avid set-up ($$$) to even think about making a movie.



    [ 02-12-2002: Message edited by: Arty50 ]</p>
     0Likes 0Dislikes 0Informatives
  • Reply 62 of 93
    Consumer applications don't need 64-bit processors yet, and likely won't for at least a few years. Big servers benefit from it, and some truly massive computational tasks. DV and 3D work don't need it. The 64-bit processors that are currently available also happen to be the high end of the performance spectrum, and that is why they are used for render farms... not because of their 64-bit nature.
     0Likes 0Dislikes 0Informatives
  • Reply 63 of 93
    macgregormacgregor Posts: 1,434member
    Boy, alot of people are getting pretty worked up over very little info.



    The G4 part of the article seems pretty reasonable and not at all something to feel bad about. The digital hub can not be an optimized Linux box at Pixar or a Nintendo 64, either. It has to be flexible and all of those things in the article point to a very flexible consumer path for the already well appointed iMac2. This is GOOD news.



    So what is cowerd and CCR65 worrying about. The article and common sense says that there is the smell of a greater divergence in the upgrade paths of the consumer and pro product lines. The article sez this and it sez it in an unnusually optimistic way for the Register that lives and breathes opensource.



    We don't know the pro upgrade path. Apple has been buying alot of high-end companies, that will make a big change in Apple corporate even by diffusion alone.



    The last few years the difference between consumers and pros was G3 vs G4, and some MHz differences. That is nothing compared with what I think Apple has in mind. So take a breath and realize that the entire pro/creative community isn't going to change its hardware either way in the next 6 months. Businesses don't change that fast....only "insider" predictions.
     0Likes 0Dislikes 0Informatives
  • Reply 64 of 93
    telomartelomar Posts: 1,804member
    [quote]Originally posted by CCR65:

    <strong>If you saw a story about pixar purchasing Intel boxes then post the link. I'm not saying your lying and if they are running Linux that at least makes some sense although I doubt they would use them for their render farm. Renderman does work on Linux but Pixar studios is going to use the fastest boxes they can for modeling and rendering. The Linux version of Renderman is sold as a product. Those machines could easily be development machines or general network computers.



    They use Macs as well so I don't know what any of this proves.</strong><hr></blockquote>



    The are replacing their SGI workstations with 400 PIV Intellistations from IBM.



    <a href="http://news.zdnet.co.uk/story/0,,t289-s2103536,00.html"; target="_blank">General blurb</a>



    Or if you are too lazy to look at the link:



    [quote] Pixar was the lone company of the four that didn't have a mainframe story to tell. Instead, it was advocating Linux on the workstations its teams of animators use to create their digital characters and complex effects such as animated facial features. Pixar so far has installed about 175 of a planned 400 Intel-based IBM IntelliStation workstations running Linux, said Darwyn Peachey, Pixar's vice president of technology.



    "By the end of March, we'll have essentially phased out our SGI (workstations) and replaced them with IntelliStations," Peachey said. <hr></blockquote>



    In addition to that they use workstations from Sun. Any use of Macs would be minor.
     0Likes 0Dislikes 0Informatives
  • Reply 65 of 93
    [quote]Originally posted by CCR65:

    <strong>Pixar's business does not depend on Pentium 4's. They use Sun Sparc and MACS. Oh I'm sure there are Windows boxes sitting around doing something (like crashing)there but do they depend on them for their livelyhood. Not a chance.







    ???? I've been doing video and audio editing, graphics, some web design and software development for over a decade. I never heard of any "digital artist" wanting to use an RS6000. Photoshop works on Mac/Windows. SGI is used widely as well (although not widely enough to keep them afloat).</strong><hr></blockquote>



    You really need to update your info.



    Pixar has been replacing their SGI hardware with IBM Intellistations with Pentium 4 and Xeon processors.



    I never said IBM tries to sell RS6000 machines to digital artists, where would you get that? Oh, I guess you didn't know that IBM also sells Intel based workstations. Well, then it's not my fault that you didn't know that, is it?
     0Likes 0Dislikes 0Informatives
  • Reply 66 of 93
    [quote]Originally posted by Arty50:

    <strong>You have to ask yourself, "Why is it a small part of the market?"

    </strong><hr></blockquote>



    My point was not that only few can afford it, but rather that only few actually see any benefit whatsoever from going to 64 bits.



    It seems to be a common misbelief that 64 bits are in some way inherently better - well, they are, if and only if you need huge address spaces (which could already be had with current G4s - see my post above - so if there's such a need, why don't Apple just use larger address spaces right now?) or 64bit integer calculations. Both of these are not an issue for most of the things nearly anyone would ever do with their computer, and since the basic int is now suddenly twice as big, it actually even hurts performance where it is not needed.



    Bye,

    RazzFazz



    [ 02-13-2002: Message edited by: RazzFazz ]</p>
     0Likes 0Dislikes 0Informatives
  • Reply 67 of 93
    g-newsg-news Posts: 1,107member
    maybe they liked that:



    512KB, Level 2 Advanced Transfer Cache

    The Level 2 Advanced Transfer Cache (ATC) is 512KB in size and delivers a much higher data

    throughput channel between the Level 2 cache and the processor core. The Advanced

    Transfer Cache consists of a 512-bit (32-byte) interface that transfers data on each core clock.

    As a result, the Pentium 4 processor can deliver a data transfer rate of core speed multiplied by

    32 bytes, reported in GB/s. This compares to a transfer rate of 16GB/s on the Pentium III

    processor at 1GHz. Features of the ATC include:



    Non-Blocking, full speed, on-die Level 2 cache

    8-way set associativity

    512-bit data bus to the Level 2 cache

    Data clocked into and out of the cache every clock cycle
     0Likes 0Dislikes 0Informatives
  • Reply 68 of 93
    [quote]Originally posted by G-News:

    <strong>What is making me pessimistic is not the outlook for MWNY, but the outlook beyond that, the way it looks we're not going to hit 2GHz this year, and it even looks like no 64bit chip until 04 or even later.</strong><hr></blockquote>



    Geez, I thought we all knew about the Mhz myth. Total performance depends on the clock and the effort per clock, a long pipeline means less work done per clock. A low-Mhz processor with a short pipeline can keep up with a high-Mhz processor with a long pipeline. Don't forget that the 1 Ghz P3 is faster than a 1.5 Ghz P4.



    You can compare it to a size of a wheel (the pipeline) and the number of rotations. A small wheel needs many more rotations to get the same speed. A P4 is like a car with 1" wheels that turn very fast, it might look cool, but will generate a lot of friction-&gt;heat. You are better of with larger wheels that turn slower.



    CPU's have got a sweet spot that give you the best bang for the buck (best performance for the wattage). Intel (and AMD) chose to increase the Mhz's to suboptimal levels as to sell to stupid people. Fortunately for them, that is a large market.



    And everyone who knows something about 64-bit says that we don't need it right now. It will get you more expensive processors with a slower clock that will perform worse in 32-bit operations. In return you can use more than 4 GB per app and a few operations will get faster (although you could speed up many of these with Altivec as well). So I'd rather not have a 64 bit CPU (yet).



    What I care about is high performance and a low wattage. A 1.5 Ghz G4 with a 500 mhz bus using less than 20 watts should be awesome, certainly on par with a 2.5Ghz P4. It will produce much less heat, allowing Apple to produce quiet computers (although they put a very noisy fan in the PMac G4, strangely enough) and fast powerbooks. The people who are angry for not getting a G5 are just immature and ignorant, the point of having a computer is not to be able to get a new buzzword-compliant CPU, but to get better performance or features.



    The rumored G4e certainly delivers, it has 75% of the stuff we wanted from the G5 anyway.
     0Likes 0Dislikes 0Informatives
  • Reply 69 of 93
    rickagrickag Posts: 1,626member
    [quote]wfzelle

    "Don't forget that the 1 Ghz P3 is faster than a 1.5 Ghz P4."

    "A 1.5 Ghz G4 with a 500 mhz bus using less than 20 watts should be awesome, certainly on par with a 2.5Ghz P4"<hr></blockquote>



    Just because a 1 Ghz P3 is marginally faster than a 1.5 GHz P4, doesn't mean that a 1.5GHz G4 will be comparable to a 2.5GHz P4. Also, if, a mighty big if, you go by the Register the MPC7500 G4 w/ RapidI/O, & 500MHz bus will come along about this time next year. What will Intel/AMD be offering by then?



    Don't get me wrong. Me personally, I will be EXTREMELY HAPPY, if, a mighty big if, Apple intros a MPC7470??? @ 1.5GHz using a 0.13µm process and a faster bus. 266MHz capapble, using DDRsDram in July.
     0Likes 0Dislikes 0Informatives
  • Reply 70 of 93
    Okay okay. They are using intellistations. None of this has anything to do with Apple save for the debate about the usefulness of 64bit chips in video and 3D.



    I believe that the "scientific research" everyone is refering to that fully uses a 64bit chip's resources is things such as weather modeling and mathmatical calculations. I'm aware that currently most creative software is coded for 32bit machines. That does not however mean that they cannot be re-coded to take advantage of 64bit processing. Video effects, compression and 3D modeling do have a lot of complex math involved. Memory bandwidth is unquestionably a benifit but once the data is in the processor it has to cruch some numbers. I'm willing to bet that the video and 3D apps coded especially for Irix/SGI do just that. If the G5 will be both 32 and 64 bit logic compatable, then there should be no problem in introducing 64bit optimized apps into the Mac market.



    From a marketing standpoint you are probably right. 64bitness won't sell in the same way that it does for video cards that used bitness in marketing (Rage128 etc.). However barring asking a software engineer that codes 3D apps to settle this (I don't know anybody offhand I'm still convinced that if video and graphics apps were coded for a 64bit G5 that it would be a part (just a part mind you) of what apple could use to show better performance.



    I'm not going to get into the argument about processor speed potenial for AMD and Intel since I don't know enough about CPU design to go that deep, but I do know that AMD of late has made some attempts to dispell the Mhz myth as if they see a slow down in speed increases coming. This could be due to their capacity but if Intel starts doing it as well (itanium( I'm hoping more people will talk about bus speed, width and such.



    I know that RapidIO is what would create a high bandwidth connection to the rest of the mobo, however though I am not a chip engineer it makes sense to me that widening the data path inside the chip is has to a good thing for dataflow inside and outside. I am somewhat confused by the seeming condratictions in talk about an on die memory controller and the inclusion of RapidIO on the same chip.



    As I told you I wasn't saying that you were lying. I crave information about gadgets and read as much as I can but I must have missed the artical. I meant no offense. I am passionate about what I do and about gadgets and I count myself lucky to be a Mac user and to be doing what I enjoy. Considering the economy I could easily be laid off and flipping burgers (sorry if I made anybody hungry) .
     0Likes 0Dislikes 0Informatives
  • Reply 71 of 93
    and yes I know how to spell but typing too fast is a problem. <img src="graemlins/hmmm.gif" border="0" alt="[Hmmm]" />
     0Likes 0Dislikes 0Informatives
  • Reply 72 of 93
    [quote]Originally posted by CCR65:

    <strong>I'm aware that currently most creative software is coded for 32bit machines. That does not however mean that they cannot be re-coded to take advantage of 64bit processing. Video effects, compression and 3D modeling do have a lot of complex math involved.

    </strong><hr></blockquote>



    Yes, but more often than not, "complex maths" pretty much also means "floating point". And (scalar) FP operations are already 64 bits wide on all current x86 and G4 processors, just like on the Alpha, on SPARC or on MIPS (actually, they are even 80 bits wide in x86 land). The FP units will not get anthing from going to a 64 bit chip that they couldn't get on a 32 bit chip.





    [quote]<strong>Memory bandwidth is unquestionably a benifit but once the data is in the processor it has to cruch some numbers. I'm willing to bet that the video and 3D apps coded especially for Irix/SGI do just that. If the G5 will be both 32 and 64 bit logic compatable, then there should be no problem in introducing 64bit optimized apps into the Mac market.</strong><hr></blockquote>



    Once again, the whole crunching advantage provided by a 64-bitness is that you can have very large address spaces (which could also be had with current G4s), and that you can do single-cycle 64 bit arithmetic (which is only useful if you handle numbers larger than 2^32-1 each).





    [quote]<strong>From a marketing standpoint you are probably right. 64bitness won't sell in the same way that it does for video cards that used bitness in marketing (Rage128 etc.).

    </strong><hr></blockquote>



    Note that the bitness used in GFX card advertising isn't even comparable to the bitness in our context at all.





    [quote]<strong>I know that RapidIO is what would create a high bandwidth connection to the rest of the mobo, however though I am not a chip engineer it makes sense to me that widening the data path inside the chip is has to a good thing for dataflow inside and outside.</strong><hr></blockquote>



    Keep in mind that RapidIO doesn't depend on a chip being 64 bit at all.



    Bye,

    RazzFazz



    [ 02-13-2002: Message edited by: RazzFazz ]</p>
     0Likes 0Dislikes 0Informatives
  • Reply 73 of 93
    [quote]Originally posted by G-News:

    <strong>maybe they liked that:

    The Advanced Transfer Cache consists of a 512-bit (32-byte) interface that transfers data on each core clock.

    </strong><hr></blockquote>



    Where are you getting this from? This is flat out wrong, accorrding to <a href="http://www.intel.com/design/Xeon/prodbref/#transfer"; target="_blank">Intel</a>, the ATC is connected by a 256 bit wide interface (which by the way incidentally happens to be exactly the same as 32 bytes wide, 32*8=256).



    Note that the on-die L2 cache on the G4 has a 256 bit wide connection too (the Athlon's one is only 64 bit wide), and is also 8-way associative.



    Admittedly, right now it's only half as large as the Northwood's ATC, but we'll see what happens when the G4 moves to 130nm too.



    Bye,

    RazzFazz



    [ 02-13-2002: Message edited by: RazzFazz ]</p>
     0Likes 0Dislikes 0Informatives
  • Reply 74 of 93
    Points well taken. I suppose if the next chip is 64bit and it doesn't raise the cost and provides at least some benifit then fine. I am really only interested in performance so if a knowledgable person can set me straight on this I'm grateful. Regardless of the accuracy of the Register artical. I have a feeling what is coming next will be good. It's healthy to have optimisum.
     0Likes 0Dislikes 0Informatives
  • Reply 75 of 93
    [quote]Originally posted by CCR65:

    <strong>However barring asking a software engineer that codes 3D apps to settle this (I don't know anybody offhand I'm still convinced that if video and graphics apps were coded for a 64bit G5 that it would be a part (just a part mind you) of what apple could use to show better performance.</strong><hr></blockquote>



    Okay you asked. The answer is that a 64-bit processor (in the G5 or Hammer sense of the term) only gives you native 64-bit integer math and a 64-bit addressing space. This is of little use to 3D or video production applications. There are some supercomputing applications that need these capabilities, and some server applications can benefit from huge address spaces... but in the consumer space is buys you nothing and costs you something.



    There are many pathways, register sizes, etc in a processor and most of them are independant of one another. The G4 has 32-bit, 64-bit, 128-bit, and 256-bit elements within it. The G5 & Hammer "64-bit" nature refers specifically to the size of the machine's integer registers, which are also used to hold pointers (and that limits how big an application's address space can be). It doesn't say anything about whether a machine supports 64-bit floats (the G4 does) or 128-bit vectors (the G4 does).



    RazzFazz: the current G4 processors can address 36-bits of physical space, but a single application's address space can only be 32-bits worth (due to pointer size). This means Apple could build machines which hold up to 16 GBytes, but no single process could use all of that directly.
     0Likes 0Dislikes 0Informatives
  • Reply 76 of 93
    tjmtjm Posts: 367member
    From my perspective, IF this Reg article is right, then it is about 98% good news. The route to the 7500 looks reasonable and shows that Moto really does care about Apple and the desktop market. The end product is everything we were willing to call the "G5" except 64-bitness. I would be extremely happy if Apple were to sell a CPU with one (or more) of these 7500 chips.



    At the moment, 64-bitness is something we can live without. The markets which really NEED it at the moment are quite small and ones Apple doesn't currently compete in anyway. So if Moto can get to the 7500 (which adds all the immediately useful stuff), then add 64-bitness afterward, it would seem reasonable.



    For a long time yet, 64-bit registers and such will not be terribly useful (from all that I have heard/read). However, I suspect that clever programmers will find all sorts of creative ways of using that space in ways we have not yet perceived. Eventually, 64-bit CPUs will be "de riguer", but for at least the next few years it will be like AltiVec was at first - a really cool feature of which almost no one took advantage. I think jumping on Moto or Apple for not focusing on it is a bit premature.
     0Likes 0Dislikes 0Informatives
  • Reply 77 of 93
    [quote]Originally posted by Leonis:

    <strong>G4 or G5 or G6 or whatever are just the name



    What I care the most is peformance. If the processor can blow P4 and Athlon I don't really care if it's called G1 </strong><hr></blockquote>



    I'd have to agree. I want my PC using collegues to have their doors blown off. Not just, "gee, the pretty fast." but rather a jaw on the floor, Tex Avery bug eyed, "Holy sh*t!"



    tsukurite
     0Likes 0Dislikes 0Informatives
  • Reply 78 of 93
    But what about this part?:



    "...but common sense and recent precedent suggest that these two processors will form the mainstay of Apple?s iMac2 and low-end professional lines."



    Doesn't that leave room at the top for something different, a la G5?



    tsukurite
     0Likes 0Dislikes 0Informatives
  • Reply 79 of 93
    [quote]Originally posted by tsukurite:

    <strong>But what about this part?:



    "...but common sense and recent precedent suggest that these two processors will form the mainstay of Apple?s iMac2 and low-end professional lines."



    Doesn't that leave room at the top for something different, a la G5?



    tsukurite</strong><hr></blockquote>



    It would if it weren't all supposition. Apple makes no formal distinction between low- and high-end professional systems.
     0Likes 0Dislikes 0Informatives
  • Reply 80 of 93
    [quote]Originally posted by Nonsuch:

    <strong>

    It would if it weren't all supposition. Apple makes no formal distinction between low- and high-end professional systems.</strong><hr></blockquote>





    Well, they don't make that distinction yet. The main thing is that we still don't have any proof (of anything). We can suppose that Moto and Apple are working on faster machines, but anything beyond that just has no supporting evidence that really can be trusted. This is true of both the faster G4 or the G5.
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.