The Reg on future G4 roadmap

135

Comments

  • Reply 41 of 93
    outsideroutsider Posts: 6,008member
    Why is the concept of having development on 2, 3, 4 chips simutaneously so hard to grasp? Continued develpment on the G4 != stop develpment on G5.
     0Likes 0Dislikes 0Informatives
  • Reply 42 of 93
    I think 1.5 Ghz + 266 Mhz DDR at MWNY wouldn't be a bad machine. It'd still be too little too late though.



    I find it very ironic that, Apple's PowerMac G4 pages are so full of anti-Intel, specifically anti-Pentium 4 nonsense and yet Steve Jobs' very own Pixar chooses to replace their high-end SGI workstations with IBM Pentium 4 Xeon boxes.



    Funny thing they wouldn't go with their very own PowerMacs that are [quote]"up to 72 percent fasterthan 2 Ghz Pentium 4s."<hr></blockquote>



    Why wouldn't they use PowerMacs with 2 MB of high-speed DDR caches [quote]"unimpeded by bottlenecks caused by other data (unlike Pentium 4-based systems, which don’t have L3 cache — a disadvantage that leads to congestion between various data streams, and slowdowns in the overall rate of data transfer)" <hr></blockquote>?
     0Likes 0Dislikes 0Informatives
  • Reply 43 of 93
    g-newsg-news Posts: 1,107member
    Maybe because Xeon chips have several megs of L2 cache though?



    G-News
     0Likes 0Dislikes 0Informatives
  • Reply 44 of 93
    [quote]Originally posted by Arty50:

    <strong>Apple better do the same. Because that video market they covet so much will dry up damn quick if they don't. Apple's processor situation is slowly becoming a big joke.</strong><hr></blockquote>



    Mind telling me how exactly 64-bitness is important to video editing?



    Bye,

    RazzFazz



    [ 02-12-2002: Message edited by: RazzFazz ]</p>
     0Likes 0Dislikes 0Informatives
  • Reply 45 of 93
    [quote]Originally posted by msp:

    <strong>

    I agree with Moogs on this one. The Register article was one of the most bizarre bits of rambling I've read in a long time, especially the bit about pipelines, branch mispredictions, and clock frequency.</strong><hr></blockquote>



    Agreed.



    (I apologize for crossposting from other thread, but I think this better fits here: )



    Wow, this is probably the most misinformed (or mis-written, dunno) article I have read in a long time.





    [quote]a crude summary is that a deep pipeline allows the processor to second guess the subsequent instructions, at the cost of clearing the pipeline when it gets the wrong answer.<hr></blockquote>



    So deep pipelines somehow magically do branch prediction? Um, don't think so...





    [quote]RapidIO is a switched fabric interconnect which mirrors the parallel Infiniband initiative: the former is endorsed by the embedded industry, the latter by big iron system vendors, so the two don't really overlap. <hr></blockquote>



    Actually, they are right, the two don't really overlap. But that's not because they are endorsed by different manufacturers, but rather because they target two completely different areas. RapidIO is for chip-to-chip-connections, whereas InfiniBand is for machine-to-machine-connections (see <a href="http://common.ziffdavisinternet.com/util_get_image/0/0,3363,s=1&i=8564,00.jpg"; target="_blank">this diagram</a>, or better yet <a href="http://www.extremetech.com/article/0,3396,s=1005&a=21813,00.asp"; target="_blank">this whole article</a>).





    [quote]In practical terms it will permit the the memory controller to be housed on the die, communicating at full clock speed. So potentially, there's no need for L3 on-die cache.<hr></blockquote>



    RapidIO allows for on-die memory-controllers?

    WTF?

    This is just completely unrelated. RapidIO allows for (surprise, suprise) fast I/O access and chip-to-chip communication, but if your memory controller is already on-die, it won't need RapidIO anymore.



    Also, how exactly is an on-die memory controller going to make L3 cache superfluous unless you also have 500MHz fast memory modules, which aren't even on the horizon?



    Damn, where do those guys get their "inside knowledge" from?



    This sucks.



    Bye,

    RazzFazz



    [ 02-12-2002: Message edited by: RazzFazz ]</p>
     0Likes 0Dislikes 0Informatives
  • Reply 46 of 93
    [quote]Funny thing they wouldn't go with their very own PowerMacs that are

    ------------------------------------------------------------------------

    "up to 72 percent fasterthan 2 Ghz Pentium 4s."

    ------------------------------------------------------------------------<hr></blockquote>

    Their very own?? Apple != Pixar, no matter how much you wish-think it. Pixar == business, hence they buy the best Hardware for the job. Do you think Apple runs their HR backend or website on Apple boxen? Get a clue, Apple doesn't make anything resembling a workstation box--and even worse there is NO vendor support for RAID 5 cards or OpenGL workstation class video cards.
     0Likes 0Dislikes 0Informatives
  • Reply 47 of 93
    [quote] Mind telling me how exactly 64-bitness is important to video editing? <hr></blockquote>



    Certainly. Total amount of addressable memory is a good start. For instance, Commotion and similar video compositing packages (Shake?) use RAM memory to store video in that it is working on. Final Cut Pro as well as any other video editing apps run best with a lot of RAM (I use 512MB for FCP alone). Running certain things on a RAM disk when editing can be a huge timesaver and avoids dropped frames (yes I know a RAID will help as well). Short video clip capture from an analog source works better if you capture it to RAM then stream it to disk.



    Secondly, video editing/FX/Compositing is CPU intensive beyond any other operation you might do with a workstation besides some scientific research and high quality 3D rendering (Toy Story etc.). There is a reason that Pixar and Dreamworks have a room full of 64Bit Sun machines all networked together in a cluster to render 3D and other visual effects. The wider the path the faster it gets done in this particular case. This isn't true for everything you could do with a 64bit computer but in this case it does.



    A robust chip such as the G5 (or whatever you want to call it), can handle better bandwidth connections with the rest of the computer (firewire,PCI,memory and so on). As much as processor speed these things are important. When you are creative on paper it's instant, when you are creative on a computer you have to wait. Sometimes that's not a big deal but other times it makes for an uncomfortable disruption at the very least. This is why pro users want more and more. We have gotten very close to the realtime experience in some areas but not enough. I'm sure gamers have similar concerns.



    Look if we're going to call them pro desktops then they should be PRO desktops. There is nothing wrong with pushing Apple a little to adopt these technologies. I said it once and I'll say it again. Apple has repeatedly made what used to cost top dollar to accomplish available to the consumer and the not so top dollar professional user. Bringing out the first affordable 64bit desktop that can do what a few years ago took a large room and multiple computers to do is not a bad goal. If they come close to this I'll be happy. I know it can be done.



    By the way I know there is some dispute as to what a 64bit chip is. I think however that most technically oriented people know what I'm talking about.
     0Likes 0Dislikes 0Informatives
  • Reply 48 of 93
    powerdocpowerdoc Posts: 8,123member
    [quote]Originally posted by rickag:

    <strong>



    I hope your right, where did you get 60-70%?</strong><hr></blockquote>

    Just a calculation 50 % more speed due to the increase in mhz, 512 L2 cache and DDR memory equal 10 to 15 % performance increase = ranging from 1,65 (1,5 + 0,15) to 1,825 (1,5 + 0,225) it's mean 65 % to 82,5 % speed increase between a 7455 G4 1 hgz and a 1,5 ghz 7470.

    The result is even better. a 10 to 15 % increase with improved memory cache and DDR is not overevaluated i think.
     0Likes 0Dislikes 0Informatives
  • Reply 49 of 93
    matsumatsu Posts: 6,558member
    [quote]Originally posted by Eugene:

    <strong>The last redesign of the <a href="http://e-www.motorola.com/"; target="_blank">http://e-www.motorola.com/</a>; site changed the structure of the 'PowerPC ISA' index to list the family of processors called the MPC7XXX. That's what led me to believe in the existence of the MPC7500, but now that The Register has jumped on the bandwagon, I'm skeptical... <img src="graemlins/bugeye.gif" border="0" alt="[Skeptical]" />



    The Register is NEVER right.</strong><hr></blockquote>



    More than a few times in the last months I've read supposed industry articles from the register and c|net that just look like cleaned up versions of the stuff that gets posted in here. Only the threads in here started before the articles were posted. This phenomena used to be confined to MOSR's dellusional ramblings but it seems to have penetrated more widely now. The register probably got it from a rumor forum. Oh well...



     0Likes 0Dislikes 0Informatives
  • Reply 50 of 93
    serranoserrano Posts: 1,806member
    [quote]Originally posted by G-News:

    <strong>Yeah I agree on that. (It's funny in fact I dreamt of a 1.5GHz G4 with DDR about 2 days ago at MWNY, somehow found 1.5GHz a strange rate though)</strong><hr></blockquote>



    ew, you're creepy, dreaming about chips and mwny?
     0Likes 0Dislikes 0Informatives
  • Reply 51 of 93
    [quote]Originally posted by cowerd:

    <strong>

    Their very own?? Apple != Pixar, no matter how much you wish-think it. Pixar == business, hence they buy the best Hardware for the job. Do you think Apple runs their HR backend or website on Apple boxen? Get a clue, Apple doesn't make anything resembling a workstation box--and even worse there is NO vendor support for RAID 5 cards or OpenGL workstation class video cards.</strong><hr></blockquote>



    I know Apple is not Pixar, you're the one who needs a clue!



    The fact is that Steve Jobs runs both companies. And one of these companies constantly badmouths Pentium 4s while the other one is now depending on them to make a profit.



    Let me clue you in on one more thing, when a computer comes with dual processors, Gigabit ethernet, built-to-order dual-channel U160 SCSI options, and has a price tag upwards of $3000 it's competing with other workstations not consumer products.



    It's Apple that calls PowerMacs "professional hardware", not me. What kind of professionals did you think they were talking about there? IT professionals? Professional accountants? Do you think it could be that they mean the same kind of professionals that IBM also makes their workstations for, like digital artists? Get a clue!



    As for professional OpenGL hardware, it's none other than Apple's fault that they don't have any. Nvidia's Quadros are simply Geforce chips with different drivers. Apple already builds Nvidia cards, they could just as well offer Quadro cards as a built-to-order option. Same thing with ATI's latest FireGL cards, they are based on the R200 core.
     0Likes 0Dislikes 0Informatives
  • Reply 52 of 93
    [quote]Originally posted by G-News:

    <strong>Maybe because Xeon chips have several megs of L2 cache though?



    G-News</strong><hr></blockquote>



    No, current Xeons are exactly the same as Pentium 4, the only difference is that they support dual-processors.
     0Likes 0Dislikes 0Informatives
  • Reply 53 of 93
    Pixar's business does not depend on Pentium 4's. They use Sun Sparc and MACS. Oh I'm sure there are Windows boxes sitting around doing something (like crashing)there but do they depend on them for their livelyhood. Not a chance.



    [quote] Do you think it could be that they mean the same kind of professionals that IBM also makes their workstations for, like digital artists? <hr></blockquote>



    ???? I've been doing video and audio editing, graphics, some web design and software development for over a decade. I never heard of any "digital artist" wanting to use an RS6000. Photoshop works on Mac/Windows. SGI is used widely as well (although not widely enough to keep them afloat).



    I'm not terribly unhappy with Apple Hardware. I think that the desktop machines are indeed "pro" level but I also don't see anything wrong with wanting more. Though I am forced like a slave to work on a peecee at work and they don't like my Powerbook on the network. I have done a lot of work on Apple desktops and I find them more capable even out of the box than a custom built Wintel machine. The volume of quality multimedia work (encompassing all print and electronic creative media) done on Macs is undeniable.



    This business about price should be something everyone on this board knows better than to believe. Anyone who works with both platforms on a regular basis knows the truth about total cost of ownership regarding Mac vs. PC.
     0Likes 0Dislikes 0Informatives
  • Reply 54 of 93
    cowerdcowerd Posts: 579member
    [quote]Pixar's business does not depend on Pentium 4's. They use Sun Sparc and MACS. Oh I'm sure there are Windows boxes sitting around doing something (like crashing)there but do they depend on them for their livelyhood. Not a chance.<hr></blockquote>

    Well they just bought a boatload of them so them must be using them for something. And I don't think they bought them to run Windows--try Linux--though its going to be alot harder to do Linux crashing jokes.



    And the G4 already has 36-bit memory addressing--though its kinda hard to max out with only 3-DIMM slots..
     0Likes 0Dislikes 0Informatives
  • Reply 55 of 93
    If you saw a story about pixar purchasing Intel boxes then post the link. I'm not saying your lying and if they are running Linux that at least makes some sense although I doubt they would use them for their render farm. Renderman does work on Linux but Pixar studios is going to use the fastest boxes they can for modeling and rendering. The Linux version of Renderman is sold as a product. Those machines could easily be development machines or general network computers.



    They use Macs as well so I don't know what any of this proves.
     0Likes 0Dislikes 0Informatives
  • Reply 56 of 93
    The only link I found was;



    <a href="http://www.salon.com/tech/feature/2001/11/01/linux_hollywood/print.html"; target="_blank">http://www.salon.com/tech/feature/2001/11/01/linux_hollywood/pr int.html</a>



    I found a little blurb elsewhere that Industrial light and Magic switched to Linux on IA64 for rendering, ditching SGI. The version of Linux they used had to be hacked to death to do what they wanted it to do and I'm sure each machine has a lot of processors.



    This is irrelevent to the thread however. My only point was that a 64bit G5 in a Apple desktop would be a major breakthrough (especially an MP G5), and that though a lot of good work has been done with the current pro hardware, The G5 or something with it's rumored specs needs to be introduced soon in order to bolster the pro market for Apple. Which is important to them I'm sure.



    Think secret seems to think that the purchase of Nothing Real shows a move in this area by talking to some people at Discreet who make a competing product. The software and the hardware have to be there however.
     0Likes 0Dislikes 0Informatives
  • Reply 57 of 93
    moogsmoogs Posts: 4,296member
    [quote]Originally posted by Arty50:

    <strong>Unfortunately, this roadmap sounds reasonable. </strong><hr></blockquote>





    If such a road map EXISTED somewhere other than in the imagination of the Register, I'd agree with you.



    Guys...they MADE IT UP. Where's the roadmap they're talking about? Anyone find it yet? Just because some shmo emails the register and says



    "psst. I work for Motorola...listen, I have some news for your Mac readers...."



    doesn't mean said email is worth a damn. Maybe there will be a 7470 that scales up another 400MHz or so and is integrated with an improved MPX bus or something similar...but there's no more reason to believe that than to believe me when I tell you I think the next chip will be the 7465, because the latest one was the 7455.



    They're GUESSING. Not trying to be a smart ass, I just don't get why people put any stock into this report. Maybe it's because they do a better job of looking like "official news" than some POS site like MOSR. But you all must surely know that all the icing in the world won't make a shitty cake taste better....



    Right?



    <img src="graemlins/surprised.gif" border="0" alt="[Surprised]" />
     0Likes 0Dislikes 0Informatives
  • Reply 58 of 93
    [quote]Originally posted by CCR65:

    <strong>

    Certainly. Total amount of addressable memory is a good start. For instance, Commotion and similar video compositing packages (Shake?) use RAM memory to store video in that it is working on. Final Cut Pro as well as any other video editing apps run best with a lot of RAM (I use 512MB for FCP alone). Running certain things on a RAM disk when editing can be a huge timesaver and avoids dropped frames (yes I know a RAID will help as well).</strong><hr></blockquote>



    So do you really think Apple is going to intro more than eight RAM slots on their new machines, so you can have more than 4GB of physical RAM?



    If so, what's stopping them from doing it right now? The current G4s can address 64GB of physical RAM already, and 2^52 bytes of virtual memory.





    [quote]<strong>Secondly, video editing/FX/Compositing is CPU intensive beyond any other operation you might do with a workstation besides some scientific research and high quality 3D rendering (Toy Story etc.). There is a reason that Pixar and Dreamworks have a room full of 64Bit Sun machines all networked together in a cluster to render 3D and other visual effects.

    </strong><hr></blockquote>



    Well, unless you actually have 64bit integer data to process, 64-bitness per se brings no speed gains at all (aside from the addressable memory issue - see above).





    [quote]<strong>

    The wider the path the faster it gets done in this particular case.

    </strong><hr></blockquote>



    The memory interface already is 64 bits wide on current G4s right now.





    [quote]<strong>A robust chip such as the G5 (or whatever you want to call it), can handle better bandwidth connections with the rest of the computer (firewire,PCI,memory and so on). As much as processor speed these things are important.

    </strong><hr></blockquote>



    Sure, but this isn't related to a chip being 64 bit at all. RapidIO, DDR-RAM, (insert favourite buzzword here), ... can all be had with a 32bit chip too.





    [quote]<strong>I said it once and I'll say it again. Apple has repeatedly made what used to cost top dollar to accomplish available to the consumer and the not so top dollar professional user. Bringing out the first affordable 64bit desktop that can do what a few years ago took a large room and multiple computers to do is not a bad goal.</strong><hr></blockquote>



    But the point is, it's only an incredibly small part of the market that actually has any benefits from going to 64bit chips. And what's even worse, those who need it have been using it for a long time now, only on other platforms - what reason would they have to switch to Macs? Still worse, those who don't need 64-bitness actually suffer form it, since all their integer data suddenly is twice as large.



    And bragging rights alone ("my Mac has more bits than your PC!") surely wouldn't justify the costs and tradeoffs.



    Bye,

    RazzFazz



    [ 02-12-2002: Message edited by: RazzFazz ]</p>
     0Likes 0Dislikes 0Informatives
  • Reply 59 of 93
    arty50arty50 Posts: 201member
    It sounds reasonable because the endless buildup of hope cause by Future Hardware has in reality broken me to the point in which a pessimistic view of Moto's roadmap has become a sane reality.



    Yes, that sentence rambled on. It's a stream of consciousness/madness caused by this damn board. Ahhhhhhahahahhahaha.







     0Likes 0Dislikes 0Informatives
  • Reply 60 of 93
    [quote]Originally posted by timortis:

    <strong>As for professional OpenGL hardware, it's none other than Apple's fault that they don't have any. Nvidia's Quadros are simply Geforce chips with different drivers. Apple already builds Nvidia cards, they could just as well offer Quadro cards as a built-to-order option. Same thing with ATI's latest FireGL cards, they are based on the R200 core.</strong><hr></blockquote>



    If you use a Quadro with GeForce drivers (in case this works at all, dunno), you could just as well just use a GeForce in the first place. The drivers do make a difference, and they have to be adapted to the pro-versions. Also, it's not like Appl would be responsible for writing them, and I doubt anybody actually forces ATi adn NVidia not to release Mac drivers for their pro hardware.



    Bye,

    RazzFazz



    [ 02-12-2002: Message edited by: RazzFazz ]</p>
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.