A look at the July Power Macs now that we know the Xserve specs

16781012

Comments

  • Reply 181 of 238
    xypexype Posts: 672member
    [quote]Originally posted by timortis:

    <strong>I don't know if you're saying this from experience, but Maya IS the 3D app.</strong><hr></blockquote>



    It's (or was, until recently when they dropped the price) the most expensive but it surely doesn't get you as much bang for your bucks as Lightwave, for example, does.



    [quote]Originally posted by timortis:

    <strong>It is by far the most respected 3D application. Almost all 3D games are done on Maya and 3D Max. But Max isn't used much in feature film and FX production, Maya is. In fact it is the one that's most commonly used.</strong><hr></blockquote>



    Actually most game companies use 3DS Max, not Maya, and big studios use a lot of _different_ packages. Maya is just expandable and so you have the basic and complete versions, whereas studios have to write most of stuff they need themselves.



    Of all 3D packages I tried (from Blender, to Cinema, to 3DS Max, to Lightwave to Maya) Lightwave is the best option for most tasks because it has a lot of nice features, a good renderer and comes at a affordable price and doesn't need a 10.000 $ computer to run (and run nicely). You'll see Lightwave used as much.



    Just the huge price tag it had didn't make it the best option and certainly didn't help a|w much (given the rumored bad economic state they are in right now). Sure, it sounds great to "have downloaded Maya just recently, woah it's great", but it doesn't change the fact that Maya of all "known" packages is the only with a IRIX port, which forces many SGI-equipped shops to use it. Unless they write their own tools.
  • Reply 182 of 238
    telomartelomar Posts: 1,804member
    Oh my god you wrote a lot in all those posts Only one thing I really care to pick up on.



    [quote]Originally posted by Lemon Bon Bon:

    <strong>Yeah. I guess you're right. After all, 'power'Mac sales have dropped from a over half a million a quarter to? Wintel are making inroads into Apple's stronghold markets such as Education and Print!</strong><hr></blockquote>



    I have done a great deal of work in an industry that has historically been dominated by Wintel machines (or Unix workstations or a combination). OS X has been receiving a great deal of interest though. It gives the best of Unix while maintaining a consumer based profile. Especially in companies that have maintained unix machines and Windows machines. OS X gives the option of integrating the two.



    Most of the computers are only PIIIs so speed isn't the issue for these companies. If they want speed they usually won't buy an "off-the-counter" computer.



    The major issue is most people still think Macs can't be used for regular computing or are difficult to integrate.



    You have no idea how much joy it gives me when people ask, "Why do you use a Mac?" to reply "Because I can do my job better, faster and more easily and have been doing so for quite some time." It's great to prove to people it is a decent system.



    Even when I used do process design and modelling I prefered the Mac and that does take computing power.



    All of that said from everything I have read Apple has been maintaining or increasing unit sales while a lot of PC manufacturers are seeing shrinking numbers. Unit sales don't directly equal profits but marketshare means more software development and a better community knowledge, which equal inroads into PC markets.
  • Reply 183 of 238
    daveleedavelee Posts: 245member
    Hey Lemon Bon Bon,



    I did that Ray Dream thing (just Ray Dream 3D at the time) on a IIvx!!

    Wa-hey, those were the days (I was only a hobbyist though). It took me a whole summer (literally) to model and render a scene.



    I now have C4D and it does kick *** (at least from my previous experiences), it is still too slow in OSX though (and I don't know what the OpenGL enhancement does... anyone?).



    I hope Apple has a computer that I can put some confidence in for purchasing this summer at NY.
  • Reply 184 of 238
    rickagrickag Posts: 1,626member
    It is known that Apple computers have a weaker floating point unit, right??



    Could this help?



    \t

    Architosh Staff ([email protected])

    22 May 2002

    Â*"Apple discusses Oct-level precision in AltiVec G4 Processor"



    <a href="http://www.architosh.com/news/2002-05/2002c-0522-oct-prec.phtml"; target="_blank">web page</a>



    PDF can be found <a href="http://developer.apple.com/hardware/ve/pdf/oct3a.pdf"; target="_blank">here</a>



    Would this require software developers an inordinant amount of extra programming to utilize?? Would this only be used in a very narrow type of calculations??



    A 4X increase in speed seems fairly significant.
  • Reply 185 of 238
    programmerprogrammer Posts: 3,458member
    [quote]Originally posted by rickag:

    <strong>It is known that Apple computers have a weaker floating point unit, right??



    Could this help?



    \t

    Architosh Staff ([email protected])

    22 May 2002

    Â*"Apple discusses Oct-level precision in AltiVec G4 Processor"



    <a href="http://www.architosh.com/news/2002-05/2002c-0522-oct-prec.phtml"; target="_blank">web page</a>



    PDF can be found <a href="http://developer.apple.com/hardware/ve/pdf/oct3a.pdf"; target="_blank">here</a>



    Would this require software developers an inordinant amount of extra programming to utilize?? Would this only be used in a very narrow type of calculations??



    A 4X increase in speed seems fairly significant.</strong><hr></blockquote>





    Not really -- the complaint is usually that normal C/C++ floating point math code runs slower than on other platforms. This means single and double precision floats. The technique in that paper is for getting even higher precision using the AltiVec unit in some special ways that you have to code specifically. It doesn't speed up the standard float types and it doesn't work with ported code unless you rewrite that code.



    And the G4's floating point unit is not slow -- the problem is that there is only one of them, and the processor's clock rate is lower than the competition's. In use this amounts to the same thing, but it means that the speed disadvantage is not insurmountable and there is not some design flaw that limits what Motorola can build. They just have to decide to put more of them into the next processor.
  • Reply 186 of 238
    lemon bon bonlemon bon bon Posts: 2,383member
    "I just realized than any time Lemon Bon Bon saves by using his PC versus his Mac he wastes posting incredibly long messages here "



    Yeah. I guess. <img src="graemlins/embarrassed.gif" border="0" alt="[Embarrassed]" />



    Still, I guess I must be compensating for a small...post count



    Lemon Bon Bon
  • Reply 187 of 238
    lemon bon bonlemon bon bon Posts: 2,383member
    "I don't know if you're saying this from experience, but Maya IS the 3D app."



    Said Xtype:



    "It's (or was, until recently when they dropped the price) the most expensive but it surely doesn't get you as much bang for your bucks as Lightwave, for example, does."



    It doesn't offer anywhere near as much bang for buck as Lightwave...or as many plug ins or as much support on the web.



    Originally posted by timortis:



    "It is by far the most respected 3D application."



    Well, there are 'names'. Aren't there? Sure, ILM used Maya for the Pod Race in Star Wars. But they mainly write their own tools...



    Maya is the 'glamour' name. But, I'd rather use Lightwave. A superb all rounder and as feature complete as you'd care to get for its price. Great 'hair' renderer with version 7-7.5. You'll pay big bucks for that 'add on' with Maya with no noticeable improvement in 'fur' quality.



    Just like Xsi. But I found Xsi to be inferior in many aspects to both Maya and Lightwave in my own experience. Lightwave seems to work better with less resources than Maya or Xsi...



    The renderer on Maya got a whole lot better with version 4. But I sez Lightwave owns in. And I guess Xsi has 'Mental Ray' the 'nice' but slow renderer... (well, Xsi has to have something to say for itself and it's ridiculous price...)



    "Almost all 3D games are done on Maya and 3D Max. But Max isn't used much in feature film and FX production, Maya is. In fact it is the one that's most commonly used."



    Agreed. Spot on.



    "Actually most game companies use 3DS Max, not Maya, and big studios use a lot of _different_ packages. Maya is just expandable and so you have the basic and complete versions, whereas studios have to write most of stuff they need themselves."



    Yep. Though Maya is 'up and coming'. I don't seeing it surpassing Max any time soon...



    "Of all 3D packages I tried (from Blender, to Cinema, to 3DS Max, to Lightwave to Maya) Lightwave is the best option for most tasks because it has a lot of nice features, a good renderer and comes at a affordable price and doesn't need a 10.000 $ computer to run (and run nicely). You'll see Lightwave used as much."



    It's a great contender for it's price and all round feature set. Where as you'll have to pawn your granny, your soul and an overpriced 'power'Mac to own either Max, Xsi or a 'complete' version of Maya. Newtek, on the other hand, have had some great offers on Lightwave over the last year. I bought version 6.5 for £995. Bargain. Then I've had several free upgrades since then including the excellent 'Duo' license version! I'm now on 7.5 and haven't paid a bean since version 6.5!



    Beat that Discreet and Alias!



    "Just the huge price tag it had didn't make it the best option and certainly didn't help a|w much (given the rumored bad economic state they are in right now). Sure, it sounds great to "have downloaded Maya just recently, woah it's great", but it doesn't change the fact that Maya of all "known" packages is the only with a IRIX port, which forces many SGI-equipped shops to use it. Unless they write their own tools."



    Lotta snobbery in the 3D world. There is some substance to Xsi and Maya and Max's toolset. I looked at them...but couldn't see what they offered the 'artist' that Lightwave couldn't. If you're a real boffin/science-artist/cum scripter (I hate you if you have that god-like capacity) then Maya is probably for you so you can spend ages programming new shaders...



    But at the price? Take Lightwave and though I haven't used it much, the improving 'pretender' is Cinema 4D. Also, some excellent upgrade deals for Electric Image for those who bought DV Garage 3D tool kit!



    Lemon Bon Bon
  • Reply 188 of 238
    lemon bon bonlemon bon bon Posts: 2,383member
    "And the G4's floating point unit is not slow -- the problem is that there is only one of them,"



    Yup.



    "and the processor's clock rate is lower than the competition's. In use this amounts to the same thing, but it means that the speed disadvantage is not insurmountable"



    ...deliver us Saint Jobs from this cup of G4 suffering...bringeth on the most G5-ish of processors. Even though x86 land is doing a damn fine job of 'lapping' Moto, I still don't think it's insurmountable either.



    "and there is not some design flaw that limits what Motorola can build. They just have to decide to put more of them into the next processor."



    Improvements and more fpus, not design flaws, I hope you mean!







    Lemon Bon Bon
  • Reply 189 of 238
    lemon bon bonlemon bon bon Posts: 2,383member
    "Hey Lemon Bon Bon,

    I did that Ray Dream thing (just Ray Dream 3D at the time) on a IIvx!!"



    Good man! I fellow 'Ray Dreamer'. Cool.



    You're a braver man than me Davelee!



    "Wa-hey, those were the days (I was only a hobbyist though). "



    Given how long it too to update. So was I!!!



    "It took me a whole summer (literally) to model and render a scene."



    Bring out a gold medal for this guy. You have my undying admiration. You've got more patience than I have...



    "I now have C4D and it does kick *** (at least from my previous experiences), it is still too slow in OSX though (and I don't know what the OpenGL enhancement does... anyone?)."



    Open Gl enhancement. It's software. A common 3D standard. If you have a graphic card that supports it? Makes the 3D objects update faster on your display! (You know, when you go 'wahay' with a hundred thousand polygon model with 32-bit textures on...) If you have a big 3D standard like Open GL, you can keep adding support for more 'features' eg fog and lens flares in real time in your 3D viewpoint without having to render them to see what's going on...



    ...at least that's the theory...you may encounter severe inertia with any Ati Alien Face Sucker 8 or 16 meg card...



    As for why 'X' and 'Cinema 4D' are slow? What are your specs? Cpu? Card? Ram?



    "I hope Apple has a computer that I can put some confidence in for purchasing this summer at NY."



    Me too...,



    ...me too.



    Lemon Bon Bon.



    [ 05-23-2002: Message edited by: Lemon Bon Bon ]</p>
  • Reply 190 of 238
    lemon bon bonlemon bon bon Posts: 2,383member
    "You have no idea how much joy it gives me when people ask, "Why do you use a Mac?" to reply "Because I can do my job better, faster and more easily and have been doing so for quite some time." It's great to prove to people it is a decent system."



    Nice one, fellow evangelist.



    "Even when I used do process design and modelling I prefered the Mac and that does take computing power."



    A preference it is. And preference it would be, for me...if Apple weren't far behind in CPU and Mobo. It's as plain as vanilla for me. Others may disagree. Still, I'm not without some hope that as we begin to leave the G4 behind this summer that the next year will offer some hope.



    In English? I'll probably crack by January and buy the 7500 if its the 'G5' we've been waiting for...we'll see. It's getting harder to wait for 'it' despite my 'axe to grind'.



    "All of that said from everything I have read Apple has been maintaining or increasing unit sales while a lot of PC manufacturers are seeing shrinking numbers."



    Well, of very late. Apple are holding firm in the PC market and that is chiefly down to the NEW iMac, shrewdly continuing the 'old' imac and good laptops. If it was down to towers alone, they'd be in serious trouble.



    If they can get the G5 (whatever the next cpu is...) out of the doors, keep the case style 'gimmicks' coming, the 'free' iapps, the broadening of the brand eg iPod style digital devices but fundamentally the OS 'X' factor maintained in its current momentum AND the increasing visible presence in their Apple branded stores (not only in the US...but over here in Europe!!!)



    ...then...they've got a shot at some growth.



    Perhaps what is their most historically inspired move is the mac on Unix flavour OS with deep Java support. This potentially gives them an iron clad stronghold...and a base from which to begin a stealth assault on certain server markets...



    But if they sit on their specs like they have been doing they can jeopardize some of the quite superb work they've done with software over the last few years. I think most Mac users would settle for 'near' CPU/Mobo parity. But I'm not paying the premium for Moto's lazy ways and for Apple's poor management in not seeing this coming sooner.



    In my reply to Steve's post, I'll obviously come over as very vexed and frustrated (time of the week, fellow Appleinsiders...)however, it's perhaps because, givent that Apple has so much of it's strategy 'nailed' down...and now in getting more developers than ever to see the Mac in a new light (not down to hardware...) but the 'X' factor... I guess, I feel annoyed that the CPU and mobo should still be an issue some several years on from the G4 stuck at 500mhz debacle.



    Get a good CPU in there. The G4 isn't it. It was fine for last year, maybe. The year before that? Great. But now? Dated. Over powered by the competition. And if it is REALLY that cheap compared to Intel's over priced chips then why can't apple use some of their 'fat' 30% margin to give us dual processors across the 'power'Mac line?



    "Unit sales don't directly equal profits but marketshare means more software development and a better community knowledge, which equal inroads into PC markets."



    Yes. When Apple failed to clone at the point of almost going critical mass all those years ago. They blew it. Now? I'm not sure I'd have it any other way (bar the cpu issue...)



    I think even Apple are beginning to realise they're gonna have to dip below 30% margins to increase 'mind/unit' share. Especially if those retail outlets are going to help fuel 'growth.'



    Lemon Bon Bon
  • Reply 191 of 238
    rickagrickag Posts: 1,626member
    [quote]Programmer

    "It doesn't speed up the standard float types and it doesn't work with ported code unless you rewrite that code."<hr></blockquote>



    drat
  • Reply 192 of 238
    bodhibodhi Posts: 1,424member
    I do not write code or software so if this sounds like it makes no sense that is my excuse!



    Didn't I hear that one of the big pieces of news to come out of WWDC was that coding or writing in C++ has now become much easier and more stable due to something that is integrated in Jaguar? Could this along with better G4's make the next Power Macs even faster?



    [ 05-23-2002: Message edited by: Bodhi ]</p>
  • Reply 193 of 238
    stevessteves Posts: 108member
    Originally posted by Lemon Bon Bon:



    "I didn't say I was 'back' either. "



    You did use the word temporary in describing your switch, correct? What exactly are you implying? I don't think I was out of line by inferring you were planning on coming back based on the choice of words you used.



    For many years. 'power'Macs. Joke. Simple words. My view. Your point on the software. Well...okay, we had Lightwave. (A very wobbly version of until relatively recently...go Rage ATi...)



    If we're talking about a platform, it's the combination of hardware and software which defines the overall rating. I think we can agree on that much. Anyway, I never said Macs were best in class for 3D. I will reiterate that I don't think they are the joke you seem to think they are. Macs do lack truly high end video cards for 3D. Though, the pace of performance increase at the low end gaming cards (take the Geforce 4Ti) for example, is rapidly approaching the capabilities of the high end cards. You have to agree the difference between the two has narrowed greatly.



    "SO that makes it alright to charge almost £3,000 for a Mac workstation that gets hammered by the equivalent PC workstation? For techology I, ahem, lets get this right now, 'perceive' to be out of date, ridiculously over priced as opposed to premiumly priced...?"



    I'm not going to argue that spec for spec Macs are a great deal hardware wise across the entire range. That's a losing battle. However, many people look beyond the raw specs and look at the entire package, and then determine if the value is worth it or not. Give me a dual Athlon running OS X and my favorite software, and I'll buy it. Until then, I choose to remain dual platform.



    "(...and the initial point of my argument is my 'low end' PC, a 1.6 gig Xp slaps the 'power'Mac dual silly. "



    My P3 1GHZ is low end. Your 1.6Ghz Athlon is not low end. Further, it doesn't smack the powermac dual silly. Yes, there will be some benchmarks whereby a single threaded task is faster on the Athlon. However, I maintain that overall system performance is better on the dual G4. Further, there are more than a few benchmarks where the dual G4 will smack the single Athlon silly.



    "Compared to a pc half the price that's comparitively crippled? Look again. If you think a few ports here and there makes up for the whopping extra Apple charges..."



    A few ports, plus some very significant bundled software that let's a Mac be very productive right out of the box. Yes, this type of thing does add up. My point is simply that this value consideration isn't quite as black and white as you make it out to be.



    "RDF. Seriously? "



    This is a common response for you when someone doesn't side with your opinion, huh. If I were in an RDF, why would I own a PC?



    "Well, PC schools can get a cheap, high performing tower that stuffs both the old imac and the 'e' mac."



    There are an infinite number of things a school can do. What's your point?



    "However, note the dip in sales with both the ibook and powerbook as Apple took ages to update them and when they did, it wasn't enough. Sales for both have dropped. Sales for them look okay combined. But the ibooks have stopped 'flying' off the shelves so quickly. This is when Apple's 'mean' spec list becomes exposed."



    BS! You are completely ignoring the fact that every new product has a typical life cycle. When a new product is announced, sales are brisk. Eventually, overtime as things don't change much, sales drop off. You note the tail end of a product cycle and try to conclude Apple's business is going down the tubes. Further, during the early part of a product's life cycle, you attribute the "anticipated" success as a "gimmick". Based on these observations, I have to question your sense of history and business acumen.



    "(...and don't tell me you haven't noticed Apple's generous supplements of ram over the years?)"



    Gee, you don't think that could have anything to do with OS's 128mb requirement, do you? Further, they've simply kept pace with PC's amount of memory in the process. When was the last time you saw a PC advertised with only 32MB of memory? I have to wonder where you get these arguments from. Perhaps there's a MSRDF in effect that I'm now aware of.



    "Then how do you know? I've read many news sites, data etc that suggests this 'erosion'. I guess I'll have to make notes from now on and post an avalanche of hyperlinks to keep me mate here happy. (Still, it's more fun to stick to our perceived arguments, right?)"



    I specifically said that I don't know and I likewise asked you for evidence. I clearly have not seen this trend. That doesn't mean it isn't true. This is a friendly discussion, I'm just asking you to enlighten me in this area.





    "That's why Apple's worldwide marketshare is 3-5%?



    Where's your hard data to show this erosion aint happining. "




    Considering Apple's marketshare has been fairly consistent over the past number of years, that should be proof in and of itself.



    "So, I'm not 'more' intelligent (like you...of course...) because I bought a machine for a, let's see, a sixth of the price of the dual mac that clobbers it in any of the tests you fail to provide benchmarks for?"



    I didn't classify you one way or another, you did. I did suggest that if you purchase something just based on a spec sheet and not realize the value of the total package, that wouldn't be real bright. That's not to say that everyone who buys a PC isn't intelligent. I own a PC because I have a specific work related need to own one. I don't flip flop from platform to platform based on price and spec sheets though. I also don't disregard other investments in software, etc. that you're not including in your price. Also, for your statement to be true, that would mean you paid $500 for your PC. Either your math is very good, or you got one hell of a deal!



    &gt;"will judge Macs by how well they perform their job, how much support and aggrevation they require to maintain as compared to PCs, and possibly determine (as you mention) if they are still competitive in performance for what they do."



    Yadda, yadda."




    LOL! Yeah, none of the above is important. I love the way you dismiss that so easily. I'm glad that sort of thing isn't important to you. That doesn't really help your case for switching platforms based on a rational decision.



    "So all the people who decide they can get a dual Athlon XP with twice the mhz and performance for less money are 'ignorant'?"



    Not at all. It would depend entirely on why and how they made their decision. It would depend upon taking many factors into consideration such as how much money they'd be losing by buying all new software, the retraining involved, etc.



    &gt;"I recall seeing a comparison between Apple's new iMac and a similar setup from Gateway (flat panel, dvd burner, etc.)."



    Where, exactly, do you 'recall'? A link..?




    I don't have the original link handy, but a quick search brings up many such comparisons. Here's one at random:



    <a href="http://www.sfgate.com/cgi-bin/article.cgi?file=/chronicle/archive/2002/0 3%2F11%2FBU205972.DTL&type=business" target="_blank">SF Chronicle article</a>



    "Tell me, why my school, having £12,000 to spend are going PC? Gee, I wonder, but because I'm not intelligent I don't know."



    Oh please, that's anecdotal evidence at best. Hardly a trend. Do you want anecdotal evidence to the contrary? Fine. My wife's school district was mixed PC and Mac. Now, they're going all Mac, except in the offices. I can ask you the same silly question as to why that is happening...



    "Patronising, eh? Easily swayed...but you aren't. Good for you."



    No, I'm not. I see value in both platforms and continue to support both. There are very specific things I prefer a PC for, the majority of things I prefer a Mac for. Neither preference was based on price.



    "I think history shows, even in difficult and uncertain times for the platform, that the majority of Mac users are not as easily swayed as you."



    "Take away the imacs and the already waning ibook and Apple have some underlying problems to solve with their famous 'value added' approach."



    According to your logic, Apple will go out of business very soon, bacause their iBook gimmick is wearing out, correct? It seems you still haven't grasped the concept of a product life cycle, have you?



    "The shrinking 'majority'. I guess if you want links to testify to that you'll have to get off your *** and look it or Apple's sales figures up yourself! "



    I'm not the one making the assertion that Mac market is shrinking and eroding like you are. Therefore, the burdon is upon you to back up this claim, not me. In short, you shouldn't make claims that you cannot back up. If you do so, you should at least have the courtousy to mention that you cannot backup your claim with any actual evidence.



    "Macs are not without 'some' substance. "



    Okay, at least we agree on this much.



    "The 'imac' in a beige case. Dead on arrival. The 'gimmick' saved the company."



    It's pure speculation on both of our parts how much of the original iMacs success was gimmick and how much was substance. I will say this, the iMac was the first consumer based G3 processor which performed quite well for it's time. It was a significant increase in performance for a consumer level machine. It also came well equipped for the internet. Both a built in modem, 10/100 ethernet, bundled internet based software, all at the height of the internet frenzy. Also keep in mind just how the PC based iMac clones, such as those from E-Machines and Compaq failed miserably. Likewise, I seriously question just how much the "Gimmick" factor came into play with the iMac's success. Additionally, the Performas (which the iMac replaced) were horrible machines that were not marketed well either. The iMac was designed for simplicity, and was marketed as such. Make all of the iMacs the same "snow" colored and I think they would have done almost as well.



    "The company bet on the imac and 'won'. But note how the 'gimmick' faded during those 'six years' when Apple failed to keep the spec list competitive, when they doggedly stuck with an antiquated cpu and graphic chip and 15 inch monitor. "



    Yes, of course, how long do you think any product can succeed without a significant refresh? This all goes back to the product life cycle that I'm talking about. Why do you think the new iMac is doing so well? They sold 150,000 (or at least received orders for) in the first month. That beat any product in Apple's history. Of course, this is a brand new product that is sufficiently different from it's predecessor. It's also in the early stage of it's product life cycle.



    "Take away current 'new' imac sales and Apple look in trouble to me. That's my perception. You can read their sales figures differently if you like."



    You can say that with just about any company. Take away their best selling product and see how the company does. That's a silly argument to make.



    "You pay BMW prices, you expect a damn good engine. Fast. Not the fastest. But the G4 is a Ford engine in a BMW chassis."



    What do you think the difference between a Ford Mustang and a BMW 5-series is? I'll give you a clue, it's not just the engine. You can have a very fast engine in a Ford, but at the end of the day, it's still a Ford. The same goes for PCs. Generally speaking, they are not as smooth and well integrated as Macs. Hence the Mac experience.



    &gt;"gross oversimplification of what has changed since that time."



    No it isn't. I'm talking about hardware. I don't recall dropping on Apple's software in the 'recovery' period?"




    Well you should be comparing everything. Apple is not a software company. Apple would not survive selling software alone. Apple is a hardware company where it's software adds value to it's hardware. Likewise, when discussing the health of the company, you must compare the whole picture, not merely the small factor which you feel supports your case, then ignore all of the other factors.





    &gt;"few links which discuss this."



    Do we really have to? I'm having too much fun clobbering your Steve Jobs inspired RDF."




    Oh, I see. Again, if I don't agree with you, it must be due to an RDF, despite the fact that I'm also a PC user and despite the fact that you cannot backup your claims. Are you sure you don't live in Redmond, Washington?





    "Let's compare my 'low end' pc with the 'low end' Powermac?"



    Okay, how about a G4800 vs P3 1GHZ. These seem to represent the respective low ends. FYI- a 1.6GHZ Athlon is not the low end for a PC.



    "Just how many task are 'velocity controlled'?"



    In Apple's core markets, image manipulation, digital video edition / compression, etc., quite a few. More importantly, the major applications that are core to Apple's market are optimized for SIMD.



    "A dual 1 gig 'power'Mac G4 still gets thumped on Lightwave by a single Athlon Xp. A low end dual Athlon wipes the floor with it."



    Like most applications, it depends on the specific task. I don't have Lightwave benches handy. However, what about Cinebench (a benchmark based on Cinema 4D). There are some rendering functions which make use of dual processors. In those functions, the dual G4 would beat the Athlon. In other functions, the Athlon would be the G4. Big deal. What about PS tests. What about Mpeg2 compression, what about scientific applications like Blast, or encryption programs like RC5? You see, my point about across the board performance isn't as narrowly defined as your Lightwave tests. Further, what about total system performance such as running multiple programs at the same time. What happens when you start a long render in Lightwave, then want to surf the net, or play an MP3? I think you'll find a different story regarding system performance with the dual G4 as compared to your single Athlon.



    "What, like those cheap Photoshop filters, the ones 'nobody' uses anymore? "



    Do you care to elaborate on that a bit? Are you referring to SJ's PS bakeoff against PCs where the PC gets smoked? It seems to me that it was based off of the creation of movie posters like Monsters Inc. Let's see I saw some transforms, rgb-&gt;cymk, guassian blurs, lighting effects... no, these filters never get used.



    This conversation could go on and on forever. In the end, you seem upset because you're now a PC only user that is basically a Mac user wannabe. Are Macs expensive? Compared to PCs, strictly speaking of the hardware, sure. Have Macs fallen behind in performance. Sure. Are they still competitive. It think so, at least in Apple's core markets. However, the bottom line is that you are now on the outside looking in. You're hoping that Apple will lower it's prices so that you can become a Mac user again. Otherwise, you wouldn't be hanging out in Mac forums or even care about the Mac market in the first place. Oh well. At least I'm happy with my platform decisions. I hope one day you can resolve your issues and move on.



    Steve



    [edited excessively long URL to restore board formatting -Amorph]



    [ 05-23-2002: Message edited by: Amorph ]</p>
  • Reply 194 of 238
    xypexype Posts: 672member
    [quote]Originally posted by Bodhi:

    <strong>IDidn't I hear that one of the big pieces of news to come out of WWDC was that coding or writing in C++ has now become much easier and more stable due to something that is integrated in Jaguar? Could this along with better G4's make the next Power Macs even faster?</strong><hr></blockquote>



    It's true - apparently Apple will ship a much improved version of the C++ compiler for OSX which in turn should make software faster after a simple "recompile". C++ is never easy if it's "done right" but maybe the compiler will also optimise the code otherwise than just on the speed side. Anyway, it's nice to see the Mac front develop that nicely. I wonder if Metrowerks has a OSX CodeWarrior yet?
  • Reply 195 of 238
    [lightwave has..]"a great 'hair'renderer with version 7-7.5. You'll pay big bucks for that 'add on' with Maya with no noticeable improvement in 'fur' quality."



    I use lightwave. You can't even compare saslight with Maya Fur. Sasquach is (a bit) closer, but noway near as versatile as Maya's, or as realistic.



    I'd agree that lightwave's renderer kicks Maya's. Thing is, most places use renderman or sometimes EI (if there in a rush) with Maya.



    /back on topic:

    The Powermacs aren't quite as far behind as everyone seems to reckon. OK, they are behind, but are still highly rated by professionals in many areas. If Macosrumors is right about the next specs <img src="graemlins/lol.gif" border="0" alt="[Laughing]" /> they'll be right up there.



    This is not my first post - I just changed my name.
  • Reply 196 of 238
    lemon bon bonlemon bon bon Posts: 2,383member
    Steve, haven't you got anything better to do?







    We obviously don't agree.



    Lemon Bon Bon
  • Reply 197 of 238
    programmerprogrammer Posts: 3,458member
    I just read something which strongly implies that Motorola will not add DDR support to MPX, and it will not reach higher than 166 MHz (if it even gets there). Their next step will be to an on-chip memory controller with per-processor memory. No time frame was given. While not as factual as a press release from Motorola, this is a source I tend to believe (no I'm not going to publish a link), and yes it does shoot down The Register's 7460 / 7470 rumour.



    If we are lucky then the on-chip memory controller will arrive very soon. Otherwise I expect that the next PowerMac will be an XServe in a new case.
  • Reply 198 of 238
    keyboardf12keyboardf12 Posts: 1,379member
    speedwise, how would this compare programmer?
  • Reply 199 of 238
    programmerprogrammer Posts: 3,458member
    [quote]Originally posted by keyboardf12:

    <strong>speedwise, how would this compare programmer?</strong><hr></blockquote>



    How would which compare? The on-chip memory controller is a better approach, in my opinion, and "the way of the future". The trend since the invention of the transistor has been toward higher and higher levels of integration, and bringing the memory controller onto the chip is the next logical step. The MPX goes away and is replaced by an I/O and arbitration bus like RapidIO.



    Simplying adding DDR would have given a short term performance increase, but it wouldn't be a long-term solution... it would have just been a stepping stone.



    The XServe-style approach will probably not appear to be any better in benchmarks than current machines, nor will it run processor intensive applications any faster. It should, however, make a big difference when combined with Quartz Extreme and lots of I/O. Consider that if you were running a memory bound application while doing lots of GUI work, 50% of your performance is lost due to the time spent updating the GUI. I/O activity (i.e. hard disk, FireWire, or network access) would reduce this further. On an XServe the processor would run at full speed, as if nothing else was going on in the system... and yet your I/O and graphics will still be happening, and at a faster rate due to the improved controllers and QE's full use of the graphic chip and AGP. So while the machine might not be faster at specific things, it will probably be much more usable and may subjectively feel like a considerably faster machine. At least that's my theory. Time will tell if I'm right or not.



    I'd still much rather see the on-chip memory controller... much of the stuff I do benefits from lots of processor &lt;-&gt; memory bandwidth.
  • Reply 200 of 238
    keyboardf12keyboardf12 Posts: 1,379member
    thanks.
Sign In or Register to comment.