Is Apple making way for the 970?

124»

Comments

  • Reply 61 of 78
    [quote]Originally posted by Lemon Bon Bon:

    <strong>We need a pro-3D card.

    </strong><hr></blockquote>



    An ATI 9700 Pro 128MB w/ new OpenGL drivers should do just fine.
     0Likes 0Dislikes 0Informatives
  • Reply 62 of 78
    [quote]Originally posted by DrBoar:

    <strong>To trying to get this meandering thread into the main stream of the topic!

    The single 1 GHz introduction in the

    1x1GHz-2x1.25-2x1.42 compared to the previous

    2x0.867-2x1.00-2x 1.25 did look strange. Even with the price drop there are two obvious problems with the design. First its CPU is weaker than the predecessors secondly the midrange model for not that much more offers 2.5 times the CPU power. A dual 1GHz would have looked more natrual.</strong><hr></blockquote>



    I personally think this has less to do with preparing us for a single 970 introduction than the marketing department getting wise to the fact that Dual CPUs are a big seller. By cutting out duals on the low-end it means many buyers will jump to the middle priced machine. The low end drops in price, but overall you actually are getting a higher average selling price on your boxen because you have to pay more for that dual setup you know you want (come on - you know you want a duallie!).



    I personally would rather have a dual 867 than a single 1000.
     0Likes 0Dislikes 0Informatives
  • Reply 62 of 78
    cowerdcowerd Posts: 579member
    [quote]We need a pro-3D card<hr></blockquote>Is there that much difference between a gamer's card and pro card--these days?
     0Likes 0Dislikes 0Informatives
  • Reply 64 of 78
    [quote] An ATI 9700 Pro 128MB w/ new OpenGL drivers should do just fine.



    <hr></blockquote>



    It will do MORE than 'fine', I think.



    However, my point stands. I'd like to see a Wildcat or a Quadro on the Mac.



    If '68%' of Workstation buyers want a Mac on the next budget merrygoround? I'd say a 'high-end' card can be supported in the Mac market.



    Hell, 25% of Maya buyers are now Mac X folks. And that's been out for only a couple of years now.



    Looks like the Mac...once 'nowhere' in 3D in a-coming home.



    That will please Lightwave fans. (Er. That's me.)



    Lemon Bon Bon
     0Likes 0Dislikes 0Informatives
  • Reply 65 of 78
    airslufairsluf Posts: 1,861member
     0Likes 0Dislikes 0Informatives
  • Reply 66 of 78
    macroninmacronin Posts: 1,174member
    ATI, nVidia...



    Whichever has the best OpenGL/OS X drivers, and has the best integration with Maya...



    Now if Maya Unlimited would get released on the Mac platform...



    ;^p
     0Likes 0Dislikes 0Informatives
  • Reply 67 of 78
    gargar Posts: 1,201member
    [quote]Originally posted by Lemon Bon Bon:

    <strong>In short, I kinda agree with the above post on what available evidence we have.</strong><hr></blockquote>



    dear LBB,

    what available evidence do we have? a fpu that, according to IBM, is altivec competable? i don't believe there is much more than that, is there?



    (please, proof me wrong)
     0Likes 0Dislikes 0Informatives
  • Reply 68 of 78
    nevynnevyn Posts: 360member
    [quote]Originally posted by gar:

    <strong>



    dear LBB,

    what available evidence do we have? a fpu that, according to IBM, is altivec competable? i don't believe there is much more than that, is there?</strong><hr></blockquote>



    SpecFP for G4 1.0 Ghz = 187.

    SpecFP for PPC970 1.8 GHz =1000+.

    "Ready for SMP"



    I don't have SpecFP for 1.4GHz G4+s, should be 261.

    The size, & powerconsumption aren't out of line for a desktop. The cost is determined by the size, there's no reason to expect it's too much for Apple.



    I can believe one more "speed bump" upgrade from Mot, but the timing is getting tighter and tighter. Too much info from Mot that effectively says "G5 is dead" though. Drop in morale, cuts etc. The 7457's slated for November -&gt; way way too late. At 1.3GHz -&gt; way way too slow. (Search "Motorola Confidential Proprietary 7457" at google.) Gotta love google.
     0Likes 0Dislikes 0Informatives
  • Reply 69 of 78
    [quote]Originally posted by Lemon Bon Bon:

    <strong>

    It will do MORE than 'fine', I think.



    However, my point stands. I'd like to see a Wildcat or a Quadro on the Mac.

    </strong><hr></blockquote>



    But why?! The 9700 Pro w/ 128 MB will run circles around a Wildcat or a Quadro.
     0Likes 0Dislikes 0Informatives
  • Reply 70 of 78
    amorphamorph Posts: 7,112member
    [quote]Originally posted by Programmer:

    <strong>



    But why?! The 9700 Pro w/ 128 MB will run circles around a Wildcat or a Quadro.</strong><hr></blockquote>



    LBB clearly likes spending money. You gonna stop him?



    In all seriousness, I was under the impression that the main things you bought with the high-end graphics cards were higher precision (mostly via unlocked firmware) and much more robust OpenGL support in the drivers, rather than what Carmack has dismissed as "Quake OpenGL". The latter advantage is mooted on the Mac, and apparently the former is too. So I guess LBB's getting a hole burned in his pocket.



    [ 02-21-2003: Message edited by: Amorph ]</p>
     0Likes 0Dislikes 0Informatives
  • Reply 71 of 78
    [quote]Originally posted by Amorph:

    <strong>



    LBB clearly likes spending money. You gonna stop him?



    In all seriousness, I was under the impression that the main things you bought with the high-end graphics cards were higher precision (mostly via unlocked firmware) and much more robust OpenGL support in the drivers, rather than what Carmack has dismissed as "Quake OpenGL". The latter advantage is mooted on the Mac, and apparently the former is too. So I guess LBB's getting a hole burned in his pocket.

    </strong><hr></blockquote>



    The only thing I'm aware of that the geForce4Ti consumer version didn't have enabled was the high speed anti-aliased line draw. And it was there, just disabled (there was an article floating around about how to convert it to a Pro version by modifying some resistors on the card). The new chips don't even have this restriction, I believe -- if the Mac versions ever did. The Mac drivers have never been "Quake OpenGL", that was an odd PC-ism.



    The current boards have the fastest, widest, largest RAM possible with the GPUs and the GPUs are running at top speed. The competition between nVidia and ATI has assured this. If there is going to be a new "high-end" workstation card, then it will be a board with multiple R300s or nv30s (or the next versions of them). Can you imagine the cooling system that multiple nv30s would require? Egad.
     0Likes 0Dislikes 0Informatives
  • Reply 72 of 78
    amorphamorph Posts: 7,112member
    [quote]Originally posted by Programmer:

    <strong> Can you imagine the cooling system that multiple nv30s would require? Egad.</strong><hr></blockquote>



    Say goodbye to all four PCI slots.



    Maybe it would require a dedicated 30 amp circuit, like our dual Alpha. <img src="graemlins/lol.gif" border="0" alt="[Laughing]" />
     0Likes 0Dislikes 0Informatives
  • Reply 73 of 78
    <a href="http://www.spec.org/gpc/opc.data/vp7/summary.html"; target="_blank">http://www.spec.org/gpc/opc.data/vp7/summary.html</a>;



    <a href="http://www.nvidia.com/view.asp?PAGE=quadrofx"; target="_blank">http://www.nvidia.com/view.asp?PAGE=quadrofx</a>;



    My contention wasn't necessarily over performance full stop. Choice is a good thing TM.



    It's hotted up competition in terms of graphic cards between Ati and Nvidia in the 'power'Mac lines. (No more Rage Ati 16 meg pant card in yer top Tower. Shudder...)



    So, Wildcat why? It's a popular brand. Many pros use them. The Quadro 2000 (or whatever it's called...) seems to stomp all over the Wildcat. Even more so in price performance. So? Choice is good. It has a reputation. Prestige. Pro's like their 'snobby' kit. Perception is important. Y'know, like having Maya on the software side. I'd like to see more graphic card venders...in the 'high end'. A Quadro, FireGL...a Wildcat. Better than having the 'sneaky' Radeon 9000 in yer top end.



    Why have IBM make CPUs? Why Intel? Broaden your appeal beyond Mac users who are used to and seemingly satisfied with less? Increase your options (Apple likes 'options'.) Choice. Important to customers (though Apple sometimes seems to think it isn't important...)



    Elsa had boosted GL drivers for things like Studio Max in their graphic cards. Some cards have optimisations for Lightwave, Max etc.



    The anti-aliasing thing was an issue that appears to have been cracked with the recent VPUs. The bang for buck on the 9700 probably won't be beaten by anybody from when it was launched until end of first half 2003. Especially if you compare the value to a Wildcat. I'd still like to see the Geforce FX though.



    Y'know, Amorph, I bought the whole boodle five years ago. £7000. $10, 000 in yer dollar money, maybe. The Formac 80 cost £600. No Open GL. (Just the pathetic 'notinventedhere' Quickdraw 3d.) But it had great Photoshop performance in PS4!



    I won't be doing that again. (Needless to say.) I like Apple, but not THAT much You can buy alot more for five years later. No graphic card over £300, and no 970 over £2000. (And I'll be expecting dual 970s for that.)



    It maybe be a 9700 I stick in my next Apple Tower. Maybe its follow up. R350. (If available...)



    But some people do like burning a whole in their pocket (especially when the companies buying !) I've since learned that 7K is very hard earned. Even more so as you get older. (Unless you're a 'programmer' of course... which I'm not...)



    I look forward to spending my money when Apple finally deliver a tower that is worth it. (Y'know, something that makes my Athlon tower look lame...) But I won't be spending silly money. I've learned my lesson on that one.



    Lemon Bon Bon



    :eek:



    [ 02-22-2003: Message edited by: Lemon Bon Bon ]



    [ 02-22-2003: Message edited by: Lemon Bon Bon ]</p>
     0Likes 0Dislikes 0Informatives
  • Reply 74 of 78
    I hope Apple goes with the Nvidia GPU. ATI is more partial to Windows than Nvidia. I mean, the 9700 has had the "coming soon" status for over a month, but Nvidia has all the 4X AGP chips it makes for Windows compatible with the Mac also. Not to mention the fact that Nvidia has superior chip architecture. And, seeing as the newest Apple releases are featuring Nvidia chps, it seems that the 970 too will have a Geforce inside.
     0Likes 0Dislikes 0Informatives
  • Reply 75 of 78
    [quote]Not to mention the fact that Nvidia has superior chip architecture.<hr></blockquote>Haven't seen the 9700 vs. FX benchmarks have you?
     0Likes 0Dislikes 0Informatives
  • Reply 76 of 78
    Well, the 9700 isn't available for Mac yet. It has, rather, been "available soon" for about a month or two.
     0Likes 0Dislikes 0Informatives
  • Reply 77 of 78
    [quote]Originally posted by os10geek:

    <strong>I hope Apple goes with the Nvidia GPU. ATI is more partial to Windows than Nvidia. I mean, the 9700 has had the "coming soon" status for over a month, but Nvidia has all the 4X AGP chips it makes for Windows compatible with the Mac also. Not to mention the fact that Nvidia has superior chip architecture. And, seeing as the newest Apple releases are featuring Nvidia chps, it seems that the 970 too will have a Geforce inside.</strong><hr></blockquote>



    <a href="http://store.apple.com/1-800-MY-APPLE/WebObjects/AppleStore.woa/263/wo/Jp4Tid5Hr17f3k2yqNG1bfngI3L/0.3.0.3.27.25.0.0.1.0.3.1.1.0?126,45"; target="_blank">Really?</a> And the 9700 as a coming-soon BTO option. I think Apple wants both ATI and nVidia to have all of their GPUs available on the Mac. The 9700 is late specifically because they are writing (and testing) new drivers from the ground up for it, and that takes a lot of time. There have also been a lot of OpenGL changes in the last 8 months so they need to track to that as well. Once the 9700 is here I think we'll see more rapid updates to newer versions (i.e. R350 and later).
     0Likes 0Dislikes 0Informatives
  • Reply 78 of 78
    You're right, Programmer... Both would be better.
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.