or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › The real GeForce 4
New Posts  All Forums:Forum Nav:

The real GeForce 4

post #1 of 54
Thread Starter 
[quote]Originally posted by powerdoc:
<strong>Before complaining , you should wait for benchmarks. The only information for the moment avalaible is that the geforce 4 mx provide 1,1 billions pixel per second .
The geforce 3 provide 800 millions pixel per second and the radeon 8500 1,1 billions pixel per second also.
So i doubt that you can say a geforce 4 mx is a geforce 3 mx. We don't know how millions triangle per second can provide the 4 mx.

Remember what was the difference between a geforce 2 and a geforce 2 mx : same number of triangles per second, but the geforce 2 was providing 800 millions pixel per second and the geforce 2 mx the half . The geforce 3 was providing the same number of pixel per second but was much faster concerning the number of triangles per second.
so it's always hard to make comparisons before real benchmarks.

But one time more there is people complaining about this video card even if it's the best performant video card ever built in a mac since many years (is there is anybody here who prefer the ATI rage chips family ...)
With nvidia, even if we have not the more performant chip of this company , we have the last generation of them, even before the PC. It was never the case with ATI.</strong><hr></blockquote>

What the hell. Do you live in a box or something? We've known that the gf3 outperforms the gf4mx since the day after its announcement.

Go to <a href="http://www.xlr8yourmac.com" target="_blank">www.xlr8yourmac.com</a> and scroll down the main page a bit until you find the graph showing the same system with a gf3 and a gf4mx.
-Moo to you too!
<a href="http://www.cowofwar.com" target="_blank">www.cowofwar.com</a>
Reply
-Moo to you too!
<a href="http://www.cowofwar.com" target="_blank">www.cowofwar.com</a>
Reply
post #2 of 54
Thread Starter 
-Moo to you too!
<a href="http://www.cowofwar.com" target="_blank">www.cowofwar.com</a>
Reply
-Moo to you too!
<a href="http://www.cowofwar.com" target="_blank">www.cowofwar.com</a>
Reply
post #3 of 54
As evidenced by the Quake III benchmarks on Apple's site, the GeForce 4MX that was released with the latest round of towers is a crippled 3MX not worthy of the 4 designation. However, the real GeForce 4 is just around the corner; common knowledge at this point places an nVidia release in February.
The Apple developer docs detailing the newest PowerMacs mention a 128MB video card. I can only guess that this means the real GeForce 4 is a 128MB card, dual head (given the 4MX,) 64MB each way. Very reasonable if you ask me.
There. We have advance knowledge of a future Apple release.

-DisgruntledQS733Owner

Edit: Just checked Apple's site; Radeon is single head. You get the idea. **** hell damn ass shit bitch.

[ 02-01-2002: Message edited by: DisgruntledQS733Owner ]</p>
post #4 of 54
There's only 32 MB in the Radeon 7500.
I can change my sig again!
Reply
I can change my sig again!
Reply
post #5 of 54
[quote]Originally posted by DisgruntledQS733Owner:
<strong>
Edit: Just checked Apple's site; Radeon is single head. You get the idea. **** hell damn ass shit bitch.

[ 02-01-2002: Message edited by: DisgruntledQS733Owner ]</strong><hr></blockquote>

Both the Radeon and the 'GF4' cards with the new PowerMacs are dual-head.
post #6 of 54
[quote]Originally posted by Eugene:
<strong>There's only 32 MB in the Radeon 7500.</strong><hr></blockquote>

I don't think the radeon splits the ram just if you are using a single monitor though like the geforce cards do
post #7 of 54
I noticed Apple doesn't even offer the Geforce 3 as a BTO option on the new Powermacs, in fact, I can't find it anywhere on their site. Yet the Geforce 4mx is less of a video card than the Geforce 3?

In other words, Apple lowered the top-end video card specs for the Mac. I fear that this means the GF3s didn't sell well enough to justify the cost of even offering them as a BTO option. Sad.

Personally I'd be happy with a GF4mx, that looks like an awesome card for the price. But it's always best to keep the option of a totally bad-ass, money is no object gaming video card on the Mac. It's good for the platform, even if Apple doesn't make much money on it (or loses a bit), it advertises to people that the Mac can rise to the demands of serious gamers if the need arises. Apple would do well to either get the Geforce 3 back as a BTO option, or get a real GF4 up there pronto.

The optimist in me says that if we wait a short while we'll see that Apple was just in transition from the GF3 as the high-end gaming card, to a true GF4 as the high end. Hope it's true.
post #8 of 54
[quote]Originally posted by Junkyard Dawg:
<strong>I noticed Apple doesn't even offer the Geforce 3 as a BTO option on the new Powermacs, in fact, I can't find it anywhere on their site. Yet the Geforce 4mx is less of a video card than the Geforce 3?</strong><hr></blockquote>

When Mac OS 8 was released, a number of third-party application developers jumped to version 8 from wherever they had been because they were swamped with concerns about whether a v.4 product would run on a v.8 OS. It sounds silly to anyone familiar with software versioning, but then there aren't nearly enough of those people.

So, by the same logic, how many people would fork out $300 or so for a GeForce3 when a GeForce4 came as the default? Even if, name notwithstanding, the GF3 was the better performer?
"...within intervention's distance of the embassy." - CvB

Original music:
The Mayflies - Black earth Americana. Now on iTMS!
Becca Sutlive - Iowa Fried Rock 'n Roll - now on iTMS!
Reply
"...within intervention's distance of the embassy." - CvB

Original music:
The Mayflies - Black earth Americana. Now on iTMS!
Becca Sutlive - Iowa Fried Rock 'n Roll - now on iTMS!
Reply
post #9 of 54
My answer: Wait for nVidia's announcement.
I can change my sig again!
Reply
I can change my sig again!
Reply
post #10 of 54
it advertises to people that the Mac can rise to the demands of serious gamers if the need arises.


Not to sound like a PC biggot or anything (Oh I miss the macOS) but even with a GF3 a serious gamer isnt going to consider a mac. To tell you the truth I think that the GF3 is kind of a waste on the mac.
Those who dance the dance must look very foolish to those who can't hear the music
Reply
Those who dance the dance must look very foolish to those who can't hear the music
Reply
post #11 of 54
Well, it's <a href="http://homepage.mac.com/onrop1/.Pictures/FPS%2DGF3.jpg" target="_blank">not a waste</a> to me! My GF3 rocks pretty well in a Dual 533. I built a PC just for gaming, but my Mac "feels" better even though I get 20+ fewer fps in UT. I've tried adjusting the ballistics over and over again, but I just can't move as efficiently as I can on my Mac.

The GF4MX is the bottom of the line card. It's a great card for the money and is certainly a lot better than the GF2MX that came with my machine. Just sell it on eBay and put that cash towards a PC <a href="http://www.xbitlabs.com/news/images/2002-01/geforce4-01.jpg" target="_blank">GF4</a> and flash it to work in your Mac. I think that you'll be happy then. You just have to be patient until the GF4's are released.

The Asus GF4's look awesome:
[quote]ASUS V8460 Series GeForce4 Ti 4600 Graphics Card
ASUS V8460Ultra series is powered by the most advanced graphics processing unit on earth the GeForce4 Ti 4600. A core clock speed of 330MHz and memory clock speed of 660MHz, combined with 128MB DDR SDRAM provide the ultimate graphics experience with revolutionary technologies such as: nFiniteFX"! II engine for complex geometry and animation, Accuview Antialiasing"! for unbeatable visual quality and frame rate, as well as nView for multiple display flexibility and user control.

ASUS V8440 Series GeForce4 Ti 4400 Graphics Card
ASUS V8440 Series is powered by the GeForce4 Ti 4400 GPU. A core clock speed of 300MHz and memory clock speed of 550MHz, combined with 128MB DDR SDRAM provides exceptional performance in all the latest games and applications. Also included are revolutionary technologies such as: nFiniteFX"! II engine for complex geometry and animation, Accuview Antialiasing"! for unbeatable visual quality and frame rate, as well as nView for multiple display flexibility and user control.

ASUS V8170Pro GeForce4 MX Pro Graphics Card
With the most integrated GPU, the GeForce4 MX Pro, the V8170 delivers best performance and value for mainstream PCs. 64MB of DDR SDRAM integrated with 300MHz core clock speed and 550MHz memory clock speed provide ample performance for all multimedia applications. The V8170 also includes new technologies such as: Lightspeed Memory Architecture"! II, nView, and Accuview Antialiasing"!

ASUS V8170DDR GeForce4 MX DDR Graphics Card
The V8170DDR leverages the GeForce4 MX DDR GPU with 64MB DDR SDRAM to provide competitive performance and value for mainstream PCs. Core clock speed runs at 270MHz and memory clock speed is 400MHz. Some GeForce 4 MX DDR technologies included are: Lightspeed Memory Architecture"! II, nView, and Accuview Antialiasing"!

ASUS V8170SE GeForce4 MX SDR Graphics Card
The V8170SE provides reliable and cost-effective GeForce 4 MX perfomance for mainstream PCs. Using the GeForce4 MX SDR GPU and 64MB of SDRAM, the V8170SE has a core clock speed of 250MHz and a memory clock speed of 166MHz. It also includes new GeForce4 MX technologies such as: Lightspeed Memory Architecture"! II, nView, and Accuview Antialiasing"!<hr></blockquote>
post #12 of 54
did anyone of you think what impact a "128 mb ram geforce 4" would have on the powermac prices? of course it's 4mx, it's cheap and it delivers good results. not everyone needs quake3 at 500 fps.
oy!
Reply
oy!
Reply
post #13 of 54
Can anyone advise on which is the best card for 2D work ? The radeon or the GF4mx ?

Thanks.
--
move forward.
--
Reply
--
move forward.
--
Reply
post #14 of 54
Bumping back to the top. This thread was pushed to the bottom as a reply was added to it when the AI clock was somehow set to April 15/16, 1990.
post #15 of 54
[quote]Originally posted by The Toolboi:
<strong> it advertises to people that the Mac can rise to the demands of serious gamers if the need arises.


Not to sound like a PC biggot or anything (Oh I miss the macOS) but even with a GF3 a serious gamer isnt going to consider a mac. To tell you the truth I think that the GF3 is kind of a waste on the mac.</strong><hr></blockquote>

I question wether Apple is really interested in going after the hardcore gamer. Jobs is only interested in lip service - the MX cards make it appear to the much, much larger pool of casual gamers that Apple has hardware that is at least in the ballpark. When it comes to gaming, Apple is all about perception...

Caler
You can't get here from there.
Reply
You can't get here from there.
Reply
post #16 of 54
I doubt very much that *any* of the cards being talked about have a real impact of 2D graphics performance. Perhaps they help in the usual "my documents scroll faster" way, but in terms of accelerating the speed at which your images render they help not at all. All the rendering work in done by the CPU, hence the reason so many 2D and 3D graphics types are clamoring for *much* faster Power Macs...graphics cards don't help them.
Aldo is watching....
Reply
Aldo is watching....
Reply
post #17 of 54
Can anyone explain why top of the line PCs have a hard time dealing with 2D vector programs like Illustrator and Acrobat?
It must have something to do with graphics performance... <img src="confused.gif" border="0">
Bill Bradley to comedian Bill Cosby: "Bill, you are a comic, tell us a joke!"
- "Senator, you are a politician, first tell us a lie!"
Reply
Bill Bradley to comedian Bill Cosby: "Bill, you are a comic, tell us a joke!"
- "Senator, you are a politician, first tell us a lie!"
Reply
post #18 of 54
[quote]Originally posted by xype:
<strong>did anyone of you think what impact a "128 mb ram geforce 4" would have on the powermac prices? of course it's 4mx, it's cheap and it delivers good results. not everyone needs quake3 at 500 fps.</strong><hr></blockquote>

spending 3000 dollars on a professional "workstation" I would not expect to have to get a cheap run of the mill graphic card. these aren't iMacs. these are PowerMacs. The Radeon 8500 or Geforce 4 should be standard equipment on all powermacs except maybe the 1599 one. and the geforce 4mx should be in the iMac.
post #19 of 54
[quote]Originally posted by applenut:
<strong>

spending 3000 dollars on a professional "workstation" I would not expect to have to get a cheap run of the mill graphic card. these aren't iMacs. these are PowerMacs. The Radeon 8500 or Geforce 4 should be standard equipment on all powermacs except maybe the 1599 one. and the geforce 4mx should be in the iMac.</strong><hr></blockquote>

Your point? The $2800 Sony MX 1.7 GHz comes with a GeForce2 MX.
I can change my sig again!
Reply
I can change my sig again!
Reply
post #20 of 54
[quote]Originally posted by super:
<strong>Can anyone advise on which is the best card for 2D work ? The radeon or the GF4mx ?

Thanks.</strong><hr></blockquote>

The Radeon. Especially for video.
post #21 of 54
Before complaining , you should wait for benchmarks. The only information for the moment avalaible is that the geforce 4 mx provide 1,1 billions pixel per second .
The geforce 3 provide 800 millions pixel per second and the radeon 8500 1,1 billions pixel per second also.
So i doubt that you can say a geforce 4 mx is a geforce 3 mx. We don't know how millions triangle per second can provide the 4 mx.

Remember what was the difference between a geforce 2 and a geforce 2 mx : same number of triangles per second, but the geforce 2 was providing 800 millions pixel per second and the geforce 2 mx the half . The geforce 3 was providing the same number of pixel per second but was much faster concerning the number of triangles per second.
so it's always hard to make comparisons before real benchmarks.

But one time more there is people complaining about this video card even if it's the best performant video card ever built in a mac since many years (is there is anybody here who prefer the ATI rage chips family ...)
With nvidia, even if we have not the more performant chip of this company , we have the last generation of them, even before the PC. It was never the case with ATI.
post #22 of 54
there are already benhcmarks... it's not impressive.

some are saying the radeon 7500 is probably a better choice as an "all around" card
post #23 of 54
[quote]Originally posted by applenut:
<strong>there are already benhcmarks... it's not impressive.

some are saying the radeon 7500 is probably a better choice as an "all around" card</strong><hr></blockquote>
do you have any links ?
post #24 of 54
[quote]Originally posted by powerdoc:
<strong>
do you have any links ?</strong><hr></blockquote>

Mike at <a href="http://www.xlr8yourmac.com" target="_blank">www.xlr8yourmac.com</a> has tested the 4MX and 7500 in Q3. The 7500 won at 1600x1200/32, but the 4MX took the lower resolutions. He's going to post more comprehensive benchmarks on Monday.

I know that the 8500 Mac Edition and full fledged GeForce 4 aren't out yet, but once they are Apple better damn well give us the BTO option for one of these cards. Face it people, the 4MX is a huge step down from the GF3. You see, <a href="http://www.anandtech.com" target="_blank">www.anandtech.com</a> ran benchmarks on the Unreal 2 engine. What were they're conclusions? The 8500 and GF3 were the only cards capable of running it effectively, with the 8500 pretty much coming out on top. Why? Because they're both next gen cards with pixel and vertex shaders. The 7500 is basically an overclocked original Radeon, and the 4MX is a neutered GF3 (it's missing a pixel or vertex shader). So basically Q3 doesn't tell the whole story since it doesn't take advantage of these new capabilities. An 8500, GF3, or full GF4 gets you a future capable card; a 7500 or 4MX only helps older apps. The GF3 is a MUCH better card than the 7500 or 4MX. So basically, Apple downgraded our options. Are you really happy about that?

I know I'm not.
post #25 of 54
[quote] The Radeon. Especially for video. <hr></blockquote>

Thanks. Care to explain the reasons behind that ? Im just trying to get some education. Thanks.
--
move forward.
--
Reply
--
move forward.
--
Reply
post #26 of 54
Damn those Fps are good. One of my friends has a 2Ghz PIV with a GF3, 512Mb RDRAM etc, and his PC doesn't match those frames on any resolution against the Dual 1Ghz G4 with a GF3.

Tell me, are those fps tests for Quake 3 done at max settings? If so, then damn. <img src="graemlins/smokin.gif" border="0" alt="[Chilling]" />
Abhor the Stereotype, respect the Individual.
1.33Ghz 15" Powerbook: 80GB HD, 1GB RAM, OSX.4.7, Soundsticks II, 320GB LaCie FW800 EXT HD, iPod 20GB 4G
Reply
Abhor the Stereotype, respect the Individual.
1.33Ghz 15" Powerbook: 80GB HD, 1GB RAM, OSX.4.7, Soundsticks II, 320GB LaCie FW800 EXT HD, iPod 20GB 4G
Reply
post #27 of 54
Many workstations have cards that can't paint pixels as fast a good gaming card.

What is Apple doing? They're getting some cards that will be ready for the next LONG OVERDUE update of the OpenGL spec. That's why certain ATI products are being left behind, there's no point supporting them, as the acceleration is going to be next to nothing. (they don't even have T&L, and even the portable products still lack T&L) But a bunch of people who should've known better are even trying to sue Apple over this oh well. This will be the graphics boost people who work for a living are looking for. You can bet that OpenGL integration in OSX will, in the near future, allow your video system to help out **a lot** and without the special drivers/expensive professional cards that we need to get any kinda meaningful acceleration of 3-d, or 2-d, or video work.

People are bitching, but Apple is going to provide a very clean solution where your desktop and major apps all get hooks into any new *compliant* card. The way it should be. Give em a year, you'll see. There is no special Raycer chip, or Apple graphics card (that's just Kormac inspired stupidity.) Those engineers will give you a graphical/system boost by providing a new level of API features and API/OS integration/functionality both for the desktop and for devs to use on important software as well as on gaming related frivolity.

Even on a Geforce2MX there is a host of effects (transparancy, blending etc...) that could be brought to bear on a much wider scope than just games. That's what you'll see.
IBL!
Reply
IBL!
Reply
post #28 of 54
[quote]Originally posted by Matsu:
<strong>Many workstations have cards that can't paint pixels as fast a good gaming card.

What is Apple doing? They're getting some cards that will be ready for the next LONG OVERDUE update of the OpenGL spec. That's why certain ATI products are being left behind, there's no point supporting them, as the acceleration is going to be next to nothing. (they don't even have T&L, and even the portable products still lack T&L) But a bunch of people who should've known better are even trying to sue Apple over this oh well. This will be the graphics boost people who work for a living are looking for. You can bet that OpenGL integration in OSX will, in the near future, allow your video system to help out **a lot** and without the special drivers/expensive professional cards that we need to get any kinda meaningful acceleration of 3-d, or 2-d, or video work.

People are bitching, but Apple is going to provide a very clean solution where your desktop and major apps all get hooks into any new *compliant* card. The way it should be. Give em a year, you'll see. There is no special Raycer chip, or Apple graphics card (that's just Kormac inspired stupidity.) Those engineers will give you a graphical/system boost by providing a new level of API features and API/OS integration/functionality both for the desktop and for devs to use on important software as well as on gaming related frivolity.

Even on a Geforce2MX there is a host of effects (transparancy, blending etc...) that could be brought to bear on a much wider scope than just games. That's what you'll see.</strong><hr></blockquote>
There will be a revision of Open GL : The programmer confirm that in an another thread. He says that The revision of open GL will include new features that can be used by different graphic cards, and not special extensions from nvidia or ATI.

Perhaps we should this revision on mac os 10,2.
post #29 of 54
[quote]Originally posted by Gustav:
<strong>

The Radeon. Especially for video.</strong><hr></blockquote>

Yes. For instance, it's a known fact that ATI's DVD playback is vastly superior to Nvidia. Others claim that ATI's overall image quality is better too.

Matsu,

I understand what you're saying, and I think its great that Apple's dropping older cards. They dropped the old Radeon a while back and the GF2MX is gone now too. Getting an upgrade to OpenGL will be a great thing also. But that's still no excuse for dropping the GF3. It's vastly more capable than the 4MX. And the 7500 is nothing more than the old Radeon. You talk about future capability, and that's where the GF3 and 8500 shine. The 4MX and 7500? Uh, well they run Q3 well. But if they can't leverage the new pixel and vertex shader tech then how are they cards for the future?
post #30 of 54
In comparing ATI and nVidia I often think of ATI as being much like Apple and nVidia being much like the PC industry. ATI does it RIGHT. They produce chips with full feature sets, they always have. They produce chips that provide the best image quality of any in their industry segment. ATI has always had better image quality than 3dfx or nVidia. nVidia goes for raw power and speed. I see them as taking somewhat of a quick and dirty approach. Don't get me wrong, they produce good drivers, its just I've never read an interview with them in which they stated image display quality as a top priority. Textures yes, your desktop picture (which I spend quite a bit of time looking at), no. They are gradually developing the feature set that ATI has always given us, but their goal is raw power. ATI's goal is to produce cards that give you the best all around graphic quality. How many terapixels per second they can pump is somewhat secondary (although now they have to worry just to stay in the market) and that's the way it should be.
post #31 of 54
I agree that not having a high-end graphics card available in the line-up is just plain silly, but I'm hoping that is just an availability issue. The geForce3 was a really expensive card to produce, and the high-end geForce4 is expected to perform better but cost less. Hopefully Apple adds this card to its BTO lineup next week.

The choice of an NV17 based chipset is probably driven by cost -- the NV25 is very expensive and its primary addition is the programmable vertex engine. The NV17's pixel engine is pretty much the same as the geForce3's. The next OpenGL release for OSX will likely include a standardized vertex shader program, and Apple will likely do what Microsoft did in DirectX8 -- provide a CPU based vertex shader implementation. The great thing is that the G4's AltiVec unit will simply rock at doing this, and we ought to see decent performance out of the NV17 despite its lack of a programmable vertex unit. On a dual G4 800 or 1000, its possible that the software implementation will outrun a geForce3's vertex shaders (I say dual because then you still get it happening in parallel, assuming the OpenGL implementation is multi-threaded). The NV25 has two vertex shader units running in parallel at a higher clock rate, so it'll do better than the current crop of G4s. The performance will be highly dependent on content, of course, since the GPU implementation benefits from being later in the pipeline.

Note that current Mac-based benchmarks don't include any use of shaders because they are currently unavailable through Apple's OpenGL (no extensions yet either). This might explain why the PC framerates appear lower in some cases -- the rendering might be simplified due to the lack of shader access. Shaders don't make things faster necessarily, they make them much more flexible and allow the GPU to do much cooler stuff... usually at the cost of some framerate.
Providing grist for the rumour mill since 2001.
Reply
Providing grist for the rumour mill since 2001.
Reply
post #32 of 54
[quote]Originally posted by The Swan:
<strong>In comparing ATI and nVidia I often think of ATI as being much like Apple and nVidia being much like the PC industry. ATI does it RIGHT.</strong><hr></blockquote>

Oh I wouldn't agree with that at all. ATI came from a 2D background so their 2D feature set (including movie playback and crisp image output) tends to be better. Their 3D feature set, however, was laughable in the early Rage products, weak in the Rage128, and is only really catching up in the Radeon line. The current Radeon2 has them roughly on par with pre-NV25 in terms of 3D.

nVidia, on the other hand, has been all about 3D from their inception. The 2D stuff to them has been almost an afterthought. As of geForce4, however, they've pretty much caught up with ATI as far as visual quality and video playback goes.

The graphics chip market seems to have stabilized on these two vendors, at least for now, and both of them are producing comparable chips for the first time ever. You no longer have to choose between 2D and 3D -- you get topnotch GPUs that do both equally well. For at least the rest of this year it looks like they will have comparable products. If ATI meets its aggressive schedule they should pull slightly ahead by fall before nVidia jumps past them again. Driver quality, price, and OEM deals will determine which of them succeeds better this year, but I'm hoping that they'll split the market 50/50 and both continue to develop such awesome GPUs.
Providing grist for the rumour mill since 2001.
Reply
Providing grist for the rumour mill since 2001.
Reply
post #33 of 54
Thanks Programmer,

I think some people misunderstood what I meant about an updated OpenGL API. I referred to dropping support for older cards that clearly wouldn't be able to supply any meaningful acceleration of new features. As an API **any** compliant card wold work, natch. I'm pretty sure I said that. You can bet that rage 128 based products won't be compliant so they'll be dropped. The spec probably won't include anything that a Radeon or GF2MX can't do, so they should be supprted insofar as they should be compliant with the new spec.

What we really need is for Apple to provide very tight quartz/OSX integration, so that capable machines can get a nice graphical speed boost. Should the API provide the right hooks for a new level of effects/features, then it is not inconcievable that software like photoshop/premiere/final cut/maya etc etc... could see meaningful boosts in performance.

Right now you only get that from Professional graphics cards (on both the PC and Mac side) for as fas as work goes, it is still the CPU that does most of the work. Even the fastest Gaming card doesn't significantly speed up your 3-d/video work.

I think 4MX is a very nice upgrade to 2MX. They didn't downgrade anything at all, they just took away an expensive option that few people went for anyway. Rather than GF4Ti or whatever gaming Radeon, Apple should provide a real professional option. Pro users don't really need ultra frame rates, they need number crunching (polys, curves, lighting effects, etc etc.) That's good for games too, but a card that can chomp 25-50% off our render times in any number of pro apps, means more to the Powermac market than a card that can pump out 350fps in Quake.

Apple needs to provide upcoming versions of Quadro and FireGL as options, not GF4ti. 4MX is more than enough for gaming. PowerMacs are for work, not games.
IBL!
Reply
IBL!
Reply
post #34 of 54
[quote]Originally posted by Matsu:
<strong>I think some people misunderstood what I meant about an updated OpenGL API. I referred to dropping support for older cards that clearly wouldn't be able to supply any meaningful acceleration of new features. As an API **any** compliant card wold work, natch. I'm pretty sure I said that. You can bet that rage 128 based products won't be compliant so they'll be dropped. The spec probably won't include anything that a Radeon or GF2MX can't do, so they should be supprted insofar as they should be compliant with the new spec.
</strong><hr></blockquote>

Actually I think the Rage128 is OpenGL compliant and should be fully supported for quote a while going forward. Its not a real speed demon, and any vertex programs will need to run on the CPU (also true for the geForce 2MX and 4MX).

<strong> [quote]
What we really need is for Apple to provide very tight quartz/OSX integration, so that capable machines can get a nice graphical speed boost. Should the API provide the right hooks for a new level of effects/features, then it is not inconcievable that software like photoshop/premiere/final cut/maya etc etc... could see meaningful boosts in performance.

Right now you only get that from Professional graphics cards (on both the PC and Mac side) for as fas as work goes, it is still the CPU that does most of the work. Even the fastest Gaming card doesn't significantly speed up your 3-d/video work.

I think 4MX is a very nice upgrade to 2MX. They didn't downgrade anything at all, they just took away an expensive option that few people went for anyway. Rather than GF4Ti or whatever gaming Radeon, Apple should provide a real professional option. Pro users don't really need ultra frame rates, they need number crunching (polys, curves, lighting effects, etc etc.) That's good for games too, but a card that can chomp 25-50% off our render times in any number of pro apps, means more to the Powermac market than a card that can pump out 350fps in Quake.

Apple needs to provide upcoming versions of Quadro and FireGL as options, not GF4ti. 4MX is more than enough for gaming. PowerMacs are for work, not games.</strong><hr></blockquote>

I'm never clear on exactly what the professional market wants in the way of 3D acceleration... I mean the geForce3 & 4 are really fast GPUs and blow away any professional boards from even just 4 years ago. Is it higher vertex processing speeds? More memory? More pixel rate?
Providing grist for the rumour mill since 2001.
Reply
Providing grist for the rumour mill since 2001.
Reply
post #35 of 54
[quote]Originally posted by Matsu:
<strong>Thanks Programmer,
Apple needs to provide upcoming versions of Quadro and FireGL as options, not GF4ti. 4MX is more than enough for gaming. PowerMacs are for work, not games.</strong><hr></blockquote>

Huh? So what you're saying is that after I spend a few thousand dollars on a midrange PowerMac/Monitor/RAM/HDs/etc., I should have to go out and spend another thousand or so on a gaming PC instead of upgrading the graphics card for $200-300? When I'm spending that much money on a computer, I want to get the most out of it. And that means having a full fledged GF4 or 8500. Sure the Quadro and FireGL would be nice, but I don't need that much power.

If I haven't said the following before I apologize. One should note that the 8500 and GF4 haven't been released yet. If Apple adds those as a BTO option soon after their release, then I'll shut up. Somehow I have a feeling they won't. But who knows, it's just a prediction.

[ 02-04-2002: Message edited by: Arty50 ]</p>
post #36 of 54
I think spending 2000-3000 on any computer just to play games is foolish. I realize you're not saying that you use your powermac for games only, or even firstly, but IMHO having games as even a top three consideration on such an expensive machine is a waste. The machine should, as you point out, handle games well but that can't be the primary consideration for Apple. And at the prices they charge I sure hope it isn't the primary consideration for buyers either.

The majority of PowerMac buyers don't play on their computers. If there is going to be an option for a card then it should be a card that lets you work better, not play better. Gamers who spend 2000-3000 on a machine are either spoiled rich kids, very lonely, or very dumb.

OTOH, I kinda agree that after spending a considerable sum on a powermac, you shouldn't need to spend another grand on a gaming rig. But games just aren't that important to the powermac market, geforce 3 sold poorly , and you can bet that 4ti won't do that much better. These high-end gaming cards more often than not go into the home-built rigs of the aforementioned lonely male teens -- so they can practice for the next Columbine or something Most high-end PC's and workstations don't ship with GF3Ti cards since there's a limit to how much they help with productive work.

In Powermacs Apple would probably sell more pro cards as 500-600 options than they would GF4Ti's at half as much.
IBL!
Reply
IBL!
Reply
post #37 of 54
We have a great discussion going on about the GeForce 4 and ATI's future offerings in "Hold off those video card purchases".
~Winner of the Official 2003 AppleInsider NCAA Men's Basketball Tournament Pool~
Reply
~Winner of the Official 2003 AppleInsider NCAA Men's Basketball Tournament Pool~
Reply
post #38 of 54
Well, these two topics are no the same. This one is more about gripes over nv17 vs nv25. Yours is more about upcoming graphics options in general.

Ya know what UBB needs? Some kinda merge thread feature so admins can (with one or two clicks) join two threads that they think are similar enough, rather than just close one. Hack anyone?
IBL!
Reply
IBL!
Reply
post #39 of 54
What I don't understand is gamer and professional video cards differ? I know a lot about what games need in a 3D card, but I don't understand what a Pro card needs that is different -- they are usually just faster. For people that don't want to pay for high end 3D support Apple should just provide a BTO option (either to add a fast 3D card or to remove it).
Providing grist for the rumour mill since 2001.
Reply
Providing grist for the rumour mill since 2001.
Reply
post #40 of 54
[quote]Originally posted by Matsu:
<strong>The majority of PowerMac buyers don't play on their computers. If there is going to be an option for a card then it should be a card that lets you work better, not play better. Gamers who spend 2000-3000 on a machine are either spoiled rich kids, very lonely, or very dumb.</strong><hr></blockquote>

that, or they're gamers simply because you wouldn't pay so much for a computer simply to play games doesn't mean that those who do are either rich, lonely, or dumb.

[quote]<strong>
These high-end gaming cards more often than not go into the home-built rigs of the aforementioned lonely male teens -- so they can practice for the next Columbine or something </strong><hr></blockquote>



i'm not even going to touch that.

however i agree with the rest of your post
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › The real GeForce 4