The real GeForce 4

2

Comments

  • Reply 21 of 53
    applenutapplenut Posts: 5,768member
    there are already benhcmarks... it's not impressive.



    some are saying the radeon 7500 is probably a better choice as an "all around" card
  • Reply 22 of 53
    powerdocpowerdoc Posts: 8,123member
    [quote]Originally posted by applenut:

    <strong>there are already benhcmarks... it's not impressive.



    some are saying the radeon 7500 is probably a better choice as an "all around" card</strong><hr></blockquote>

    do you have any links ?
  • Reply 23 of 53
    arty50arty50 Posts: 201member
    [quote]Originally posted by powerdoc:

    <strong>

    do you have any links ?</strong><hr></blockquote>



    Mike at <a href="http://www.xlr8yourmac.com"; target="_blank">www.xlr8yourmac.com</a> has tested the 4MX and 7500 in Q3. The 7500 won at 1600x1200/32, but the 4MX took the lower resolutions. He's going to post more comprehensive benchmarks on Monday.



    I know that the 8500 Mac Edition and full fledged GeForce 4 aren't out yet, but once they are Apple better damn well give us the BTO option for one of these cards. Face it people, the 4MX is a huge step down from the GF3. You see, <a href="http://www.anandtech.com"; target="_blank">www.anandtech.com</a> ran benchmarks on the Unreal 2 engine. What were they're conclusions? The 8500 and GF3 were the only cards capable of running it effectively, with the 8500 pretty much coming out on top. Why? Because they're both next gen cards with pixel and vertex shaders. The 7500 is basically an overclocked original Radeon, and the 4MX is a neutered GF3 (it's missing a pixel or vertex shader). So basically Q3 doesn't tell the whole story since it doesn't take advantage of these new capabilities. An 8500, GF3, or full GF4 gets you a future capable card; a 7500 or 4MX only helps older apps. The GF3 is a MUCH better card than the 7500 or 4MX. So basically, Apple downgraded our options. Are you really happy about that?



    I know I'm not.
  • Reply 24 of 53
    supersuper Posts: 82member
    [quote] The Radeon. Especially for video. <hr></blockquote>



    Thanks. Care to explain the reasons behind that ? Im just trying to get some education. Thanks.
  • Reply 25 of 53
    mattyjmattyj Posts: 898member
    Damn those Fps are good. One of my friends has a 2Ghz PIV with a GF3, 512Mb RDRAM etc, and his PC doesn't match those frames on any resolution against the Dual 1Ghz G4 with a GF3.



    Tell me, are those fps tests for Quake 3 done at max settings? If so, then damn. <img src="graemlins/smokin.gif" border="0" alt="[Chilling]" />
  • Reply 26 of 53
    matsumatsu Posts: 6,558member
    Many workstations have cards that can't paint pixels as fast a good gaming card.



    What is Apple doing? They're getting some cards that will be ready for the next LONG OVERDUE update of the OpenGL spec. That's why certain ATI products are being left behind, there's no point supporting them, as the acceleration is going to be next to nothing. (they don't even have T&L, and even the portable products still lack T&L) But a bunch of people who should've known better are even trying to sue Apple over this oh well. This will be the graphics boost people who work for a living are looking for. You can bet that OpenGL integration in OSX will, in the near future, allow your video system to help out **a lot** and without the special drivers/expensive professional cards that we need to get any kinda meaningful acceleration of 3-d, or 2-d, or video work.



    People are bitching, but Apple is going to provide a very clean solution where your desktop and major apps all get hooks into any new *compliant* card. The way it should be. Give em a year, you'll see. There is no special Raycer chip, or Apple graphics card (that's just Kormac inspired stupidity.) Those engineers will give you a graphical/system boost by providing a new level of API features and API/OS integration/functionality both for the desktop and for devs to use on important software as well as on gaming related frivolity.



    Even on a Geforce2MX there is a host of effects (transparancy, blending etc...) that could be brought to bear on a much wider scope than just games. That's what you'll see.
  • Reply 27 of 53
    powerdocpowerdoc Posts: 8,123member
    [quote]Originally posted by Matsu:

    <strong>Many workstations have cards that can't paint pixels as fast a good gaming card.



    What is Apple doing? They're getting some cards that will be ready for the next LONG OVERDUE update of the OpenGL spec. That's why certain ATI products are being left behind, there's no point supporting them, as the acceleration is going to be next to nothing. (they don't even have T&L, and even the portable products still lack T&L) But a bunch of people who should've known better are even trying to sue Apple over this oh well. This will be the graphics boost people who work for a living are looking for. You can bet that OpenGL integration in OSX will, in the near future, allow your video system to help out **a lot** and without the special drivers/expensive professional cards that we need to get any kinda meaningful acceleration of 3-d, or 2-d, or video work.



    People are bitching, but Apple is going to provide a very clean solution where your desktop and major apps all get hooks into any new *compliant* card. The way it should be. Give em a year, you'll see. There is no special Raycer chip, or Apple graphics card (that's just Kormac inspired stupidity.) Those engineers will give you a graphical/system boost by providing a new level of API features and API/OS integration/functionality both for the desktop and for devs to use on important software as well as on gaming related frivolity.



    Even on a Geforce2MX there is a host of effects (transparancy, blending etc...) that could be brought to bear on a much wider scope than just games. That's what you'll see.</strong><hr></blockquote>

    There will be a revision of Open GL : The programmer confirm that in an another thread. He says that The revision of open GL will include new features that can be used by different graphic cards, and not special extensions from nvidia or ATI.



    Perhaps we should this revision on mac os 10,2.
  • Reply 28 of 53
    arty50arty50 Posts: 201member
    [quote]Originally posted by Gustav:

    <strong>



    The Radeon. Especially for video.</strong><hr></blockquote>



    Yes. For instance, it's a known fact that ATI's DVD playback is vastly superior to Nvidia. Others claim that ATI's overall image quality is better too.



    Matsu,



    I understand what you're saying, and I think its great that Apple's dropping older cards. They dropped the old Radeon a while back and the GF2MX is gone now too. Getting an upgrade to OpenGL will be a great thing also. But that's still no excuse for dropping the GF3. It's vastly more capable than the 4MX. And the 7500 is nothing more than the old Radeon. You talk about future capability, and that's where the GF3 and 8500 shine. The 4MX and 7500? Uh, well they run Q3 well. But if they can't leverage the new pixel and vertex shader tech then how are they cards for the future?
  • Reply 29 of 53
    In comparing ATI and nVidia I often think of ATI as being much like Apple and nVidia being much like the PC industry. ATI does it RIGHT. They produce chips with full feature sets, they always have. They produce chips that provide the best image quality of any in their industry segment. ATI has always had better image quality than 3dfx or nVidia. nVidia goes for raw power and speed. I see them as taking somewhat of a quick and dirty approach. Don't get me wrong, they produce good drivers, its just I've never read an interview with them in which they stated image display quality as a top priority. Textures yes, your desktop picture (which I spend quite a bit of time looking at), no. They are gradually developing the feature set that ATI has always given us, but their goal is raw power. ATI's goal is to produce cards that give you the best all around graphic quality. How many terapixels per second they can pump is somewhat secondary (although now they have to worry just to stay in the market) and that's the way it should be.
  • Reply 30 of 53
    I agree that not having a high-end graphics card available in the line-up is just plain silly, but I'm hoping that is just an availability issue. The geForce3 was a really expensive card to produce, and the high-end geForce4 is expected to perform better but cost less. Hopefully Apple adds this card to its BTO lineup next week.



    The choice of an NV17 based chipset is probably driven by cost -- the NV25 is very expensive and its primary addition is the programmable vertex engine. The NV17's pixel engine is pretty much the same as the geForce3's. The next OpenGL release for OSX will likely include a standardized vertex shader program, and Apple will likely do what Microsoft did in DirectX8 -- provide a CPU based vertex shader implementation. The great thing is that the G4's AltiVec unit will simply rock at doing this, and we ought to see decent performance out of the NV17 despite its lack of a programmable vertex unit. On a dual G4 800 or 1000, its possible that the software implementation will outrun a geForce3's vertex shaders (I say dual because then you still get it happening in parallel, assuming the OpenGL implementation is multi-threaded). The NV25 has two vertex shader units running in parallel at a higher clock rate, so it'll do better than the current crop of G4s. The performance will be highly dependent on content, of course, since the GPU implementation benefits from being later in the pipeline.



    Note that current Mac-based benchmarks don't include any use of shaders because they are currently unavailable through Apple's OpenGL (no extensions yet either). This might explain why the PC framerates appear lower in some cases -- the rendering might be simplified due to the lack of shader access. Shaders don't make things faster necessarily, they make them much more flexible and allow the GPU to do much cooler stuff... usually at the cost of some framerate.
  • Reply 31 of 53
    [quote]Originally posted by The Swan:

    <strong>In comparing ATI and nVidia I often think of ATI as being much like Apple and nVidia being much like the PC industry. ATI does it RIGHT.</strong><hr></blockquote>



    Oh I wouldn't agree with that at all. ATI came from a 2D background so their 2D feature set (including movie playback and crisp image output) tends to be better. Their 3D feature set, however, was laughable in the early Rage products, weak in the Rage128, and is only really catching up in the Radeon line. The current Radeon2 has them roughly on par with pre-NV25 in terms of 3D.



    nVidia, on the other hand, has been all about 3D from their inception. The 2D stuff to them has been almost an afterthought. As of geForce4, however, they've pretty much caught up with ATI as far as visual quality and video playback goes.



    The graphics chip market seems to have stabilized on these two vendors, at least for now, and both of them are producing comparable chips for the first time ever. You no longer have to choose between 2D and 3D -- you get topnotch GPUs that do both equally well. For at least the rest of this year it looks like they will have comparable products. If ATI meets its aggressive schedule they should pull slightly ahead by fall before nVidia jumps past them again. Driver quality, price, and OEM deals will determine which of them succeeds better this year, but I'm hoping that they'll split the market 50/50 and both continue to develop such awesome GPUs.
  • Reply 32 of 53
    matsumatsu Posts: 6,558member
    Thanks Programmer,



    I think some people misunderstood what I meant about an updated OpenGL API. I referred to dropping support for older cards that clearly wouldn't be able to supply any meaningful acceleration of new features. As an API **any** compliant card wold work, natch. I'm pretty sure I said that. You can bet that rage 128 based products won't be compliant so they'll be dropped. The spec probably won't include anything that a Radeon or GF2MX can't do, so they should be supprted insofar as they should be compliant with the new spec.



    What we really need is for Apple to provide very tight quartz/OSX integration, so that capable machines can get a nice graphical speed boost. Should the API provide the right hooks for a new level of effects/features, then it is not inconcievable that software like photoshop/premiere/final cut/maya etc etc... could see meaningful boosts in performance.



    Right now you only get that from Professional graphics cards (on both the PC and Mac side) for as fas as work goes, it is still the CPU that does most of the work. Even the fastest Gaming card doesn't significantly speed up your 3-d/video work.



    I think 4MX is a very nice upgrade to 2MX. They didn't downgrade anything at all, they just took away an expensive option that few people went for anyway. Rather than GF4Ti or whatever gaming Radeon, Apple should provide a real professional option. Pro users don't really need ultra frame rates, they need number crunching (polys, curves, lighting effects, etc etc.) That's good for games too, but a card that can chomp 25-50% off our render times in any number of pro apps, means more to the Powermac market than a card that can pump out 350fps in Quake.



    Apple needs to provide upcoming versions of Quadro and FireGL as options, not GF4ti. 4MX is more than enough for gaming. PowerMacs are for work, not games.
  • Reply 33 of 53
    [quote]Originally posted by Matsu:

    <strong>I think some people misunderstood what I meant about an updated OpenGL API. I referred to dropping support for older cards that clearly wouldn't be able to supply any meaningful acceleration of new features. As an API **any** compliant card wold work, natch. I'm pretty sure I said that. You can bet that rage 128 based products won't be compliant so they'll be dropped. The spec probably won't include anything that a Radeon or GF2MX can't do, so they should be supprted insofar as they should be compliant with the new spec.

    </strong><hr></blockquote>



    Actually I think the Rage128 is OpenGL compliant and should be fully supported for quote a while going forward. Its not a real speed demon, and any vertex programs will need to run on the CPU (also true for the geForce 2MX and 4MX).



    <strong> [quote]

    What we really need is for Apple to provide very tight quartz/OSX integration, so that capable machines can get a nice graphical speed boost. Should the API provide the right hooks for a new level of effects/features, then it is not inconcievable that software like photoshop/premiere/final cut/maya etc etc... could see meaningful boosts in performance.



    Right now you only get that from Professional graphics cards (on both the PC and Mac side) for as fas as work goes, it is still the CPU that does most of the work. Even the fastest Gaming card doesn't significantly speed up your 3-d/video work.



    I think 4MX is a very nice upgrade to 2MX. They didn't downgrade anything at all, they just took away an expensive option that few people went for anyway. Rather than GF4Ti or whatever gaming Radeon, Apple should provide a real professional option. Pro users don't really need ultra frame rates, they need number crunching (polys, curves, lighting effects, etc etc.) That's good for games too, but a card that can chomp 25-50% off our render times in any number of pro apps, means more to the Powermac market than a card that can pump out 350fps in Quake.



    Apple needs to provide upcoming versions of Quadro and FireGL as options, not GF4ti. 4MX is more than enough for gaming. PowerMacs are for work, not games.</strong><hr></blockquote>



    I'm never clear on exactly what the professional market wants in the way of 3D acceleration... I mean the geForce3 & 4 are really fast GPUs and blow away any professional boards from even just 4 years ago. Is it higher vertex processing speeds? More memory? More pixel rate?
  • Reply 34 of 53
    arty50arty50 Posts: 201member
    [quote]Originally posted by Matsu:

    <strong>Thanks Programmer,

    Apple needs to provide upcoming versions of Quadro and FireGL as options, not GF4ti. 4MX is more than enough for gaming. PowerMacs are for work, not games.</strong><hr></blockquote>



    Huh? So what you're saying is that after I spend a few thousand dollars on a midrange PowerMac/Monitor/RAM/HDs/etc., I should have to go out and spend another thousand or so on a gaming PC instead of upgrading the graphics card for $200-300? When I'm spending that much money on a computer, I want to get the most out of it. And that means having a full fledged GF4 or 8500. Sure the Quadro and FireGL would be nice, but I don't need that much power.



    If I haven't said the following before I apologize. One should note that the 8500 and GF4 haven't been released yet. If Apple adds those as a BTO option soon after their release, then I'll shut up. Somehow I have a feeling they won't. But who knows, it's just a prediction.



    [ 02-04-2002: Message edited by: Arty50 ]</p>
  • Reply 35 of 53
    matsumatsu Posts: 6,558member
    I think spending 2000-3000 on any computer just to play games is foolish. I realize you're not saying that you use your powermac for games only, or even firstly, but IMHO having games as even a top three consideration on such an expensive machine is a waste. The machine should, as you point out, handle games well but that can't be the primary consideration for Apple. And at the prices they charge I sure hope it isn't the primary consideration for buyers either.



    The majority of PowerMac buyers don't play on their computers. If there is going to be an option for a card then it should be a card that lets you work better, not play better. Gamers who spend 2000-3000 on a machine are either spoiled rich kids, very lonely, or very dumb.



    OTOH, I kinda agree that after spending a considerable sum on a powermac, you shouldn't need to spend another grand on a gaming rig. But games just aren't that important to the powermac market, geforce 3 sold poorly , and you can bet that 4ti won't do that much better. These high-end gaming cards more often than not go into the home-built rigs of the aforementioned lonely male teens -- so they can practice for the next Columbine or something Most high-end PC's and workstations don't ship with GF3Ti cards since there's a limit to how much they help with productive work.



    In Powermacs Apple would probably sell more pro cards as 500-600 options than they would GF4Ti's at half as much.
  • Reply 36 of 53
    We have a great discussion going on about the GeForce 4 and ATI's future offerings in "Hold off those video card purchases".
  • Reply 37 of 53
    matsumatsu Posts: 6,558member
    Well, these two topics are no the same. This one is more about gripes over nv17 vs nv25. Yours is more about upcoming graphics options in general.



    Ya know what UBB needs? Some kinda merge thread feature so admins can (with one or two clicks) join two threads that they think are similar enough, rather than just close one. Hack anyone?
  • Reply 38 of 53
    programmerprogrammer Posts: 3,458member
    What I don't understand is gamer and professional video cards differ? I know a lot about what games need in a 3D card, but I don't understand what a Pro card needs that is different -- they are usually just faster. For people that don't want to pay for high end 3D support Apple should just provide a BTO option (either to add a fast 3D card or to remove it).
  • Reply 39 of 53
    serranoserrano Posts: 1,806member
    [quote]Originally posted by Matsu:

    <strong>The majority of PowerMac buyers don't play on their computers. If there is going to be an option for a card then it should be a card that lets you work better, not play better. Gamers who spend 2000-3000 on a machine are either spoiled rich kids, very lonely, or very dumb.</strong><hr></blockquote>



    that, or they're gamers simply because you wouldn't pay so much for a computer simply to play games doesn't mean that those who do are either rich, lonely, or dumb.



    [quote]<strong>

    These high-end gaming cards more often than not go into the home-built rigs of the aforementioned lonely male teens -- so they can practice for the next Columbine or something </strong><hr></blockquote>







    i'm not even going to touch that.



    however i agree with the rest of your post
  • Reply 40 of 53
    What nVidia is doing is transitioning their whole product line to the GeForce 4. The MX cards based on the NV17, and the Ti series based on the NV25. I remember reading a while back that the NV17 was supposed to have a lot of similarities with the current GeForce 3. I don't know that it is fair to say that the GF4MX sucks because they will be at similar prices to the GF2MX, and will be a lot better. I think the GF4MX will probably perform about as well as a GF3 Ti 200 in most cases, so you basically can say that now we get GF3 technology at GF2MX prices. That's an advancement definitely. I believe the GF4MX will support a few different features than the GF3 such as dual-head. Also really all we have to go by are early performance tests and some manufacturer samples that were tested on the PC side. Perhaps this card will be a bit more future-proof and perform better in a few months. Now on the other hand, I agree that there is no reason for Apple to use these cards in their PowerMacs. These machines are supposed to pro machines, yet they ship with a budget card. The GF4MX ain't too shabby but I think a Radeon 8500 should have been offered or something.
Sign In or Register to comment.