or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › nVidia GeForce 4 (NV25)
New Posts  All Forums:Forum Nav:

nVidia GeForce 4 (NV25)

post #1 of 36
Thread Starter 
News is surfacing from various sites on the upcoming GeForce 4. It seems as if these cards could be released/shown as early as January/Febuary. Seems a little early to release a GeForce 4, since the GeForce Ti 500 isn't exactly old hash quite yet. If there is truth to this, perhaps nVidia will launch it or demo it on the new systems at MWSF? After all, the GeForce 3 was first released at MacWorld Tokyo last year. I think MWTY again could be possible if the GeForce 4 is coming out that fast.

Here are a few specs:

GeForce4 Ti 1000: This is the fastest graphics cards built on GeForce4 chip working at about 300MHz frequency. The card will have AGP 8x interface and 128MB DDR SDRAM memory working at 700MHz.

GeForce4 Ti 500: This is a bit slower solution with around 275MHz chip frequency and 600MHz memory frequency. Although it will have AGP 4x interface, the card will still come with 128MB graphics memory.

GeForce4 MX 460: This is the eldest representative of the GeForce4 MX family. It will probably have 4 rendering pipelines and DirectX 8-compliant T&L unit. The amount of DDR graphics memory used (with 128bit bus) will be cut down to 64MB, and its working frequency will be reduced down to 550MHz. The core will work at 300MHz.

GeForce4 MX 440: These cards will go with 64MB DDR SDRAM with 128bit access bus. The memory working frequency will be 400MHz and the core frequency 275MHz.

GeForce4 MX 420: According to the available data, this GeForce4 MX version will be targeted for the Low-End market that is why the cards built on it will have 64MB SDR SDRAM memory working at 166MHz. The core will work at 250MHz. Besides, It looks as if there were only two rendering pipelines in this modification.

from nvnews.net

Imagine if Jobs demoed a G5 w/GeForce 4..............

ok I woke up.
~Winner of the Official 2003 AppleInsider NCAA Men's Basketball Tournament Pool~
Reply
~Winner of the Official 2003 AppleInsider NCAA Men's Basketball Tournament Pool~
Reply
post #2 of 36
Correction: GF3 was introduced at MWSF '01.
*Registered March 1, 1999*
Member #14
Reply
*Registered March 1, 1999*
Member #14
Reply
post #3 of 36
[quote]Originally posted by MacAddict:
<strong>Correction: GF3 was introduced at MWSF '01.</strong><hr></blockquote>


Correction GF3 was introduced at MW Tokyo 2001.

Geforce 2 products in the PowerMac line were introduced at MWSF 2001.
post #4 of 36
The next NVIDA GPU coming out in Apple machines is...nevermind, do not want to lose my job
post #5 of 36
[quote]Originally posted by Jared:
<strong>The next NVIDA GPU coming out in Apple machines is...nevermind, do not want to lose my job </strong><hr></blockquote>

I'll say it for you.... nForce.
post #6 of 36
Yay!

Can you say dual channel RAM goodness....

lol! leave it to Apple to pick the highest price single CPU mobo on the market!

hey jared and codename....at the very least, thanks for the entertaining posts!
post #7 of 36
Info on nForce?
I'm making plastics right now!
Reply
I'm making plastics right now!
Reply
post #8 of 36
[quote]Originally posted by TigerWoods99:
<strong>News is surfacing from various sites on the upcoming GeForce 4. It seems as if these cards could be released/shown as early as January/Febuary. Seems a little early to release a GeForce 4, since the GeForce Ti 500 isn't exactly old hash quite yet. If there is truth to this, perhaps nVidia will launch it or demo it on the new systems at MWSF? After all, the GeForce 3 was first released at MacWorld Tokyo last year. I think MWTY again could be possible if the GeForce 4 is coming out that fast.

Here are a few specs:

GeForce4 Ti 1000: This is the fastest graphics cards built on GeForce4 chip working at about 300MHz frequency. The card will have AGP 8x interface and 128MB DDR SDRAM memory working at 700MHz.

GeForce4 Ti 500: This is a bit slower solution with around 275MHz chip frequency and 600MHz memory frequency. Although it will have AGP 4x interface, the card will still come with 128MB graphics memory.

GeForce4 MX 460: This is the eldest representative of the GeForce4 MX family. It will probably have 4 rendering pipelines and DirectX 8-compliant T&L unit. The amount of DDR graphics memory used (with 128bit bus) will be cut down to 64MB, and its working frequency will be reduced down to 550MHz. The core will work at 300MHz.

GeForce4 MX 440: These cards will go with 64MB DDR SDRAM with 128bit access bus. The memory working frequency will be 400MHz and the core frequency 275MHz.

GeForce4 MX 420: According to the available data, this GeForce4 MX version will be targeted for the Low-End market that is why the cards built on it will have 64MB SDR SDRAM memory working at 166MHz. The core will work at 250MHz. Besides, It looks as if there were only two rendering pipelines in this modification.

from nvnews.net

Imagine if Jobs demoed a G5 w/GeForce 4..............

ok I woke up.</strong><hr></blockquote>
have we got an idear of the power of those cards especially the numbers of triangles per second ?
post #9 of 36
Jared would you be JaredS from the old AI boards?

If Apple standardises on nForce across the Desktop line, then I wonder what will happen to the price line...

powerdoc as far as the GF4's fillrate goes, I think you need not worry about it

[ 12-27-2001: Message edited by: SYN ]</p>
Soyons réalistes, Demandons l'impossible.
Reply
Soyons réalistes, Demandons l'impossible.
Reply
post #10 of 36
[quote]Originally posted by SYN:
<strong>
If Apple standardises on nForce across the Desktop line, then I wonder what will happen to the price line...
</strong><hr></blockquote>

Prices will remain the same, margins will grow up...it's Apple, not Dell
post #11 of 36
Are there any 8x AGP mobos out yet?
IBL!
Reply
IBL!
Reply
post #12 of 36
InForce makes sense...

rumors of hypertransport?

if the nForce chipset does get into a Mac, lets hope that nVidia has gotten the kinks worked out.

This complicates things a bit:
if nForce goes into the PowerMac, what will be the graphics card? Surely, it cannot be the integrated video...a GeForce 3? A GF3 Ti?

an ATi Radeon 8500?

Actually, the more I think about it, the more it makes sense for the nForce to go into the new iMac. Now that would be fscking cool. Jobs should move all desktops to DDR....if the G5 comes out in the PowerMac, Jobs wouldn't have to keep SDRAM in the iMac to distinguish the line.
post #13 of 36
Maybe I'm just high, but I remember reading that it would be possible to deploy nforce mobos with an empty AGP slot along side the integrated graphics. When you fill the AGP slot it disables the integrated graphics. They could then use the nforce integrated graphics in both powermacs and iMacs. Same Mobo for everything! Powermacs just get more RAM slots, PCI slots, and a different PPC card/proc module (G4/3 for iMac, G5 for the powermac)

Just give the Powermacs an AGP slot to go along with the integrated graphics. If both the AGP slot and the integrated graphics could be in use it'd make an easy peasy path for dual monitor set-ups!
IBL!
Reply
IBL!
Reply
post #14 of 36
[quote]Originally posted by SYN:
<strong>Jared would you be JaredS from the old AI boards?</strong><hr></blockquote>

That would be me
post #15 of 36
Yes. nForce of course has an AGP slot.

The built in graphics (comparable to a GF2 MX) is meant only for value customers like businesses.
post #16 of 36
We're forgeting one important detail: the nForce was not made for PowerPC processors!. It supports busses for x86 machines but not for the 60x or MPX bus. Unless it's a totally different part, no way it's going to happen. It's not even that impressive.
post #17 of 36
GeForce 4, in MWNY.

Man that really sounds like I know something, but in truth another pathetic guess.
"Its a good thing theres no law against a company having a monopoly of good ideas. Otherwise Apple would be in deep yogurt..."
-Apple Press Release
Reply
"Its a good thing theres no law against a company having a monopoly of good ideas. Otherwise Apple would be in deep yogurt..."
-Apple Press Release
Reply
post #18 of 36
Yeah, I know that the Southbridge on a PC is completely different than the PowerPC mobo implementation, but stranger things have happened.

Mmmmm....dual channel DDR northbridge.
post #19 of 36
Not even that. The southbridge is just a PCI device or in this case a HT device. The southbridge is the ONLY thing Apple could possibly conceivably make use of. The northbridge is incompatible. It just won't work.
post #20 of 36
[quote]Originally posted by Jared:
<strong>

That would be me </strong><hr></blockquote>

nice to see ya back.

if anyone here would know anything it would be you.

come on, G4 or G5. PLEASE
post #21 of 36
[quote] Originally posted by MacAddict:
Correction: GF3 was introduced at MWSF '01. <hr></blockquote>

[quote]Originally posted by applenut:
<strong>

Correction GF3 was introduced at MW Tokyo 2001.

Geforce 2 products in the PowerMac line were introduced at MWSF 2001.</strong><hr></blockquote>


:eek: SHUT DOWN!!! :eek: MacAddict, check your info before you descide to try and look like your smart again.
post #22 of 36
Thread Starter 
Haha.


Now there is info on the R300 coming out January/Febuary too!

Can't complain about the graphics card technologies not progressing.
~Winner of the Official 2003 AppleInsider NCAA Men's Basketball Tournament Pool~
Reply
~Winner of the Official 2003 AppleInsider NCAA Men's Basketball Tournament Pool~
Reply
post #23 of 36
[quote]Originally posted by SameOldSht:
<strong>
"before you descide" . </strong><hr></blockquote>

Yeah, learn to spell simple words first.

Coming to MacAddict's rescue... one post at a time.
post #24 of 36
*smack* *smack* *smack*

Ahh I can't keep up!

*Registered March 1, 1999*
Member #14
Reply
*Registered March 1, 1999*
Member #14
Reply
post #25 of 36
[quote]Originally posted by TigerWoods99:
<strong>Here are a few specs:

GeForce4 Ti 1000: This is the fastest graphics cards built on GeForce4 chip working at about 300MHz frequency. The card will have AGP 8x interface and 128MB DDR SDRAM memory working at 700MHz.
</strong><hr></blockquote>

AGP 8x? Wonder where you'd stick that one in...

(plus I guess it would be as an important step speed-wise as ATA133 was

Also, shouldn't that be "working at 350MHz"? (700MHz sure would be nice, though

Bye,
RazzFazz
post #26 of 36
Wouldnt it be twice the bus speed because its DDR?

[ 01-03-2002: Message edited by: dartblazer ]</p>
dartblazer
<a href="http://www.openoffice.org" target="_blank">openoffice.org</a>
<a href="http://www.openbeos.org" target="_blank">openbeos.org</a>
Have a good-cold day
Reply
dartblazer
<a href="http://www.openoffice.org" target="_blank">openoffice.org</a>
<a href="http://www.openbeos.org" target="_blank">openbeos.org</a>
Have a good-cold day
Reply
post #27 of 36
The nv25 is the chip in the XBox, and from what I've been told its not the geForce4. More like the geForce3.5 -- if it ever shows up as a PC/Mac product at all. No, the next generation graphics chip is what to hold out for (nv30), and the leap forward there will be in functionality not poly rate or pixel rate. These cards will do a lot more calculations per poly and per pixel, rather than doing significantly more polys and pixels. This was true of the geForce3 as well, but on the Mac at least we haven't seen the functionality made available yet (or so I've been told).
Providing grist for the rumour mill since 2001.
Reply
Providing grist for the rumour mill since 2001.
Reply
post #28 of 36
[quote]Originally posted by dartblazer:
<strong>Wouldnt it be twice the bus speed because its DDR?
</strong><hr></blockquote>

Nope, the clock speed stays the same, you just get to use both the rising and the falling edge of the signal to transmit data on. This effectively doubles your throughput (MB/s), but leaves your clock speed (MHz) untouched.

Bye,
RazzFazz
post #29 of 36
[quote]Originally posted by Programmer:
<strong>The nv25 is the chip in the XBox, and from what I've been told its not the geForce4. More like the geForce3.5 -- if it ever shows up as a PC/Mac product at all. No, the next generation graphics chip is what to hold out for (nv30), and the leap forward there will be in functionality not poly rate or pixel rate. These cards will do a lot more calculations per poly and per pixel, rather than doing significantly more polys and pixels. This was true of the geForce3 as well, but on the Mac at least we haven't seen the functionality made available yet (or so I've been told).</strong><hr></blockquote>

i have read one year ago, that when a graphic card will reach 100 millions polygones per sec , the result will enter in the virtual reality. Actually a geforce 3 500 TI reach this limit, but we don't see virtual reality yet. Perhaps there is some works to do with the T&L engine, to reach virtual reality.
post #30 of 36
Thread Starter 
*bump*

<a href="http://users.pandora.be/threatlockz/geforce4.htm" target="_blank">http://users.pandora.be/threatlockz/geforce4.htm</a>

Pics of the GeForce 4 logos.

Hmm....all over the web they are saying nVidia is set to launch in Febuary. Could it be possible that a G5 & GeForce 4 combo are the reason the new PowerMacs didnt arrive?

Imagine.......Apple first with GeForce 4s on the new PowerMac G5

*drools*......
~Winner of the Official 2003 AppleInsider NCAA Men's Basketball Tournament Pool~
Reply
~Winner of the Official 2003 AppleInsider NCAA Men's Basketball Tournament Pool~
Reply
post #31 of 36
Thread Starter 
Febuary 5th special event look for new PowerMacs.
~Winner of the Official 2003 AppleInsider NCAA Men's Basketball Tournament Pool~
Reply
~Winner of the Official 2003 AppleInsider NCAA Men's Basketball Tournament Pool~
Reply
post #32 of 36
[quote]Originally posted by TigerWoods99:
<strong>Febuary 5th special event look for new PowerMacs.</strong><hr></blockquote>

Who's event is on the 5th? Nvida or Apple?
post #33 of 36
post #34 of 36
I wouldn't necessarily say a couple decades . . .

How long does it take to render a full frame from the Final Fantasy movie?

Granted, those numbers will stay the same as we create more complex software - but FF as it is - was pretty damn close to almost perfect VR
(texture and motion-wise - nevermind the characters all were shoddily modeled plasticky junk - that's human error.)
post #35 of 36
[quote]Originally posted by cinder:
<strong>I wouldn't necessarily say a couple decades . . .

How long does it take to render a full frame from the Final Fantasy movie?

Granted, those numbers will stay the same as we create more complex software - but FF as it is - was pretty damn close to almost perfect VR
(texture and motion-wise - nevermind the characters all were shoddily modeled plasticky junk - that's human error.)</strong><hr></blockquote>

I hope your idea of virtual reality isn't as lame as that movie. Or any movie for that matter -- panning at 24 fps gives me a headache.

True virtual reality (think StarTrek HoloDeck quality visuals and audio) is a long long way off... but we'll get there one step at a time, and each step will be awe inspiring.
Providing grist for the rumour mill since 2001.
Reply
Providing grist for the rumour mill since 2001.
Reply
post #36 of 36
I think that one of the coolest implications of a spawning massive VR world(like star trek holo deck) would be games like rogue spear, or ghost recon or delta force land warrior ya know? realistic strategy first person shooter games, put to new level by actually being in the environment, that would be nutty cool, I doubt I'll live to see the day though, and even if I did I'd probably be too old to care
orange you just glad?
Reply
orange you just glad?
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › nVidia GeForce 4 (NV25)