or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Briefly: Apple UK blunder hints at Mac Pro update
New Posts  All Forums:Forum Nav:

Briefly: Apple UK blunder hints at Mac Pro update - Page 4

post #121 of 156
Quote:
Originally Posted by nvidia2008 View Post

Perhaps then if one buys used Mac Pro, out of the 1st year warranty, then not as much "pain" and worry about seriously violating the warranty by swapping in the Clovertowns.

Is the AnandTech link the main one out there on the 8 core MacPro DIY ?? Other more recent hack/mods??
http://www.anandtech.com/mac/showdoc.aspx?i=2832&p=6

While googling I found http://www.clovertown.com/ ...Heh.

I don't know of any other useful ones either.
post #122 of 156
Quote:
Originally Posted by Marvin View Post

Sure but I'm not talking about serious gaming - even the X1600 can't really do serious gaming. I just mean that Apple's lowest end shouldn't have such a huge difference in performance compared to the next level up because that doesn't give people any choice. The X1300 will play Half-Life 2 at maximum settings with no AA. The GMA can't do that.

3dmark06
X1600 = 1800
X1400 = 900
X1300 = 560
GMA 950 = 170

So the jump from the Mac Mini to the iMac with X1600 gives you an order of magnitude jump in graphics performance. I would be content if they used x1400s because that's only a factor of 2. More than a factor of 10 is taking the piss.

According to this forum, the X3000 should definitely be a good solution:

http://forums.vr-zone.com/archive/in.../t-101174.html

But their benchmarks are coming out at around 750 for 3dmark05 for the X3000 and the x1300 gets 1000 so it's about double the GMA. When the drivers are better optimized and we get it in the mac machines, I'm sure it will perform a bit better. Behold the future spec of the Mini:

http://www.newegg.com/Product/Produc...82E16883227007

As always, already available in a PC.

You'll notice it has a TV tuner and two optical drives so that would equal a Mini + elgato + external HD as I'm sure you can replace one of the optical drives with a HD. And they are nice quiet mini-cd compatible tray loading drives.

I think you have made some interesting points.

Could it be that even not-so-serious gaming, for 2007 and 2008 titles, we're gonna need a GPU that can push at least 1500 3DMark06...?* I'm trying to keep a more open mind because I would consider myself a mid-range-mainstream but very casual (1 hour per day?) gamer, but "aesthetically biased" with regards to what minimum "visual experience" one should have with 1024x768, 1280x960 level resolution of gaming. Biased for example I love Source with "HDR" (HL2+1 etc) and LithTech+ (FEAR) and NFS:MostWanted graphic engines..... Doom3/Quake4, I don't likey.

*??I've to go back and check my 3DMark06 for my nVidia6600GT 128mbVRAM.
post #123 of 156
Quote:
Originally Posted by Marvin View Post

...............
3dmark06
X1600 = 1800
X1400 = 900
X1300 = 560
GMA 950 = 170
..................

Yeah, I pulled about 1826 3dMark06's for 6600GT 128mbVRAM,
AMD64 singlecore 2.18ghz 1gb RAM.


Edit:
So yeah... a 3dMark06 of 1800 would give 1280x960-is resolution, 2xAA 8xAF or 16xAF, and "HDR" graphics on close-to-max settings for quite a number of games. HL2 Episode 1, yeah, you can max it out but keep AA to 2x only. 4x is nice but sluggish. FEAR and FEAR:ExtractionPoint, 2xAA, 4xAF or 16xAF, can max out most settings, but textures medium for 128mbVRAM cards, and all shadows turned off (some shadows okay but look not so nice without "soft shadows" on, which kills framerates). NeedForSpeed:MostWanted, 2x-ishAA, 8x or 16xAF, "overbright" "visual treatment" medium to max "model detail".

So that's my bias -- 1800 3DMark06's - just nice = which dovetails smoothly into, X1600 [mobility], but nonetheless just right, for 2005-2008 titles at just about what I would have described here as the "6600GT" standard of visual experience in 3D Gaming. Looking briefly at your links to the X3000, yeah, kinda passable, but as they said in the linked thread, UnrealTournament2007 will eat it alive.

Give me at least 1500 3DMark06's or Give ME DEATH! muah haha ah hahhahah
post #124 of 156
I think Apple could put in an X3000 integrated graphics into the Mac Mini and MacBooks, and a number of people would complain that it "can't really play games" or that "teh graphix sucks" or "is sooo lagggy...".

If you'll excuse me now, I've got a 6600GT to run at 10% hard OC and redline my AMD64, 200+ miles an hour (NeedForSpeed:MW)... Aww yeah. Peace Out.
post #125 of 156
Quote:
Originally Posted by nvidia2008 View Post

I think Apple could put in an X3000 integrated graphics into the Mac Mini and MacBooks, and a number of people would complain that it "can't really play games" or that "teh graphix sucks" or "is sooo lagggy...".

If you'll excuse me now, I've got a 6600GT to run at 10% hard OC and redline my AMD64, 200+ miles an hour (NeedForSpeed:MW)... Aww yeah. Peace Out.

By Apple relying on the 9xx chipsets, they are trying to avoid having people say "Since they put a seperate GPU in, why didn't they put a ---- in, instead of what they used?"

This just isn't a gaming machine.
post #126 of 156
Quote:
Originally Posted by melgross View Post

By Apple relying on the 9xx chipsets, they are trying to avoid having people say "Since they put a seperate GPU in, why didn't they put a ---- in, instead of what they used?"

This just isn't a gaming machine.

I believe that the X3000 is being built into the next mobile chipset coming in the next few months, it's just a matter of when Apple updates their notebooks.
post #127 of 156
Quote:
Originally Posted by JeffDM View Post

I believe that the X3000 is being built into the next mobile chipset coming in the next few months, it's just a matter of when Apple updates their notebooks.

Whatever is being built into the chipset is fine. It's the same concept as the 950 or new 965. So it's the same thing, not a seperate part.
post #128 of 156
Quote:
Originally Posted by melgross View Post

By Apple relying on the 9xx chipsets, they are trying to avoid having people say "Since they put a seperate GPU in, why didn't they put a ---- in, instead of what they used?"

This just isn't a gaming machine.

But then aging there is not an mid-range head less desktop
post #129 of 156
Quote:
Originally Posted by Joe_the_dragon View Post

But then aging there is not mid-range head less desktop

I'm sorry Joe, but I didn't quite understand that.
post #130 of 156
I'm sorry, but Apple has never been a truly gaming platform and probably never will be.

The X1600 in the iMac and MBP is GARBAGE. Sure, it can run Doom III, but that game is like, what, 3 years old? Cutting edge is FEAR and Oblivion.

The disappointing thing about the MacBook for me is that I use video spanning, and there is some lag with Expose --this just shouldn't happen for a laptop with a rather fast CPU.

Bring on the X3000, but by the time that it's released and optimized, nVidia will roll out its new mid-range DX10 cards (ATi as well), and the X3000 will then be considered just as poor as the 950 was a year ago.
32" Sharp AQUOS (1080p) > 13" MacBook Pro 2.26GHz. 4Gb RAM . 32Gb Corsair Nova SSD >>> 500Gb HDD
Reply
32" Sharp AQUOS (1080p) > 13" MacBook Pro 2.26GHz. 4Gb RAM . 32Gb Corsair Nova SSD >>> 500Gb HDD
Reply
post #131 of 156
Quote:
Originally Posted by applebook View Post

The disappointing thing about the MacBook for me is that I use video spanning, and there is some lag with Expose --this just shouldn't happen for a laptop with a rather fast CPU.

I don't think it is the chip - the ATI card in the Mac Pro can show some lag for Exposé too.
post #132 of 156
Quote:
Originally Posted by applebook View Post

The X1600 in the iMac and MBP is GARBAGE. Sure, it can run Doom III, but that game is like, what, 3 years old? Cutting edge is FEAR and Oblivion.

Get your hand out of the candy jar. What part of midrange don't you understand?

MacBook Pro's aren't pizza ovens.
post #133 of 156
Quote:
Originally Posted by gregmightdothat View Post

Get your hand out of the candy jar. What part of midrange don't you understand?

MacBook Pro's aren't pizza ovens.

Mid-range would be something like the $150 7900 GS. The X1600 always was a low-end card and has been discontinued for quite awhile on the PC side.

The interesting thing is that Apple continues to use ATi when the 1000 Radeon series run significantly slower and hotter than the 7000 GeForce cards. Good for Apple.

I don't have a huge problem with the MacBooks and Minis' 950 because almost all PC laptops in the $1000 range and virtually all small PCs use the same chip.

A $1500+ PC laptop with the X1600 is one year ago. Today, it's plain
32" Sharp AQUOS (1080p) > 13" MacBook Pro 2.26GHz. 4Gb RAM . 32Gb Corsair Nova SSD >>> 500Gb HDD
Reply
32" Sharp AQUOS (1080p) > 13" MacBook Pro 2.26GHz. 4Gb RAM . 32Gb Corsair Nova SSD >>> 500Gb HDD
Reply
post #134 of 156
Quote:
Originally Posted by applebook View Post

Mid-range would be something like the $150 7900 GS. The X1600 always was a low-end card and has been discontinued for quite awhile on the PC side.

No, the 7900 is last generation's high-end card.

For both NVIDIA and ATI cards, __300 is low end, __600 is mid-range, and __800+ is high end.

The difference is important. Even though the 7900 is slower by today's high-end standards, it still runs incredibly hot—too hot to be used in a 1" laptop.

Quote:
The interesting thing is that Apple continues to use ATi when the 1000 Radeon series run significantly slower and hotter than the 7000 GeForce cards. Good for Apple.

You seem to confuse series with models. A high-end card is never going to be cooler than a mid-range card. Besides, I don't know where you got that statistic from, but NVIDIA chips have always been way hotter and consumed more power than ATI.

Quote:
I don't have a huge problem with the MacBooks and Minis' 950 because almost all PC laptops in the $1000 range and virtually all small PCs use the same chip.

A $1500+ PC laptop with the X1600 is one year ago. Today, it's plain

Funny, looking at a $1600 Dell Inspiron, it only goes up to a NVIDIA Quadro 350 Mobile, which is less than half of the speed of an X1600.

An HP Pavillion can only go up to a GeForce GO 7600, which is still a dip below the X1600.

So, the fastest shipping graphics card of any mainstream laptop is a no, huh? Or were you holding out for a $4,000 Alienware monstrosity?
post #135 of 156
Quote:
Originally Posted by gregmightdothat View Post

No, the 7900 is last generation's high-end card.

For both NVIDIA and ATI cards, __300 is low end, __600 is mid-range, and __800+ is high end.

The difference is important. Even though the 7900 is slower by today's high-end standards, it still runs incredibly hottoo hot to be used in a 1" laptop.

You seem to confuse series with models. A high-end card is never going to be cooler than a mid-range card. Besides, I don't know where you got that statistic from, but NVIDIA chips have always been way hotter and consumed more power than ATI.



The pathetically inept X1650 XT runs hotter than the 7900 GS:



"A high-end card is never going to be cooler than a mid-range card."

False, refer to the chart: the high-end 1950GT is much cooler than the X1650, though I maintain that it's a low-end card.

If you want to look only at series as the indicator of range, then my point about the X1600 is even more relevant, seeing as it was a lower-mid (at best) mainstream card that has been discontinued. Why does Apple continue to supply this, and how can any Mac buyer justify this?

Quote:
Funny, looking at a $1600 Dell Inspiron, it only goes up to a NVIDIA Quadro 350 Mobile, which is less than half of the speed of an X1600.

An HP Pavillion can only go up to a GeForce GO 7600, which is still a dip below the X1600.

The 7600 is not slower than the X1600. A simple search would verify this: http://www.notebookcheck.net/Mobile-...ist.844.0.html

Quote:
So, the fastest shipping graphics card of any mainstream laptop is a no, huh? Or were you holding out for a $4,000 Alienware monstrosity?

Speaking of Alienware, check this out:

[1] Area-51® m5790
Processor: Intel® Core 2 Duo Processor T7400 2.16GHz 4MB Cache 667MHz FSB
Display: 17" WideUXGA 1920 x 1200 LCD - Saucer Silver
Motherboard: Alienware® Intel® 945PM + ICH7 Chipset
Memory: 1GB Dual Channel DDR2 SO-DIMM at 667MHz - 2 x 512MB
System Drive: 160GB Serial ATA 1.5Gb/s 5,400 RPM w/ 8MB Cache
8x Dual Layer CD-RW/DVD±RW w/ Nero Software
Video/Graphics Card: 256MB ATI Mobility Radeon® X1800
Sound Card: Intel® 7.1 High-Definition Audio

>>>>>>>>>>> Free Alienware® T-Shirt: Free Alienware® T-Shirt - Black!!!!!!!!!!!!!
SubTotal: $1,939.00


15.4" WideUXGA 1920 x 1200 LCD
Intel® Core 2 Duo Processor T7400 2.16GHz 4MB Cache 667MHz FSB
1GB Dual Channel DDR2 SO-DIMM at 667MHz - 2 x 512MB
120GB Serial ATA 1.5Gb/s 5,400 RPM w/ 8MB Cache
Intel High Definition 7.1 Audio
8x Dual Layer CD-RW/DVD±RW w/ Nero Software
256MB NVidia® GeForce Go 7600
$1,894.00

At the end of the day, these are Windoze laptops, so I wouldn't even touch them, but there's no doubt that Windoze PCs are generally a step ahead in terms of GPU, display, and other hardware.

Most $1000 notebooks come with 100GB+ hard drives and 1Gb Ram as well.

[/QUOTE]
32" Sharp AQUOS (1080p) > 13" MacBook Pro 2.26GHz. 4Gb RAM . 32Gb Corsair Nova SSD >>> 500Gb HDD
Reply
32" Sharp AQUOS (1080p) > 13" MacBook Pro 2.26GHz. 4Gb RAM . 32Gb Corsair Nova SSD >>> 500Gb HDD
Reply
post #136 of 156
Quote:
Originally Posted by applebook View Post



The pathetically inept X1650 XT runs hotter than the 7900 GS:



"A high-end card is never going to be cooler than a mid-range card."

False, refer to the chart: the high-end 1950GT is much cooler than the X1650, though I maintain that it's a low-end card.

If you want to look only at series as the indicator of range, then my point about the X1600 is even more relevant, seeing as it was a lower-mid (at best) mainstream card that has been discontinued. Why does Apple continue to supply this, and how can any Mac buyer justify this?



The 7600 is not slower than the X1600. A simple search would verify this: http://www.notebookcheck.net/Mobile-...ist.844.0.html



Speaking of Alienware, check this out:

[1] Area-51® m5790
Processor: Intel® Core 2 Duo Processor T7400 2.16GHz 4MB Cache 667MHz FSB
Display: 17" WideUXGA 1920 x 1200 LCD - Saucer Silver
Motherboard: Alienware® Intel® 945PM + ICH7 Chipset
Memory: 1GB Dual Channel DDR2 SO-DIMM at 667MHz - 2 x 512MB
System Drive: 160GB Serial ATA 1.5Gb/s 5,400 RPM w/ 8MB Cache
8x Dual Layer CD-RW/DVD±RW w/ Nero Software
Video/Graphics Card: 256MB ATI Mobility Radeon® X1800
Sound Card: Intel® 7.1 High-Definition Audio

>>>>>>>>>>> Free Alienware® T-Shirt: Free Alienware® T-Shirt - Black!!!!!!!!!!!!!
SubTotal: $1,939.00


15.4" WideUXGA 1920 x 1200 LCD
Intel® Core 2 Duo Processor T7400 2.16GHz 4MB Cache 667MHz FSB
1GB Dual Channel DDR2 SO-DIMM at 667MHz - 2 x 512MB
120GB Serial ATA 1.5Gb/s 5,400 RPM w/ 8MB Cache
Intel High Definition 7.1 Audio
8x Dual Layer CD-RW/DVD±RW w/ Nero Software
256MB NVidia® GeForce Go 7600
$1,894.00

At the end of the day, these are Windoze laptops, so I wouldn't even touch them, but there's no doubt that Windoze PCs are generally a step ahead in terms of GPU, display, and other hardware.

Most $1000 notebooks come with 100GB+ hard drives and 1Gb Ram as well.

[/QUOTE]

Unfortunately, Apple seems to have other concerns.

Actually, Apple has always been slow to move to the latest systems. They may start out with a great system, but they upgrade too slowly. Other companies may start out behind, but they end up ahead.

It's something we suffer with for the OS.

I'm hoping that will change, at least with the Mac Pro, and then, if we are lucky (very lucky) the comcept will filter down to the other lines.
post #137 of 156
Quote:
Originally Posted by melgross View Post

Unfortunately, Apple seems to have other concerns.

They may start out with a great system, but they upgrade too slowly. Other companies may start out behind, but they end up ahead.

Absolutely true when it comes to CPU and GPU trends. The G4 was years ahead of Intel and AMD when first released, but IBM lost all of its advantage within a few years. The same could be said about the G5, though its performance wasn't as big a step forward as the G4 was many years earlier.

Notice now though that in terms of CPU, Apple is right there with PC vendors. The next step is not cheaping out on RAM and HD space for the lower and mid-range machines.

I think that the 7600GT (a very solid card) should be standard on the 20" and 24" iMacs and optional on the 17."
32" Sharp AQUOS (1080p) > 13" MacBook Pro 2.26GHz. 4Gb RAM . 32Gb Corsair Nova SSD >>> 500Gb HDD
Reply
32" Sharp AQUOS (1080p) > 13" MacBook Pro 2.26GHz. 4Gb RAM . 32Gb Corsair Nova SSD >>> 500Gb HDD
Reply
post #138 of 156
Quote:
Originally Posted by applebook View Post

Absolutely true when it comes to CPU and GPU trends. The G4 was years ahead of Intel and AMD when first released, but IBM lost all of its advantage within a few years. The same could be said about the G5, though its performance wasn't as big a step forward as the G4 was many years earlier.

Notice now though that in terms of CPU, Apple is right there with PC vendors. The next step is not cheaping out on RAM and HD space for the lower and mid-range machines.

I think that the 7600GT (a very solid card) should be standard on the 20" and 24" iMacs and optional on the 17."

One of the problems Apple has, is that if sales of a product are making them happy, they see no need to quickly update it, and if sales are slow, they may not want to bother with it.

They try to get as much mileage as they can from a configuration.
post #139 of 156
And I thought *I* was getting all intense and PC-gamer-y on y'all asses talking about my poor old, now apparently "low end" 6600GT... ...I'm stepping off this hamster-treadmill discussion.

I will be trying out a bit of GhostRecon and ZOMFG FEAR:ExtractionPoint. Speaking of FEAR, surprisingly, for almost similar 3dMark06's to a X1600, I seemed to have enjoyed it... and HL2 and HL2:LostCoast and HL2:Episode 1.

I'm not wasting money at this stage on a nVidia 7-series, and for frack's sake nVidia's 8 series and ATI's next gen stuff needs to get down to 65nm. The heat and power is just ridiculous, and they have totally stalled in ANY BLOODY CASE ON DELIVERING A POWERFUL AFFORDABLE MOBILE DISCREET GPU. ...The X1600mobility or a bit higher than that, that's it for 1st half 2007, peoples, let's accept that and move on.
post #140 of 156
Feel free to call me an Apple Apologist. ...I'd be happy to see a Go7600 or MobilityX1800 in iMacs and MacBookPros and MacBooks, perhaps, but meh. PC Gaming is a weird and terribly cruel world when it comes to graphics. And yes, Apple is "behind" because they are milking the profits.
post #141 of 156
Quote:
Originally Posted by nvidia2008 View Post

I'm not wasting money at this stage on a nVidia 7-series, and for frack's sake nVidia's 8 series and ATI's next gen stuff needs to get down to 65nm.

It's done. R600 will be 65nm. The source for the story is the not very reliable Inquirer but they're reporting from CeBIT in Hanover and they're also very close to AMD.

http://www.theinquirer.net/default.aspx?article=38292

Hopefully it's true.
post #142 of 156
Quote:
Originally Posted by kukito View Post

It's done. R600 will be 65nm. The source for the story is the not very reliable Inquirer but they're reporting from CeBIT in Hanover and they're also very close to AMD.
http://www.theinquirer.net/default.aspx?article=38292
Hopefully it's true.

Interesting. That was a very weirdly worded article... "Satan Clara?" WTF ...Anyways yeah, we need the mid-range 65nm ATI/nVidias to push 7900GTX levels of rendering capacity at around 120 Watts Load MAX.

Otherwise the whole PC-gaming-superfast-8seriesnVidia and Radeon X2000etc series will be on a powerhog runaway train.
post #143 of 156
Looks like I'll be hanging on to my nVidia 6600GT128mb this year. It was struggling somewhat in FEAR:ExtractionPoint last night now that I got a good gaming-quality (cordless V200 Logitech, believe it or not, it's real cool no lag) mouse, spinning and moving fast is straining the GPU on the volumetric lighting at max ; despite all shadows off, 0xAA...

<rambling: warning>Still hoping to fire up Ghost Recon and give that a shot. A bit of information/ inspiration/ Mac/ PC/ tech/ client/ internship overload at the moment, but managing reasonably okay. ...UGH TPSReports.... Well, actually it's just timesheets, and timesheets = ass covering for all involved. If each timesheet item is really scrutinised, well, I got my high-falutin' explanations ready that I *did* do that kinda stuff. Always has been tricky since I started working (at 21 in 2000-ish) but I got the bullsh1t skillz to justify the R&D, innovation, idea factory, which is what I bring to companies.</rambling>
post #144 of 156
Quote:
Originally Posted by applebook View Post

....
8x Dual Layer CD-RW/DVD±RW w/ Nero Software
Video/Graphics Card: 256MB ATI Mobility Radeon® X1800
......>>>>>>>>>>> Free Alienware® T-Shirt: Free Alienware® T-Shirt - Black!!!!!!!!!!!!!
SubTotal: $1,939.00
........................

Aw yeah, Nero Software and a free Alienware T-Shirt, BLACK !!!11!1!!!
ZOMFG I am sooo buying that laptop

Heh. Just mucking around, okay? Don't take too serious.
post #145 of 156
Since the topic of this thread has to do with the Mac Pro, I would be interested to know where the standard Mac Pro graphics card, NVIDIA GeForce 7300 GT graphics with 256MB memory, fits on this listing of graphics cards. When I buy a Mac Pro I'm interested in upgrading to the ATI Radeon X1900 XT, but not at the expense of having double the heat and more fan noise for cooling. Anyone have any knowledge they can share on my concern?



Quote:
Originally Posted by applebook View Post



The pathetically inept X1650 XT runs hotter than the 7900 GS:



:
2009 Quad 2.66 Mac Pro, 12 GB OWC RAM, ATI 4870, Wi-Fi Card 802.11n, AppleCare, 4 WD Caviar Black 1TB HD's, 2 SuperDrives, 24" Apple LED Display.
Reply
2009 Quad 2.66 Mac Pro, 12 GB OWC RAM, ATI 4870, Wi-Fi Card 802.11n, AppleCare, 4 WD Caviar Black 1TB HD's, 2 SuperDrives, 24" Apple LED Display.
Reply
post #146 of 156
Quote:
Originally Posted by applebook View Post



The pathetically inept X1650 XT runs hotter than the 7900 GS:



"A high-end card is never going to be cooler than a mid-range card."

False, refer to the chart: the high-end 1950GT is much cooler than the X1650, though I maintain that it's a low-end card.

You probably don't want to compare charts from two different sites who probably have different testing methodologies.

/shrug

Not everyone plays Oblivion or FEAR. The X1600 runs WoW just fine.

Vinea
post #147 of 156
Quote:
Originally Posted by vinea View Post

You probably don't want to compare charts from two different sites who probably have different testing methodologies.

/shrug

Not everyone plays Oblivion or FEAR. The X1600 runs WoW just fine.

Vinea

The charts are consistent though.

Why Oblivion and FEAR are benchmarks is because of their demand and load put on GPU processing power. Also, many games in the future will use these variations of these engines.
32" Sharp AQUOS (1080p) > 13" MacBook Pro 2.26GHz. 4Gb RAM . 32Gb Corsair Nova SSD >>> 500Gb HDD
Reply
32" Sharp AQUOS (1080p) > 13" MacBook Pro 2.26GHz. 4Gb RAM . 32Gb Corsair Nova SSD >>> 500Gb HDD
Reply
post #148 of 156
Quote:
Originally Posted by Royboy View Post

Since the topic of this thread has to do with the Mac Pro, I would be interested to know where the standard Mac Pro graphics card, NVIDIA GeForce 7300 GT graphics with 256MB memory, fits on this listing of graphics cards. When I buy a Mac Pro I'm interested in upgrading to the ATI Radeon X1900 XT, but not at the expense of having double the heat and more fan noise for cooling. Anyone have any knowledge they can share on my concern?

The X1900 should be substantially hotter than the 7300 GT, but the performance difference is enormous:





The X1900 will allow you to play any game in the foreseeable future, probably even Crisis.

If you don't play many "cutting-edge," recent games, then you should definitely go with the less expensive and cooler 7300.
32" Sharp AQUOS (1080p) > 13" MacBook Pro 2.26GHz. 4Gb RAM . 32Gb Corsair Nova SSD >>> 500Gb HDD
Reply
32" Sharp AQUOS (1080p) > 13" MacBook Pro 2.26GHz. 4Gb RAM . 32Gb Corsair Nova SSD >>> 500Gb HDD
Reply
post #149 of 156
Quote:
Originally Posted by Royboy View Post

Since the topic of this thread has to do with the Mac Pro, I would be interested to know where the standard Mac Pro graphics card, NVIDIA GeForce 7300 GT graphics with 256MB memory, fits on this listing of graphics cards. When I buy a Mac Pro I'm interested in upgrading to the ATI Radeon X1900 XT, but not at the expense of having double the heat and more fan noise for cooling. Anyone have any knowledge they can share on my concern?

http://www.guru3d.com/article/Videocards/355/3/ says the following for their total system load:

GeForce 7300 GT (Galaxy) \t
188W

GeForce 7600 GS (Galaxy) \t
190W

GeForce 7600 GT \t
198W

GeForce 7900 GT \t
235W

GeForce 7900 GTX \t
255W

.....There's a lot of figures being thrown around, the above and the previous two graphs.

....But yes, you are basically looking at the X1900XT consuming twice the power as the 7300GT.
However, you get a up to 6x more 3D graphics processing power, particularly if you are running your screen res at 1600x1200 and up, that is, anything beyond 1280x1024 etc.

...For example http://www23.tomshardware.com/graphi...=548&chart=199

....If you are interested in playing 2006, 2007, 2008 3D games on Mac or PC, it would be worthwhile going for the X1900XT. The Mac Pro casing, in terms of heat flow and thermal design, etc, is as good as anything you'll find in the PC world. Far better, IMO. A X1900XT yes will be somewhat louder and hotter than the 7300GT, but for the graphics processing power, and when in the Mac Pro as an overall tower setup, sound and heat is really not a major concern.

...If you are after *very* casual gaming and older titles (2004 and previous), then the 7300GT is fine.
post #150 of 156
Quote:
Originally Posted by nvidia2008 View Post

http://www.guru3d.com/article/Videocards/355/3/ says the following for their total system load:

GeForce 7300 GT (Galaxy) \t
188W

GeForce 7600 GS (Galaxy) \t
190W

GeForce 7600 GT \t
198W

GeForce 7900 GT \t
235W

GeForce 7900 GTX \t
255W

.....There's a lot of figures being thrown around, the above and the previous two graphs.

....But yes, you are basically looking at the X1900XT consuming twice the power as the 7300GT.
However, you get a up to 6x more 3D graphics processing power, particularly if you are running your screen res at 1600x1200 and up, that is, anything beyond 1280x1024 etc.

...For example http://www23.tomshardware.com/graphi...=548&chart=199

....If you are interested in playing 2006, 2007, 2008 3D games on Mac or PC, it would be worthwhile going for the X1900XT. The Mac Pro casing, in terms of heat flow and thermal design, etc, is as good as anything you'll find in the PC world. Far better, IMO. A X1900XT yes will be somewhat louder and hotter than the 7300GT, but for the graphics processing power, and when in the Mac Pro as an overall tower setup, sound and heat is really not a major concern.

...If you are after *very* casual gaming and older titles (2004 and previous), then the 7300GT is fine.

Those numbers looked very wonky to me. The actual wattage is far lower than what you are showing here.

As the web site says, these are the TOTAL watts used by the entire system (excluding, I would assume, the monitor). So read them with some salt.

The Mac Pro has a 980 watt power supply. In systems using the highest power cards, The Quattro, I haven't noticed much of a difference in noise. That includes 8 GB RAM. The system is designed to use high power cards, unlike cheaper PC's which may take them, but then strain under the onslaught of power draw, and heat. These machines just loaf along.
post #151 of 156
Quote:
Originally Posted by melgross View Post

Those numbers looked very wonky to me. The actual wattage is far lower than what you are showing here.

As the web site says, these are the TOTAL watts used by the entire system (excluding, I would assume, the monitor). So read them with some salt.

The Mac Pro has a 980 watt power supply. In systems using the highest power cards, The Quattro, I haven't noticed much of a difference in noise. That includes 8 GB RAM. The system is designed to use high power cards, unlike cheaper PC's which may take them, but then strain under the onslaught of power draw, and heat. These machines just loaf along.

Yeah, like I said, a lot of numbers being thrown around. We've got 3 data sets, one of which is, as I noted, importantly, TOTAL system power draw.

But looking at the data and making some guesstimates (informed ones), we can see that clearly the 7300GT will be pulling half the power of the X1900XT, at a very rough approximation. And it kinda makes sense too.

Remember that the charts do not show what *brand* exactly of cards are being used, they're different OEMS not pure reference cards from nVidia and ATI. In otherwords, different heatsink and fans too.

But as I alluded to, and I think yeah, good point you mention, the Mac Pros are designed for like two cards, Quadros, so a *single* X1900XT, in terms of heat, power draw, PSU "strain", noise, cooling, is really not going to be a major issue over the 7300GT. But the 3D power you get for "720p" and higher-res gaming is around 6X greater than the 7300GT.

If I was getting a Mac Pro and wanted to play some Mac or PC games, even say 5 hours a week, any titles from 2005 to 2008/2009, X1900XT would be the way to go, most definitely.

Melgross, a point about integrated graphics, my experience and some feedback I have noted is that if you are driving a 2nd LCD screen, the GMA950 *does* get pushed to it's limit, even for 2D/ OSX Core Image 3D stuff. Screen spanning to 20" or 23", I would definitely recommend the MacBookPro with X1600 dedicated, 256mb VRAM.

If just MacBook on main screen, CoreImage, CoreAnimation, Dashboard "droplet" effect, Keynote OpenGL transitions, Aperture [possibly], iPhoto editing, CoverFlow, it's all good in da' hood.

Overall, the successor to GMA950 and respectively X1700 or X1800 mobility or Go 7600 would be good options in the next laptop and iMac updates [7600GT 256mb VRAM standard in Mac Pros would be nice]. I'm not holding my breath though, because GPU power is, as we have seen, is a never-ending debate. But of course, that's why we keep coming back to these threads.
post #152 of 156
Quote:
Originally Posted by nvidia2008 View Post

Yeah, like I said, a lot of numbers being thrown around. We've got 3 data sets, one of which is, as I noted, importantly, TOTAL system power draw.

But looking at the data and making some guesstimates (informed ones), we can see that clearly the 7300GT will be pulling half the power of the X1900XT, at a very rough approximation. And it kinda makes sense too.

Remember that the charts do not show what *brand* exactly of cards are being used, they're different OEMS not pure reference cards from nVidia and ATI. In otherwords, different heatsink and fans too.

But as I alluded to, and I think yeah, good point you mention, the Mac Pros are designed for like two cards, Quadros, so a *single* X1900XT, in terms of heat, power draw, PSU "strain", noise, cooling, is really not going to be a major issue over the 7300GT. But the 3D power you get for "720p" and higher-res gaming is around 6X greater than the 7300GT.

If I was getting a Mac Pro and wanted to play some Mac or PC games, even say 5 hours a week, any titles from 2005 to 2008/2009, X1900XT would be the way to go, most definitely.

Melgross, a point about integrated graphics, my experience and some feedback I have noted is that if you are driving a 2nd LCD screen, the GMA950 *does* get pushed to it's limit, even for 2D/ OSX Core Image 3D stuff. Screen spanning to 20" or 23", I would definitely recommend the MacBookPro with X1600 dedicated, 256mb VRAM.

If just MacBook on main screen, CoreImage, CoreAnimation, Dashboard "droplet" effect, Keynote OpenGL transitions, Aperture [possibly], iPhoto editing, CoverFlow, it's all good in da' hood.

Overall, the successor to GMA950 and respectively X1700 or X1800 mobility or Go 7600 would be good options in the next laptop and iMac updates [7600GT 256mb VRAM standard in Mac Pros would be nice]. I'm not holding my breath though, because GPU power is, as we have seen, is a never-ending debate. But of course, that's why we keep coming back to these threads.

I agree with everything you said here.

But, I'm wondering why everyone is talking about the 1900XT. The 1950XT was designed to use much less power than the older 1900XT. I haven't checked. Is the 1950XT available for the Mac Pro?

If not, then I can understand. I wonder if it can be flashed to work as have some other cards.
post #153 of 156
Quote:
Originally Posted by melgross View Post

Is the 1950XT available for the Mac Pro?

Nope.

In a better world, Mac gamers would get the 8800, which runs about as cool as the 1900XT, if I'm not mistaken, but Apple insists on ATi.
32" Sharp AQUOS (1080p) > 13" MacBook Pro 2.26GHz. 4Gb RAM . 32Gb Corsair Nova SSD >>> 500Gb HDD
Reply
32" Sharp AQUOS (1080p) > 13" MacBook Pro 2.26GHz. 4Gb RAM . 32Gb Corsair Nova SSD >>> 500Gb HDD
Reply
post #154 of 156
Quote:
Originally Posted by melgross View Post

I agree with everything you said here.

But, I'm wondering why everyone is talking about the 1900XT. The 1950XT was designed to use much less power than the older 1900XT. I haven't checked. Is the 1950XT available for the Mac Pro?

If not, then I can understand. I wonder if it can be flashed to work as have some other cards.

Yeah, X1950XT vs X1900XT, abut same-ish price, same-ish performance, or slight performance gain compared with the X1900XT. Interesting update, but in any case not available for Mac Pro at this stage. Flashing and all that not worth the trouble, IMHO. X1900XT 256mb RAM is fine...

Even the X1950XT[X] gives you a bit more performance and but is a few hundred more.

For single GPUs on the ATI Radeon side,

AFAIK, X1900XT was updated X1950XT, very minor performance gains, about same cost.
X1900XTX was updated to X1950XTX, some performance gains, flagship card, a few hundred more when
comparing X1950XTX vs X1950XT (in Froogle.com).

[[[I'm not seeing really any great difference between load power of X1900XT vs X1950XT in:]]]
Edit: Actually ignore sentence above, actually it *is* impressive when looking at power draw of
X1950XTX vs X1900XT...

post #155 of 156
For the same price range, the ATI X1900XT and X1950XT stacks up nicely against the nVidia 7950GT.

Edit: heat temps are a point though, [and above post I should be referring to temps instead of "power draw" with regard to the blue and yellow graph image] the nVidia 7950GT is a nicer choice than the now older X1900XT.

So if Apple were to offer something else, I would say they should offer the newer, cooler temps, nVidia 7950GT instead of the X1900XT.

But, meh............. whatevs.
post #156 of 156
Quote:
Originally Posted by applebook View Post

.....my point about the X1600 is even more relevant, seeing as it was a lower-mid (at best) mainstream card that has been discontinued. Why does Apple continue to supply this, and how can any Mac buyer justify this?....

Yes, in all honesty, for a mainstream, enthusiast PC gamer, a Mac buyer buying it for games is very hard to justify. In which case I would recommend a PC tower for all their gaming with nVidia 7900GS or 7950GT, basically just upgrading their even AMD64 2ghz singlecore to 2GB, or Pentium4 3ghz to 2GB, and swap in a 7900GS or 7950GT.

Have that for gaming, and get a MacBook for everthing else computer-Life-stuff. A good solution, IMHO.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Briefly: Apple UK blunder hints at Mac Pro update