or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › What will be the new specs for the next PM line?
New Posts  All Forums:Forum Nav:

What will be the new specs for the next PM line? - Page 2

post #41 of 282
I think that before a PowerMac Dual Core G5/6 we will see the iMac v3 G5 in September 2004.
THEN the PowerBook G5 - if we are lucky - in January,
and then sometimes between February and April, the G5 rev C, Tiger and a new iBook.
I don't know what they can present after at WWDC apart of course a SFF MiniMac and a XServe Rev2
Then again in September 2005 iMac v3 Rev2 with iLife 2005.
In January 2006 iBook G5 !
Sometimes between Feb and April 2006: PowerBook Rev2, MiniMac Rev2
and then !!!! -> in WWDC 2006 : Dual G6 DualCore !

That's it folks
"I like workin on my Mac to jazz. A pianist doesn't spend time peeking inside the piano." Neville Brody
Reply
"I like workin on my Mac to jazz. A pianist doesn't spend time peeking inside the piano." Neville Brody
Reply
post #42 of 282
Quote:
Originally posted by onlooker
AFAIK - They are rated as GeForce, not Quadro's.

GeForce = Gaming, and standard graphics support.

Quadro = ProGraphics.

If I new for a fact that the 6800 Ultra DDL had the all the Quadro GPU, Firmware, and drivers, and was getting QuadroFX 4000 #'s I would have bought the dual 2.5GHz PowerMac already.

Please excuse my ignorance but can you explain the difference. The 6800 is the fastest card I believe, so if that is true what would a "Quadro" card do that is different?
Wll I have my G5 so I am off to get a life; apart from this post...
Reply
Wll I have my G5 so I am off to get a life; apart from this post...
Reply
post #43 of 282
Quote:
Originally posted by Addison
Please excuse my ignorance but can you explain the difference. The 6800 is the fastest card I believe, so if that is true what would a "Quadro" card do that is different?

Maybe a wrong but from memory the difference on PC between GeForce/Quadro and Radeon/FireGL are only 2 things : Dual Digital Interface and Better Drivers (for 3D/CAD). From what I read somewhere the drivers for Mac OS X are closer to Quadro/FireGL that GeForce/Radeon.
So basically we have PRO cards.
"I like workin on my Mac to jazz. A pianist doesn't spend time peeking inside the piano." Neville Brody
Reply
"I like workin on my Mac to jazz. A pianist doesn't spend time peeking inside the piano." Neville Brody
Reply
post #44 of 282
At least on some of the later cards a simple firmware flash turned normal cards into Pro cards. GeForce to Quadro it was, I think.
Matyoroy!
Reply
Matyoroy!
Reply
post #45 of 282
Quote:
If I was a moderator

Stop dreaming.

Go eat some chocolate.

Lemon Bon Bon
We do it because Steve Jobs is the supreme defender of the Macintosh faith, someone who led Apple back from the brink of extinction just four years ago. And we do it because his annual keynote is...
Reply
We do it because Steve Jobs is the supreme defender of the Macintosh faith, someone who led Apple back from the brink of extinction just four years ago. And we do it because his annual keynote is...
Reply
post #46 of 282
After hearing rant after rant after rant after rant after rant after rant after rant after rant after rant after rant from onlooker..........

I gotta agree... apple does need a quadro GPU. Even though I still don't agree the demand would be that high and could possibly cost apple $$... a few things make this pathetic that they dont.

1. Graphics card in the powermacs are weak. I did upgrade mine to 9600xt (bought a dual 2.0 rev b). This in turn has a direct impact on the other computers apple offers. The iMacs have to revert to Geforce 4mx!!! That card is approaching 3 years of age.

It should be this:

Powermacs have: 6800 ultra, 9800xt, 9600xt
iMacs have: 9200, 5200, and 9600xt.
Powerbooks.. 9800.

2. Apple does have some of the best software and not a graphics card to really harness it. The 6800 ultra is awesome (I don't care what anyone says), but a quadro would be nice. We have Maya unlimited... isn't it time?

You happy now onlooker :P

You won.

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #47 of 282
Quote:
Originally posted by jeromba
Maybe a wrong but from memory the difference on PC between GeForce/Quadro and Radeon/FireGL are only 2 things : Dual Digital Interface and Better Drivers (for 3D/CAD). From what I read somewhere the drivers for Mac OS X are closer to Quadro/FireGL that GeForce/Radeon.
So basically we have PRO cards.

They also have more rendering pipes.

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #48 of 282
Don't be hate'n.

Who ever said we have pro cards someone has been pulling your leg, and are probably joking. There are some important differences in screen rendering capabilities. if they are as fast as Quadro I would like to see a HUGE poly Scene comparison next to an Alienware with similar processor specs, and a Quadro FX 4000 just moving a 7+ million poly dragon model in a mountain environment around the screen in Maya like it was a poly plane.

I would rather see the comparison done by toms hardware, and a lot more tests than just that, but if it's true then prove it. I have not seen Apple go there, or prove it.
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
post #49 of 282
I would like to see it too. I would like to see the difference between the current 6800 ultra on a mac... and a high end pc GPU card. Right now the macs get their asses beat by PC cards of the same caliber in Cinema 4d for OpenGL... so explain that one?

If you don't believe me... go here http://www.3dfluff.com/mash/cbogl.php and sort by opengl benches.

Shows how far back the mac hardware really is when it comes to hardcore multimedia.

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #50 of 282
Quote:
Originally posted by emig647
I would like to see it too. I would like to see the difference between the current 6800 ultra on a mac... and a high end pc GPU card. Right now the macs get their asses beat by PC cards of the same caliber in Cinema 4d for OpenGL... so explain that one?

If you don't believe me... go here http://www.3dfluff.com/mash/cbogl.php and sort by opengl benches.

Shows how far back the mac hardware really is when it comes to hardcore multimedia.

I'm not sure I understand....the Quadro is about 12-15 rankings below the Radeon 9800 Pro? Isnt this a gamer card? Shouldnt the Quadro be faster if indeed these 'Pro/Workstation' cards are the way to go?

Also, how in Lords name does a nVidia Ti4200 in a PC do OpenGL faster than an nVidia 6800 Ultra in a Dual G5????

-M
post #51 of 282
I Don't care, aslong as it comes with a 56k modem built in, you can count me in
post #52 of 282
Quote:
Originally posted by moazam
I'm not sure I understand....the Quadro is about 12-15 rankings below the Radeon 9800 Pro? Isnt this a gamer card? Shouldnt the Quadro be faster if indeed these 'Pro/Workstation' cards are the way to go?

Wellllllllll a few things on that. First off I believe those 9800 pros are overclocked. Also that OpenGL score isn't 100% hardware... that is a combined of hardware and software. If you download cinebench03 from www.cinebench.com you will see how it works. But (I don't know how much this makes a difference) but the Quadro FX3000 isn't the newest card... the Quadro FX3400 is. Which adds more pipes. The top card (the X800 xt) is pci-express. Which is a faster connection speed.

Quote:

Also, how in Lords name does a nVidia Ti4200 in a PC do OpenGL faster than an nVidia 6800 Ultra in a Dual G5????

Our drivers suck. Thats why these cards toast ours. If you put a 6800 ultra in a pc and a 6800ultra in a dual g5 mac... it will beat the mac in any program. For some reason its just not as optimized on the mac as pc for drives... perhaps cause they have the 3rd world country 5 year olds programming the mac drivers and 30 american engineers doing the pc drivers. Compare fps in quake 3 and UT2004... our 9800xt does worse than the 9800xt on pcs... *shrugs*.

One thing you have to remember is they can be overclocked on pc's by anyone. My 5900xt in my pc is equivalent to a stock 9800pro because of how I OC'd it and added a GPU cooler to it. Damn programmers they don't know how to program... including me :P

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #53 of 282
Quote:
Originally posted by emig647
Wellllllllll a few things on that. First off I believe those 9800 pros are overclocked. Also that OpenGL score isn't 100% hardware... that is a combined of hardware and software. If you download cinebench03 from www.cinebench.com you will see how it works. But (I don't know how much this makes a difference) but the Quadro FX3000 isn't the newest card... the Quadro FX3400 is. Which adds more pipes. The top card (the X800 xt) is pci-express. Which is a faster connection speed.

Our drivers suck. Thats why these cards toast ours. If you put a 6800 ultra in a pc and a 6800ultra in a dual g5 mac... it will beat the mac in any program. For some reason its just not as optimized on the mac as pc for drives... perhaps cause they have the 3rd world country 5 year olds programming the mac drivers and 30 american engineers doing the pc drivers. Compare fps in quake 3 and UT2004... our 9800xt does worse than the 9800xt on pcs... *shrugs*.

One thing you have to remember is they can be overclocked on pc's by anyone. My 5900xt in my pc is equivalent to a stock 9800pro because of how I OC'd it and added a GPU cooler to it. Damn programmers they don't know how to program... including me :P

So basically...it's not that Macs lack good Pro video cards...its that they just have really shitty video card capability in general? So all the people yelling "I dont care about Doom, I'm running X Y Z pro app" are also full of crap cuz other pro apps would run a ton better on a PC?

I still think there must be something very seriously wrong with the benchmarking utils. If the Mac 3D/OpenGL/graphics are really that bad, well...gawd damn.

-M
post #54 of 282
Ask people that work with those programs on a daily basis. I for one notice a huge difference between a 9800pro on a mac and a 5900xt on a pc (9800pro's are better normally).... the pc is way faster in cinema and maya.

I'm not an expert... I just play around and notice it. Its not the rendering I notice, its the moving of real time in opengl view. Someone back me up with this.

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #55 of 282
Yes. I don't have PC but this seems to be well established. Partly it is poor optimization of code during the conversion. Partly it is an (understandable) relutance to take advantage of Mac-only technology. Partly it is slowdowns in the rest of the system (mainly in the G4 days). And partly it is in the drivers. I don't know if Apple is trying to stick closer to spec or what... It seems to me that Apple should be the flag bearer of OpenGL, but if you want 3d and you need performance, it is just quite simply a mistake to go with Apple right now. Hopefully this will change. On the plus side, I have also heard that these programs (maya, c4d, lightwave) are less crash-prone under Mac OS than Windows, so that is certainly a good sign, although maybe apocryphal.

It used to be that games were considered good benchmarks of overall system performance. Seems to me that should still be true, but now if you dare mention, you are summarily invited to get an XBox or the like.
post #56 of 282
Quote:
perhaps cause they have the 3rd world country 5 year olds programming the mac drivers and 30 american engineers doing the pc drivers.

I suspect it is vice versa, which is why the PC drivers are so fast. 30 American engineers is nothing against 500'000 indian math talents.
Matyoroy!
Reply
Matyoroy!
Reply
post #57 of 282
Good (link) call my young Padawan.

Hmmmm. Interesting info'.

Note:

G5 2.5 blows all away in rendering bar Xeon machines that are over clocked. Sometimes giving as much as 700 mhz-1.4 gig in speed.

In that context? Your average AMD/Intel chip isn't going to make the dual G5 2.5 gig sweat.

When it comes to physics simulation. G5 struts cpu and bandwidth to whup ass on all but the most overclocked (cheat) machines.

On Open GL. Puzzling.

The skewing of results with less powerful cards whupping more powerful cards could be a result of cpu/graphic card ratio.

A Pentium 2.o with Ultra 6800 aint going to beat an AMD 64 3.8 gig rated cpu with ultra 6800.

The results clearly show this.

1. Cinebench is beta.
2. Apple's Open GL still has some way to go. A 'solid' but not brilliant version.
3. Expect this to change for the better with 'Tiger' and hopefully Open GL 2 adoption.
4. Big Endian vs Little Endian? Issues?
5. PC optimisation vs Mac optimisation.
6. In time, expect this to change as Core Image/Video/QuartzExtreme enter the developer pipleine via X Code 2.
7. Apple's cpus are still a 'little' behind Wintel's. 'Catch up'.
8. Expect this to change when Antares murders the opposition.
9. The rendering and physics benches are good. Antares should pull the 6800 Ultra scores closer to Wintel counterparts.
10. Further optimisations may well come care of Cinebench, 'Tiger' and Open GL 2.
11. Appleinsider already carried some info' on Open GL improvements.
12. 'Panther' is still early days for Apple. It can surely be further optimised. It's far away from being Mac OS 7, 8 or 9. IT will GET there.

Add all that up and you could really close the gap.

But yes, it's really humiliating that an ATI x800 xt is on 4000 plus while Apple's 6800 ultra is puffing at 1,700!

OUUUUUUUUUCH!

Those are my ideas. Programmer? Care to jump in? You've been far too quiet of late. AI is not same without the voice of reason...

lemon bon bon
We do it because Steve Jobs is the supreme defender of the Macintosh faith, someone who led Apple back from the brink of extinction just four years ago. And we do it because his annual keynote is...
Reply
We do it because Steve Jobs is the supreme defender of the Macintosh faith, someone who led Apple back from the brink of extinction just four years ago. And we do it because his annual keynote is...
Reply
post #58 of 282
Quote:
4. Big Endian vs Little Endian? Issues?

Should that turn out to be the reason why Macs have consistantly fallen behind the competition, then there is little hope left for the PPC platform.
Matyoroy!
Reply
Matyoroy!
Reply
post #59 of 282
Those cinebench #s don't make much sense. If you say it's on Cinema4D that could be the problem right there. Maybe Cinema4D doesn't handle OpenGL on the Mac very well?. Other than it did note it was all theoretical #'s from beta softwares (on the Mac only). What was beta? THe OpenGL firmware, and drivers in the 6800? Maybe it was a beta version of cinebench? Who knows, but I think a lot of the scores were flubber due to the beta issue.
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
post #60 of 282
I don't know much about 3D programs but the scene that is used in cinebench for OpenGL looks really, really shitty, yet runs like crap too. I just can't imagine that things have to be that slow for such a low quality scene. I mean, every modern game looks better than that and runs at about 10x the speed.
Matyoroy!
Reply
Matyoroy!
Reply
post #61 of 282
Don't start attacking cinema 4d and cinebench just yet.

First off: You asked what was beta about the tests.

The beta part is Cinebench for g5's. Everything else is final. They have been working to optimize Cinema / cinebench for g5's for about 3 months. Those apps have added about 90-150 to cpu scores and about graphics card scores have kind of hovered. That is the only thing beta. YOu can run the 32 bit version still and get slower results.

Also lets keep in mind Cinema 4d first came out on mac before PC. Therefore I would hope that code is more optimized.

And WTF are you saying that cinema 4d has horrible graphics? You obviously have NOT seen what cinema 4d does. I can send you screenshots if you like. Not that this is saying much but its lightyears ahead of lightwave.

If you don't believe me on the OpenGL scores, use a game as an instance. UT2004. Compare the 9800xt on mac to the 9800xt on pc. Its embarassing! This is CLEARLY a driver or hardware issue.

Maybe RISC vs. CISC. I've always been taught that RISC is faster because it does more instructions to do the same result which meant fewer clock cycles. So I'm not sure on that one.

Maybe its Drivers, this could clearly be the case since most code is ported from PC's... perhaps some optimizations have to be lost (can we get an expert in on here).

Maybe its Big Endian vs. Little Endian... Why? Why would it matter if you have your data in memory opposite of what it would be on a x86 proc? Just reverses stack and heap.

Maybe its Apple's OS. Big question mark on that one for me.

Maybe its IBM's stubberness to put on-chip memory on g5's! If you notice, the only procs to beat the g5 have on-chip memory.

There is WAY too many factors that could be playing in on this. Right now I'm voting on driver optimizations.

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #62 of 282
BTW,

I love the feel of my cold aluminum power button on my rev b powermac in the morning

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #63 of 282
Quote:
Originally posted by onlooker
Don't be hate'n.

Who ever said we have pro cards someone has been pulling your leg, and are probably joking. There are some important differences in screen rendering capabilities. if they are as fast as Quadro I would like to see a HUGE poly Scene comparison next to an Alienware with similar processor specs, and a Quadro FX 4000 just moving a 7+ million poly dragon model in a mountain environment around the screen in Maya like it was a poly plane.

That actually would not contradict the argument that they're essentially the same card. That argument does acknowledge a difference in real use, but attributes it to two things:

1) A real, complete, precise OpenGL implementation as opposed to the speed-optimized "Quake subset" shipped with PC consumer cards;

2) A deal between the card vendors and the 3D software vendors where the 3D software only assumes Quadro/FireGL-level capability if the card identifies itself as a Quadro/FireGL. This is to discourage people from flashing high-end consumer cards. Unfortunately, the side effect is that even if the GeForces in PowerMacs are functionally equivalent to Quadros, and even if Apple's OpenGL implementation is functionally equivalent to a Quadro's, Maya and its ilk will assume that they're running on gelded hardware and disable a lot of their optimized code.

So, in other words, your test would show a huge difference in performance, but that doesn't change the argument. If you could get a version of your 3D app of choice that assumed it was running on pro hardware, and then logged a difference in performance, you might have a point. But the GPU vendors wouldn't want anyone getting ahold of that software, or running that test, for obvious reasons.

[edit:] As to benchmarking:

The main thing you're paying for with the "pro" 3D cards is not speed. That hasn't been true for some years now. What you're paying for is completeness of implementation of OpenGL, and precision. It's quite possible for gamer cards to bench faster because they take shortcuts. With polygons flying by, nobody playing games really cares about isolated rendering errors. But if you're rendering a high-resolution 3D scene, you absolutely do care about precision. A preview of a scene that doesn't actually show you what the scene will look like after a final render isn't worth much, is it?

Also, as I mention above gamer cards tend to support what John Carmack has derided as the "Quake subset" of OpenGL. The functions commonly used in games appear, and they're optimized for speed, but anything that isn't used isn't included. Since games have absolutely fixed requirements, this is OK (although Carmack chafes against it, because it restricts him to the set of functions that everyone else is using, rather than letting him use the entire OpenGL API). Since 3D apps are far more open-ended in what demands they will make of OpenGL, it's not acceptable there. And this is one major complaint with any benchmark that compares consumer cards (and drivers) to pro cards (and drivers): One of the two major reasons to go with a pro card is to get a driver that supports functions that consumer drivers don't support at all — and which, therefor, cannot be benchmarked. The subset of OpenGL that can be benchmarked will most likely be faster on the consumer cards, because their drivers are optimized for speed over precision. But, again, since you're buying a Quadro (and driver) for its rendering precision, this is a moot point. There are Camaros that go faster than some Ferraris, too.
"...within intervention's distance of the embassy." - CvB

Original music:
The Mayflies - Black earth Americana. Now on iTMS!
Becca Sutlive - Iowa Fried Rock 'n Roll - now on iTMS!
Reply
"...within intervention's distance of the embassy." - CvB

Original music:
The Mayflies - Black earth Americana. Now on iTMS!
Becca Sutlive - Iowa Fried Rock 'n Roll - now on iTMS!
Reply
post #64 of 282
I guess that makes complete sense morph...

*shrugs*

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #65 of 282
As I was saying. I still think a Pro 3D Card is needed regardless.

I had a dream last night that I woke up and this was available in PCie on a Mac PURE3D Not that I'm interested in this card. it was just a nice change to see that someone thought us a worthy 3D environment.

Anyway back to the next PowerMac. As was title of thread.

My list of Apple musts haves is following.
  • PCIe x2, PCI-X x2, or 3
  • 3+GHz (IBM has to prove itself soon, and they owe Apple 1 big time)
  • Pro 3D card availability.
  • And for once - Apple should announce availability that day on all models.

THe 4th one would be really refreshing.
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
post #66 of 282
Quote:
Originally posted by onlooker
As I was saying. I still think a Pro 3D Card is needed regardless.

It would be more disruptive (in a good way) for Apple to convince the software vendors to enable the "pro card" code on their platform unconditionally. But if you really want to spend $3K for the same card and an OpenGL driver that you won't use (because Apple ships OpenGL already) that's an acceptable second best solution.

Quote:
My list of Apple musts haves is following.
  • PCIe x2, PCI-X x2, or 3
  • 3+GHz (IBM has to prove itself soon, and they owe Apple 1 big time)
  • Pro 3D card availability.
  • And for once - Apple should announce availability that day on all models.

THe 4th one would be really refreshing.

PCIe will come when there are enough cards that support it, which, according to all the hype, will happen Real Soon Now(TM). Right now it's only appearing in high-end cards, and then only as an option.

3GHz G5s will probably appear when IBM sorts out the issues that are making the 970fx burn out at high voltages (meaning, 1.3v and up), when temperatures exceed 100°C.

Pro 3D card availability is desirable; the removal of the entirely artificial barrier preventing 3D apps from running as well as they could on Apple's current hardware is more desirable.

Same day availability of announced hardware? Now you're definitely smoking something. Sure would be nice, though...
"...within intervention's distance of the embassy." - CvB

Original music:
The Mayflies - Black earth Americana. Now on iTMS!
Becca Sutlive - Iowa Fried Rock 'n Roll - now on iTMS!
Reply
"...within intervention's distance of the embassy." - CvB

Original music:
The Mayflies - Black earth Americana. Now on iTMS!
Becca Sutlive - Iowa Fried Rock 'n Roll - now on iTMS!
Reply
post #67 of 282
Not that this is saying a lot.... and kudos for Moto (they got no respect for this one).

Powerbooks, iBooks, and eMacs were available right away (until eMac machines got rare which wasn't moto's fault). I think we all know who is to blame for the g5 shortages \

I hope next revision of the 970 or 975... the process will yield many more cpus. From the sounds of it they better figure something out. And why does apple keep announcing new machines with g5 procs when they damn well know they can't supply that many? It has happened with every g5 computer so far... Rev A PM, XServe, Rev B PM, and soon to be iMac!

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #68 of 282
Quote:
Originally posted by emig647
And why does apple keep announcing new machines with g5 procs when they damn well know they can't supply that many? It has happened with every g5 computer so far... Rev A PM, XServe, Rev B PM, and soon to be iMac!

What else are they going to use? The G4 had no business in a PowerMac, and its presence in the Xserve confined that machine to specialized, embarrassingly-parallel compute-intensive tasks.

Meanwhile, all else being equal, yields improve over time.

This is actually one reason I'm pulling for Freescale: I would just love to see a two-supplier CPU market as lively as the two-supplier GPU market that Apple currently enjoys. I can't really remember when Apple had a choice of high-performance CPUs, but it would certainly be welcome...
"...within intervention's distance of the embassy." - CvB

Original music:
The Mayflies - Black earth Americana. Now on iTMS!
Becca Sutlive - Iowa Fried Rock 'n Roll - now on iTMS!
Reply
"...within intervention's distance of the embassy." - CvB

Original music:
The Mayflies - Black earth Americana. Now on iTMS!
Becca Sutlive - Iowa Fried Rock 'n Roll - now on iTMS!
Reply
post #69 of 282
Quote:
The G4 had no business in a PowerMac,

You feeling okay, Amorph? Not running a temperature or anything?



I am more inclined to side with your long held belief that Apple should have two healthy cpu suppliers.

Hmmm. What's in a name. I hope Freescale can shake the shackles of neglect.

A dual core G4 in a Powerbook would be most welcome over a 1.5 G4.

And would run over a 1.6 G5 easily?

Time to bury the hatchet (I give your three guesses where...) and give Freescale a chance to prove themselves IF they can deliver.

Come on IBM, you owe us. 3 gig and over.

Antares...whither art thou?

Lemon Bon Bon
We do it because Steve Jobs is the supreme defender of the Macintosh faith, someone who led Apple back from the brink of extinction just four years ago. And we do it because his annual keynote is...
Reply
We do it because Steve Jobs is the supreme defender of the Macintosh faith, someone who led Apple back from the brink of extinction just four years ago. And we do it because his annual keynote is...
Reply
post #70 of 282
AMorph,

What I meant to say is why does apple keep announcing these machines too early... when they know they can't get enough supply in a timely fashion. Do what they did with the ibook, pb, and emac and stock up on the chips before shipping. It would be the wise thing to do. It just hurts them more when they can't deliver.

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #71 of 282
Quote:
Originally posted by emig647
AMorph,

What I meant to say is why does apple keep announcing these machines too early... when they know they can't get enough supply in a timely fashion. Do what they did with the ibook, pb, and emac and stock up on the chips before shipping. It would be the wise thing to do. It just hurts them more when they can't deliver.

it hurts them more when they have nothing new to show and people decide to buy a windows computer.
the usaul buying cycle of computers is 2-4 years. that's something you have to deal with.
alles sal reg kom
Reply
alles sal reg kom
Reply
post #72 of 282
Quote:
Originally posted by Lemon Bon Bon
...Time to bury the hatchet (I give your three guesses where...) and give Freescale a chance to prove themselves IF they can deliver...

I would rather see IBM and Freescale come back to the joint development table with the PowerPC like they were back with the 603/604/G3 days. We had fast development without constrained supplies of chips. Of course there was also more incentive for Moto and IBM to develop faster chips at a pace that kept up with Inetl since there were more (large) customers than just Apple. I think that 2 suppliers is the answer for Apple, but idealy IBM and Motorolla would be making compatible chips with a similar feature set so that Apple can adequatly optamize the OS for the chips with minimal development time.
post #73 of 282
Quote:
Originally posted by Amorph
It would be more disruptive (in a good way) for Apple to convince the software vendors to enable the "pro card" code on their platform unconditionally. But if you really want to spend $3K for the same card and an OpenGL driver that you won't use (because Apple ships OpenGL already) that's an acceptable second best solution.

Apples OpenGL driver isn't necessarily the Quadro Firmware either. Who is saying that the Quadro firmware in the 6800 anyway? Apple? I've never read that.

But back to what you were saying about 3D software vendors enabling the code - what are the chances of Apple getting Nvidia, and Alias to enable the Quadro Drivers, and firmware for the Mac 6800 ULTRA DDL? I'm thinking it's slim.

Also to mention about the ATI 800xt thing. (in refrence to UT2K4 Performance with 9800xt's) From what information I've gathered about ATI, and OpenGL is that most PC Maya users contend that ATI openGL sucks. (at least with Maya) The Majority of Maya pro's have highly recommend the quadro.

Last mention: is that the fastest Quadro isn't the 3400 it's the 4400. The 3400 is the 3000 PCIe version, and it's still using the previous GPU. Pretty good being that Nvidia doubled performance with the new NV40 GPU for the 4000, and 4400 which is the PCIe version of the 4000 (AGP).

But according to toms hardware article that I'm about to put a link to. (not openGL specific) The ATI Pro cards are becoming contenders. But the FireGL V7100 still will not come close to matching the 4000, or 4400 with the latest NV40 GPU. It's about 2x as fast as the 3400.
http://www20.graphics.tomshardware.c...816/index.html

I still think the cinnebench thing is odd at best.
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
post #74 of 282
Quote:
Originally posted by Amorph
This is actually one reason I'm pulling for Freescale: I would just love to see a two-supplier CPU market as lively as the two-supplier GPU market that Apple currently enjoys.

Ditto, but I wonder how feasible this is when they're selling less than a million units each quarter with no real growth. It seems like it would require tremendous R&D for such a small market.
post #75 of 282
Quote:
Originally posted by onlooker
Apples OpenGL driver isn't necessarily the Quadro Firmware either. Who is saying that the Quadro firmware in the 6800 anyway? Apple? I've never read that.

Um? Firmware and drivers are two completely different things, so I don't really understand this. Firmware is low-level programmable logic that runs on the card itself. The driver is OS-specific software that runs on the PC. Firmware would control whether the card is little- or big-endian (on NVIDIA parts at least), which parts of the GPU are installed or enabled (they're configurable for OEM needs), and what the name and part number of the card are. The driver is the interface between the card and the operating system.

On PCs, the Quadro driver consists of the same basic driver that ships with the GeForce, a correct and complete (within reason) implementation of OpenGL, and the guarantee that the card and driver have been tested with various expensive pieces of hardware and software, and have been found to be compatible with them.

On Macs, the GeForce driver consists of the basic PC driver, which has been been ported to OS X by Apple engineers. OpenGL is built into the OS, so part of the port involves hooking the back of OS X's OpenGL engine to the driver itself. The only significant differences between this arrangement and a Quadro on the PC are: That the PC driver is certified by NVIDIA to work with sundry pieces of nice kit, and; if asked, the driver reports that the Mac card identifies itself as a "GeForce," not a "Quadro". The all-important OpenGL support is, if not identical, functionally equivalent.

Quote:
But back to what you were saying about 3D software vendors enabling the code - what are the chances of Apple getting Nvidia, and Alias to enable the Quadro Drivers, and firmware for the Mac 6800 ULTRA DDL? I'm thinking it's slim.

Basically, what it comes down to is, how badly do the 3D app vendors want the Mac market, what are the odds of NVIDIA bringing the Quadro to the Mac market, and just how persuasive can Steve be? The middle question is particularly thorny, because NVIDIA (say) has to answer for themselves whether they want to give that all-important certification of compatibility to an implementation that they don't entirely control (because the OpenGL implementation is Apple's, and so is the port of the driver), or whether they want to take on the considerable work and expense involved in porting their driver and OpenGL implementation over themselves — and whether Apple would consider balkanization of Mac OpenGL support acceptable. Frankly, if I were in their shoes (and unless I'm missing something really big), I wouldn't bring the Quadro to the Mac either, at least not until demand reached enough of a pitch to make the considerable risk and investment worthwhile.

That leaves the 3D vendors. I think, given the reluctance of NVIDIA and ATI to commit their high-end solutions to the Mac market, and given a proven and significant revenue stream from Mac sales, that they would consider this course of action. They still risk pissing off the GPU vendors if the result is a significant chunk taken out of PC 3D workstation sales, but that adjustment just might have to be made. Also, 3D customers might have to swallow a lack of guaranteed compatibility (unless Apple or the 3D vendors are willing to take up the role of certifier) which some may be reluctant or unwilling to do.

Quote:
Also to mention about the ATI 800xt thing. (in refrence to UT2K4 Performance with 9800xt's) From what information I've gathered about ATI, and OpenGL is that most PC Maya users contend that ATI openGL sucks. (at least with Maya) The Majority of Maya pro's have highly recommend the quadro.

I'll ask someone else with more low-level familiarity with the GPUs chime in here. I've heard that's actually a lack of hardware support in the GPU, not a driver problem. But ATI does their own Mac drivers, and their drivers for the Mac are as solid as their drivers for the PC are iffy, so I don't know how much any of this would carry over if I'm wrong and it is in fact a software problem.

If it's specifically a problem with ATI's OpenGL implementation, well, any software running on OS X will use Apple's OpenGL implementation, so that would become a non-issue on the Mac (unless Apple's implementation comes up short, in which case it would just be a pervasive problem instead of something specific to ATI).
"...within intervention's distance of the embassy." - CvB

Original music:
The Mayflies - Black earth Americana. Now on iTMS!
Becca Sutlive - Iowa Fried Rock 'n Roll - now on iTMS!
Reply
"...within intervention's distance of the embassy." - CvB

Original music:
The Mayflies - Black earth Americana. Now on iTMS!
Becca Sutlive - Iowa Fried Rock 'n Roll - now on iTMS!
Reply
post #76 of 282
Quote:
On PCs, the Quadro driver consists of the same basic driver that ships with the GeForce, a correct and complete (within reason) implementation of OpenGL, and the guarantee that the card and driver have been tested with various expensive pieces of hardware and software, and have been found to be compatible with them.

On Macs, the GeForce driver consists of the basic PC driver, which has been been ported to OS X by Apple engineers. OpenGL is built into the OS, so part of the port involves hooking the back of OS X's OpenGL engine to the driver itself. The only significant differences between this arrangement and a Quadro on the PC are: That the PC driver is certified by NVIDIA to work with sundry pieces of nice kit, and; if asked, the driver reports that the Mac card identifies itself as a "GeForce," not a "Quadro". The all-important OpenGL support is, if not identical, functionally equivalent.

Where do you get this info anyway? You can PM me if you want.
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
post #77 of 282
2000(!) Open GL points and over beind an x800XT card on the pc.

The Mac's best card is projected at 1,700 on a DUAL 2.5 gig?

WHAT THE HELLL IS GOING ON!?!?!?

It MUST be a beta driver problem in Cinebench. Only bad software can cripple a machine like that.

The dual 2.5 does fine in render and physics.

WHY do Macs graphic card fall over 100% behind their PC counterparts? THAT'S OUTRAGEOUS!!!!!!!!!!!!!!!

Scratches head. IF it was 10% I could grumble and take it.

Same card. Dual cpus powering behind it. Apple can hardly be crap at writing GL drivers. They worked with Nvidia on the 6800ultra!!! There's former sgi GL experience right there.

WHAT THE f**************** is going on!?

Lemon Bon Bon
We do it because Steve Jobs is the supreme defender of the Macintosh faith, someone who led Apple back from the brink of extinction just four years ago. And we do it because his annual keynote is...
Reply
We do it because Steve Jobs is the supreme defender of the Macintosh faith, someone who led Apple back from the brink of extinction just four years ago. And we do it because his annual keynote is...
Reply
post #78 of 282
Take your pills more often, plz.
Matyoroy!
Reply
Matyoroy!
Reply
post #79 of 282
Quote:
Originally posted by Lemon Bon Bon
2000(!) Open GL points and over beind an x800XT card on the pc.

The Mac's best card is projected at 1,700 on a DUAL 2.5 gig?

WHAT THE HELLL IS GOING ON!?!?!?

They don't call it "benchmarketing" for nothing.
"...within intervention's distance of the embassy." - CvB

Original music:
The Mayflies - Black earth Americana. Now on iTMS!
Becca Sutlive - Iowa Fried Rock 'n Roll - now on iTMS!
Reply
"...within intervention's distance of the embassy." - CvB

Original music:
The Mayflies - Black earth Americana. Now on iTMS!
Becca Sutlive - Iowa Fried Rock 'n Roll - now on iTMS!
Reply
post #80 of 282
Quote:
Originally posted by Amorph
They don't call it "benchmarketing" for nothing.

I can understand that, and why lemon insano is freakin', but wasn't apple the first to use OGL outside of SGI? I think they should have some serious experience with OGL. Were also forgetting the guys from raycer graphics. Those guys were supposed to be the OGL dev's from heaven. Didn't they design the Quarts engine?

I think that ATI thing is an ATI issue with Mac's, but cinni-biache beta is an awful test.
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › What will be the new specs for the next PM line?