You assume that gcc for the Mac is as optimized for the hardware as ICC is for x86. And you'd be horribly wrong. All the benchmarks that show "normality" demonstrate is that a wildly well-optimized compiler will beat a poorly optimized compiler. But most applications are compiled with MS' compiler, not Intel's, so it's not necessarily a real-world test to use ICC.
You know some of these posts can be attributed to not fully reading what others say.
You missed this part about GCC where I corrected myself from what the author said from the original GCC test in the post.
"SPEC FAQ, SPECfp2000 contains 10 Fortran programs, and 4 C programs. In other words, SPECfp is mostly Fortran, and NAGWare is the Fortran compiler, so therefore it is most likely NAGWare that is the bad compiler for Intel, not GCC.)"
Although the GGC portion was what I remembered from articles in the 2 year old tests, none of that really pertains to the heart of the matter which is most of the tests were unscrupulously altered in Apples favor, and they were knowledgeable and responsible for that, and making false statements as to why they did it.
Quote:
If you want to use ICC for the Intel side, you'd have to use something comparably optimized for the PPC on the Mac side to have a fair comparison.
But then Apple went, and said they did it because the Mac came out too fast comparatively, and they did it to make the PC look faster. Again. It's just not truthful. Actually according to you the opposite its the truth. The PC is would be faster than they stated with it. Why would they do that? If it's not true don't say it.
Quote from Fallen: "My personal favorite would be a new audio interface card taking full advantage of 64 bit technology enabling Apple to leave ProTools
and their friggin $7000 HD requirements in the dust."
Perhaps a bit OT but Fallen can't help but ask himself how many Cell's it would take to smoke ProTools' $3000 each Accel cards.
Actually I can't wait to see the next PowerMac update. I think Apple has had so much time to work, and actually think about it that they probably designed something that's really extraordinary. It's all going to revolve around the motherboard.
That aside. I think that the iPods success has affected the other groups. It only makes them strive to create something more excellent, and maybe Apple will use some of their new success to go back to their roots, and put something back to their computing buisness model which is what got them there to begin with.
Quote from Fallen: "My personal favorite would be a new audio interface card taking full advantage of 64 bit technology enabling Apple to leave ProTools
and their friggin $7000 HD requirements in the dust."
Perhaps a bit OT but Fallen can't help but ask himself how many Cell's it would take to smoke ProTools' $3000 each Accel cards.
Er.... 1/2?
The whole reason I'm getting a new tower is so I can finally attempt
a transition to digital audio recording without sacrificing too much quality.
It's only my opinion, but charging $7995 for an HD1 PCI card
is like Apple still charging us $3995 for a DVD burner and
then telling you that it can't be used unless you buy the proprietary
$4000 license and software to use it.
All this before you even consider the cost of professional quality, interfaces, sound libraries and PRO audio plug-ins.
It's no wonder that Digidesigns' parent company AVID bought out
M-Audio. But ProTools won't work with M-Audio hardware. Go figure.
With a growing user base of Garageband and Logic 7,
I hope that Apple will soon give Digidesign a run for their
money with truly professional capabilities in Logic software
and supporting hardware.
I could be wrong, but I think I'll have more quality and flexibility with Logic 7 Pro and 3rd party hardware, than downgrading to ProTools LE.
If Apple takes on pro audio as seriously as they have for video,
Hi all....new here......but am like some of you looking and waiting for the next upgrage to the g5. Seems to have taken longer that normal and when i bought my G4 dual i was under the impression it would be a while until the G5 came out and it was only about 2 months and my mac was outdated. I have been with mac since the inseption and watched this happen more than once and i totally understand how it does but this time i am waiting for at least a dual g5 3 ghz or better and tiger. Any one with any current thoughts on this???
Hi all....new here......but am like some of you looking and waiting for the next upgrage to the g5. Seems to have taken longer that normal and when i bought my G4 dual i was under the impression it would be a while until the G5 came out and it was only about 2 months and my mac was outdated. I have been with mac since the inseption and watched this happen more than once and i totally understand how it does but this time i am waiting for at least a dual g5 3 ghz or better and tiger. Any one with any current thoughts on this???http://forums.appleinsider.com/newre...hreadid=49730#
I think the difference between dual 3.0 and 2.8 will be nothing. If that is the next machine that comes out I wouldn't see anything wrong with getting that machine over a dual 3.0. Waiting for tiger is fine, if you want it to ship with the machine. But the current machines will run tiger just as efficiently as the new ones will. I don't think there will be any major changes between new ones and current ones... minor things like PCI-E. etc.
I could be wrong, but I think I'll have more quality and flexibility with Logic 7 Pro and 3rd party hardware, than downgrading to ProTools LE.
If Apple takes on pro audio as seriously as they have for video,
then it only seems to be a matter of time.
I agree. I fell into the PT trap a while back and while they do have quality and the name rep, prices are offputting. Meanwhile Logic gets better and better (a definite Key Buy from Keyboard) with PI's almost the equal of standalone's and bundled virtual synths. A 'powertower' with a decent interface - Emu's? - should almost equal the sound at a fraction of the cost.
I think the difference between dual 3.0 and 2.8 will be nothing. If that is the next machine that comes out I wouldn't see anything wrong with getting that machine over a dual 3.0. Waiting for tiger is fine, if you want it to ship with the machine. But the current machines will run tiger just as efficiently as the new ones will. I don't think there will be any major changes between new ones and current ones... minor things like PCI-E. etc.
PCI-E doesn't seem minor to me. Thats a new MB design. Why design a new MB just for PCI-E. If they are going to add PCI-E there will be more. What, I don't know, but there will be more.
PCI-E doesn't seem minor to me. Thats a new MB design. Why design a new MB just for PCI-E. If they are going to add PCI-E there will be more. What, I don't know, but there will be more.
Just a side note to onlooker, have you seen teh dual gpu 6600gt? It smokes dual SLI 6800ultras... it actually has 2 gpus on the board. They are saying this is the future... only problem is it requires a special mobo. SLI is less than a year old and it is already outdated.
Gigabyte makes the motherboard and graphics card.
Reason I brought this up is because if apple immediately invested into SLI someone would already be complaining that there is a new technology out and apple doesn't have it. I'd much rather see dual gpu 6600gt cards than SLI.
They will probably add DDR2 ram... I"m guessing PC4200
Other than that I really don't know what else they would add except for dual memory controllers, but I don't see that happening any time soon.
Maybe hypertransport2? Guess is as good as mine.
Point being, (especially for audio) the current machines would suit fallen just fine. There is nothing wrong with waiting if he can. I have been checking up because (after selling my dual 2) I was going to get a powerbook, but if the powermacs are going to be significantly upgraded, i'll just get a new powermac instead.
Dual GPUs on a card just makes more sense. The next thing that needs to happen is a true dual core GPU with a memory controller ondie or offdie that allows both cores to access the same pool of memory efficiently.
We're going to look back in a few years and laugh about the days of wanting to stuff two GPU cards into our computers just as I laught about the days when you had to toss two 3DFX cards in a computer "and" a 2D card!
In dealing with hardware I always try to remember that it is me that is the slowest component in the chain. I can affect a change in efficiency by my own actions than that of a GPU card running faster.
Just a side note to onlooker, have you seen teh dual gpu 6600gt? It smokes dual SLI 6800ultras... it actually has 2 gpus on the board. They are saying this is the future... only problem is it requires a special mobo. SLI is less than a year old and it is already outdated.
Gigabyte makes the motherboard and graphics card.
Reason I brought this up is because if apple immediately invested into SLI someone would already be complaining that there is a new technology out and apple doesn't have it. I'd much rather see dual gpu 6600gt cards than SLI.
Yes I saw it, but I didn't see it smoke 6800 Ultras in SLI, and it doesn't. I saw their test where it does beat 6600GT's in an SLI configuration in some tests, but not all. Usually a single 6800GT will beat it in a non SLI configuration.
The difference between their card, and the usual GT is theirs has Dual 6600GT GPU's with "*256MB DDR3 on a 256-bit Bit Bus" clocked @ "**600MHz" with a Core Clock @ 500 MHz
(*same as the 6800 Ultra / **overclocked)
The regular GT has 128MB DDR3 on a 128MB Bus Clocked @ 500MHz with a Core Clock @ 500MHz.
Tom's hardware has already looked at this card, and I wasn't as impressed with it as you are. Dual GPU on one card isn't that stellar IMO. I am far more impressed with SLI.
But Usually a single 6800 Ultra whooped this card. Toms hardware didn't bother using it in SLI because it wasn't really comparable. 6800 Ultra is faster in most cases as a single card. So is the 6800 GT.
All and all it's performs OK, for an over clock, but it's nothing to freak about.
Without a doubt, Gigabyte has created a fascinating piece of hardware with the 3D1. The concept of creating an SLI setup on a single card deserves the highest respect. Why Gigabyte chose to use the GeForce 6600 GT processor instead of the faster 6800 model is unclear. Possibly, the NV45's HSI bridge chip caused some problems, or such a card would have become too complex to produce. After all, the 6800s use a 256 bit memory interface. Such a dual-core circuit board would quickly become very complex and consequently expensive.
Going only by the numbers, we see that the 3D1 definitely has its pros and cons. Bundled with the motherboard, the card will be slightly cheaper than a comparable GeForce 6600 GT SLI setup, while offering better performance. Also, it will be much less expensive than a single 6800 GT or Ultra card.
However, the downside is that the buyer basically loses the second x16 PCIe slot when using this card. This removes the option of upgrading to SLI at a later time, diminishing the overall flexibility. Therefore, whether or not the 3D1 is a good choice compared to a more flexible two-card 6600 GT SLI setup or even a single 6800 GT/Ultra card mostly depends on the buyer's plans for future upgrades. Whether or not SLI pays off at all depends on the resolutions and quality settings the user prefers to play at. Lastly, the choice of games is important as well: In modern games, a GeForce 6600 GT SLI setup can really shine, offering a tangible performance boost. However, in older titles a single GeForce 6800 GT may be the better choice.
Considering the history of failed attempts at bringing dual-core graphics cards to the market, Gigabyte's 3D1 will probably have some difficulties establishing itself in the marketplace - especially since the card does have some technological limitations. Of course Gigabyte is well aware of this and plans to offer the card - bundled with the K8NXP-SLI motherboard - as a limited edition only.
In the end, the 3D1 showcases Gigabyte's technological expertise and its willingness to innovate. Perhaps the card can be compared to the design prototypes with which car makers try to impress their potential customers at automobile shows. Whatever the case may be, we're definitely hoping to see more of this kind of thing
Keep in mind the tests they ran do NOT take advantage of SLI or dual gpu, so those benches are bunk no matter how you slice it. Its like comparing a stock 6600gt to a stock 6800ultra (overclocking considered). This is like having a dual processor system in windows, no true multithreading / multitasking between processors. The 2nd processor just sits there waiting to be used... just like in this case, the 2nd gpu just sits there in Unreal Tourn and other games not being used. Maximum PC is the ones that tested the card in this months issue (if you care to pick it up and look). They DID use applications to compare SLI and the dual core cards. And YES the dual core 6600gt was faster than the 6800ultra SLI. You can't believe every bench you see, and while I have great respect for tomshardware, this test does not prove anything about the dual core / SLI.
And this proves my point... Apple not moving to PCI-Express ASAP does not bare a big difference, yes there is a 533mhz vs (what is pci-e, 1066mhz?) (agp8x vs pci-e 16x).
Long story short apple not moving to pci-express 16x doesn't have any BIG performance issues until applications take advantage of the bi-directional communication of which makes pci-e so much better. A year from now, i'll bite. I'm sure major apps will take advantage of it. As it is now nothing really does take advantage of it.
Another web site asked how a G5 might compare to a modern automobile.
I chose Volvo, because of the G5's overall stability, safety, reliability and performance. <">
Hopefully we'll see some real head turning benchmarks with the first dual core TigerMac.
Then it's pedal to the metal
Wow, if you choose a Volvo for stability and reliability then I have to be sorry for you. I have two volvos and I can't wait to get rid of them. They have good pick up and safety. But when it comes to maintenance and reliability they look more like PC's that keep giving you headaches.
The only thing that stopped our old Volvo 240 wagon
was the tree it hit when my wife accidentally forgot to set the emergency brake and it rolled down the mountain.
Not to get too much on the car discussion. I agree that the really old Volvo's are great and very reliable. Anything built after 1995 are horrible. I experienced three completely different models 95, 98 and 2000 and was unhappy with them all. I know two people who purchased new ones in the last year and are not crazy about them either.
i would put this way: G5 is: Volvo for safety, BMW or Audi for performance and Toyota or Honda for reliability.
Comments
Originally posted by Amorph
Not really.
You assume that gcc for the Mac is as optimized for the hardware as ICC is for x86. And you'd be horribly wrong. All the benchmarks that show "normality" demonstrate is that a wildly well-optimized compiler will beat a poorly optimized compiler. But most applications are compiled with MS' compiler, not Intel's, so it's not necessarily a real-world test to use ICC.
You know some of these posts can be attributed to not fully reading what others say.
You missed this part about GCC where I corrected myself from what the author said from the original GCC test in the post.
"SPEC FAQ, SPECfp2000 contains 10 Fortran programs, and 4 C programs. In other words, SPECfp is mostly Fortran, and NAGWare is the Fortran compiler, so therefore it is most likely NAGWare that is the bad compiler for Intel, not GCC.)"
Although the GGC portion was what I remembered from articles in the 2 year old tests, none of that really pertains to the heart of the matter which is most of the tests were unscrupulously altered in Apples favor, and they were knowledgeable and responsible for that, and making false statements as to why they did it.
If you want to use ICC for the Intel side, you'd have to use something comparably optimized for the PPC on the Mac side to have a fair comparison.
But then Apple went, and said they did it because the Mac came out too fast comparatively, and they did it to make the PC look faster. Again. It's just not truthful. Actually according to you the opposite its the truth. The PC is would be faster than they stated with it. Why would they do that? If it's not true don't say it.
I chose Volvo, because of the G5's overall stability, safety, reliability and performance. <">
Hopefully we'll see some real head turning benchmarks with the first dual core TigerMac.
Then it's pedal to the metal
and their friggin $7000 HD requirements in the dust."
Perhaps a bit OT but Fallen can't help but ask himself how many Cell's it would take to smoke ProTools' $3000 each Accel cards.
Er.... 1/2?
That aside. I think that the iPods success has affected the other groups. It only makes them strive to create something more excellent, and maybe Apple will use some of their new success to go back to their roots, and put something back to their computing buisness model which is what got them there to begin with.
Originally posted by Dave J
Quote from Fallen: "My personal favorite would be a new audio interface card taking full advantage of 64 bit technology enabling Apple to leave ProTools
and their friggin $7000 HD requirements in the dust."
Perhaps a bit OT but Fallen can't help but ask himself how many Cell's it would take to smoke ProTools' $3000 each Accel cards.
Er.... 1/2?
The whole reason I'm getting a new tower is so I can finally attempt
a transition to digital audio recording without sacrificing too much quality.
It's only my opinion, but charging $7995 for an HD1 PCI card
is like Apple still charging us $3995 for a DVD burner and
then telling you that it can't be used unless you buy the proprietary
$4000 license and software to use it.
All this before you even consider the cost of professional quality, interfaces, sound libraries and PRO audio plug-ins.
It's no wonder that Digidesigns' parent company AVID bought out
M-Audio. But ProTools won't work with M-Audio hardware. Go figure.
With a growing user base of Garageband and Logic 7,
I hope that Apple will soon give Digidesign a run for their
money with truly professional capabilities in Logic software
and supporting hardware.
I could be wrong, but I think I'll have more quality and flexibility with Logic 7 Pro and 3rd party hardware, than downgrading to ProTools LE.
If Apple takes on pro audio as seriously as they have for video,
then it only seems to be a matter of time.
I think the difference between dual 3.0 and 2.8 will be nothing. If that is the next machine that comes out I wouldn't see anything wrong with getting that machine over a dual 3.0. Waiting for tiger is fine, if you want it to ship with the machine. But the current machines will run tiger just as efficiently as the new ones will. I don't think there will be any major changes between new ones and current ones... minor things like PCI-E. etc.
Originally posted by FallenFromTheTree
I could be wrong, but I think I'll have more quality and flexibility with Logic 7 Pro and 3rd party hardware, than downgrading to ProTools LE.
If Apple takes on pro audio as seriously as they have for video,
then it only seems to be a matter of time.
I agree. I fell into the PT trap a while back and while they do have quality and the name rep, prices are offputting. Meanwhile Logic gets better and better (a definite Key Buy from Keyboard) with PI's almost the equal of standalone's and bundled virtual synths. A 'powertower' with a decent interface - Emu's? - should almost equal the sound at a fraction of the cost.
Originally posted by emig647
Yah,
I think the difference between dual 3.0 and 2.8 will be nothing. If that is the next machine that comes out I wouldn't see anything wrong with getting that machine over a dual 3.0. Waiting for tiger is fine, if you want it to ship with the machine. But the current machines will run tiger just as efficiently as the new ones will. I don't think there will be any major changes between new ones and current ones... minor things like PCI-E. etc.
PCI-E doesn't seem minor to me. Thats a new MB design. Why design a new MB just for PCI-E. If they are going to add PCI-E there will be more. What, I don't know, but there will be more.
It's not that PCI-E is so critical to audio recording by any means, but that
it would involve a major motherboard alteration replacing AGP.
At least that's the way I understand it for now.
Originally posted by onlooker
PCI-E doesn't seem minor to me. Thats a new MB design. Why design a new MB just for PCI-E. If they are going to add PCI-E there will be more. What, I don't know, but there will be more.
Just a side note to onlooker, have you seen teh dual gpu 6600gt? It smokes dual SLI 6800ultras... it actually has 2 gpus on the board. They are saying this is the future... only problem is it requires a special mobo. SLI is less than a year old and it is already outdated.
Gigabyte makes the motherboard and graphics card.
Reason I brought this up is because if apple immediately invested into SLI someone would already be complaining that there is a new technology out and apple doesn't have it. I'd much rather see dual gpu 6600gt cards than SLI.
They will probably add DDR2 ram... I"m guessing PC4200
Other than that I really don't know what else they would add except for dual memory controllers, but I don't see that happening any time soon.
Maybe hypertransport2? Guess is as good as mine.
Point being, (especially for audio) the current machines would suit fallen just fine. There is nothing wrong with waiting if he can. I have been checking up because (after selling my dual 2) I was going to get a powerbook, but if the powermacs are going to be significantly upgraded, i'll just get a new powermac instead.
We're going to look back in a few years and laugh about the days of wanting to stuff two GPU cards into our computers just as I laught about the days when you had to toss two 3DFX cards in a computer "and" a 2D card!
In dealing with hardware I always try to remember that it is me that is the slowest component in the chain. I can affect a change in efficiency by my own actions than that of a GPU card running faster.
Originally posted by emig647
Just a side note to onlooker, have you seen teh dual gpu 6600gt? It smokes dual SLI 6800ultras... it actually has 2 gpus on the board. They are saying this is the future... only problem is it requires a special mobo. SLI is less than a year old and it is already outdated.
Gigabyte makes the motherboard and graphics card.
Reason I brought this up is because if apple immediately invested into SLI someone would already be complaining that there is a new technology out and apple doesn't have it. I'd much rather see dual gpu 6600gt cards than SLI.
Yes I saw it, but I didn't see it smoke 6800 Ultras in SLI, and it doesn't. I saw their test where it does beat 6600GT's in an SLI configuration in some tests, but not all. Usually a single 6800GT will beat it in a non SLI configuration.
The difference between their card, and the usual GT is theirs has Dual 6600GT GPU's with "*256MB DDR3 on a 256-bit Bit Bus" clocked @ "**600MHz" with a Core Clock @ 500 MHz
(*same as the 6800 Ultra / **overclocked)
The regular GT has 128MB DDR3 on a 128MB Bus Clocked @ 500MHz with a Core Clock @ 500MHz.
Tom's hardware has already looked at this card, and I wasn't as impressed with it as you are. Dual GPU on one card isn't that stellar IMO. I am far more impressed with SLI.
But Usually a single 6800 Ultra whooped this card. Toms hardware didn't bother using it in SLI because it wasn't really comparable. 6800 Ultra is faster in most cases as a single card. So is the 6800 GT.
All and all it's performs OK, for an over clock, but it's nothing to freak about.
Some Comparisons from Tom's hardware.
Unreal Tournament
DOOM 3
FarCry
3DMark 2005
Tom's Hardware conclusion
Without a doubt, Gigabyte has created a fascinating piece of hardware with the 3D1. The concept of creating an SLI setup on a single card deserves the highest respect. Why Gigabyte chose to use the GeForce 6600 GT processor instead of the faster 6800 model is unclear. Possibly, the NV45's HSI bridge chip caused some problems, or such a card would have become too complex to produce. After all, the 6800s use a 256 bit memory interface. Such a dual-core circuit board would quickly become very complex and consequently expensive.
Going only by the numbers, we see that the 3D1 definitely has its pros and cons. Bundled with the motherboard, the card will be slightly cheaper than a comparable GeForce 6600 GT SLI setup, while offering better performance. Also, it will be much less expensive than a single 6800 GT or Ultra card.
However, the downside is that the buyer basically loses the second x16 PCIe slot when using this card. This removes the option of upgrading to SLI at a later time, diminishing the overall flexibility. Therefore, whether or not the 3D1 is a good choice compared to a more flexible two-card 6600 GT SLI setup or even a single 6800 GT/Ultra card mostly depends on the buyer's plans for future upgrades. Whether or not SLI pays off at all depends on the resolutions and quality settings the user prefers to play at. Lastly, the choice of games is important as well: In modern games, a GeForce 6600 GT SLI setup can really shine, offering a tangible performance boost. However, in older titles a single GeForce 6800 GT may be the better choice.
Considering the history of failed attempts at bringing dual-core graphics cards to the market, Gigabyte's 3D1 will probably have some difficulties establishing itself in the marketplace - especially since the card does have some technological limitations. Of course Gigabyte is well aware of this and plans to offer the card - bundled with the K8NXP-SLI motherboard - as a limited edition only.
In the end, the 3D1 showcases Gigabyte's technological expertise and its willingness to innovate. Perhaps the card can be compared to the design prototypes with which car makers try to impress their potential customers at automobile shows. Whatever the case may be, we're definitely hoping to see more of this kind of thing
And this proves my point... Apple not moving to PCI-Express ASAP does not bare a big difference, yes there is a 533mhz vs (what is pci-e, 1066mhz?) (agp8x vs pci-e 16x).
Long story short apple not moving to pci-express 16x doesn't have any BIG performance issues until applications take advantage of the bi-directional communication of which makes pci-e so much better. A year from now, i'll bite. I'm sure major apps will take advantage of it. As it is now nothing really does take advantage of it.
Originally posted by FallenFromTheTree
Another web site asked how a G5 might compare to a modern automobile.
I chose Volvo, because of the G5's overall stability, safety, reliability and performance. <">
Hopefully we'll see some real head turning benchmarks with the first dual core TigerMac.
Then it's pedal to the metal
Wow, if you choose a Volvo for stability and reliability then I have to be sorry for you. I have two volvos and I can't wait to get rid of them. They have good pick up and safety. But when it comes to maintenance and reliability they look more like PC's that keep giving you headaches.
was the tree it hit when my wife accidentally forgot to set the emergency brake and it rolled down the mountain.
Originally posted by FallenFromTheTree
The only thing that stopped our old Volvo 240 wagon
was the tree it hit when my wife accidentally forgot to set the emergency brake and it rolled down the mountain.
Not to get too much on the car discussion. I agree that the really old Volvo's are great and very reliable. Anything built after 1995 are horrible. I experienced three completely different models 95, 98 and 2000 and was unhappy with them all. I know two people who purchased new ones in the last year and are not crazy about them either.
i would put this way: G5 is: Volvo for safety, BMW or Audi for performance and Toyota or Honda for reliability.