I disagree. The FireGL cards are very good cards as well.
For video work, NVidia really sucks. ATI's video quality is SO much better, it's not even close. And Nvidia has the nerve to charge extra for their second rate software.
Hmmm... and where have you been? I'm guessing you have never heard of the NVIDIA QuadroFX SDI series? ATI doesn't have a card that matchhes the 4000 which is a few years old let alone come close to performing like a QFX 5500 SDI.
On another note I'm totally excited that the Quadro's have been updated. (1GB RAM) The QuadroFX 5500 (regular version) in a Quad Woodcrest Mac Pro is going to kick some serious tail.
Hmmm... and where have you been? I'm guessing you have never heard of the NVIDIA QuadroFX SDI series? ATI doesn't have a card that matchhes the 4000 which is a few years old let alone come close to performing like a QFX 5500 SDI.
That aside I'm excited to the Quadro's updated. The QuadroFX 5500 (regular version) in a Quad Woodcrest Mac Pro is going to kick some serious tail.
If you want to talk about one card fine. But I know a lot of companies that won't touch Nvidia, and for video editing, they are terrible. Not everything is 3D.
Every time a card gets updated is good. Apple should offer more than a mid reange, not very good performer, and the highest card that Nvidia makes. They need to offer a better game card, lower range 3D cards, and ATI as well.
Let us make the choice. I don't know about you, but I don't like the fact that a $3.300 machine has choices of a low end card, a low-medium card, and the top 3D card. And, just from one company. What kind of chioce is that?
Talking about a Woodcrest Quad. I've been saying that I didn't believe that the Conroe would be a good choice, except for maybe the low end machine, at best. The Woodcrest is the ONLY chip to compete with the Opteron's, and the only chip to compete with itself on other Intel workstations.
Look. All I was saying was it's obvious Nvidia is making the better video editing card contrary to what you were aware of. So I put it up, and there it is. Like it or not there is no denying it. The Nvidia SDI series is the cream of the crop between the two in alll areas. Sure Apple could offer both ATI, and Nvidia cards, but if they are only going to offer one I would have to go with Nvidia. Cause ATI comparatively sucks ass. In ALL areas.
As far as the Mac Pro goes I thank you for the link, but that's not confirming anything for me. Just cause some clown at Ars finally says it - doesn't give it any more credibility. I already put that together in here months ago. AFAIAC that guy is a bit late to the party.
Look. All I was saying was it's obvious Nvidia is making the better video editing card contrary to what you were aware of. So I put it up, and there it is. Like it or not there is no denying it. The Nvidia SDI series is the cream of the crop between the two in alll areas. Sure Apple could offer both ATI, and Nvidia cards, but if they are only going to offer one I would have to go with Nvidia. Cause ATI comparatively sucks ass. In ALL areas.
As far as the Mac Pro goes I thank you for the link, but that's not confirming anything for me. Just cause some clown at Ars finally says it - doesn't give it any more credibility. I already put that together in here months ago. AFAIAC that guy is a bit late to the party.
I still don't agree. It's well known that Nvidia's video is truely bad. Most post houses don't use it.
And, of course, the link doesn't prove it, but it's just another bit of information that it's likely.
Look. All I was saying was it's obvious Nvidia is making the better video editing card contrary to what you were aware of.
You fail to address melgross's correct point that ATi's video output quality is far, far beter than nVidia's. Always has been. Of course, matrox has a slight edge on that over ATi, even, but that's another matter.
Performance is what you keep bringing up, but performance isn't everything.
You fail to address melgross's correct point that ATi's video output quality is far, far beter than nVidia's. Always has been.
Performance is what you keep bringing up, but performance isn't everything.
Actually no it hasn't. Why don't you back that up with a link to a side by side performance benchmark with an ATI vs. Nvidia SDI series card to prove it?
Also who is the biggest video editor in movies. AVID? I know they used it on starwars so I imagine that it's used at ILM. For some reason Adobe recommends the SDI series for use with their video products. Go figure.
If performance means so little why are people complaining about the ATI cards used in the iMac, and the MBP as being just not enough for them, and they would prefer a better graphics card. Of corse being that you have no need for performance I'm sure it's enough for you, and your email, and forum browsing so what would it matter.
I don't know what the rationale is for switching a solid performer & seller, the iMac G5, precipitously.
Seems pretty obvious to me -- the iMac is a low power environment (compared to the PowerMac) so the G5 they could put in it was constrained. Putting a Core Duo in there allows them to get better performance in most software (especially consumer level), it accelerates their migration to Intel time table, its probably cheaper, and it puts a perceptual stake in the ground for their target market.
The PowerMac will be the last to go because peak performance on an unlimited power/heat budget is where the G5 excels, especially on media-oriented apps, and it is what the pro desktop machine needs. Intel's new generation top performance chips aren't available until closer to year end, and then they'll be able to demonstrate a clear improvement over a 2.5 GHz dual dual G5. Plus the Universal Binaries for the big pro apps weren't going to arrive until later this year (some have already, others are pending).
Seems pretty obvious to me -- the iMac is a low power environment (compared to the PowerMac) so the G5 they could put in it was constrained. Putting a Core Duo in there allows them to get better performance in most software (especially consumer level), it accelerates their migration to Intel time table, its probably cheaper, and it puts a perceptual stake in the ground for their target market.
The PowerMac will be the last to go because peak performance on an unlimited power/heat budget is where the G5 excels, especially on media-oriented apps, and it is what the pro desktop machine needs. Intel's new generation top performance chips aren't available until closer to year end, and then they'll be able to demonstrate a clear improvement over a 2.5 GHz dual dual G5. Plus the Universal Binaries for the big pro apps weren't going to arrive until later this year (some have already, others are pending).
I sat thru a presentation last week where a Apple employee said that he ran a test on his dual G5 which took two hours (Handbrake recoding a video file from DVD) which the 20" iMac Intel chip machine did the same task in 30 minutes. That's right, one fourth of the time of a dual G5!
I believe that the reason for the "Mac Pro" (tower) timing is simply that Intel will not be releasing the chip which is intended for it until shortly before the expected release date of the "Mac Pro".
Apple may be able to cut the lead time between the release of the CPUs and product shipping by producing the units without CPU and installing the CPU at the last (when they become available). If Apple chooses to do things this way, it would indicate a high degree of confidence, based upon testing of preproduction chips, that Intel will be able to ship the CPU "as advertised". Otherwise the risk of having to change the machines already producted because of a change in the final production CPU would be too risky.
It will be interesting to see how this develops as it will indicate the state of the emerging relationship with Intel.
Actually no it hasn't. Why don't you back that up with a link to a side by side performance benchmark with an ATI vs. Nvidia SDI series card to prove it?
Also who is the biggest video editor in movies. AVID? I know they used it on starwars so I imagine that it's used at ILM. For some reason Adobe recommends the SDI series for use with their video products. Go figure.
If performance means so little why are people complaining about the ATI cards used in the iMac, and the MBP as being just not enough for them, and they would prefer a better graphics card. Of corse being that you have no need for performance I'm sure it's enough for you, and your email, and forum browsing so what would it matter.
People's complaints mean nothing. If Apple got a better deal on an Nvidia, they would have used that, and people would still be complaining.
Go to any of the tech sites yourself that have tested these cards with video, and you will see that all of them agree that ATI's video output is far superior.
I sat thru a presentation last week where a Apple employee said that he ran a test on his dual G5 which took two hours (Handbrake recoding a video file from DVD) which the 20" iMac Intel chip machine did the same task in 30 minutes. That's right, one fourth of the time of a dual G5!
I believe that the reason for the "Mac Pro" (tower) timing is simply that Intel will not be releasing the chip which is intended for it until shortly before the expected release date of the "Mac Pro".
Apple may be able to cut the lead time between the release of the CPUs and product shipping by producing the units without CPU and installing the CPU at the last (when they become available). If Apple chooses to do things this way, it would indicate a high degree of confidence, based upon testing of preproduction chips, that Intel will be able to ship the CPU "as advertised". Otherwise the risk of having to change the machines already producted because of a change in the final production CPU would be too risky.
It will be interesting to see how this develops as it will indicate the state of the emerging relationship with Intel.
Just remember this;
Apple was the first company to get the Yonah in shipping quantities. Several PC manufacturers complained about that.
Intel is pushing the Conroe, Merom, and Woodcrest forward to the 3rd quarter. Apple pushed its developer conference back to August, in the 3rd quarter.
Why did Apple do that?
With the knowledge we now have about the delivery schedule of these chips, it's a reasonable assumption that it is not a coincidence.
Also remember that compared to the rest of the industry, Apple's sales are miniscule (hopefully, not forever, but that's a different argument). That would mean that as the ramp-up in production for these chips proceeds, they would have enough chips for Apple to sell, while the big vendors might pass.
Going back to the days of the clones, Apple was hit with this very same phenomenon. The smaller vendors always got the fastest chips a couple of months ahead of Apple, because of the ramp-up.
So, we might see all of those lines of chips being used.
AND, if Apple does choose to come out with some mid-price machine, it might be with a Conroe. That machine would have better cooling capacity than the iMac, but would cost less than a tower. It would have to compete against all of the other mid-price machines using Conroe's. I'm not saying that they will do it, of course, but if they do...
If this argument about G5 improvements is correct--and it seems quite plausible to me--why did Apple switch the G5 iMac to Intel so soon? Switching the G4 computers first seems like a much better strategy (even though Apple said at their quarterly financial call that they're "thrilled" (or some such) with their current iBooks), followed by a synchronized transition of G5's to Intel later in the year. It seems to me that this would have maximized Mac sales.
Apple also said that it was seeing a pause in sales of PowerPC macs... The iMac and the PowerBook were the two best-selling macs out there, so if Apple hadn't changed their processors first (and people had suspended buying in anticipation of an Intel PowerBook), then the pause in mac sales would have been even more pronounced. Also, take into account the fact that the iBook is more targeted towards the education and low-end consumer markets, and Apple indicated it doesn't see a whole bunch of education buying until around July.
My bet: the Intel iBooks are waiting on price cuts on Core Duos, coming in at some point during May.
Apple also said that it was seeing a pause in sales of PowerPC macs... The iMac and the PowerBook were the two best-selling macs out there, so if Apple hadn't changed their processors first (and people had suspended buying in anticipation of an Intel PowerBook), then the pause in mac sales would have been even more pronounced. Also, take into account the fact that the iBook is more targeted towards the education and low-end consumer markets, and Apple indicated it doesn't see a whole bunch of education buying until around July.
My bet: the Intel iBooks are waiting on price cuts on Core Duos, coming in at some point during May.
That's a good assumption. And, May is almost here.
AND, if Apple does choose to come out with some mid-price machine, it might be with a Conroe. That machine would have better cooling capacity than the iMac, but would cost less than a tower. It would have to compete against all of the other mid-price machines using Conroe's. I'm not saying that they will do it, of course, but if they do...
I could be wrong but I assume Conroe should work fine in an iMac too.
[edit]
stupid me: said that already earlier in this thread... anyway
I could be wrong but I assume Conroe should work fine in an iMac too.
[edit]
stupid me: said that already earlier in this thread... anyway
[/edit]
That's ok, I also said it earlier on this thread.
It's just that if Apple did come out with some middle machine, they would want to differentiate it more. And what better way, than to use a different chip?
With IBM, there was no choice. Same with Freescale. You could either go fairly low with the G4, or high with the G5. Nothing in between. It really limited Apple's options. Many things are now open to them that weren't in the past.
People's complaints mean nothing. If Apple got a better deal on an Nvidia, they would have used that, and people would still be complaining.
Go to any of the tech sites yourself that have tested these cards with video, and you will see that all of them agree that ATI's video output is far superior.
You keep blathering on about this but you can not show me one shred of evidence.
Comments
Originally posted by melgross
I disagree. The FireGL cards are very good cards as well.
For video work, NVidia really sucks. ATI's video quality is SO much better, it's not even close. And Nvidia has the nerve to charge extra for their second rate software.
Hmmm... and where have you been? I'm guessing you have never heard of the NVIDIA QuadroFX SDI series? ATI doesn't have a card that matchhes the 4000 which is a few years old let alone come close to performing like a QFX 5500 SDI.
Nvidia QuadroFX 5500 SDI
On another note I'm totally excited that the Quadro's have been updated. (1GB RAM) The QuadroFX 5500 (regular version) in a Quad Woodcrest Mac Pro is going to kick some serious tail.
Originally posted by onlooker
Hmmm... and where have you been? I'm guessing you have never heard of the NVIDIA QuadroFX SDI series? ATI doesn't have a card that matchhes the 4000 which is a few years old let alone come close to performing like a QFX 5500 SDI.
Nvidia QuadroFX 5500 SDI
That aside I'm excited to the Quadro's updated. The QuadroFX 5500 (regular version) in a Quad Woodcrest Mac Pro is going to kick some serious tail.
If you want to talk about one card fine. But I know a lot of companies that won't touch Nvidia, and for video editing, they are terrible. Not everything is 3D.
Every time a card gets updated is good. Apple should offer more than a mid reange, not very good performer, and the highest card that Nvidia makes. They need to offer a better game card, lower range 3D cards, and ATI as well.
Let us make the choice. I don't know about you, but I don't like the fact that a $3.300 machine has choices of a low end card, a low-medium card, and the top 3D card. And, just from one company. What kind of chioce is that?
Talking about a Woodcrest Quad. I've been saying that I didn't believe that the Conroe would be a good choice, except for maybe the low end machine, at best. The Woodcrest is the ONLY chip to compete with the Opteron's, and the only chip to compete with itself on other Intel workstations.
Well, it may be coming true!
http://arstechnica.com/journals/appl...2006/4/22/3712
As far as the Mac Pro goes I thank you for the link, but that's not confirming anything for me. Just cause some clown at Ars finally says it - doesn't give it any more credibility. I already put that together in here months ago. AFAIAC that guy is a bit late to the party.
Originally posted by onlooker
Look. All I was saying was it's obvious Nvidia is making the better video editing card contrary to what you were aware of. So I put it up, and there it is. Like it or not there is no denying it. The Nvidia SDI series is the cream of the crop between the two in alll areas. Sure Apple could offer both ATI, and Nvidia cards, but if they are only going to offer one I would have to go with Nvidia. Cause ATI comparatively sucks ass. In ALL areas.
As far as the Mac Pro goes I thank you for the link, but that's not confirming anything for me. Just cause some clown at Ars finally says it - doesn't give it any more credibility. I already put that together in here months ago. AFAIAC that guy is a bit late to the party.
I still don't agree. It's well known that Nvidia's video is truely bad. Most post houses don't use it.
And, of course, the link doesn't prove it, but it's just another bit of information that it's likely.
Originally posted by onlooker
Look. All I was saying was it's obvious Nvidia is making the better video editing card contrary to what you were aware of.
You fail to address melgross's correct point that ATi's video output quality is far, far beter than nVidia's. Always has been. Of course, matrox has a slight edge on that over ATi, even, but that's another matter.
Performance is what you keep bringing up, but performance isn't everything.
Originally posted by Chucker
You fail to address melgross's correct point that ATi's video output quality is far, far beter than nVidia's. Always has been.
Performance is what you keep bringing up, but performance isn't everything.
Actually no it hasn't. Why don't you back that up with a link to a side by side performance benchmark with an ATI vs. Nvidia SDI series card to prove it?
Also who is the biggest video editor in movies. AVID? I know they used it on starwars so I imagine that it's used at ILM. For some reason Adobe recommends the SDI series for use with their video products. Go figure.
If performance means so little why are people complaining about the ATI cards used in the iMac, and the MBP as being just not enough for them, and they would prefer a better graphics card. Of corse being that you have no need for performance I'm sure it's enough for you, and your email, and forum browsing so what would it matter.
Originally posted by dh87
I don't know what the rationale is for switching a solid performer & seller, the iMac G5, precipitously.
Seems pretty obvious to me -- the iMac is a low power environment (compared to the PowerMac) so the G5 they could put in it was constrained. Putting a Core Duo in there allows them to get better performance in most software (especially consumer level), it accelerates their migration to Intel time table, its probably cheaper, and it puts a perceptual stake in the ground for their target market.
The PowerMac will be the last to go because peak performance on an unlimited power/heat budget is where the G5 excels, especially on media-oriented apps, and it is what the pro desktop machine needs. Intel's new generation top performance chips aren't available until closer to year end, and then they'll be able to demonstrate a clear improvement over a 2.5 GHz dual dual G5. Plus the Universal Binaries for the big pro apps weren't going to arrive until later this year (some have already, others are pending).
Originally posted by Programmer
Seems pretty obvious to me -- the iMac is a low power environment (compared to the PowerMac) so the G5 they could put in it was constrained. Putting a Core Duo in there allows them to get better performance in most software (especially consumer level), it accelerates their migration to Intel time table, its probably cheaper, and it puts a perceptual stake in the ground for their target market.
The PowerMac will be the last to go because peak performance on an unlimited power/heat budget is where the G5 excels, especially on media-oriented apps, and it is what the pro desktop machine needs. Intel's new generation top performance chips aren't available until closer to year end, and then they'll be able to demonstrate a clear improvement over a 2.5 GHz dual dual G5. Plus the Universal Binaries for the big pro apps weren't going to arrive until later this year (some have already, others are pending).
I sat thru a presentation last week where a Apple employee said that he ran a test on his dual G5 which took two hours (Handbrake recoding a video file from DVD) which the 20" iMac Intel chip machine did the same task in 30 minutes. That's right, one fourth of the time of a dual G5!
I believe that the reason for the "Mac Pro" (tower) timing is simply that Intel will not be releasing the chip which is intended for it until shortly before the expected release date of the "Mac Pro".
Apple may be able to cut the lead time between the release of the CPUs and product shipping by producing the units without CPU and installing the CPU at the last (when they become available). If Apple chooses to do things this way, it would indicate a high degree of confidence, based upon testing of preproduction chips, that Intel will be able to ship the CPU "as advertised". Otherwise the risk of having to change the machines already producted because of a change in the final production CPU would be too risky.
It will be interesting to see how this develops as it will indicate the state of the emerging relationship with Intel.
Originally posted by onlooker
Actually no it hasn't. Why don't you back that up with a link to a side by side performance benchmark with an ATI vs. Nvidia SDI series card to prove it?
Also who is the biggest video editor in movies. AVID? I know they used it on starwars so I imagine that it's used at ILM. For some reason Adobe recommends the SDI series for use with their video products. Go figure.
If performance means so little why are people complaining about the ATI cards used in the iMac, and the MBP as being just not enough for them, and they would prefer a better graphics card. Of corse being that you have no need for performance I'm sure it's enough for you, and your email, and forum browsing so what would it matter.
People's complaints mean nothing. If Apple got a better deal on an Nvidia, they would have used that, and people would still be complaining.
Go to any of the tech sites yourself that have tested these cards with video, and you will see that all of them agree that ATI's video output is far superior.
Originally posted by RBR
I sat thru a presentation last week where a Apple employee said that he ran a test on his dual G5 which took two hours (Handbrake recoding a video file from DVD) which the 20" iMac Intel chip machine did the same task in 30 minutes. That's right, one fourth of the time of a dual G5!
I believe that the reason for the "Mac Pro" (tower) timing is simply that Intel will not be releasing the chip which is intended for it until shortly before the expected release date of the "Mac Pro".
Apple may be able to cut the lead time between the release of the CPUs and product shipping by producing the units without CPU and installing the CPU at the last (when they become available). If Apple chooses to do things this way, it would indicate a high degree of confidence, based upon testing of preproduction chips, that Intel will be able to ship the CPU "as advertised". Otherwise the risk of having to change the machines already producted because of a change in the final production CPU would be too risky.
It will be interesting to see how this develops as it will indicate the state of the emerging relationship with Intel.
Just remember this;
Apple was the first company to get the Yonah in shipping quantities. Several PC manufacturers complained about that.
Intel is pushing the Conroe, Merom, and Woodcrest forward to the 3rd quarter. Apple pushed its developer conference back to August, in the 3rd quarter.
Why did Apple do that?
With the knowledge we now have about the delivery schedule of these chips, it's a reasonable assumption that it is not a coincidence.
Also remember that compared to the rest of the industry, Apple's sales are miniscule (hopefully, not forever, but that's a different argument). That would mean that as the ramp-up in production for these chips proceeds, they would have enough chips for Apple to sell, while the big vendors might pass.
Going back to the days of the clones, Apple was hit with this very same phenomenon. The smaller vendors always got the fastest chips a couple of months ahead of Apple, because of the ramp-up.
So, we might see all of those lines of chips being used.
AND, if Apple does choose to come out with some mid-price machine, it might be with a Conroe. That machine would have better cooling capacity than the iMac, but would cost less than a tower. It would have to compete against all of the other mid-price machines using Conroe's. I'm not saying that they will do it, of course, but if they do...
Originally posted by dh87
If this argument about G5 improvements is correct--and it seems quite plausible to me--why did Apple switch the G5 iMac to Intel so soon? Switching the G4 computers first seems like a much better strategy (even though Apple said at their quarterly financial call that they're "thrilled" (or some such) with their current iBooks), followed by a synchronized transition of G5's to Intel later in the year. It seems to me that this would have maximized Mac sales.
Apple also said that it was seeing a pause in sales of PowerPC macs... The iMac and the PowerBook were the two best-selling macs out there, so if Apple hadn't changed their processors first (and people had suspended buying in anticipation of an Intel PowerBook), then the pause in mac sales would have been even more pronounced. Also, take into account the fact that the iBook is more targeted towards the education and low-end consumer markets, and Apple indicated it doesn't see a whole bunch of education buying until around July.
My bet: the Intel iBooks are waiting on price cuts on Core Duos, coming in at some point during May.
Originally posted by Mr. Dirk
Apple also said that it was seeing a pause in sales of PowerPC macs... The iMac and the PowerBook were the two best-selling macs out there, so if Apple hadn't changed their processors first (and people had suspended buying in anticipation of an Intel PowerBook), then the pause in mac sales would have been even more pronounced. Also, take into account the fact that the iBook is more targeted towards the education and low-end consumer markets, and Apple indicated it doesn't see a whole bunch of education buying until around July.
My bet: the Intel iBooks are waiting on price cuts on Core Duos, coming in at some point during May.
That's a good assumption. And, May is almost here.
Originally posted by melgross
AND, if Apple does choose to come out with some mid-price machine, it might be with a Conroe. That machine would have better cooling capacity than the iMac, but would cost less than a tower. It would have to compete against all of the other mid-price machines using Conroe's. I'm not saying that they will do it, of course, but if they do...
I could be wrong but I assume Conroe should work fine in an iMac too.
[edit]
stupid me: said that already earlier in this thread... anyway
[/edit]
Originally posted by gar
I could be wrong but I assume Conroe should work fine in an iMac too.
[edit]
stupid me: said that already earlier in this thread... anyway
[/edit]
That's ok, I also said it earlier on this thread.
It's just that if Apple did come out with some middle machine, they would want to differentiate it more. And what better way, than to use a different chip?
With IBM, there was no choice. Same with Freescale. You could either go fairly low with the G4, or high with the G5. Nothing in between. It really limited Apple's options. Many things are now open to them that weren't in the past.
Originally posted by melgross
People's complaints mean nothing. If Apple got a better deal on an Nvidia, they would have used that, and people would still be complaining.
Go to any of the tech sites yourself that have tested these cards with video, and you will see that all of them agree that ATI's video output is far superior.
You keep blathering on about this but you can not show me one shred of evidence.
Originally posted by Placebo
YOU PEOPLE ARE HAVING A PISSING CONTEST OVER "FACTS" THAT COULD BE CHANGED AT ANY MOMENT BY EITHER COMPANY RELEASING A NEW CARD. GIVE IT A REST.
Your PC keyboard's caps lock is broken.
Originally posted by kim kap sol
Your PC keyboard's caps lock is broken.
Oh. It's so funny. \
Originally posted by onlooker
You keep blathering on about this but you can not show me one shred of evidence.
I was curious too, so I used Google.
http://www.extremetech.com/article2/...1916966,00.asp
Mmmm, Google.
Originally posted by Gene Clean
Oh. It's so funny. \
And you're not.