This was just on a single 1.8GHz G5 with a 9800 Pro and 1.5GB of RAM. The point is, it was being done quite a lot quicker with GPU shaders than by CPU, even for the 2056 x 1600 images I used.
It's properly designed 2D apps that can benefit heftily from these cards. Have you ever played around in Core Image Funhouse with a 9800 or better graphics card? Compared to performing the same blurs and distortions in Photoshop or Fireworks, it's ridiculously fast. Literally as fast as you can drag the blur slider.
My understanding is that it's good for previews, but it's not good enough to commit for inclusion into the final output. I think that's why Final Cut and iMovie use it to preview but still require rendering through the CPU.
My understanding is that it's good for previews, but it's not good enough to commit for inclusion into the final output. I think that's why Final Cut and iMovie use it to preview but still require rendering through the CPU.
You're the one that's claiming "it's not good enough to commit for inclusion into the final output"...the burden of proof is on you.
I'm not trying to prove anything, if I'm wrong, I'd like to know why. All I have is that it's from an audio interview with a Final Cut plug-in developer that said that sometimes the output is sometimes slightly unpredictable and not as good from a professional quality standpoint. I don't even know how I can dig that up. I think it makes sense given how long it takes to render video when the preview is done in real time.
I wish people would stop using rumored names to call machines. The name Mac Pro is stupid as hell, so lets stop calling it that until if/when its official. There is no proof they will call it that yet. And for those that claim its proof because of the trademark filing on "Mac Pro" well thats just one of hundreds of names that have been filed by apple and never used yet.
Ever think that they dont want other people using "Mac Pro" for other reasons than a computer name?
I vote for MacTower and MacTower Pro...
i just get annoyed when people keep using "Mac Pro" so much.
Mac Pro may not be the best name but is much better than your ideas, i'm sorry to say
He's actually right; on some effects (represented by orange bars in the Final Cut Pro timeline) can be previewed but require final processing for full quality prior.
Quote:
Originally posted by JeffDM
If I am wrong, please do a better job explaining why I am wrong.
Hush, it's just that he hasn't deemed you worthy of a full response.
I wish people would stop using rumored names to call machines. The name Mac Pro is stupid as hell, so lets stop calling it that until if/when its official...I vote for MacTower and MacTower Pro...
Dude, unfortunately MacTower and MacTower Pro are the most ridiculous names I have heard of suggested. Highly un-Apple like. I say let's stick to Mac Pro until something better/ official comes out, and I say let's stick to Conroe and Woodcrest until Intel's naming makes sense [Core 2 Duo/ Extreme Core Xeon WTF Core this Core that].
edit:
Well, at least I will stick to Yonah, Merom, Conroe and Woodcrest. All the Intel Core designations make my brain hurt. And to me, Xeon is an ugly, dirty word. A Woodcrest-Xserve sounds delicious. but "Xserve-now driven by Xeon" - makes me sad.
...All I have is that it's from an audio interview with a Final Cut plug-in developer that said that sometimes the output is sometimes slightly unpredictable and not as good from a professional quality standpoint. I don't even know how I can dig that up. I think it makes sense given how long it takes to render video when the preview is done in real time.
Good enough for most of us If KimKapSol wants more proof he can call up Apple or kidnap an Apple developer to interrogate*.
*Imagine KimKapSol with Apple developer in dark room, single table lamp shining at Apple developer strapped up to lie detector and electro-shock device.
I bet that they will just call the damn thing "sex". It may attract the Aleinware buyers, they have never had "sex"!
Hey I own an alienware dork, and it's an awesome machine. The reason I got it is because they offered all the stuff I needed when Apple wasn't and they still don't offer a lot it's features - yet.
Hmm... I'm playing around with a 7500x7500 pixel image in Photoshop on PC. I'm taking screenshots of google map tiles and making one big map. Zooming in and out and moving certain tiles around (each tile is just less than 1280x1024) things are pretty fluid. Saving and loading is what takes time as it writes the whole 24bit-color file, sometimes with layers, to disk.
I'm wondering how much of the fluidity is the Athlon64 Venice 2.15ghz and how much is the nVidia 6600GT 128mb DDR3 VRAM.
We seem to have only touched the surface in looking at Photoshop operations* and the engineering on PC and Mac, on the weighting of CPU vs GPU.
*Obviously operations and previews are CPU-driven, GPU roles are more on the display of the image and panning around a high-res file.
Hmm... I'm playing around with a 7500x7500 pixel image in Photoshop on PC. I'm taking screenshots of google map tiles and making one big map. Zooming in and out and moving certain tiles around (each tile is just less than 1280x1024) things are pretty fluid. Saving and loading is what takes time as it writes the whole 24bit-color file, sometimes with layers, to disk.
I'm wondering how much of the fluidity is the Athlon64 Venice 2.15ghz and how much is the nVidia 6600GT 128mb DDR3 VRAM.
We seem to have only touched the surface in looking at Photoshop operations* and the engineering on PC and Mac, on the weighting of CPU vs GPU.
*Obviously operations and previews are CPU-driven, GPU roles are more on the display of the image and panning around a high-res file.
All your computations are in 2 dimensions, maybe even one. I don't think much of it calls to the GPU if any.
That's kind of like the "if a tree falls in the forest and nobody hears it did it really fall?"... "if a photoshop file is made with zero height does it really exist?" - remember zero height not 1 pixel high. Actually gar - a 1D PSD is 1pixel high and x pixels wide.
Comments
Originally posted by Placebo
It's properly designed 2D apps that can benefit heftily from these cards. Have you ever played around in Core Image Funhouse with a 9800 or better graphics card? Compared to performing the same blurs and distortions in Photoshop or Fireworks, it's ridiculously fast. Literally as fast as you can drag the blur slider.
My understanding is that it's good for previews, but it's not good enough to commit for inclusion into the final output. I think that's why Final Cut and iMovie use it to preview but still require rendering through the CPU.
Originally posted by JeffDM
My understanding is that it's good for previews, but it's not good enough to commit for inclusion into the final output. I think that's why Final Cut and iMovie use it to preview but still require rendering through the CPU.
Wrong.
Originally posted by kim kap sol
Wrong.
If I am wrong, please do a better job explaining why I am wrong.
Originally posted by JeffDM
If I am wrong, please do a better job explaining why I am wrong.
You're the one that's claiming "it's not good enough to commit for inclusion into the final output"...the burden of proof is on you.
Originally posted by kim kap sol
You're the one that's claiming "it's not good enough to commit for inclusion into the final output"...the burden of proof is on you.
I'm not trying to prove anything, if I'm wrong, I'd like to know why. All I have is that it's from an audio interview with a Final Cut plug-in developer that said that sometimes the output is sometimes slightly unpredictable and not as good from a professional quality standpoint. I don't even know how I can dig that up. I think it makes sense given how long it takes to render video when the preview is done in real time.
Originally posted by doh123
I wish people would stop using rumored names to call machines. The name Mac Pro is stupid as hell, so lets stop calling it that until if/when its official. There is no proof they will call it that yet. And for those that claim its proof because of the trademark filing on "Mac Pro" well thats just one of hundreds of names that have been filed by apple and never used yet.
Ever think that they dont want other people using "Mac Pro" for other reasons than a computer name?
I vote for MacTower and MacTower Pro...
i just get annoyed when people keep using "Mac Pro" so much.
Mac Pro may not be the best name but is much better than your ideas, i'm sorry to say
Originally posted by burningwheel
Mac Pro may not be the best name but is much better than your ideas, i'm sorry to say
I bet that they will just call the damn thing "sex". It may attract the Aleinware buyers, they have never had "sex"!
Originally posted by kim kap sol
Wrong.
He's actually right; on some effects (represented by orange bars in the Final Cut Pro timeline) can be previewed but require final processing for full quality prior.
Originally posted by JeffDM
If I am wrong, please do a better job explaining why I am wrong.
Hush, it's just that he hasn't deemed you worthy of a full response.
I bet that they will just call the damn thing "sex". It may attract the Aleinware buyers, they have never had "sex"!
They have (well think about) alien sex and fantasise more about 3d-generated attractive women than real women
I wish people would stop using rumored names to call machines. The name Mac Pro is stupid as hell, so lets stop calling it that until if/when its official...I vote for MacTower and MacTower Pro...
Dude, unfortunately MacTower and MacTower Pro are the most ridiculous names I have heard of suggested. Highly un-Apple like. I say let's stick to Mac Pro until something better/ official comes out, and I say let's stick to Conroe and Woodcrest until Intel's naming makes sense [Core 2 Duo/ Extreme Core Xeon WTF Core this Core that].
edit:
Well, at least I will stick to Yonah, Merom, Conroe and Woodcrest. All the Intel Core designations make my brain hurt. And to me, Xeon is an ugly, dirty word. A Woodcrest-Xserve sounds delicious. but "Xserve-now driven by Xeon" - makes me sad.
...All I have is that it's from an audio interview with a Final Cut plug-in developer that said that sometimes the output is sometimes slightly unpredictable and not as good from a professional quality standpoint. I don't even know how I can dig that up. I think it makes sense given how long it takes to render video when the preview is done in real time.
Good enough for most of us If KimKapSol wants more proof he can call up Apple or kidnap an Apple developer to interrogate*.
*Imagine KimKapSol with Apple developer in dark room, single table lamp shining at Apple developer strapped up to lie detector and electro-shock device.
"Are you an Apple developer?"
YES
[Ding!]
"Do you work on Final Cut and iMovie?"
YES
[Ding!]
"Do you own an Alienware PC"
OF COURSE NOT
BZZZTZTTTTTTT!!!
ARGHH! OKAY OKAY I DO OWN AN ALIENWARE
[Ding!]
"Am I a cool guy?"
UMMM... YES?
BZZZTZTTTTTTT!!!
Originally posted by a_greer
I bet that they will just call the damn thing "sex". It may attract the Aleinware buyers, they have never had "sex"!
Hey I own an alienware dork, and it's an awesome machine. The reason I got it is because they offered all the stuff I needed when Apple wasn't and they still don't offer a lot it's features - yet.
BTW, I'm still hoping they call it PimpZilla!!!!
Originally posted by kim kap sol
Wrong.
No, he was perfectly correct.
I'm wondering how much of the fluidity is the Athlon64 Venice 2.15ghz and how much is the nVidia 6600GT 128mb DDR3 VRAM.
We seem to have only touched the surface in looking at Photoshop operations* and the engineering on PC and Mac, on the weighting of CPU vs GPU.
*Obviously operations and previews are CPU-driven, GPU roles are more on the display of the image and panning around a high-res file.
Originally posted by sunilraman
I'm wondering how much of the fluidity is the Athlon64 Venice 2.15ghz and how much is the nVidia 6600GT 128mb DDR3 VRAM.
The graphics card has virtually no effect.
*Obviously operations and previews are CPU-driven, GPU roles are more on the display of the image and panning around a high-res file.
No, not even that. The GPU just gets a bunch of lines and blits them on the screen. It doesn't do any of the handling.
Originally posted by sunilraman
Hmm... I'm playing around with a 7500x7500 pixel image in Photoshop on PC. I'm taking screenshots of google map tiles and making one big map. Zooming in and out and moving certain tiles around (each tile is just less than 1280x1024) things are pretty fluid. Saving and loading is what takes time as it writes the whole 24bit-color file, sometimes with layers, to disk.
I'm wondering how much of the fluidity is the Athlon64 Venice 2.15ghz and how much is the nVidia 6600GT 128mb DDR3 VRAM.
We seem to have only touched the surface in looking at Photoshop operations* and the engineering on PC and Mac, on the weighting of CPU vs GPU.
*Obviously operations and previews are CPU-driven, GPU roles are more on the display of the image and panning around a high-res file.
All your computations are in 2 dimensions, maybe even one. I don't think much of it calls to the GPU if any.
Originally posted by onlooker
All your computations are in 2 dimensions, maybe even one
hmmmmm...
A PSD with zero hight?
Originally posted by onlooker
All your computations are in 2 dimensions, maybe even one.
That refers to computations applied to image data, not to the images themselves.