or Connect
AppleInsider › Forums › Mac Hardware › Current Mac Hardware › Intel unleashes Mac-bound "Woodcrest" server chip
New Posts  All Forums:Forum Nav:

Intel unleashes Mac-bound "Woodcrest" server chip - Page 5

post #161 of 566
Quote:
Originally posted by Placebo
It's properly designed 2D apps that can benefit heftily from these cards. Have you ever played around in Core Image Funhouse with a 9800 or better graphics card? Compared to performing the same blurs and distortions in Photoshop or Fireworks, it's ridiculously fast. Literally as fast as you can drag the blur slider.

ummm...PS blurs are as fast as I can drag the slider when the system has a gig of ram or more unless you are working on a super hi res image (say 4MP+) what res were you using to do your comparison? same image in the same res in both?

I know CI is faster but you are talking about a blur, I havent seen PS choak on a blur since v6 if the image is less than like 1600.1200.
You can't quantify how much I don't care -- Bob Kevoian of the Bob and Tom Show.
Reply
You can't quantify how much I don't care -- Bob Kevoian of the Bob and Tom Show.
Reply
post #162 of 566
This was just on a single 1.8GHz G5 with a 9800 Pro and 1.5GB of RAM. The point is, it was being done quite a lot quicker with GPU shaders than by CPU, even for the 2056 x 1600 images I used.
post #163 of 566
Quote:
Originally posted by Placebo
It's properly designed 2D apps that can benefit heftily from these cards. Have you ever played around in Core Image Funhouse with a 9800 or better graphics card? Compared to performing the same blurs and distortions in Photoshop or Fireworks, it's ridiculously fast. Literally as fast as you can drag the blur slider.

My understanding is that it's good for previews, but it's not good enough to commit for inclusion into the final output. I think that's why Final Cut and iMovie use it to preview but still require rendering through the CPU.
post #164 of 566
Quote:
Originally posted by JeffDM
My understanding is that it's good for previews, but it's not good enough to commit for inclusion into the final output. I think that's why Final Cut and iMovie use it to preview but still require rendering through the CPU.

Wrong.
post #165 of 566
Quote:
Originally posted by kim kap sol
Wrong.

If I am wrong, please do a better job explaining why I am wrong.
post #166 of 566
Quote:
Originally posted by JeffDM
If I am wrong, please do a better job explaining why I am wrong.

You're the one that's claiming "it's not good enough to commit for inclusion into the final output"...the burden of proof is on you.
post #167 of 566
Quote:
Originally posted by kim kap sol
You're the one that's claiming "it's not good enough to commit for inclusion into the final output"...the burden of proof is on you.

I'm not trying to prove anything, if I'm wrong, I'd like to know why. All I have is that it's from an audio interview with a Final Cut plug-in developer that said that sometimes the output is sometimes slightly unpredictable and not as good from a professional quality standpoint. I don't even know how I can dig that up. I think it makes sense given how long it takes to render video when the preview is done in real time.
post #168 of 566
Quote:
Originally posted by doh123
I wish people would stop using rumored names to call machines. The name Mac Pro is stupid as hell, so lets stop calling it that until if/when its official. There is no proof they will call it that yet. And for those that claim its proof because of the trademark filing on "Mac Pro" well thats just one of hundreds of names that have been filed by apple and never used yet.

Ever think that they dont want other people using "Mac Pro" for other reasons than a computer name?


I vote for MacTower and MacTower Pro...


i just get annoyed when people keep using "Mac Pro" so much.

Mac Pro may not be the best name but is much better than your ideas, i'm sorry to say
post #169 of 566
Quote:
Originally posted by burningwheel
Mac Pro may not be the best name but is much better than your ideas, i'm sorry to say

I bet that they will just call the damn thing "sex". It may attract the Aleinware buyers, they have never had "sex"!
You can't quantify how much I don't care -- Bob Kevoian of the Bob and Tom Show.
Reply
You can't quantify how much I don't care -- Bob Kevoian of the Bob and Tom Show.
Reply
post #170 of 566
Quote:
Originally posted by kim kap sol
Wrong.

He's actually right; on some effects (represented by orange bars in the Final Cut Pro timeline) can be previewed but require final processing for full quality prior.
Quote:
Originally posted by JeffDM
If I am wrong, please do a better job explaining why I am wrong.

Hush, it's just that he hasn't deemed you worthy of a full response.
post #171 of 566
[QUOTE]Originally posted by a_greer
I bet that they will just call the damn thing "sex". It may attract the Aleinware buyers, they have never had "sex"!



They have (well think about) alien sex and fantasise more about 3d-generated attractive women than real women
post #172 of 566
[QUOTE]Originally posted by doh123
I wish people would stop using rumored names to call machines. The name Mac Pro is stupid as hell, so lets stop calling it that until if/when its official...I vote for MacTower and MacTower Pro...



Dude, unfortunately MacTower and MacTower Pro are the most ridiculous names I have heard of suggested. Highly un-Apple like. I say let's stick to Mac Pro until something better/ official comes out, and I say let's stick to Conroe and Woodcrest until Intel's naming makes sense [Core 2 Duo/ Extreme Core Xeon WTF Core this Core that].

edit:
Well, at least I will stick to Yonah, Merom, Conroe and Woodcrest. All the Intel Core designations make my brain hurt. And to me, Xeon is an ugly, dirty word. A Woodcrest-Xserve sounds delicious. but "Xserve-now driven by Xeon" - makes me sad.
post #173 of 566
[QUOTE]Originally posted by JeffDM
...All I have is that it's from an audio interview with a Final Cut plug-in developer that said that sometimes the output is sometimes slightly unpredictable and not as good from a professional quality standpoint. I don't even know how I can dig that up. I think it makes sense given how long it takes to render video when the preview is done in real time.



Good enough for most of us If KimKapSol wants more proof he can call up Apple or kidnap an Apple developer to interrogate*.

*Imagine KimKapSol with Apple developer in dark room, single table lamp shining at Apple developer strapped up to lie detector and electro-shock device.

"Are you an Apple developer?"
YES
[Ding!]

"Do you work on Final Cut and iMovie?"
YES
[Ding!]

"Do you own an Alienware PC"
OF COURSE NOT
BZZZTZTTTTTTT!!!
ARGHH! OKAY OKAY I DO OWN AN ALIENWARE
[Ding!]

"Am I a cool guy?"
UMMM... YES?
BZZZTZTTTTTTT!!!

post #174 of 566
Thread Starter 
Quote:
Originally posted by a_greer
I bet that they will just call the damn thing "sex". It may attract the Aleinware buyers, they have never had "sex"!

Hey I own an alienware dork, and it's an awesome machine. The reason I got it is because they offered all the stuff I needed when Apple wasn't and they still don't offer a lot it's features - yet.

BTW, I'm still hoping they call it PimpZilla!!!!
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
post #175 of 566
Quote:
Originally posted by kim kap sol
Wrong.

No, he was perfectly correct.
post #176 of 566
Hmm... I'm playing around with a 7500x7500 pixel image in Photoshop on PC. I'm taking screenshots of google map tiles and making one big map. Zooming in and out and moving certain tiles around (each tile is just less than 1280x1024) things are pretty fluid. Saving and loading is what takes time as it writes the whole 24bit-color file, sometimes with layers, to disk.

I'm wondering how much of the fluidity is the Athlon64 Venice 2.15ghz and how much is the nVidia 6600GT 128mb DDR3 VRAM.

We seem to have only touched the surface in looking at Photoshop operations* and the engineering on PC and Mac, on the weighting of CPU vs GPU.

*Obviously operations and previews are CPU-driven, GPU roles are more on the display of the image and panning around a high-res file.
post #177 of 566
Quote:
Originally posted by sunilraman
I'm wondering how much of the fluidity is the Athlon64 Venice 2.15ghz and how much is the nVidia 6600GT 128mb DDR3 VRAM.

The graphics card has virtually no effect.

Quote:
*Obviously operations and previews are CPU-driven, GPU roles are more on the display of the image and panning around a high-res file.

No, not even that. The GPU just gets a bunch of lines and blits them on the screen. It doesn't do any of the handling.
post #178 of 566
Thread Starter 
Quote:
Originally posted by sunilraman
Hmm... I'm playing around with a 7500x7500 pixel image in Photoshop on PC. I'm taking screenshots of google map tiles and making one big map. Zooming in and out and moving certain tiles around (each tile is just less than 1280x1024) things are pretty fluid. Saving and loading is what takes time as it writes the whole 24bit-color file, sometimes with layers, to disk.

I'm wondering how much of the fluidity is the Athlon64 Venice 2.15ghz and how much is the nVidia 6600GT 128mb DDR3 VRAM.

We seem to have only touched the surface in looking at Photoshop operations* and the engineering on PC and Mac, on the weighting of CPU vs GPU.

*Obviously operations and previews are CPU-driven, GPU roles are more on the display of the image and panning around a high-res file.

All your computations are in 2 dimensions, maybe even one. I don't think much of it calls to the GPU if any.
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
post #179 of 566
Quote:
Originally posted by onlooker
All your computations are in 2 dimensions, maybe even one

hmmmmm...
A PSD with zero hight?
alles sal reg kom
Reply
alles sal reg kom
Reply
post #180 of 566
That's kind of like the "if a tree falls in the forest and nobody hears it did it really fall?"... "if a photoshop file is made with zero height does it really exist?" - remember zero height not 1 pixel high. Actually gar - a 1D PSD is 1pixel high and x pixels wide.
post #181 of 566
gar and sunilraman may want to read more closely:

Quote:
Originally posted by onlooker
All your computations are in 2 dimensions, maybe even one.

That refers to computations applied to image data, not to the images themselves.
post #182 of 566
You still haven't answered my question: "if a photoshop file is made with zero height does it really exist?"
post #183 of 566
Quote:
Originally posted by Placebo
It's properly designed 2D apps that can benefit heftily from these cards. Have you ever played around in Core Image Funhouse with a 9800 or better graphics card? Compared to performing the same blurs and distortions in Photoshop or Fireworks, it's ridiculously fast. Literally as fast as you can drag the blur slider.

Sadly, Adobe don't use Core Image so your point is moot for most graphic design users. I suspect Adobe still won't use Core Image in CS3 either.

I'm also not sure if Core Image is accurate enough for actual print or high res design work too but I've not dug about in the developer docs to see if it supports the colourspaces and sizes used in traditional design.

As far as I've been aware of it, it's a neat trick for previews and effects on screen or even for just flinging pixels about when you really aren't that bothered about accuracy. How you mix in that creative freedom with the often constrained structures of a design grid and pantone swatches, I don't know.
post #184 of 566
[QUOTE]Originally posted by aegisdesign
...As far as I've been aware of it, [Core Image is] a neat trick for previews and effects on screen or even for just flinging pixels about when you really aren't that bothered about accuracy. How you mix in that creative freedom with the often constrained structures of a design grid and pantone swatches, I don't know...



I think in terms of contribution to computer graphics in general Core Image is a worthwhile piece of R&D out of Cupertino.

I'm not into 3D at the moment but 3D graphics faces a similar situation: How much of the 3D done in-GPU is actually used in the final render passes? Possibly not much, but as 3D cards approach photorealism levels the "hand-off" of 3D renders to the CPU will become more streamlined.

Maybe one day we'll see a similar thing for 2D graphics. Suffice to say that if Core Image-type abilities become more accurate, moving towards a full non-destructive, GPU-accelerated, node-based workflow in a Photoshop-like application would be a massive step for graphic artists.
post #185 of 566
Quote:
Originally posted by JeffDM
I'm not trying to prove anything, if I'm wrong, I'd like to know why. All I have is that it's from an audio interview with a Final Cut plug-in developer that said that sometimes the output is sometimes slightly unpredictable and not as good from a professional quality standpoint. I don't even know how I can dig that up. I think it makes sense given how long it takes to render video when the preview is done in real time.

This isn't about the CPU rendering the final output...I wanna know who told you the 'output' from the GPU is 'unpredictable'.
post #186 of 566
Quote:
Originally posted by kim kap sol
This isn't about the CPU rendering the final output...I wanna know who told you the 'output' from the GPU is 'unpredictable'.

The GPU doesn't use high precision.
post #187 of 566
Quote:
Originally posted by Chucker
The GPU doesn't use high precision.

It doesn't. What makes you say that?


Could you imagine if WYSINWYG in Aperture?
post #188 of 566
Quote:
Originally posted by kim kap sol
Could you imagine if WYSWNWYG in Aperture?

The quality is more than good enough for screen display.
post #189 of 566
Quote:
Originally posted by Chucker
The quality is more than good enough for screen display.

What's high precision to you?
post #190 of 566
24-bit? 32-bit? 64-bit?

Almost all modern GPUs are 32-bit. Is that not good enough for print...it's definitely good enough for video.
post #191 of 566
Actually, I think I'm totally off base with Core Image and it's suitability for use in Photoshop.

http://developer.apple.com/macosx/coreimage.html makes it pretty clear of it's intended purpose and accuracy. How Core Image renders the image on screen however might be an issue if the GPU isn't as accurate as doing it longhand with the CPU but that wouldn't stop Adobe say from using CI for doing the hard work, then it's just up to Apple to get it's CI code right.

I wonder though if Adobe would junk all it's filter and effect code not to mention it's layers code on the Mac to use CI when there's nothing like that on Windows (at least not before Vista). It would make Photoshop an absolute screamer of a product on the Mac.
post #192 of 566
Quote:
Originally posted by aegisdesign
Actually, I think I'm totally off base with Core Image and it's suitability for use in Photoshop.

http://developer.apple.com/macosx/coreimage.html makes it pretty clear of it's intended purpose and accuracy. How Core Image renders the image on screen however might be an issue if the GPU isn't as accurate as doing it longhand with the CPU ...

...but it is as accurate. I dunno if that was what you were implying.

Quote:
Core Image changes the game. Developers can now easily create real-time capable image processing solutions that automatically take full advantage of the latest hardware without worrying about future architectural changes. Even better, Core Image can perform its processing using 32-bit floating point math. This means that you can work with high bit-depth images and perform multiple image processing steps with no loss of accuracy.

Quote:
Because Core Image uses 32-bit floating point numbers instead of fixed scalars, it can handle over 10^25 colors. Each pixel is specified by a set of four floating point numbers, one each for the red, green, blue, and alpha components of a pixel. This color space is far greater than the human eye can perceive. This level of precision in specifying color components allows image fidelity to be preserved even through a large number of processing steps without truncation.

Quote:
As you have seen, Core Image changes the game of image processing. It gives application developers the ability to create applications that can fully utilize the performance and capabilities of modern graphics hardware. It allows for manipulation of deep bit images with incredible accuracy and color fidelity. And finally, Image Units defines a new way to share image processing capabilities between applications and paves the road for a marketplace of plug-ins that can be used by any image processing application on the system that supports Core Image.
post #193 of 566
Thread Starter 
Quote:
Originally posted by aegisdesign
Sadly, Adobe don't use Core Image so your point is moot for most graphic design users. I suspect Adobe still won't use Core Image in CS3 either.

I'm also not sure if Core Image is accurate enough for actual print or high res design work too but I've not dug about in the developer docs to see if it supports the colourspaces and sizes used in traditional design.

As far as I've been aware of it, it's a neat trick for previews and effects on screen or even for just flinging pixels about when you really aren't that bothered about accuracy. How you mix in that creative freedom with the often constrained structures of a design grid and pantone swatches, I don't know.


Sadly Adobe didn't use Altivec until they announced it. It could happen. I think it could be beneficial to their users. Which is why Adobe used Altivec.
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
post #194 of 566
Quote:
Originally posted by kim kap sol
...but it is as accurate. I dunno if that was what you were implying.

I was stating that Core Image is accurate enough. The GPU isn't necessarily accurate enough.

At the moment, the responsibility for accuracy is all due to Adobe and presumably they like it that way as it's their reputation they have on the line. If they used CI, the responsibility lies with Apple and the card manufacturer.

Quote:
Originally posted by onlooker
Sadly Adobe didn't use Altivec until they announced it. It could happen. I think it could be beneficial to their users. Which is why Adobe used Altivec.

It could absolutely ROCK if Adobe used it as it would totally SCREAM by comparison to CS2 or the Windows version, at least pre-Vista which IIRC has something similar to Core Image. It's not as simple a change as dropping in a new rendering engine in though like they did with Altivec and the G5. They'd have to code up Image Units for each of their layer/filter/effect tools. Most designers also rely on an army of addon effects like AlienSkin too.

Maybe that's why it's taking so long for a Universal Binary. Maybe Adobe just decided that missing out on CI or whatever they have on Windows Vista would be stupid.
post #195 of 566
Quote:
Originally posted by aegisdesign
I was stating that Core Image is accurate enough. The GPU isn't necessarily accurate enough.

Ok...that's a given. Limitations come from the hardware. I haven't seen something other than a 32-bit precision GPU in 3 years though.
post #196 of 566
Quote:
Originally posted by aegisdesign
Maybe that's why it's taking so long for a Universal Binary. Maybe Adobe just decided that missing out on CI or whatever they have on Windows Vista would be stupid.

"Taking long"? It's not taking long. It's a perfectly normal Adobe CS release cycle. It just so happened that, because they're money-seeking pricks, they refuse to release the current(!) version as Universal Binary, unlike, oh, pretty much anyone else.
post #197 of 566
Thread Starter 
Quote:
Originally posted by Chucker
"Taking long"? It's not taking long. It's a perfectly normal Adobe CS release cycle. It just so happened that, because they're money-seeking pricks, they refuse to release the current(!) version as Universal Binary, unlike, oh, pretty much anyone else.

I don't think it's even been ported to Mach-O, and I think It's taking so long because they didn't move it to Mach-O when Apple advised it over a year (almost two) ago.
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
post #198 of 566
[QUOTE]Originally posted by aegisdesign
I was stating that Core Image is accurate enough. The GPU isn't necessarily accurate enough.



We seem to be confused by this point. How is Core Image implemented? How is the GPU "not" being accurate. Looks like we need to explore this point more.

Evidence of iMovie and Final Cut still requiring CPU renders may mean Core Image filters coded to be fast but less accurate.

Does this mean that Core Image can still produce accurate-enough on-the-fly GPU-driven renders if coded to do so???

Just because some Core Image filters are coded to be not-so-accurate does not mean Core Image does not have the potential to be accurate, I think.
post #199 of 566
Well, they do use bundles, which usually is good indication for Mach-O (I'm not sure if CFM in a bundle is even possible). However, IIRC, they still use CodeWarrior, despite it having been clear for years now that CodeWarrior on Mac OS isn't going to see much of a revival. They should have listened to Apple regarding moving to Xcode. They decided not to. They keep claiming Xcode isn't mature enough, but you can easily tell how much bullshit that is by the fact that Apple, Omni and many other software companies have been able to ship insanely complex software quite well. I have no respect for it.
post #200 of 566
Quote:
Originally posted by sunilraman
We seem to be confused by this point. How is Core Image implemented? How is the GPU "not" being accurate. Looks like we need to explore this point more.

The basic point is that 3D apps usually use the GPU for fast preview rendering, using OpenGL acceleration, but then use the CPU for precise final renders.

Core Image uses OpenGL (relying on ARB_fragment_program, for instance). Thus, it seems reasonable to assume that Core Image suffers the same precision problem.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Current Mac Hardware
AppleInsider › Forums › Mac Hardware › Current Mac Hardware › Intel unleashes Mac-bound "Woodcrest" server chip