or Connect
AppleInsider › Forums › Software › Mac Software › Apple's Intel Aperture 1.1 Update pushed back
New Posts  All Forums:Forum Nav:

Apple's Intel Aperture 1.1 Update pushed back - Page 3

post #81 of 112
Quote:
Originally posted by bikertwin
[I suppose AppleScript dictionary complexity might give a hint as to relative complexity, but Apple's pro apps--other than Aperture--aren't scriptable. (See, melgross, I'm no Apple apologist.) [/B]

post #82 of 112
Quote:
Originally posted by ecking
How come people always say stuff like that? For some reason everyone seems to be ok with a pro user using a mbp but not an imac.

They're the same machine.
Same ram lock-in.
Same video card lock-in.

An imac is acutally better than a mbp because of the 3.5" hd.

Plenty of pros do their rough cuts on pb and finish later on something better, plenty of pros used to find pbs good enough for apeture for live capture on shoots. Plenty of pros used pbs to run logic or pro tools setups that were flexible.

Right, but a notebook is an acceptable compromise. You're accepting the lock-in because (a) it's portable, and (b) it's not your primary machine.

Do you typically see rooms full of iMacs in professional settings?
post #83 of 112
Quote:
Originally posted by melgross
The problem with your assertions is that you make the mistake of "blaming", rather than to think about what it takes to produce the software packages.

Again, you disregard what I say, and just go on as though I've said nothing.
It can get frustrating to constantly reply, when you pretend that I've said things that I didn't say, and that I didn't say things that I did.

I will repeat this for the last time.

I NEVER said that Shake should be out now. I never said that I EXPECTED Shake to be out now.

What you said is, "I just think that it's hypercritical (yes hyper) to chastise Adobe, considering all of the work that involves, while giving Apple a free ride."

Please correct me if I'm wrong. You're saying that Adobe has spent as much effort as Apple in producing UBs. What I'm saying is that, since FCSuite is easily as complex as Creative Suite, and that FCSuite is already out in UB and Creative Suite won't be out in its current version (ever) as a UB, that Apple has clearly expended a lot more effort for its customers.

If I understand you correctly, you've used the absence of Shake as an example of Apple not working any harder than Adobe. I and others have said that the lack of Shake is simply a prioritization by Apple, since there's no high-powered hardware to run it, and thus Apple has expended all its UB effort on its consumer (iLife, iWork) and other Pro apps (Aperture, FCSuite, etc.).

Do I understand you correctly?
post #84 of 112
Quote:
Originally posted by bikertwin
What you said is, "I just think that it's hypercritical (yes hyper) to chastise Adobe, considering all of the work that involves, while giving Apple a free ride."

Please correct me if I'm wrong. You're saying that Adobe has spent as much effort as Apple in producing UBs. What I'm saying is that, since FCSuite is easily as complex as Creative Suite, and that FCSuite is already out in UB and Creative Suite won't be out in its current version (ever) as a UB, that Apple has clearly expended a lot more effort for its customers.

If I understand you correctly, you've used the absence of Shake as an example of Apple not working any harder than Adobe. I and others have said that the lack of Shake is simply a prioritization by Apple, since there's no high-powered hardware to run it, and thus Apple has expended all its UB effort on its consumer (iLife, iWork) and other Pro apps (Aperture, FCSuite, etc.).

Do I understand you correctly?

Not quite. But closer.

I do think that Adobe has spent as much time as is possible on CS3 and Universalization. I don't know what "as much effort" could be, since they are different companies. But I do believe that Adobe is doing their damnedest to get this out. don't forget that if they didn't buy Macromedia when it came up for sale, then no doubt, we would have the suite sooner.

I don't agree that FCP Suite is as complex as CS3 will be. Apple surely hasn't expended any more effort for its customers than the owner of the OS and the machines it, and its software run on, should be. Apple MUST be in the lead here. If not, then other companies will point to Apple and say; If they aren't rushing, maybe they know something. Therefore, why should we?

If you want to point to the fact that video editing is, as, or more difficult, than work with PS is, as some have said (I really don't remember if you said that as well, so bear with me here), then perhaps Apple should also have not wasted time on the older programs, and did what Adobe is doing, and skipped to getting the new ones out the door sooner. After all, it's the same team. If we don't see FCP Studio 6 by NAB this month, we will know why! As I use it, I would rather wait longer for a new version, than to get the old one UV'd, and then have to wait even LONGER to get the new version. That goes for FCP as well as Shake.

Do you really believe that when Apple said, in June; Gentlemen (and ladies), start your engines!, that they hadn't been already working on their own programs from at least the previous January? Perhaps, they were working on them from a year before. I wouldn't be surprised, would you?

Don't forget that Apple said that the first machines would be out "by June". And most people, including most here, thought that the professional machines would not be among the first. In fact, the MBP wasn't expected until the Merom, third quarter! Remember how many people said that Apple would NEVER release it with the Yonah? The "Powermacs" weren't expected until the end of 2007, well after the time that CS3 came out. Do you have any evidence that Apple told Adobe that they were moving the schedule up by a full year?

People were complaining that FCP Suite wasn't ready during the January Macworld. We saw that here. Do you think that's fair? I don't.

Again, I don't think that either of these companies is doing anything other than the very best that they can do to get their works out the door.

I think we should leave it at that.
post #85 of 112
Originally posted by melgross
Again, I don't think that either of these companies is doing anything other than the very best that they can do to get their works out the door....I think we should leave it at that.



The next question then that naturally follows is how will sales and profits of [hardware and software] for Apple and [software] for Adobe be affected? One would imagine Apple has iPod + related stuff + consumer side stuff to stay on course (although it's stock may be unfavoured for a while) ; while Adobe has a huge PC market to keep it chugging along.

I think through this very heated debate we have come to a better understanding of what is going on. It is probably true that some (most?) of us didn't read the blog links of the Adobe developers, and only when AppleInsider posted excerpts from it did we become aware of that side of things.

I'm not saying this just to please Melgross, but my initial vehemence for Adobe has subsided, now that I better understand the issue. Like most of us said, not much point "assigning blame" but clearly there were some areas both Apple and Adobe could have addressed better, in a perfect world, so that by now we'd have all apps as Universal Binaries.

I bring up the sales and profit issue for Apple (I assume Adobe|Macromedia is well covered by PC software sales) because it seems like this might kill a lot of their momentum that has been zooming along. Do you all think so? The flip side is that as Melgross and others have discussed is that the pro market operates in a slightly different buying pattern that Apple may still be well covered sales and profit-wise through 2006 and Rosetta is a strong enough proposition for pros to buy new machines through 2006.

Certainly all in all a complex state and I would not envy Apple management navigating this Intel transition. AAPL stock may hover at $60 or so until analysts see a bit more "positive" (whatever that may mean) or "stronger" pushing forward of the Intel transition.

Phew. Thanks for your time
post #86 of 112
Quote:
Originally posted by bikertwin
Right, but a notebook is an acceptable compromise. You're accepting the lock-in because (a) it's portable, and (b) it's not your primary machine.

Do you typically see rooms full of iMacs in professional settings?

No I don't because professional settings don't follow the latest hardware trends and dump their machines multipul times a year to stay up to date.

They buy what they need and keep it as long as it's useful, usually years. So any pros that need stuff now can and will buy the stuff available now.

That's why apple still have ppc stuff, if I was a post house that needed stuff now I'd get quads etc, not hold out and complain until all the stuff I want is ported.

Why?

Because I won't get new computers for like 3 years anyways and my ppc won't be obsolete in 3 years, not by a long shot.

Even then everything coming out is universal and with a machine as strong as a quad I can easily run anything they throw at me in the next 3 years.

Obviously it matters more to people that aren't pros because it's a much more serious investment to them.

All the pros I know shrug at the stuff they buy for their jobs because as they say:
"It'll pay for itself in 1-2 months easy."
And it does.

Anyone that does real editing has their stuff given to them by their company or the stuff pays itself off.

Anyone needing and buying stuff now aren't going to left in the cold anytime soon.

And if you're werid and must wait a imac as a "compromise" isn't the worst thing to go through because like I said in the post you quote it can do the vast majority of work out there. And after whatever it is your waiting for drops you have a nice home computer. Or do your work on a mbp and screen span it, and then eventually you've got a nice notebook to put with your setup. That should be enough.

If its not then like I said you've already got stuff because you're currently working anyways.
Quote:
Originally Posted by appleinsider vBulletin Message

You have been banned for the following reason:
Three personal attacks in one post. Congratulations.
Date the ban will be lifted:...
Reply
Quote:
Originally Posted by appleinsider vBulletin Message

You have been banned for the following reason:
Three personal attacks in one post. Congratulations.
Date the ban will be lifted:...
Reply
post #87 of 112
Here's what I used to do at my company.

We had, I seem to remember 24 Macs. We also had, I think 15 PC's.

The Mac's were used for the publishing, graphics, video, and photo departments. The PC's were used for the front counter, and for accounting, except for two units we had for file translation purposes, and for when we had to do something that really needed absolute PC compatibility that wasn't available when doing the work on a Mac.

For the front counter and accounting, we didn't replace the PC's until we really had to, because those uses weren't dependent so much on power. They would last 6 years.

But for the production areas, it was different.

I had a schedule that was fairly consistent.

I would buy new machines every year. But only to replace about one third of those we had. The highest end users would get a new machine. Those machines would be moved down the ladder, as would the machines at that level. The machines that were entirely bumped (now usually 3 years old), were offered to the employees, or offered to several schools.

This was repeated every year.

Perviously to the G5 models, we could upgrade the cpu's, and get another years worth out of them. We almost never upgraded the video cards. That was a waste of time and money. We always bought as much RAM as needed when first buying the machine. Same with the internal HD's. External drive towers were used for most files, as well as backup tape drives and DVD's.

Most businesses in the same business as we were, do about the same thing, though every company has some unique needs.

As Ecking says, few companies are going to rush out and buy new Intel machines. Even when CS2 is available, and the Intel PM's, most companies will buy, at most, one for evaluation purposes. After perhaps three months, or when the new buying cycle comes about, if ALL of the supporting software, from ALL other third party companies is available, and shown to be PRODUCTION READY, then companies will start buying in numbers.
post #88 of 112
Quote:
Originally posted by melgross
Here's what I used to do at my company.

We had, I seem to remember 24 Macs. We also had, I think 15 PC's.

The Mac's were used for the publishing, graphics, video, and photo departments. The PC's were used for the front counter, and for accounting, except for two units we had for file translation purposes, and for when we had to do something that really needed absolute PC compatibility that wasn't available when doing the work on a Mac.

For the front counter and accounting, we didn't replace the PC's until we really had to, because those uses weren't dependent so much on power. They would last 6 years.

But for the production areas, it was different.

I had a schedule that was fairly consistent.

I would buy new machines every year. But only to replace about one third of those we had. The highest end users would get a new machine. Those machines would be moved down the ladder, as would the machines at that level. The machines that were entirely bumped (now usually 3 years old), were offered to the employees, or offered to several schools.

This was repeated every year.

Perviously to the G5 models, we could upgrade the cpu's, and get another years worth out of them. We almost never upgraded the video cards. That was a waste of time and money. We always bought as much RAM as needed when first buying the machine. Same with the internal HD's. External drive towers were used for most files, as well as backup tape drives and DVD's.

Most businesses in the same business as we were, do about the same thing, though every company has some unique needs.

As Ecking says, few companies are going to rush out and buy new Intel machines. Even when CS2 is available, and the Intel PM's, most companies will buy, at most, one for evaluation purposes. After perhaps three months, or when the new buying cycle comes about, if ALL of the supporting software, from ALL other third party companies is available, and shown to be PRODUCTION READY, then companies will start buying in numbers.

Word.
Quote:
Originally Posted by appleinsider vBulletin Message

You have been banned for the following reason:
Three personal attacks in one post. Congratulations.
Date the ban will be lifted:...
Reply
Quote:
Originally Posted by appleinsider vBulletin Message

You have been banned for the following reason:
Three personal attacks in one post. Congratulations.
Date the ban will be lifted:...
Reply
post #89 of 112
Originally posted by melgross
Here's what I used to do at my company.....But for the production areas, it was different....As Ecking says, few companies are going to rush out and buy new Intel machines. Even when CS2 is available, and the Intel PM's, most companies will buy, at most, one for evaluation purposes. After perhaps three months, or when the new buying cycle comes about, if ALL of the supporting software, from ALL other third party companies is available, and shown to be PRODUCTION READY, then companies will start buying in numbers.



Remember that this would mean that Apple and Adobe revenue would continue to derive from PowerMac G5s, Final Cut Studio etc, and CS2 Suite respectively.

It would suggest that any sales and profit growth via Conroe/Woodcrest and CS3 for the pro market would be off by a year. Meaning, it's status quo on the pro scene, so drivers for growth of profits and Apple stock price would be Leopard, and strong consumer side offerings (Macbook, new(?) iPods)...
post #90 of 112
Quote:
Originally posted by sunilraman
Originally posted by melgross
Here's what I used to do at my company.....But for the production areas, it was different....As Ecking says, few companies are going to rush out and buy new Intel machines. Even when CS2 is available, and the Intel PM's, most companies will buy, at most, one for evaluation purposes. After perhaps three months, or when the new buying cycle comes about, if ALL of the supporting software, from ALL other third party companies is available, and shown to be PRODUCTION READY, then companies will start buying in numbers.



Remember that this would mean that Apple and Adobe revenue would continue to derive from PowerMac G5s, Final Cut Studio etc, and CS2 Suite respectively.

It would suggest that any sales and profit growth via Conroe/Woodcrest and CS3 for the pro market would be off by a year. Meaning, it's status quo on the pro scene, so drivers for growth of profits and Apple stock price would be Leopard, and strong consumer side offerings (Macbook, new(?) iPods)...

Aha! And this is precisely what the analysts have been saying. It's the December quarter when Apple will BEGIN to see the fruits of the switch.

They expect 2007 to be a very good year, as Frank used to say.
post #91 of 112
Just read the Aperture 1.1 review on Ars.

What this review has taught me RAW conversion is pretty subjective. Because RAW conversion itself is just a software interpretation without any concrete standards for how it needs to render the final image.

The reviewer just looks at the image and says "I like that one and I don't like that one." That type of analysis has no relationship to the actual picture the camera took.

In some situations Aperture 1.0 could have been showing the literal picture the camera took with its noise, digital artifacts, and all. While Aperture 1.1 and Adobe Lightroom are masking and smoothing noise and digital artifacts.
post #92 of 112
Quote:
Originally posted by TenoBell
Just read the Aperture 1.1 review on Ars.

What this review has taught me RAW conversion is pretty subjective. Because RAW conversion itself is just a software interpretation without any concrete standards for how it needs to render the final image.

The reviewer just looks at the image and says "I like that one and I don't like that one." That type of analysis has no relationship to the actual picture the camera took.

In some situations Aperture 1.0 could have been showing the literal picture the camera took with its noise, digital artifacts, and all. While Aperture 1.1 and Adobe Lightroom are masking and smoothing noise and digital artifacts.

BEIGE is ghey. So is Ars. Just leave BEIGE and Ars be.
post #93 of 112
Nice. Calling things "gay" is so last century, dude. WTF
post #94 of 112
Quote:
Originally posted by TenoBell
Just read the Aperture 1.1 review on Ars.

What this review has taught me RAW conversion is pretty subjective. Because RAW conversion itself is just a software interpretation without any concrete standards for how it needs to render the final image.

The reviewer just looks at the image and says "I like that one and I don't like that one." That type of analysis has no relationship to the actual picture the camera took.

In some situations Aperture 1.0 could have been showing the literal picture the camera took with its noise, digital artifacts, and all. While Aperture 1.1 and Adobe Lightroom are masking and smoothing noise and digital artifacts.

To a certain extent, that's correct. But, after a while of doing this work, either your own, or for others, you do get the experiance required to "know" what is correct.

One of the reasons why I would rather hire photographers, and teach them to use PS, rather than the other way around, is that I can always teach someone to use a program well, but I can't teach them to have a good "eye".
post #95 of 112
Originally posted by melgross
Aha! And this is precisely what the analysts have been saying. It's the December quarter when Apple will BEGIN to see the fruits of the switch. They expect 2007 to be a very good year, as Frank used to say.



Who the F*** is Frank? Well... anyway yeah, Apple still has a few quarters to deliver outstanding profits and revenues before the fruits ripen. On a scale of 1-10, this year Steve will be turning the RDF up to 12.
post #96 of 112
Originally posted by TenoBell
Just read the Aperture 1.1 review on Ars. What this review has taught me RAW conversion is pretty subjective. Because RAW conversion itself is just a software interpretation without any concrete standards for how it needs to render the final image...In some situations Aperture 1.0 could have been showing the literal picture the camera took with its noise, digital artifacts, and all. While Aperture 1.1 and Adobe Lightroom are masking and smoothing noise and digital artifacts.



That is interesting.
post #97 of 112
Quote:
Originally posted by sunilraman
Originally posted by melgross
Aha! And this is precisely what the analysts have been saying. It's the December quarter when Apple will BEGIN to see the fruits of the switch. They expect 2007 to be a very good year, as Frank used to say.



Who the F*** is Frank? Well... anyway yeah, Apple still has a few quarters to deliver outstanding profits and revenues before the fruits ripen. On a scale of 1-10, this year Steve will be turning the RDF up to 12.

Frank Sinatra? The Song? The line?

Hey, I know where you live, but, that's no excuse!
post #98 of 112
Heh. Not a big Sinatra fan myself. Too old skool. You're showing your age, Mel. You should be listening to stuff on this website: www.di.fm (the top several channels, not the channels at the bottom)

DUF DUF DUF DUF DUF DUF DUF w00t !!!!!!!!!!!!!
post #99 of 112
Quote:
Originally posted by sunilraman
Heh. Not a big Sinatra fan myself. Too old skool. You're showing your age, Mel. You should be listening to stuff on this website: www.di.fm (the top several channels, not the channels at the bottom)

DUF DUF DUF DUF DUF DUF DUF w00t !!!!!!!!!!!!!

I'm not a big Sinatra fan myself. I have one of his albums. But, a famous song is still a famous song.

And, I think I look good for my age!
post #100 of 112
Quote:
To a certain extent, that's correct. But, after a while of doing this work, either your own, or for others, you do get the experiance required to "know" what is correct.

That's extremely subjective and leaves wide margin for personal taste. This also leaves any real useful evaluation of RAW converters based strictly on opinion.

Admitedly its a foreign concept to me because motion film color software does not work this way at all.
post #101 of 112
Originally posted by TenoBell
That's extremely subjective and leaves wide margin for personal taste...


Well, that can play to the photographer's strengths. If the RAW files as interpreted by the software have enough resolution and minimal noise and a wide enough latitude of colors and textures, and the software is able to provide good tweaking and organising of the imagery without resorting to having to always open photoshop, then the software is useful to the photographer and their workflow.

That's my understanding of it. Whatever RAW file as seen by the software is almost never the final print(s) that the photographer will deliver. In the analog darkrooms there were always tons of "tweaks" you could do in the developing process. I suppose Aperture seeks to do the same, be a complete "digital darkroom" to organise and tweak the imagery -- but with the choosing of photos and adjusting of the photos -- that's all done "by eye" as melgross mentions:

AFAIK no pro photographer will just take that RAW file and go, okay, it's "done". The idea of RAW is that it has the maximum digital information possible that will "survive" the tweaking process, and that RAW retains the most color information possible... A pro photographer makes a living by developing (heh... pun unintended) a style and workflow that produces distinct results that clients desire. The photographer will work out a balance of how much of that style is realised "in camera" (during the shooting process) and how much the style is realised in post production. Again, if the software fits into their workflow and process of adjusting/finalising the image to their taste, then the software will be useful to them.

It IS interesting that you mention that the software shows an image that "has no relationship to the actual picture the camera took" and that different software seems to be processing RAW differently -- that's why photography is an art. Even if we are talking taking photos of cells, MRI etc, in *science*, the process of observation itself affects the results.

It IS also interesting that gone are the days of choosing film stock and choosing ISO and all that -- with digital it's all down to the way a particular digital SLR's CCD/CMOS handles various shots and light conditions -- that's your "film stock and ISO settings" there. Which I still find weird because digital shots have such a "harshness" to it and sometimes old skool film stock still can produce some beautiful results you can't replicate in digital.
post #102 of 112
How does motion film color software work?
post #103 of 112
Quote:
Originally posted by sunilraman
Nice. Calling things "gay" is so last century, dude. WTF

Correction...'ghey'...it's this century's way of calling things.
post #104 of 112
Quote:
Originally posted by sunilraman
Heh. Not a big Sinatra fan myself. Too old skool. You're showing your age, Mel. You should be listening to stuff on this website: www.di.fm (the top several channels, not the channels at the bottom)

DUF DUF DUF DUF DUF DUF DUF w00t !!!!!!!!!!!!!


That's a pretty sweet site! Thanks!!!!!

You should still know who Frank Sinatra is though.

Otherwise you are just ghey!




post #105 of 112
Quote:
If the RAW files as interpreted by the software have enough resolution and minimal noise and a wide enough latitude of colors and textures, and the software is able to provide good tweaking and organising

I'm not talking about the ability to manipulate the image. I'm talking about taking a basic RAW picture. Rendering it through different RAW converters from different manufacturers and objectively evaluating the results.

Each RAW converter has to place the image inside of a color space container and some type of limited color gamut. The only way to have a fully objective evaluation of each rendering is to have a starting point of neutrality in respect to color and contrast.

There needs to be some point for neutral skin tone reproduction, neutral color, and neutral gray scale. The only way to do this would be for all RAW converters to work under the same color space.

From what I recently learned these RAW converters have three different color space gamuts they could work under, Adobe RGB, sRGB, and ProPhoto RGB. The fact that they may not use the same color space to me completely devalues any real evaluation between the converters. The histograms between all of the renders will be completely different.

My next question would be if all of the converters did use the same color space such as Adobe RGB, would the luminance and chrominance values mean the same thing between all of them.

On top of that each converter is applying sharpening and contrast to hide noise and digital artifacts in the picture. From what I can see this is primarily what most of the evaluation is based upon. Which converter most effectively hides defects.

Under this type of evaluation Aperture 1.0 did the worst job at that. But it was probably the one showing the most honest truth about the way the picture really looked.
post #106 of 112
Quote:
How does motion film color software work?

Film contains dyes layered over a transparent plastic base. Its color reproduction is limited to real physical rules.

Motion picture film is evaluated within its ability to record accurate skin tone, color reproduction, and gray scale. This process has its own standard number system called printer lights that lets the DP know at what exposure and film density a particular film stock records proper color.

To gain an understand of a particular film stocks ability to do this you shoot what called a lab density test. You have a model, a black/white chart, and a color chart. You shoot this scene underexposed, properly exposed, and over exposed. You develop this test and print it for projection.

After development the DP will be given a lab report with printer lights on it. There are three sets of number for RGB that range from 1-60. 35-45 is in the middle and considered normal exposure. With in normal exposure you should see accurate skin tone, color, and gray scale reproduction.

Quote:
It IS also interesting that gone are the days of choosing film stock and choosing ISO and all that -- with digital it's all down to the way a particular digital SLR's CCD/CMOS handles various shots and light conditions -- that's your "film stock and ISO settings"

This has been done with film for years. Its called pushing or pulling film to change its ISO. To increase the film ISO the lab will leave film in developing chemicals a bit longer, to decrease the film ISO the lab will take the film out of its developing chemicals sooner than normal.

The DP can also shift printer lights above 45 or below 35 and that will increase or decrease print density which is the same as effecting its ISO.


Once film is scanned into digital files it moves from photochemical color space to digital video color space. Digital color space is not limited by physical rules but is limited by the amount of data that can be transported, stored, and rendered. This limitation directly affects video recording formats as well as video presentation formats.

Standard definition has a color standard called the 601 ITU color, HD color standard is 709 ITU color, 2K data and 4K data have their own standard color gamuts.

Currently there are no exact numerical values for digital RGB color grading the same way there is for photochemical film manipulation. That system is being ratified right now and should be in place soon.
post #107 of 112
Quote:
Originally posted by TenoBell
I'm not talking about the ability to manipulate the image. I'm talking about taking a basic RAW picture. Rendering it through different RAW converters from different manufacturers and objectively evaluating the results.

Each RAW converter has to place the image inside of a color space container and some type of limited color gamut. The only way to have a fully objective evaluation of each rendering is to have a starting point of neutrality in respect to color and contrast.

There needs to be some point for neutral skin tone reproduction, neutral color, and neutral gray scale. The only way to do this would be for all RAW converters to work under the same color space.

From what I recently learned these RAW converters have three different color space gamuts they could work under, Adobe RGB, sRGB, and ProPhoto RGB. The fact that they may not use the same color space to me completely devalues any real evaluation between the converters. The histograms between all of the renders will be completely different.

My next question would be if all of the converters did use the same color space such as Adobe RGB, would the luminance and chrominance values mean the same thing between all of them.

On top of that each converter is applying sharpening and contrast to hide noise and digital artifacts in the picture. From what I can see this is primarily what most of the evaluation is based upon. Which converter most effectively hides defects.

Under this type of evaluation Aperture 1.0 did the worst job at that. But it was probably the one showing the most honest truth about the way the picture really looked.

I tried to reply to your earlier post before, but after I wrote a fair amount, it crashed, so I gave up for then.

Here's the point about the colorspaces. I'm sure that you understand the problem with limited colorspace. Each device has their own limitations. This is true for film of all types as well, of course.

What colorspaces do is to allow matching between various output devices, and the original material.

If the colorspace is wrong then the result can be either over saturated colors, truncated colors, banding, or all three. At other times it can result in excessively muted colors. It also results in incorrect contrast, incorrect white and black points, etc.

Adobe RGB has been a high end standard for a while. MS and hp invented the sRGB standard to match the cheap 14" color monitors that were being used at the time on most PC's.

As most people use PC's that became a standard for many files that were to be used for reproduction mainly on them, such as web output.

Unfortunately, many companies began to use it inappropriately, simply because it was easy to do.

When you resolve a RAW image file (or any other image file), you have to know what the output will be. For web use, or for any use that will be expected to have a small colorspace, such as a laser printer, sRGB is fine. But for print, or inkjet, or dye-sub, or for Fuji Pictography, or any other hi quality output, you would use Adobe RGB, though there are some other colorspaces that are sometimes used. Then there is CMYK, or course, or Hexachrome.

Work on digital files, or scans is just as precise as that for film. In fact, I've always thought that film was distinctly less precise. I can often see changes between camera's in a scene, or scene to scene. A great deal of film work also depends on a trained eye. The vagary of film and processing ensures that no two film prints will ever be the same.

Kodak specs the film as +- 20CC (CMY) color, and =- 1/3 stop, run to run. Even with filtering in the printers, that can't be reduced to below about =- 5CC, and =- 1/5 stop. Processing adds to that uncertainty.

We monitored our machines very closely, running control strips every two hours, but, some inconsistency always crept in.

This is a VERY complex area, far too complex to receive a fair hearing here.

All I can say is that I also thought that Aperture (ver 1) was giving a less processed file. But when I took my own pictures, and processed them in all three converters, Apple's were the least close to what I had shot, here at home, in controlled conditions.

When I brought the darker areas of the pics done in the other converters up in brightness to match Apertures results. they also gained noise, but not the artifacting. Apple is processing to bring out details in shadows that other converters are not doing. In doing that, it is bringing the noise level up as well. But, it also brought artifacts into the file.

This is not unusual for processing. An example is Digital Ice, used mostly in scanners. This does remove scratches and dust, but it also changes parts of the file. What Aperture 1 was doing, it seemed to me, was giving the file an adjustment that was similar to PS's Highlight/Shadow control, but without the finesse.
post #108 of 112
Originally posted by TednDi
You should still know who Frank Sinatra is though... Otherwise you are just ghey!


I know who Frank Sinatra is, just not that particular reference by Melgross. Ah, you're all ghey !!
post #109 of 112
Originally posted by TenoBell
There needs to be some point for neutral skin tone reproduction, neutral color, and neutral gray scale. The only way to do this would be for all RAW converters to work under the same color space...My next question would be if all of the converters did use the same color space such as Adobe RGB, would the luminance and chrominance values mean the same thing between all of them.


Even if all the digital SLRs moved to Adobe RGB RAW I think the way each camera "interprets" Adobe RGB would be just different. I don't think it would have the precision of film or print as you and Melgross talk about. I just don't see camera manufacturers having that level of standardisation yet.
post #110 of 112
Originally posted by melgross
...What colorspaces do is to allow matching between various output devices, and the original material...Adobe RGB has been a high end standard for a while. MS and hp invented the sRGB standard to match the cheap 14" color monitors that were being used at the time on most PC's...As most people use PC's that became a standard for many files that were to be used for reproduction mainly on them, such as web output... Unfortunately, many companies began to use it inappropriately, simply because it was easy to do...



That's why in web design color calibration is almost meaningless. Unlike film and photography in which a defined physical output is produced, web design depends on how the end-user's machine is calibrated and even how good their monitor is. Within the web design studio one could standardise on Adobe RGB for all work and calibrate all the screens so that work (PSDs or JPGs, etc) passed around within the studio would always have some level of consistency.

I used to want to pull my hair out when we designed something nice, then walked over to like the coding or HR department and see all our work completely mangled by old, shitty, or uncalibrated monitors, or just monitors with way different contrast and brightness settings.

Heh. In the old days when you did a web page in 1024x768 at 16bit color, you'd want to cry when you saw it on some end-user's machine running at 800x600 at 256colours.

In the past few years I've been totally slack with colorspaces and conversions and calibrations. I go for things "by eye" and then just make a quick check on my co-workers' machines. And then just learn to forgive things if the orange isn't quite the orange I thought I was working with. I just make a decision if it looks "good enough" on their screens, then so be it. I've only done ad agency work for brief periods but even then all internal review was off the creatives' screens and the in-house HDTV. A few years back when doing websites for conferences, it is interesting, now that I reflect on it, people like Oracle didn't come back and say "the corporate colours are all messed up". Not even once, strangely. IIRC I'd use the Pantone colours of their corporate identity and then adjust by eye. Sometimes their corporate colors had CMYK and RGB but again, the RGB didn't always work out to what you wanted to show the client.
post #111 of 112
Quote:
Originally posted by sunilraman
Originally posted by TenoBell
There needs to be some point for neutral skin tone reproduction, neutral color, and neutral gray scale. The only way to do this would be for all RAW converters to work under the same color space...My next question would be if all of the converters did use the same color space such as Adobe RGB, would the luminance and chrominance values mean the same thing between all of them.


Even if all the digital SLRs moved to Adobe RGB RAW I think the way each camera "interprets" Adobe RGB would be just different. I don't think it would have the precision of film or print as you and Melgross talk about. I just don't see camera manufacturers having that level of standardisation yet.

You can't interpret a colorspace. That's the entire point of defining it in the first place.

But, you always have to keep in mind the fact that ALL photography, motion picture, tv, still, is an art, as well as a science. This will be true as long as the photographer, or filmmaker has a personal preference as to what (s)he wants to get across.

There are known numbers that we can deal with for grayscale, skin tone, sky color, etc., and we do use them. But, not all skin tone is exactly the same, even among limited population groups. But, it's easier to recognise ruddy complexions, different Asian complexions, and those of Afro-Americans, than you might think. It's the process of judging the relative differences between different parts of the face, whites of the eyes, teeth, etc, that enable a skilled operator to be able to adjust , or tweek this, after the gereralized numbers are applied.

Using white points and black points, most colors snap to where they should be, or very close to where they should be. Some additional gray point checking can adjust this further. Knowing the lighting will help to fill in some of the last questions.

It's only after that decision that the scientific methods of standardization come into play. Then, it's a matter of making sure that the vision is maintained across all manners of reproduction.

That's why the eye of the person working with the files is so important. I knew the preferences of all my clients that I personally worked with. We worked together. After we made sure that their vision was complete, I then made certain that it stayed that way.
post #112 of 112
Quote:
Originally posted by sunilraman
Originally posted by melgross
...What colorspaces do is to allow matching between various output devices, and the original material...Adobe RGB has been a high end standard for a while. MS and hp invented the sRGB standard to match the cheap 14" color monitors that were being used at the time on most PC's...As most people use PC's that became a standard for many files that were to be used for reproduction mainly on them, such as web output... Unfortunately, many companies began to use it inappropriately, simply because it was easy to do...



That's why in web design color calibration is almost meaningless. Unlike film and photography in which a defined physical output is produced, web design depends on how the end-user's machine is calibrated and even how good their monitor is. Within the web design studio one could standardise on Adobe RGB for all work and calibrate all the screens so that work (PSDs or JPGs, etc) passed around within the studio would always have some level of consistency.

I used to want to pull my hair out when we designed something nice, then walked over to like the coding or HR department and see all our work completely mangled by old, shitty, or uncalibrated monitors, or just monitors with way different contrast and brightness settings.

Heh. In the old days when you did a web page in 1024x768 at 16bit color, you'd want to cry when you saw it on some end-user's machine running at 800x600 at 256colours.

In the past few years I've been totally slack with colorspaces and conversions and calibrations. I go for things "by eye" and then just make a quick check on my co-workers' machines. And then just learn to forgive things if the orange isn't quite the orange I thought I was working with. I just make a decision if it looks "good enough" on their screens, then so be it. I've only done ad agency work for brief periods but even then all internal review was off the creatives' screens and the in-house HDTV. A few years back when doing websites for conferences, it is interesting, now that I reflect on it, people like Oracle didn't come back and say "the corporate colours are all messed up". Not even once, strangely. IIRC I'd use the Pantone colours of their corporate identity and then adjust by eye. Sometimes their corporate colors had CMYK and RGB but again, the RGB didn't always work out to what you wanted to show the client.

The entire concept of sRGB, as I pointed out, was to compensate for a lack of calibration on the part of cheap home monitors, which perhaps 95% of the population uses.

Very few monitors are ever taken off the 9,300 degree white point balance their manufacturer has set them to. Knowing which phosphors are being used then completes an accurate enough picture (sic) for most web design.

Being that the eye/brain combination translates relative color for us, it's more than good enough for that.

Of course, if you are trying to decide on a sweater, or a paint shade, this isn't the ideal method. But, even there, most people can come close enough, in their mind, as to what it will look like, as long as they are told, in advance, that the color isn't accurate.

That's why there are "web-safe" colors. I've found that those colors do reproduce pretty closely between most cheap monitors, though the use of LCD panels has changed that somewhat.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Mac Software
AppleInsider › Forums › Software › Mac Software › Apple's Intel Aperture 1.1 Update pushed back