Why is this such a hard picture for a digital camera?
I was playing around with a new Canon S2 IS camera, sitting on the couch reading through the manual, and taking pictures of things around the room, trying out this and that.
The camera manages to do pretty well taking pictures in ordinary room light without a flash. While trying to test out various exposure and light-metering settings, however, I turned the camera towards one side of the room with a big window...
I either got a completely washed-out, over-exposed view of the window if I wanted things inside the house to look good, or, if I got a properly exposed view of the window, showing details of light passing through the window shade, I'd get a very dark, under-exposed view of everything inside the house.
Now, I realize that the human eye has a wider range than any digital camera, but even taking that into account, I was surprised by this problem. To my eye, the contrast between the brightness of the window and the room lighting did not seem all that great. It was cloudy and gray outside, and there was some tree cover for the window, so it's not like I was contrasting room light to full sunlight landing on the window shade.
I quickly, and without great effort at seamlessness, composited two images at different exposures to show what, more or less, the scene looked like to my own eyes:
Even later in the day, when the sun was going down and the light outside was looking pretty damned dim and gray, what the camera would keep giving me was the very washed-out overexposed view of the window. I tried a different digital camera (a Panasonic DMC-FZ5), and while it seemed marginally better, the results weren't that great either, and certainly far off from how my human eyes saw the scene.
Do these cameras really have such a narrow functional range for differences in lighting? Is there perhaps artificially pumped-up contrast being applied? Is it a matter of spectral differences between sunlight and indoor lighting (halogen lamp in this case)? If so, is there some sort of filter that might reduce this kind of problem? Do more expensive digital cameras handle this kind of lighting better?
The camera manages to do pretty well taking pictures in ordinary room light without a flash. While trying to test out various exposure and light-metering settings, however, I turned the camera towards one side of the room with a big window...
I either got a completely washed-out, over-exposed view of the window if I wanted things inside the house to look good, or, if I got a properly exposed view of the window, showing details of light passing through the window shade, I'd get a very dark, under-exposed view of everything inside the house.
Now, I realize that the human eye has a wider range than any digital camera, but even taking that into account, I was surprised by this problem. To my eye, the contrast between the brightness of the window and the room lighting did not seem all that great. It was cloudy and gray outside, and there was some tree cover for the window, so it's not like I was contrasting room light to full sunlight landing on the window shade.
I quickly, and without great effort at seamlessness, composited two images at different exposures to show what, more or less, the scene looked like to my own eyes:
Even later in the day, when the sun was going down and the light outside was looking pretty damned dim and gray, what the camera would keep giving me was the very washed-out overexposed view of the window. I tried a different digital camera (a Panasonic DMC-FZ5), and while it seemed marginally better, the results weren't that great either, and certainly far off from how my human eyes saw the scene.
Do these cameras really have such a narrow functional range for differences in lighting? Is there perhaps artificially pumped-up contrast being applied? Is it a matter of spectral differences between sunlight and indoor lighting (halogen lamp in this case)? If so, is there some sort of filter that might reduce this kind of problem? Do more expensive digital cameras handle this kind of lighting better?
Comments
The CMOS detectors have the potential to have per pixel brightness control but I do not think that any do this currently (i could be wrong), which would basically allow a user to control the dynamic range of the pixels... either a full range giving you what you see, or an isolated range to produce the washout/underexposure...
Its the technology... Our eyes work similarly to the CMOS detectors...
I don't know for compact camera like your's Shetline, but for DSLR camera, I have the IL range of three cameras.
The D70 has 7 IL (read 7 stops differences or 7 speed differences if you prefer)
The 20 D has 8,5 IL
The S3 has 11 IL in the extended light mode 2, and is the absolute reference in this aera.
This said the S3 image need post processing : you must selectively ajust the highlights of the window in order to adjust it. Otherwise all the image will seems to lack contrast. The S3 will simply allow you to have info in the highlights aeras, at the contrary of the other digital cameras, who will only display burned lights with no info in it.
great idea with the photoshop to blend the two images
IMHO the main issue is difference in brightness and color temperature between inside and outside.
you would need to have better lighting inside the house for things to balance out and look good in camera. both in terms of providing a more pleasing aesthetic, matching the color temperature of the light coming from outside, and also bumping up overall LUX indoors so that the CCDs register less noise and better highlight, shadow, etc. signal.
most pros shooting with digital slr for indoor shots use a big ass light thingy, you can see it in some issues of the uber-trendy "surface" magazine. in some cases, you see, the photography is so ultra cool that they figure it's slick to leave the special indoor lighting kit IN THE PHOTO itself. go figure heh
Originally posted by BuonRotto
In the old days, you did it in a dark room.
Card stock, knifes, and patience. It's enough to make you love Photoshop, even the earlier versions without layering.
Originally posted by BuonRotto
That's too much for any camera, let alone a digital one without the indoor lighting kit. The exposure range is just too great. I've never come across a film or digital camera that could expose everything in one shot the way you want without one, and even then it's less than ideal. A common way to solve the problem is to bracket your exposures, one photo exposed for the room, one exposed for the window, and one in between. These days, you use Photoshop to layer them to get the right combination of exposures. In the old days, you did it in a dark room.
random idea that just popped into my head: for digital cinematography, what do you think about within next 5 - 10 years, for 720p or 1080p acquisition (24-60fps) which captures say 3 bracketed exposures simultaneously
so then , what happens is you take this footage into final cut pro, shake, or adobe after effects, you have three different exposures to mix and match to get a wide range of color and tone details whoa...! essentially like shooting 1920x1080p 24-60fps capturing at (24bits X 3) color depth
edit: those 3 exposures would be different presets, say one main, one darker more cooler color temp, one brighter and more warmer color temp. then it would be hella fun to mix and match in post, though then you would need to do quite a bit of travelling matte work to blend stuff
The only other solution would be to light the foreground brighter to bring it into the same range as the window. A lighting meter can also help you identify these types of situations.
Originally posted by BuonRotto
That's too much for any camera, let alone a digital one without the indoor lighting kit. . .
That was my thought too, and I would expect similar pictures with my SLR film camera. Maybe the effect is a little worse with digital, I don't know. If it is, it is likely due to the difference in gamma curves for the two media. With film, gamma falls off at the extreme ends of the range, giving some compression and ability to see a little detail.
For readers unfamiliar with gamma, it is the often referred to as contrast. There can be high and low contrast films. It would be nice to have a digital camera with adjustable gamma. If a scene has very flat lighting and there is little difference between lights and darks, the picture will be flat, with nothing very bright or dark. It would be dull. Increasing gamma would fix this and give greater difference between the bright and dark spots. The window picture has the opposite requirement. Film actually changes gamma depending on what part of the curve you are on. The middle exposure range has the highest gamma and gamma is much lower at the high and low exposure ends.
I do concede that the Photoshop solution is best. I am very impressed, almost enough to buy a digital camera and a copy of CS2.
Originally posted by ShawnJ
Nice "unique photos," Jerry! I especially like "Selective Coloring" and "Letting Go."
Thank you! Glad you like them!
Obviously the human eye has an iris to control the total amount of light striking the retina -- an aperture control. There's also rhodopsin (visual purple) which is secreted by the retina to improve light sensitivity in dark conditions.
Since I can take in this mixed lighting scene all at once, however, with my iris remaining at one single "aperture setting" for the whole scene, and since my eyes adjust to the scene much more quickly and easily than the effects of rhodopsin would allow, it therefore seems that not only is the exposure latitude, or dynamic range, of my retina much greater than a CCD imager, but that there's a strong optical illusion going on telling me that the light from the window isn't that much brighter than the light inside the room. (The optical illusion comes to mind where the same gray square is put on a white background, and then on a black background, and looks brighter against the black even though its real visual intensity doesn't change.)
I have to wonder if the individual rods each, at any given moment, have a very broad exposure latitude, or if these retinal cells are simply, and very quickly as my eye casually moves back and forth across such a scene, adjusting their gain in small clusters so as shift the bounds of a narrower exposure latitude into a more useful, detail-providing range.
Actually, sunilraman, I remember TIFFany having a bracket photo action back in the day that made compositing these things so easy, it was worth the price of the app for just one use of it. It even included automation tools for pros. The TIFFany guys are the ones who created CoreImage, and the effects in CoreImage are very simlar to the ones in TIFFany. Come to think of it, this is easy as pie in iMaginator. You can use CoreImage compositing effects and save preset combinations of the effects for overlaying thee images. Once you have the effects in place, you can save them and drop in whatever bracketed images you want.**
Just goes to show how amazing the human eye is in any case. It's not that the human eye is any different in how it exposes a scene, just that it can re-expose an image so quickly. By the time you change your focus from inside the room to the window, your eye adjusts to see outside. Even the highest end pro cameras are slow to adapt relative to that.
**Full disclosure: I've helped with iMaginator's development a bit. I think I'll recomend this bracketing thing as a preset task or a preset effects chain to Andrew though.
Originally posted by snoopy
I don't know the eye's physiology, but I suspect it has a greater dynamic range than either film or digital cameras. In addition, I understand there is a lot of image processing that goes on in the brain.
That's right, the eye is much better in the aera of dynamic range of either film or digital cameras.
Dr Land, the foundator of Polaroid, made the interesting theory of the Retinex, that explains a lot about the human eye physiology, especially for color. It explain why at the contrary of digital camera, or film, why our eye is so performant for the automatic white balance.
I'm sure most here are already familiar with these filters, but maybe some are not.
You have the same problem when trying to take the picture of a black lab indoors on a white carpet.
I assume that a camera has only one setting for light for any given scene, so you get more contrast problems.
But the trade-off is that your eye can only focus on one small part of the visual field with any clarity. Even looking 10 feet away or so, as in this picture, you're only able to see a disc a few inches wide with any clarity. But again, with a camera picture, obviously the whole thing is clear, as if cones filled the entire "retina" of the camera.
Originally posted by BRussell
shetline - I don't think rhodopsin is the issue here. Rhodopsin takes several minutes to take full effect...
If you look above, you'll see I brought it rhodopsin only for the purpose of discounting it as relevant here.
Just trying to be thorough.