If it's "inaccurate" then it's a lie, if it's lie then it's illegal and potentially very costly for the company. Marketing typically doesn't risk lying, what thy do is use ambiguous terminology and/or terms that evoke a positive response.
solipsism: thinking that your distinction between "inaccurate" and "ambiguous" is important to anyone else
If the image was a true HDR photo, then the various levels and contrast would remain adjustable. I think Apple's tone mapped image is basically comprised of 3 shots taken with high/medium/low aperture settings, then composited.... thus, it's a tone map.
I see we have morons that don't understand the difference between an image type and one specific technique of how to generate that image type. I've followed the research for years and never has there been any atempt to divorce any of the techniques from being part of the overall HDR technique set.
Fredo Durand and his students at MIT have driven most of this field recently (the modern digital version, the old dodge and burn-in printing has been around for a LOONG time ) and there are many ways to generate HDR images. Their first was exposure blending, the rest came after, but they are all HDR simply because the resultant image has a "High(er) Dynamic Range" than any of the "Low Dynamic Range" source images.
What Apple is doing is certainly NOT "Tone mapping" because tone mapping is what creates surreal painting-like images, and there is no sign of that in the iPhone photos we have seen. I think certain people here, rather then Apple, are the ones who misuse the term.
===================
There is no reason why HDR requires more then one image in it's construction. Imagine an image sensor that had 9 photo receptors per pixel: Three tiny ones for RGB of the highlights, three middle sized ones, and three large high-sensativity ones for the shadows. Such a sensor would need 18 megapixels to produce a 2 megapixel image, (The word "megapixel" is used in two different ways) but it would be able to take HDR of even high speed images and could do HDR video.
If the image was a true HDR photo, then the various levels and contrast would remain adjustable. I think Apple's tone mapped image is basically comprised of 3 shots taken with high/medium/low aperture settings, then composited.... thus, it's a tone map.
Read it again.
Quote:
In image processing, computer graphics, and photography, high dynamic range imaging (HDRI or just HDR) is a set of techniques that allow a greater dynamic range of luminances between the lightest and darkest areas of an image than standard digital imaging techniques or photographic methods. This wider dynamic range allows HDR images to more accurately represent the wide range of intensity levels found in real scenes, ranging from direct sunlight to faint starlight[1]
The two main sources of HDR imagery are computer renderings and merging of multiple photographs, the latter of which in turn are individually referred to as low dynamic range (LDR)[2] or standard dynamic range (SDR)[3] photographs.
Tone mapping techniques, which reduce overall contrast to facilitate display of HDR images on devices with lower dynamic range, can be applied to produce images with preserved or exaggerated local contrast for artistic effect.
Where is it says that you should have original three seperate exposed pictures to be called "HDR". Want to distort reality much? Do you understand what Tone mapping is? It's written perfectly clear in front of you and yet you don't understand it, or simply don't want to.
Do you understand photography?
Again...
Quote:
high dynamic range imaging (HDRI or just HDR) is a set of techniques that allow a greater dynamic range of luminances between the lightest and darkest areas of an image than standard digital imaging techniques or photographic methods.
HDR or not, you can't put light where there is none. You won't see the waves in the background until we have far more sensitive image sensors. No amount or type of HDR processing can address that fact. Your beach at night scenario is inherently, quite simply, a poor photo situation. Next time pick a better shot.
There's no need to be so terse.
Your assumption about my photo is wrong. My photo did not include the ocean. It was of people standing on the beach with the lit boardwalk in the background, spilling some light into the foreground. Obviously, the scenario you were thinking of would result in a poor photo.
And anyway, I took the photo to capture a special fleeting moment with friends. Not to win any photo contests. So forgive me if I didn't make the sun rise so I could create a Kodak moment.
Does the HDR feature even work in the new touch? It is a much poorer camera and the feature may not appear there. Yes the touch gets 4.1, but does it get HDR?
Does the HDR feature even work in the new touch? It is a much poorer camera and the feature may not appear there. Yes the touch gets 4.1, but does it get HDR?
Indeed. My point is simply that I've already seen the sentiment that HDR could make things better but couldn't make them worse. So far in my experiments, I'd say it makes the photos worse much more often than it makes them better. Some images it certainly helps dramatically-- outdoor in bright direct sunlight. I'm sure taking a stain-glass window shot with the sun behind it or something like that. But it takes a couple seconds per shot and often makes things look flat, washed-out, or just plain spooky. I'm just warning people not to leave it on all the time and assume it will always help.
I'm pretty sure Jobs stated that the HDR mode saves all three exposures-- and given that the middle exposure is presumably what you've posted, I mean..... there it is. Take a look at what HDR gives you, and discard the high and low, if that seems better to you.
More generally, it seems to me that this like complaining about depth of field-- that an improved lens gives you the option to shoot a very shallow focus plane, but then you don't like the results and worry that it's a problem. It's not, any more than any option in photography is a problem if you don't like the results.
I'm pretty sure Jobs stated that the HDR mode saved all three exposures-- and given that the middle exposure is presumably what you've posted, I mean..... there it is. Take a look at what HDR gives you, and discard the high and low, if that seems better to you.
It appears you get three total options for photos.
It appears you get three total options for photos.
Normal image. HDR off.
HDR image, with normal image saved, too.
HDR image. Normal image not saved.
So you pretty much have to go out of your way to end up with an HDR only exposure and no backup. If people start using this and don't like the results, they'll likely remember where they toggled the setting, so I don't think we have to worry about a lot of folks getting blown out skies all willy-nilly.
So you pretty much have to go out of your way to end up with an HDR only exposure and no backup. If people start using this and don't like the results, they'll likely remember where they toggled the setting, so I don't think we have to worry about a lot of folks getting blown out skies all willy-nilly.
I?m not sure if this setting on by default or not, but anyone who complains about this feature is just being an ass. The only slight would be the 2 extra seconds it takes to process the HDR. I?ve had a couple photos look better without HDR, but having both makes it easy to compare.
BTW, I tried HDR on a landscape where used the digital zoom about halfway in and i got an unexpected result. It appears less grainy than the normal image, as well pulling out the sky, mountains, and closer objects better than the normal image.
BTW, I tried HDR on a landscape where used the digital zoom about halfway in and i got an unexpected result. It appears less grainy than the normal image, as well pulling out the sky, mountains, and closer objects better than the normal image.
That's interesting. I imagine it has something to do with the blending algorithms softening the block edges, but I wonder if would be possible to do some kind of pixel shifting in software to produce a virtual higher resolution?
Hmmmm, thinking about it a bit more I guess probably not, since you're limited by the pixel size grid, and moving additional data points around within that doesn't really change anything. You'd have to be able to do some fraction of a pixel averaging, and although that might give you a smoother look (akin to what you describe) it wouldn't actually add any information.
"It (the iPod touch) won't be able to exploit a new photo feature included in the iOS 4.1 software upgrade coming this week that will let the iPhone 4 camera automatically combine multiple exposures into a single spruced-up image."
Comments
If it's "inaccurate" then it's a lie, if it's lie then it's illegal and potentially very costly for the company. Marketing typically doesn't risk lying, what thy do is use ambiguous terminology and/or terms that evoke a positive response.
solipsism: thinking that your distinction between "inaccurate" and "ambiguous" is important to anyone else
I think you're not quite there. An HDR image is composed of a series of pictures of the same subject taken at varying aperture settings.
http://en.wikipedia.org/wiki/High_dynamic_range_imaging
No, you didn't read very well. Go try again.
Perfect example of TONE MAPPING, not HDR.
If the image was a true HDR photo, then the various levels and contrast would remain adjustable. I think Apple's tone mapped image is basically comprised of 3 shots taken with high/medium/low aperture settings, then composited.... thus, it's a tone map.
I see we have morons that don't understand the difference between an image type and one specific technique of how to generate that image type. I've followed the research for years and never has there been any atempt to divorce any of the techniques from being part of the overall HDR technique set.
Fredo Durand and his students at MIT have driven most of this field recently (the modern digital version, the old dodge and burn-in printing has been around for a LOONG time ) and there are many ways to generate HDR images. Their first was exposure blending, the rest came after, but they are all HDR simply because the resultant image has a "High(er) Dynamic Range" than any of the "Low Dynamic Range" source images.
===================
There is no reason why HDR requires more then one image in it's construction. Imagine an image sensor that had 9 photo receptors per pixel: Three tiny ones for RGB of the highlights, three middle sized ones, and three large high-sensativity ones for the shadows. Such a sensor would need 18 megapixels to produce a 2 megapixel image, (The word "megapixel" is used in two different ways) but it would be able to take HDR of even high speed images and could do HDR video.
If the image was a true HDR photo, then the various levels and contrast would remain adjustable. I think Apple's tone mapped image is basically comprised of 3 shots taken with high/medium/low aperture settings, then composited.... thus, it's a tone map.
Read it again.
In image processing, computer graphics, and photography, high dynamic range imaging (HDRI or just HDR) is a set of techniques that allow a greater dynamic range of luminances between the lightest and darkest areas of an image than standard digital imaging techniques or photographic methods. This wider dynamic range allows HDR images to more accurately represent the wide range of intensity levels found in real scenes, ranging from direct sunlight to faint starlight[1]
The two main sources of HDR imagery are computer renderings and merging of multiple photographs, the latter of which in turn are individually referred to as low dynamic range (LDR)[2] or standard dynamic range (SDR)[3] photographs.
Tone mapping techniques, which reduce overall contrast to facilitate display of HDR images on devices with lower dynamic range, can be applied to produce images with preserved or exaggerated local contrast for artistic effect.
Where is it says that you should have original three seperate exposed pictures to be called "HDR". Want to distort reality much? Do you understand what Tone mapping is? It's written perfectly clear in front of you and yet you don't understand it, or simply don't want to.
Do you understand photography?
Again...
high dynamic range imaging (HDRI or just HDR) is a set of techniques that allow a greater dynamic range of luminances between the lightest and darkest areas of an image than standard digital imaging techniques or photographic methods.
Do Apple method acheive this?
HDR or not, you can't put light where there is none. You won't see the waves in the background until we have far more sensitive image sensors. No amount or type of HDR processing can address that fact. Your beach at night scenario is inherently, quite simply, a poor photo situation. Next time pick a better shot.
There's no need to be so terse.
Your assumption about my photo is wrong. My photo did not include the ocean. It was of people standing on the beach with the lit boardwalk in the background, spilling some light into the foreground. Obviously, the scenario you were thinking of would result in a poor photo.
And anyway, I took the photo to capture a special fleeting moment with friends. Not to win any photo contests. So forgive me if I didn't make the sun rise so I could create a Kodak moment.
http://www.ankleskater.com/iphotography
Jobs mentioned HDR was coming to the iPad with iOS4.2. What's up with that?
http://www.ankleskater.com/iphotography
Yes, it's a bit perplexing. Manipulation of photos in concert with an iPhone perhaps?... Maybe direct connection and uploads with a DSLR?
I ain't losing my jailbreak over that.
Nor am I. But it does leave me looking forward to a jailbroken 4.1.
This, quote for absolute truth.
Yes, it's a bit perplexing. Manipulation of photos in concert with an iPhone perhaps?... Maybe direct connection and uploads with a DSLR?
Maybe he completely mis-spoke, which is the only thing that makes sense, so...yea.
Does the HDR feature even work in the new touch? It is a much poorer camera and the feature may not appear there. Yes the touch gets 4.1, but does it get HDR?
Of course it does.
The higher contrast in the cars looks MUCH better in the non-HDR shot. The HDR one looks awful.
You're joking obviously?
Indeed. My point is simply that I've already seen the sentiment that HDR could make things better but couldn't make them worse. So far in my experiments, I'd say it makes the photos worse much more often than it makes them better. Some images it certainly helps dramatically-- outdoor in bright direct sunlight. I'm sure taking a stain-glass window shot with the sun behind it or something like that. But it takes a couple seconds per shot and often makes things look flat, washed-out, or just plain spooky. I'm just warning people not to leave it on all the time and assume it will always help.
I'm pretty sure Jobs stated that the HDR mode saves all three exposures-- and given that the middle exposure is presumably what you've posted, I mean..... there it is. Take a look at what HDR gives you, and discard the high and low, if that seems better to you.
More generally, it seems to me that this like complaining about depth of field-- that an improved lens gives you the option to shoot a very shallow focus plane, but then you don't like the results and worry that it's a problem. It's not, any more than any option in photography is a problem if you don't like the results.
I'm pretty sure Jobs stated that the HDR mode saved all three exposures-- and given that the middle exposure is presumably what you've posted, I mean..... there it is. Take a look at what HDR gives you, and discard the high and low, if that seems better to you.
It appears you get three total options for photos.
It appears you get three total options for photos.
So you pretty much have to go out of your way to end up with an HDR only exposure and no backup. If people start using this and don't like the results, they'll likely remember where they toggled the setting, so I don't think we have to worry about a lot of folks getting blown out skies all willy-nilly.
So you pretty much have to go out of your way to end up with an HDR only exposure and no backup. If people start using this and don't like the results, they'll likely remember where they toggled the setting, so I don't think we have to worry about a lot of folks getting blown out skies all willy-nilly.
I?m not sure if this setting on by default or not, but anyone who complains about this feature is just being an ass. The only slight would be the 2 extra seconds it takes to process the HDR. I?ve had a couple photos look better without HDR, but having both makes it easy to compare. BTW, I tried HDR on a landscape where used the digital zoom about halfway in and i got an unexpected result. It appears less grainy than the normal image, as well pulling out the sky, mountains, and closer objects better than the normal image.
BTW, I tried HDR on a landscape where used the digital zoom about halfway in and i got an unexpected result. It appears less grainy than the normal image, as well pulling out the sky, mountains, and closer objects better than the normal image.
That's interesting. I imagine it has something to do with the blending algorithms softening the block edges, but I wonder if would be possible to do some kind of pixel shifting in software to produce a virtual higher resolution?
Hmmmm, thinking about it a bit more I guess probably not, since you're limited by the pixel size grid, and moving additional data points around within that doesn't really change anything. You'd have to be able to do some fraction of a pixel averaging, and although that might give you a smoother look (akin to what you describe) it wouldn't actually add any information.
Of course it does.
Baig doesn't think so!
"It (the iPod touch) won't be able to exploit a new photo feature included in the iOS 4.1 software upgrade coming this week that will let the iPhone 4 camera automatically combine multiple exposures into a single spruced-up image."
http://www.usatoday.com/tech/columni...aig08_ST_N.htm