sorry for being obtuse (and i admit i didn't watch the entire presentation) but isn't "camera" an app that needs to be launched? or is all i have to do now is think about it to make it happen?
Yes. The crappy little built-in app has a new feature. But otherwise, it remains vastly inferior to many other camera apps.
The three shots are taken at the same time. Surely that was obvious?
How do they do that? Do they split the image with a prism and shine it onto 3 image sensors? Do you really know anything about this topic? Or did you just make up some sort of answer?
I suspect they will just use the same photo at 3 different exposures with an extra long exposure 'session' of sorts. This isn't like an old camera where they had a physical shutter. They can easily save 3 images from the same 'exposure'.
Imagine a picture being taken, but instead of 3 distinct click and save events, they simply keep collecting photons (much like a video), while saving images from the stream at 3 points during the session (sorry if this is hard to visualize)
The first photo being the 'fast shutter speed' & save while continuing to collect sensor data and then save again for the "mid" exposure while continuing to collect image data, and finally, save the last (overexposed) file with the longest amount of 'shutter' time as the 3rd file.
Then stack the images. This stacking has been done for years by amateur astronomy buffs to collect more photons and to reduce signal to noise ratio. They could tweak image settings from 1st to 2nd to 3rd save and accomplish it all in the length of the longest exposure if they use some clever manipulation. My only question here is if they can stabilize the image, even over that fraction of a second. When you use this in Astronomy, you can stack hundreds of photos so you have much more flexibility.
I'm under the impression that the three photos are taken so quickly together as to appear that they are taken simultaneously.
The phone cameras already have lots of problems in low light conditions. ISTM that certain of the problems will triple in magnitude, like camera shake induced blur.
As do most people's. Especially when using phones that require long exposures compared to a device with a good lens.
And HDR photos take MORE than 3 times as long to expose.
I'm prepared to be amazed. But I expect that this will work about as well (HA!) as Apple's Voice Command feature.
Depends. If they use stacking, they can just align the photos assuming there isn't a lot of motion going on. Depending on the number of samples, they can choose the 'best 3' to get a good pic, even with a bit of shaking going on.
As do most people's. Especially when using phones that require long exposures compared to a device with a good lens.
Plenty of dedicated cameras have smaller apertures.
Quote:
And HDR photos take MORE than 3 times as long to expose.
Mostly correct, depending on how it is done, it could be twice the time or almost four times the time. If it stacks "grabs" of the same exposure, then expect twice the time. If it stacks separate, sequential exposures, then it's nearly four times longer in total.
Quote:
I'm prepared to be amazed. But I expect that this will work about as well (HA!) as Apple's Voice Command feature.
I don't use voice command but I don't see where it has hurt the rest of the phone's functions.
Mostly correct, depending on how it is done, it could be twice the time or almost four times the time. If it stacks "grabs" of the same exposure, then expect twice the time. If it stacks separate, sequential exposures, then it's nearly four times longer in total.
I wonder if they will just tweak the image settings for the same image for 3 different images and then stack them, or actually take 3 distinct frames?
I wonder if they will just tweak the image settings for the same image for 3 different images and then stack them, or actually take 3 distinct frames?
I don't know, all my comments are conjecture. I think there are a couple other workable ways to do it too. I'd like to hear an iOS developer's comments on it. Whatever method Apple uses might be better than what the third party program can achieve, I seem to recall developers don't get direct access to the sensor.
Nice of Jobs NOT to say that HDR is an iPhone 4 only feature.
Their implementation of it may be but not the idea in general. There are tons of ways to carve up a turkey so to speak. I'm psyched about this as I've seen what stacking can do first hand. All of the stacking software offerings I've used have been community offerings without paid developers behind it so I'm very curious as to what Apple does with their implementation of it.
I'm just wondering if they will shoot a short video while tweaking the exposure live. Most pictures are taken with a set exposure, as are videos (generally speaking). If they tweaked the exposure 'live' so to speak while capturing three frames they could still get 3 pics in a very short amount of time.
If they use a single pic and just tweak the exposure and then stack it, they avoid the time-blurring effect altogether.
Depends. If they use stacking, they can just align the photos assuming there isn't a lot of motion going on. Depending on the number of samples, they can choose the 'best 3' to get a good pic, even with a bit of shaking going on.
As the sensor doesn't physically change, I think they might have the camera take the basic shot and interpret it at the three settings, compare, mix and present.
I've been asking a photographer friend about HDR, since he's been using it actively for a couple of years now for his work. He tells me that it can be done hand-held but that nothing in the scene should be moving, and you can combine pretty much as many images as you want, but normally three -- one under, one over, one on exposure.
Comments
sorry for being obtuse (and i admit i didn't watch the entire presentation) but isn't "camera" an app that needs to be launched? or is all i have to do now is think about it to make it happen?
Yes. The crappy little built-in app has a new feature. But otherwise, it remains vastly inferior to many other camera apps.
Why not? It doesn't take away any functionality, and it's not like it takes all that long to do.
Because it will screw up the phone royally. And there is no compelling reason to mess around.
The three shots are taken at the same time. Surely that was obvious?
How do they do that? Do they split the image with a prism and shine it onto 3 image sensors? Do you really know anything about this topic? Or did you just make up some sort of answer?
At the same time? HOW?
Imagine a picture being taken, but instead of 3 distinct click and save events, they simply keep collecting photons (much like a video), while saving images from the stream at 3 points during the session (sorry if this is hard to visualize)
The first photo being the 'fast shutter speed' & save while continuing to collect sensor data and then save again for the "mid" exposure while continuing to collect image data, and finally, save the last (overexposed) file with the longest amount of 'shutter' time as the 3rd file.
Then stack the images. This stacking has been done for years by amateur astronomy buffs to collect more photons and to reduce signal to noise ratio. They could tweak image settings from 1st to 2nd to 3rd save and accomplish it all in the length of the longest exposure if they use some clever manipulation. My only question here is if they can stabilize the image, even over that fraction of a second. When you use this in Astronomy, you can stack hundreds of photos so you have much more flexibility.
I'm under the impression that the three photos are taken so quickly together as to appear that they are taken simultaneously.
The phone cameras already have lots of problems in low light conditions. ISTM that certain of the problems will triple in magnitude, like camera shake induced blur.
Appears simultaneous? To who?
How difficult is it to hold the phone and point it for a fraction of a second? I'm not seeing the point of your comment.
Ask Aunt Millie how hard it is. Her pics suck.
As do most people's. Especially when using phones that require long exposures compared to a device with a good lens.
And HDR photos take MORE than 3 times as long to expose.
I'm prepared to be amazed. But I expect that this will work about as well (HA!) as Apple's Voice Command feature.
Ask Aunt Millie how hard it is. Her pics suck.
As do most people's. Especially when using phones that require long exposures compared to a device with a good lens.
And HDR photos take MORE than 3 times as long to expose.
I'm prepared to be amazed. But I expect that this will work about as well (HA!) as Apple's Voice Command feature.
Depends. If they use stacking, they can just align the photos assuming there isn't a lot of motion going on. Depending on the number of samples, they can choose the 'best 3' to get a good pic, even with a bit of shaking going on.
http://webpages.charter.net/darksky2.../stacking.html
http://www.djcash.demon.co.uk/astro/...HowItsDone.htm
I don't know how fast the speed of the sensor is, and I suspect a dark shot will take a lot longer and be much more prone to blur.
Ask Aunt Millie how hard it is. Her pics suck.
As do most people's. Especially when using phones that require long exposures compared to a device with a good lens.
Plenty of dedicated cameras have smaller apertures.
And HDR photos take MORE than 3 times as long to expose.
Mostly correct, depending on how it is done, it could be twice the time or almost four times the time. If it stacks "grabs" of the same exposure, then expect twice the time. If it stacks separate, sequential exposures, then it's nearly four times longer in total.
I'm prepared to be amazed. But I expect that this will work about as well (HA!) as Apple's Voice Command feature.
I don't use voice command but I don't see where it has hurt the rest of the phone's functions.
Mostly correct, depending on how it is done, it could be twice the time or almost four times the time. If it stacks "grabs" of the same exposure, then expect twice the time. If it stacks separate, sequential exposures, then it's nearly four times longer in total.
I wonder if they will just tweak the image settings for the same image for 3 different images and then stack them, or actually take 3 distinct frames?
I wonder if they will just tweak the image settings for the same image for 3 different images and then stack them, or actually take 3 distinct frames?
My understanding of HDR is that it requires three separate exposures.
I wonder if they will just tweak the image settings for the same image for 3 different images and then stack them, or actually take 3 distinct frames?
I don't know, all my comments are conjecture. I think there are a couple other workable ways to do it too. I'd like to hear an iOS developer's comments on it. Whatever method Apple uses might be better than what the third party program can achieve, I seem to recall developers don't get direct access to the sensor.
Nice of Jobs NOT to say that HDR is an iPhone 4 only feature.
Their implementation of it may be but not the idea in general. There are tons of ways to carve up a turkey so to speak. I'm psyched about this as I've seen what stacking can do first hand. All of the stacking software offerings I've used have been community offerings without paid developers behind it so I'm very curious as to what Apple does with their implementation of it.
I'm just wondering if they will shoot a short video while tweaking the exposure live. Most pictures are taken with a set exposure, as are videos (generally speaking). If they tweaked the exposure 'live' so to speak while capturing three frames they could still get 3 pics in a very short amount of time.
If they use a single pic and just tweak the exposure and then stack it, they avoid the time-blurring effect altogether.
In any case..can't wait to see it.
Overall, I'm really impressed with everything. I can't wait until 4.2 hits my iPad in November.
Depends. If they use stacking, they can just align the photos assuming there isn't a lot of motion going on. Depending on the number of samples, they can choose the 'best 3' to get a good pic, even with a bit of shaking going on.
http://webpages.charter.net/darksky2.../stacking.html
http://www.djcash.demon.co.uk/astro/...HowItsDone.htm
.
Very cool stuff. Thanks.
I don't use voice command but I don't see where it has hurt the rest of the phone's functions.
I don't think it has huit the rest of the phone's functions.
But it is a badly implemented mess. I had better voice command on my ancient Motorola.
As the sensor doesn't physically change, I think they might have the camera take the basic shot and interpret it at the three settings, compare, mix and present.
Dunno. Justa thought.
I think you might be right
If they use a single pic and just tweak the exposure and then stack it, they avoid the time-blurring effect altogether.
In any case..can't wait to see it.
I hope it works well. It's hard to believe that they would feature it at the keynote if it were a POS.
I wonder how it would really work in real life because you need a tripod to shot the 3 photos --- otherwise they don't line up.
The iPhone now has accelerometers and gyros. Think about it.