iPhone 6s photos get sharper, better color with less noise via A9, 12 megapixel sensor
Apple's iPhone 6 was already a great mobile camera, a fact the company touted globally in billboards featuring shots actual taken by the phone. With iPhone 6s, a faster, smarter processor and new 12MP image sensor deliver a sharper, more vibrant photography upgrade.
In testing the new camera features of iPhone 6s and 6s Plus compared to last year's 6 and 6 Plus, there wasn't always an obvious, incredible jump in image quality, as is ostensibly suggested by the "50 percent increase in pixels" of the new models' 12MP sensor compared to the previous 8MP shooter.
That's because sensor pixel density itself doesn't necessarily result in sharper, more accurate and better looking photos. In fact--as seen on other cameraphones that beat Apple to market with 12MP or even higher sensors--additional sensor density can make images noisy, thanks to the crosstalk that occurs between each sensor as they get smaller and packed even tighter on the sensor chip.
That explains why last year, DxOMark benchmarks for image quality ranked the 8MP iPhone 6 and 6 Plus higher than competing phones with much higher sensor specs, including the 13MP LG G2, the 16MP Samsung Galaxy S5, the 20.7MP Sony Xperia Z3, and even the 41MP sensors used by the Nokia 808 and Lumia 1020.
Source: DxOBench
For iPhone 6s, Apple is moving to a new 12MP sensor, but not just to maintain specification parity with Android and Windows Phone models boasting higher pixel count but not better photos.
The new sensor in iPhone 6s and 6s Plus retains the "Focus Pixels" of iPhone 6 (which are phase detection pixels on the sensor chip that give it fast autofocus) while adding "deep trench isolation" between sensor pixels to minimize crosstalk and the image noise and static-like pixelation that results from it.
Additionally, Apple's proprietary Image Signal Processor silicon (built into the A9 Application Processor) now has improved noise reduction logic to smooth out noise while retaining detail in edges and textures that are commonly blurred away by simpler imaging systems commonly used in competing cameraphones.
These images (below) of Apple's Infinite Loop campus indicate a significant increase in image detail with less noise from iPhone 6s (top images taken with the already decent mobile camera on iPhone 6 Plus, followed by Apple's enhanced iPhone 6s Plus).
The difference in sharpness and noise grain is more evident as you examine detail, particularly here where the original images have been scaled down and compressed for publication, losing much of their detail.
Increased color and image detail allows iPhone 6s users more range in editing their photos afterward, such as increasing the saturation or adjusting light and dark details, in contrast to having the camera itself automatically "goose" every photo taken with canned effects that produce phony, oversaturated pictures that are intended to dazzle while throwing out real information in the process.
Apple noted that its goal for the iPhone's camera system is to capture accurate images. That stands in contrast to rival cameraphone makers, who often promote features that, for example, create composite photos that mix together several shots to create a "photo" where everyone is smiling, even though that image is actually fictional.
Better color and image detail is also evident in these shots driving through Grants Pass, Oregon. iPhone 6s image is on the bottom. In comparison, the top iPhone 6 Plus photo almost looks like it has a blurred layer of fog over it, even here where both images have been scaled down for publication.
Images compressed for publication
The image detail and color accuracy of these flowers is also apparent, and when zoomed in, you can see a broader range of detail and color information, giving you more room for editing afterward in a sharing app like Instagram.
Another example of added detail without pixelated noise or blurring can be seen in the edges of this brick building and its fire escapes and ornamentation, as well as its more accurate color.
Images compressed for publication
The increased detail and color range of photos taken by iPhone 6s can consume more storage space. In the photos above, the Apple Campus image is 1.6MB from the 6 Plus, but 2.1MB from the 6s Plus. The highway photo jumped from 2.1MB to 3MB on the 6s Plus. The flowers image was only slightly larger, growing from 1.7MB to 1.9MB, while the brick building jumped from 2MB to 2.7MB.
However, the significantly improved images make the difference in image size worth it, particularly as the price of iCloud storage and other forms of backups drop in price to where extra megabytes of data are essentially inconsequential.
One aspect that hasn't changed in the jump from 6 to 6s is the plastic lens assembly. While Apple has put a lot of engineering work into developing a compact, multiple lens system for iPhones, the nature of their size and materials results in common annoying artifacts.
This is particularly an issue when taking photos with bright light sources in the subject area, which can result in extraneous blue dots as seen below (a problem we noted last year in our review of iPhone 6 models). When the sun or other strong lighting hits the lens assembly from the side, it can also introduce lens flares or cloud the entire image with light refraction.
Until Apple targets iPhone lenses for a major update, the only way to deal with these problems is to change the angle of the camera; in some cases it can also be effective to use your hand to block offensive light from hitting the lens assembly from the side.
In addition to 12MP photos, iPhone 6s models also now enable the capture of 12MP Live Photos, 1080p Slomo and 4K video, all new moving formats we will explore in detail in a followup segment.
In testing the new camera features of iPhone 6s and 6s Plus compared to last year's 6 and 6 Plus, there wasn't always an obvious, incredible jump in image quality, as is ostensibly suggested by the "50 percent increase in pixels" of the new models' 12MP sensor compared to the previous 8MP shooter.
That's because sensor pixel density itself doesn't necessarily result in sharper, more accurate and better looking photos. In fact--as seen on other cameraphones that beat Apple to market with 12MP or even higher sensors--additional sensor density can make images noisy, thanks to the crosstalk that occurs between each sensor as they get smaller and packed even tighter on the sensor chip.
That explains why last year, DxOMark benchmarks for image quality ranked the 8MP iPhone 6 and 6 Plus higher than competing phones with much higher sensor specs, including the 13MP LG G2, the 16MP Samsung Galaxy S5, the 20.7MP Sony Xperia Z3, and even the 41MP sensors used by the Nokia 808 and Lumia 1020.
Source: DxOBench
For iPhone 6s, Apple is moving to a new 12MP sensor, but not just to maintain specification parity with Android and Windows Phone models boasting higher pixel count but not better photos.
The new sensor in iPhone 6s and 6s Plus retains the "Focus Pixels" of iPhone 6 (which are phase detection pixels on the sensor chip that give it fast autofocus) while adding "deep trench isolation" between sensor pixels to minimize crosstalk and the image noise and static-like pixelation that results from it.
Additionally, Apple's proprietary Image Signal Processor silicon (built into the A9 Application Processor) now has improved noise reduction logic to smooth out noise while retaining detail in edges and textures that are commonly blurred away by simpler imaging systems commonly used in competing cameraphones.
Colors that pop with increased accuracy
In some cases, photo comparisons I took between a 6 Plus and 6s Plus made the new phone's images look brighter, which might be interpreted as looking washed out. However, the color reproduction appears more accurate, and it looks like there is more detail being recorded.These images (below) of Apple's Infinite Loop campus indicate a significant increase in image detail with less noise from iPhone 6s (top images taken with the already decent mobile camera on iPhone 6 Plus, followed by Apple's enhanced iPhone 6s Plus).
The difference in sharpness and noise grain is more evident as you examine detail, particularly here where the original images have been scaled down and compressed for publication, losing much of their detail.
Increased color and image detail allows iPhone 6s users more range in editing their photos afterward, such as increasing the saturation or adjusting light and dark details, in contrast to having the camera itself automatically "goose" every photo taken with canned effects that produce phony, oversaturated pictures that are intended to dazzle while throwing out real information in the process.
Apple noted that its goal for the iPhone's camera system is to capture accurate images. That stands in contrast to rival cameraphone makers, who often promote features that, for example, create composite photos that mix together several shots to create a "photo" where everyone is smiling, even though that image is actually fictional.
Better color and image detail is also evident in these shots driving through Grants Pass, Oregon. iPhone 6s image is on the bottom. In comparison, the top iPhone 6 Plus photo almost looks like it has a blurred layer of fog over it, even here where both images have been scaled down for publication.
center">
Images compressed for publication
The image detail and color accuracy of these flowers is also apparent, and when zoomed in, you can see a broader range of detail and color information, giving you more room for editing afterward in a sharing app like Instagram.
Another example of added detail without pixelated noise or blurring can be seen in the edges of this brick building and its fire escapes and ornamentation, as well as its more accurate color.
Images compressed for publication
The increased detail and color range of photos taken by iPhone 6s can consume more storage space. In the photos above, the Apple Campus image is 1.6MB from the 6 Plus, but 2.1MB from the 6s Plus. The highway photo jumped from 2.1MB to 3MB on the 6s Plus. The flowers image was only slightly larger, growing from 1.7MB to 1.9MB, while the brick building jumped from 2MB to 2.7MB.
However, the significantly improved images make the difference in image size worth it, particularly as the price of iCloud storage and other forms of backups drop in price to where extra megabytes of data are essentially inconsequential.
One aspect that hasn't changed in the jump from 6 to 6s is the plastic lens assembly. While Apple has put a lot of engineering work into developing a compact, multiple lens system for iPhones, the nature of their size and materials results in common annoying artifacts.
This is particularly an issue when taking photos with bright light sources in the subject area, which can result in extraneous blue dots as seen below (a problem we noted last year in our review of iPhone 6 models). When the sun or other strong lighting hits the lens assembly from the side, it can also introduce lens flares or cloud the entire image with light refraction.
Until Apple targets iPhone lenses for a major update, the only way to deal with these problems is to change the angle of the camera; in some cases it can also be effective to use your hand to block offensive light from hitting the lens assembly from the side.
In addition to 12MP photos, iPhone 6s models also now enable the capture of 12MP Live Photos, 1080p Slomo and 4K video, all new moving formats we will explore in detail in a followup segment.
Comments
My sense of the 6s Plus vs. the 6 Plus is that focusing is slower and less accurate with the new model in moderate to low light. My 6s Plus camera does a lot of "hunting" and often takes a fuzzy photo under these conditions. (Not likely a "defective" unit either, since the same behavior was seen on two different 6s Pluses.)
I do have Live Photos turned off. Yes, hopefully what I'm seeing is addressed in a sw update.
I've got a 6. I can't say I won't eventually buy a 6S or 6S Plus, but to my eyes the camera in the 6 actually looks better in many cases - particularly with video where the contrast appears better. Perhaps the image processing just needs some tweaking.
My sense of the 6s Plus vs. the 6 Plus is that focusing is slower and less accurate with the new model in moderate to low light. My 6s Plus camera does a lot of "hunting" and often takes a fuzzy photo under these conditions. (Not likely a "defective" unit either, since the same behavior was seen on two different 6s Pluses.) Since they're not mentioned in the technical specs, I'd guess the "focus pixels" of the 6 aren't present in the 6s 12MP sensor.
No need to guess: http://www.apple.com/iphone/compare/
iPhone 6s does use Focus Pixels, and actually has more than previous generation.
Whether there is an issue with focusing in low light is a separate issue, and could be something addressable in software.
Here's a tip. In settings --> Photos & Camera --> click on Record Video and set your video at 1080p at 60 fps, unless of course you want to shoot 4k. This will give you great video, especial when taken action shots.
The clouds in the photo of the car on the freeway are burnt out in places with the 6S camera while the older 6 Plus the clouds look more natural. And with the exception of the cropped photo of the people outside Infinite Loop which seems to be a major improvement on the older cameras, the differences in all the others is marginal. Seems to me from these examples anyway, that the new cameras generally overexpose shots slightly, which if so is not a good thing. According to DxO its the latest Sony Xperia Z5 that's the camera to beat at the moment, I wonder how the 6S will fair in comparison?
Don't all cameras have lens flare?
They do, so I don't think it's fair to say its a problem as mentioned in the article. All cameras have the same issue. Unfortunately, you can't buy a lens hood for a smartphone camera.
One would think iPhone 6 was no.1 but in reality it's ranked #8, behind last generation models like Note 4 and Xperia Z3
Will need to wait to see how they rate the 6S .
What you're seeing is a different crop. The 6 photo has less of the top of the cloud, which is munch brighter. The rest, further down don't look too much different. But a question is highlight recovery. Which one is better? It's hard to tell.
The only real problem with what Apple is doing here is that they still don't offer RAW files. Because of that we won't get the most out of the images. Why they aren't offering this, I don't know. I hope it's not because of the 16GB base model.
Some are better than others, of course. But lens flare in many cameras can be mitigated by using a lens hood. I tell people that they should always use a lens hood whenever it's offered.
But even some of the most expensive lenses have flare. Actually, the more complex the lens is, the more likely it will have flare. Leica's $12,000 50mm f1.2 lens is known for its flare.
What you're seeing is a different crop. The 6 photo has less of the top of the cloud, which is munch brighter. The rest, further down don't look too much different. But a question is highlight recovery. Which one is better? It's hard to tell.
The only real problem with what Apple is doing here is that they still don't offer RAW files. Because of that we won't get the most out of the images. Why they aren't offering this, I don't know. I hope it's not because of the 16GB base model.
From what I've seen, the new camera is better, except, in very low light (the Iphone 6 is better there)... Were the processing that reduces noise can't compensate for the fact that your simply getting less light per pixel so it will be darker. You can selectively lighten up those regions in post processing in many case, but then you'll see the noise too. It's an edge case.
Most smart phone phots don't occur in those situations, which would be akin to shooting in a dimly lit club.
Don't all cameras have lens flare?
They do for a reason that many smartphone manufacturers use a sapphire glass window in front of the camera - it is that window that reflects those bright light sources (that were initially reflected by the lens itself) back into the lens. And there is a reason why we know it is sapphire glass that does it. That glass is the only piece that can produce flat and undistorted reflections - exactly like the ones we can observe.
These would be much more exciting news than this live photos gimmick. That said, I bought the 6S and I am excited for the improvement over my 4 and 5C that I currently am using.
...also referred to by far too many people as "UFOs or spirit orbs".
"The clouds in the photo of the car on the freeway are burnt out in places with the 6S camera while the older 6 Plus the clouds look more natural. And with the exception"?
Don't sweat it - shots were not taken simultaneously, and as a result camera metering systems estimated exposures differently.
RAW files fix some of the that. You have more ability to correct for noise. Most of the worst noise is color noise. Reducing that goes a long way.creducing luminance nois lowers sharpness as well. But working with a JPEG is iffy.
The sapphire cover is like any lens filter. We buy filters to protect the expensive glass behind it. Every filter adds reflections. The best are very thin, and multi coated. They can be very expensive. The sapphire is good because it's hard to scratch. I don't understand your last two sentences.