iPhone 14 will get 48MP camera sensor, folding lens arriving with iPhone 15
Apple's "iPhone 14" will use a 48-megapixel camera, and the "iPhone 15" will get better optical zoom with a periscope lens, believes Ming-Chi Kuo.
The camera in the iPhone is a major marketing point for Apple, but while the focus has been on computational photography and low-light improvements, the resolution has remained quite static on the part. If true, it seems that Apple could make a considerable jump in resolution for the next model.
According to a TF Securities note seen by AppleInsider, analyst Ming-Chi Kuo makes the claim that the iPhone 14 camera will have a 48-megapixel sensor. The change, Kuo forecasts, will help improve the finances of Taiwan-based camera component supplier Largan precision.
This is not Kuo's first prediction of a 48-megapixel sensor, as a note from April said one would be on the way in 2022. The large 1/1.3-inch sensor, used for wide-angle imagery, would potentially provide a hybrid operating mode, ushering in 8K video recordings.
This doesn't specifically mean that all of the pictures taken by an iPhone will be 48 megapixels, though. The camera is also expected to use pixel binning to improve color accuracy and low-light performance, with four discrete pixels being used for each generated pixel of a low-light photo.
The end result, if true, will still be 12-megapixel photos for low-light photos, and up to 48 megapixels for brightly lit subjects.
Samsung's Galaxy S21 camera already uses this technique.
Folded lenses use a complicated prism and mirror arrangement to create a longer path for light to pass through the lens components, enabling optical zooms with larger ranges than conventional arrangements. Apple holds multiple patents on related lens systems, so it is known that the company is examining the field.
Again, this is something Kuo has previously mentioned. Originally said in March 2020, Kuo has also brought up the idea a few more times, including in July 2020 and March 2021.
Read on AppleInsider
The camera in the iPhone is a major marketing point for Apple, but while the focus has been on computational photography and low-light improvements, the resolution has remained quite static on the part. If true, it seems that Apple could make a considerable jump in resolution for the next model.
According to a TF Securities note seen by AppleInsider, analyst Ming-Chi Kuo makes the claim that the iPhone 14 camera will have a 48-megapixel sensor. The change, Kuo forecasts, will help improve the finances of Taiwan-based camera component supplier Largan precision.
This is not Kuo's first prediction of a 48-megapixel sensor, as a note from April said one would be on the way in 2022. The large 1/1.3-inch sensor, used for wide-angle imagery, would potentially provide a hybrid operating mode, ushering in 8K video recordings.
This doesn't specifically mean that all of the pictures taken by an iPhone will be 48 megapixels, though. The camera is also expected to use pixel binning to improve color accuracy and low-light performance, with four discrete pixels being used for each generated pixel of a low-light photo.
The end result, if true, will still be 12-megapixel photos for low-light photos, and up to 48 megapixels for brightly lit subjects.
Samsung's Galaxy S21 camera already uses this technique.
Periscope camera in iPhone 15
While resolution is one improvement, another could also be on the way that affects zooming. Kuo predicts that a periscope or folded lens arrangement could be used in an iPhone debuting in 2023.Folded lenses use a complicated prism and mirror arrangement to create a longer path for light to pass through the lens components, enabling optical zooms with larger ranges than conventional arrangements. Apple holds multiple patents on related lens systems, so it is known that the company is examining the field.
Again, this is something Kuo has previously mentioned. Originally said in March 2020, Kuo has also brought up the idea a few more times, including in July 2020 and March 2021.
Read on AppleInsider
Comments
The challenge with 8K capture will also be file size. If you take a lot of pictures / video the size of the library will get huge.
According to the proposed specs, the pixel size for the smaller pixels is larger than just x/4, which makes a 4x4 bin larger than the current single pixel. Which in turn should work out to equal or perhaps even better sensitivity.
A while ago I did a very rough estimate of lens usage by examining photos on Flickr, and despite scrolling though quite a number of pages found just a single wide angle iPhone 11 photo. At least 15% of iPhone X photos however were using the 2x zoom. So going by this, wide angle isn't too popular. And honestly as a reasonably experienced photographer myself, I can't think of too many situations where I've thought "oh no, I can't take that photo because the lens is too long and I can't step back", but I use zoom constantly. Of course my experience isn't everyone's but judging by the quick Flikr stats above I can say I'm probably in the majority.
Digging into James Webb Space Telescope (instruments) it looks like :
so a total of at least 54 megapixels (but not used all at once).
I guess it's sadly similar to mid-'90s when people would bring in a Polaroid with tiny subject matter surrounded by see of nothing, and ask if I could make it big for print. Whipping out my handy pica pole and proportion wheel, I tried to explain the person in photo would be a ghostly smudge at 357 percent. Obviously, that didn't register with Joe Dokes.
https://www.androidauthority.com/what-is-pixel-binning-966179/
The short breakdown of the smartphone camera industry is this. Companies like Google and Apple excel in computational photography but typically ship with lesser quality hardware. Many of the Android OEMs aren't great at computational photography, but they are willing to spend more on camera hardware. In the end, both approaches allow vendors to be competitive with one another. Pretty much all of the other vendors, including Google with their latest Pixel 6 have moved to cameras with larger pixel binning sensors. Apple is late to this game and it is essential that they adopt similar techniques in order to remain competitive.
The JWST is one of the most complex deployments in terms of folding-unfolding that’s ever been attempted. I’m looking forward to seeing what the JWST tells us about the origins of the universe but I’d bet that there are a lot of scientists who are more than a bit nervous about whether the thing, i.e., mirrors, will deploy perfectly. Hopefully, Apple’s forays into folding sensor arrays will be less complex than what the JWST team had to deal with.