iPhone 14 will get 48MP camera sensor, folding lens arriving with iPhone 15

Posted:
in iPhone edited December 2021
Apple's "iPhone 14" will use a 48-megapixel camera, and the "iPhone 15" will get better optical zoom with a periscope lens, believes Ming-Chi Kuo.




The camera in the iPhone is a major marketing point for Apple, but while the focus has been on computational photography and low-light improvements, the resolution has remained quite static on the part. If true, it seems that Apple could make a considerable jump in resolution for the next model.

According to a TF Securities note seen by AppleInsider, analyst Ming-Chi Kuo makes the claim that the iPhone 14 camera will have a 48-megapixel sensor. The change, Kuo forecasts, will help improve the finances of Taiwan-based camera component supplier Largan precision.

This is not Kuo's first prediction of a 48-megapixel sensor, as a note from April said one would be on the way in 2022. The large 1/1.3-inch sensor, used for wide-angle imagery, would potentially provide a hybrid operating mode, ushering in 8K video recordings.

This doesn't specifically mean that all of the pictures taken by an iPhone will be 48 megapixels, though. The camera is also expected to use pixel binning to improve color accuracy and low-light performance, with four discrete pixels being used for each generated pixel of a low-light photo.

The end result, if true, will still be 12-megapixel photos for low-light photos, and up to 48 megapixels for brightly lit subjects.

Samsung's Galaxy S21 camera already uses this technique.

Periscope camera in iPhone 15

While resolution is one improvement, another could also be on the way that affects zooming. Kuo predicts that a periscope or folded lens arrangement could be used in an iPhone debuting in 2023.

Folded lenses use a complicated prism and mirror arrangement to create a longer path for light to pass through the lens components, enabling optical zooms with larger ranges than conventional arrangements. Apple holds multiple patents on related lens systems, so it is known that the company is examining the field.

Again, this is something Kuo has previously mentioned. Originally said in March 2020, Kuo has also brought up the idea a few more times, including in July 2020 and March 2021.



Read on AppleInsider
«1

Comments

  • Reply 1 of 24
    Another victim of deceptive marketing. Exactly why does the camera need to be 48 megapixel? More mega pixels equals reduced low light sensitivity. This is the same roadmap that was followed by camera manufacturers. Oh well. 
    watto_cobralkrupp
  • Reply 2 of 24
    Skeptical said:
    Another victim of deceptive marketing. Exactly why does the camera need to be 48 megapixel? More mega pixels equals reduced low light sensitivity. This is the same roadmap that was followed by camera manufacturers. Oh well. 
    I guess it will be done to enable 8K video recording. Android phones have had this feature for 2 years already, with not so great results (but Apple can do it better for sure). iPhones usually have the best video capture experience in the smartphones. Adding 8K video capture would be the next logical step in the evolution.
    mike1patchythepiratewatto_cobrabyronl
  • Reply 3 of 24
    Bring it on as long as it doesn't make the camera bump any bigger.  I find it odd that even with the Apple cover the bump still protrudes out so the phone never lies absolutely flat.  It's the first iPhone to do that (with the case on) and it seems very un-Apple to me.

    The challenge with 8K capture will also be file size. If you take a lot of pictures / video the size of the library will get huge.
    twokatmewwatto_cobrabyronl
  • Reply 4 of 24
    Skeptical said:
    Another victim of deceptive marketing. Exactly why does the camera need to be 48 megapixel? More mega pixels equals reduced low light sensitivity. This is the same roadmap that was followed by camera manufacturers. Oh well. 
    You needed to preface that with, "All things being the same..."

    According to the proposed specs, the pixel size for the smaller pixels is larger than just x/4, which makes a 4x4 bin larger than the current single pixel. Which in turn should work out to equal or perhaps even better sensitivity.
    StrangeDays12Strangerstwokatmewwatto_cobrarundhvidbyronltechconc
  • Reply 5 of 24
    And in the process, making pictures even harder to text/mail/send/airdrop to people. Not sure which consumers benefit from the ever-increasing pixel density and file sizes. I have a Nikon D800, (40megapixels) and have always reduced file size. I guess it will impel flash drive development.... 
    watto_cobra
  • Reply 6 of 24
    sdw2001sdw2001 Posts: 18,039member
    hmlongco said:
    Skeptical said:
    Another victim of deceptive marketing. Exactly why does the camera need to be 48 megapixel? More mega pixels equals reduced low light sensitivity. This is the same roadmap that was followed by camera manufacturers. Oh well. 
    You needed to preface that with, "All things being the same..."

    According to the proposed specs, the pixel size for the smaller pixels is larger than just x/4, which makes a 4x4 bin larger than the current single pixel. Which in turn should work out to equal or perhaps even better sensitivity.
    My understanding is that there are many other variables, too, including the size and construction/type of sensor.  For example, the camera on my drone has a 1 inch CMOS sensor.  It's predecessor also records in 4K and is 20MP (I believe), but has a 1/2" sensor.  The result is vastly improved low light performance.  I sincerely doubt Apple would release an iPhone with low light performance that was worse than its predecessor.  
    12Strangerswatto_cobrabyronl
  • Reply 7 of 24
    Skeptical said:
    Another victim of deceptive marketing. Exactly why does the camera need to be 48 megapixel? More mega pixels equals reduced low light sensitivity. This is the same roadmap that was followed by camera manufacturers. Oh well. 
    I doubt that. Apple's Schiller has gotten on stage to explain exactly why more MPs isn't better by itself, and why their camera used a lower number. The entire team of optical engineering hasn't flip-flopped their position "because marketing". We will have to see what is in store, but even this article suggests it's only used in some scenarios.
    12Strangerstwokatmewwatto_cobrabyronl
  • Reply 8 of 24
    doggone said:
    Bring it on as long as it doesn't make the camera bump any bigger.  I find it odd that even with the Apple cover the bump still protrudes out so the phone never lies absolutely flat.  It's the first iPhone to do that (with the case on) and it seems very un-Apple to me.
    Not laying 100% flat doesn't matter. That's an idealistic stance rather than pragmatic one. In my pocket, it doesn't matter. In my hands, it doesn't matter. Looking at my photos of loved ones, it doesn't matter. Use cases for operating the phone as it lays on a table with no hands are not very common in my life. I just tried it and it rocks slightly but not enough to care. 
    napoleon_phoneaparttwokatmewpatchythepiratewatto_cobratechconc
  • Reply 9 of 24
    Mondain said:
    And in the process, making pictures even harder to text/mail/send/airdrop to people. Not sure which consumers benefit from the ever-increasing pixel density and file sizes. I have a Nikon D800, (40megapixels) and have always reduced file size. I guess it will impel flash drive development.... 
    Gosh, if only our devices could resize, scale, and compress images for transmission. Oh, wait, they already do! Yup, the images you text are not the same as out of the camera. And iOS email asks what size to send. All of which are fine for friends & family use cases. Going to do professional printing? You won't be texting those over, so it's a moot point.
    roundaboutnowtwokatmewpatchythepiratewatto_cobrabyronltmayking editor the grate
  • Reply 10 of 24
    AppleZuluAppleZulu Posts: 2,182member
    Mondain said:
    And in the process, making pictures even harder to text/mail/send/airdrop to people. Not sure which consumers benefit from the ever-increasing pixel density and file sizes. I have a Nikon D800, (40megapixels) and have always reduced file size. I guess it will impel flash drive development.... 
    It provides more headroom for editing photos and video. The end product is significantly better if editing and adjustments are made at high resolution before reducing to a lower resolution for showing sharing or whatever. If you reduce to the final resolution first and then apply the exact same edits and adjustments, the final product will be inferior to the same thing edited at higher resolution and then reduced.  So if you want to produce a 4K video, you'll get better results if you shoot and edit in 8K and then downsample the final edit to 4K for distribution. 
    twokatmewpatchythepiratewatto_cobrarundhvidbyronl
  • Reply 11 of 24
    So if the resolution and file sizes go up, the Lightning port will feel even more outdated and slow. Will Apple be kind and introduce a better port, at least on Pro models, for 8K video footage? USB-C is not likely, according to everyone, but what other options are there? Faster Lightning? Some talk about a portless iPhone, but is there a viable wireless option for this scenario? Or an updated Smart Connector standard?
    elijahgwatto_cobrabyronlfirelock
  • Reply 12 of 24
    crowleycrowley Posts: 10,453member
    Skeptical said:
    Another victim of deceptive marketing. Exactly why does the camera need to be 48 megapixel? More mega pixels equals reduced low light sensitivity. This is the same roadmap that was followed by camera manufacturers. Oh well. 
    No one has marketed anything yet.  It's a rumour.
    JaiOh81twokatmewwatto_cobrabyronlStrangeDays
  • Reply 13 of 24
    elijahgelijahg Posts: 2,853member
    doggone said:
    Bring it on as long as it doesn't make the camera bump any bigger.  I find it odd that even with the Apple cover the bump still protrudes out so the phone never lies absolutely flat.  It's the first iPhone to do that (with the case on) and it seems very un-Apple to me.
    Not laying 100% flat doesn't matter to me. That's an idealistic stance rather than pragmatic one. In my pocket, it doesn't matter. In my hands, it doesn't matter. Looking at my photos of loved ones, it doesn't matter. Use cases for operating the phone as it lays on a table with no hands are not very common in my life. I just tried it and it rocks slightly but not enough for me to care. But others may not like it.
    Fixed that for you. Your experience isn't a fact, it's an opinion. There are people who - as amazing as it may seem - don't feel the same way you do about something.
    byronlgatorguy
  • Reply 14 of 24
    elijahgelijahg Posts: 2,853member
    Apple really needs to up the optical zoom capability of the cameras. One of the things keeping me on my X is the lack of optical zoom on the newer iPhones, wide angle is useless to me (and apparently most, since very few phone cameras have wide angle vs zoom).

    A while ago I did a very rough estimate of lens usage by examining photos on Flickr, and despite scrolling though quite a number of pages found just a single wide angle iPhone 11 photo. At least 15% of iPhone X photos however were using the 2x zoom. So going by this, wide angle isn't too popular. And honestly as a reasonably experienced photographer myself, I can't think of too many situations where I've thought "oh no, I can't take that photo because the lens is too long and I can't step back", but I use zoom constantly. Of course my experience isn't everyone's but judging by the quick Flikr stats above I can say I'm probably in the majority.
    muthuk_vanalingam
  • Reply 15 of 24
    48MP would be the logical step. It doesn't mean every image would be 48MP but there would be the option of one for a wall mural sized print. Pixel-combining would most likely be employed most of the time for 12MP or smaller images and 4k or HD video with a combination of greater dynamic range, lower noise, better low-light performance etc. Then there is the option for a greater digital zoom producing images and video of sufficient size without any up-scaling. The ridiculously high pixel count merely creates many more options for computational photography and videography. Bring it on!
    muthuk_vanalingambyronl
  • Reply 16 of 24
    elijahg said:
    Apple really needs to up the optical zoom capability of the cameras. One of the things keeping me on my X is the lack of optical zoom on the newer iPhones, wide angle is useless to me (and apparently most, since very few phone cameras have wide angle vs zoom).

    A while ago I did a very rough estimate of lens usage by examining photos on Flickr, and despite scrolling though quite a number of pages found just a single wide angle iPhone 11 photo. At least 15% of iPhone X photos however were using the 2x zoom. So going by this, wide angle isn't too popular. And honestly as a reasonably experienced photographer myself, I can't think of too many situations where I've thought "oh no, I can't take that photo because the lens is too long and I can't step back", but I use zoom constantly. Of course my experience isn't everyone's but judging by the quick Flikr stats above I can say I'm probably in the majority.
    barely anyone uses flickr. go on instagram. ultra wide pictures are very popular for any kind of scenario. includes video.
  • Reply 17 of 24
    lkrupplkrupp Posts: 10,557member
    Apple is yet another victim of the ‘more is better’ marketing scam. The JWST (James Webb Space Telescope) should be able to see back to the beginning of the universe and none of its sensors are more than 40 megapixels. Unfortunately the techie wannabe public has had this hammered into their brains by marketing bather. Faster CPU, more DRAM, more GB of storage, more pixels, faster GPU, better benchmark results, always means better performance, an allegation we know to be false (as in the case of the M1 integrated RAM kerfuffle when ’techies’ said 8 or 16 GB of Ram wasn’t enough and the M1 platform is a failure.)

    Digging into James Webb Space Telescope (instruments) it looks like :

    • NIRCam has 40 megapixels in 10 x 4 Mpx sensors (2 & 8 for different wavelengths)
    • NIRspec has another two sensors, each 4 megapixels = 8 Mpx
    • MIRI imager has 1 Mpx, and MIRI spectrometer has 2 Mpx
    • NIRISS has a 4 megapixel sensor
    • FGS (Fine Guidance System) has another sensor

    so a total of at least 54 megapixels (but not used all at once).


    edited December 2021
  • Reply 18 of 24
    Mondain said:
    And in the process, making pictures even harder to text/mail/send/airdrop to people. Not sure which consumers benefit from the ever-increasing pixel density and file sizes. I have a Nikon D800, (40megapixels) and have always reduced file size. I guess it will impel flash drive development.... 
    Gosh, if only our devices could resize, scale, and compress images for transmission. Oh, wait, they already do! Yup, the images you text are not the same as out of the camera. And iOS email asks what size to send. All of which are fine for friends & family use cases. Going to do professional printing? You won't be texting those over, so it's a moot point.
    Testify! In fact, compression is so commonplace it's hard to reverse. People email photos for my weekly newspaper as thumbnails, then are dumbfounded when I explain as simply as possible to please kindly select the largest size offered. In many cases, they received the photo at smaller size or swiped it from Farcebook, and it's hopeless.

    I guess it's sadly similar to mid-'90s when people would bring in a Polaroid with tiny subject matter surrounded by see of nothing, and ask if I could make it big for print. Whipping out my handy pica pole and proportion wheel, I tried to explain the person in photo would be a ghostly smudge at 357 percent. Obviously, that didn't register with Joe Dokes.
    StrangeDays
  • Reply 19 of 24
    There are pros and cons to this technique, but overall, with the proper computational photography, this can produce better quality images.  For those interested in learning more about how this works, there are plenty of articles that cover it well.  Below is an example.

    https://www.androidauthority.com/what-is-pixel-binning-966179/

    The short breakdown of the smartphone camera industry is this.  Companies like Google and Apple excel in computational photography but typically ship with lesser quality hardware.  Many of the Android OEMs aren't great at computational photography, but they are willing to spend more on camera hardware.  In the end, both approaches allow vendors to be competitive with one another.   Pretty much all of the other vendors, including Google with their latest Pixel 6 have moved to cameras with larger pixel binning sensors.  Apple is late to this game and it is essential that they adopt similar techniques in order to remain competitive.  
    muthuk_vanalingamroundaboutnowelijahg
  • Reply 20 of 24
    dewmedewme Posts: 5,755member
    lkrupp said:
    Apple is yet another victim of the ‘more is better’ marketing scam. The JWST (James Webb Space Telescope) should be able to see back to the beginning of the universe and none of its sensors are more than 40 megapixels. Unfortunately the techie wannabe public has had this hammered into their brains by marketing bather. Faster CPU, more DRAM, more GB of storage, more pixels, faster GPU, better benchmark results, always means better performance, an allegation we know to be false (as in the case of the M1 integrated RAM kerfuffle when ’techies’ said 8 or 16 GB of Ram wasn’t enough and the M1 platform is a failure.)

    Digging into James Webb Space Telescope (instruments) it looks like :

    • NIRCam has 40 megapixels in 10 x 4 Mpx sensors (2 & 8 for different wavelengths)
    • NIRspec has another two sensors, each 4 megapixels = 8 Mpx
    • MIRI imager has 1 Mpx, and MIRI spectrometer has 2 Mpx
    • NIRISS has a 4 megapixel sensor
    • FGS (Fine Guidance System) has another sensor

    so a total of at least 54 megapixels (but not used all at once).


    I totally agree that the sensor design should match the functional requirements of the product. Not sure this is a valid pixels-to-pixels comparison since the JWST is an infrared telescope that’s not operating in the (human) visible spectrum. Being designed for longer wavelength detection means larger individual pixels. I’d imagine there was a practical and cost limitation on how large the sensor arrays could be made to meet mission requirements and still fit into the total package needed for deployment. It’s safe to assume that the designers of the JWST incorporated everything they needed and could afford into the design of the sensors. 

    The JWST is one of the most complex deployments in terms of folding-unfolding that’s ever been attempted. I’m looking forward to seeing what the JWST tells us about the origins of the universe but I’d bet that there are a lot of scientists who are more than a bit nervous about whether the thing, i.e., mirrors, will deploy perfectly. Hopefully, Apple’s forays into folding sensor arrays will be less complex than what the JWST team had to deal with.
    edited December 2021
Sign In or Register to comment.