iPhone 16 Pro again rumored to get Tetraprism zoom lens

Posted:
in iPhone

The Tetraprism lens used in the iPhone 15 Pro Max will be used in more models for the iPhone 16 generation, according to a report seemingly confirming more orders for the component.

Two smartphones with multiple camera lenses and an Apple logo, set against a textured blue brick background.
Render of the iPhone 16 Pro



The iPhone 15 Pro Max is the only one in the generation with a Tetraprism lens. The periscope lens system provides it with a 5x optical zoom, compared to the 3x zoom of the iPhone 15 Pro.

There have been numerous rumors that Apple will be bringing the lens system to more models in the iPhone 16 collection. In a Friday report, it now seems to be happening.

According to industry sources of DigiTimes Asia, Apple is preparing to expand the use of periscope lenses. As part of this, Apple is also expected to expand its supplier range, adding Genius Electronic Optical alongside existing supplier Largan Precision for the iPhone 16.

Current repeated rumors point to the iPhone 16 Pro gaining the lens and the Pro Max continuing to use it.

This would in theory make the extra zoom level a benefit available only to the Pro lineup, not the standard edition models.

Other rumors have claimed the Pro models will also gain an extra 48-megapixel camera sensor on the rear, replacing the 12-megapixel sensor on the ultra-wide angle camera. New coatings may also reduce lens flare, while thinner lenses could help reduce weight and make the bump smaller.

Rumor Score: Likely

Read on AppleInsider

Comments

  • Reply 1 of 15
    DAalsethDAalseth Posts: 2,850member
    I really hope this is true. I don’t need the extra screen, bulk, or cost of the Max. but I would LOVE to have the 5x lens. 
    mike1AniMillzeus423
  • Reply 2 of 15
    dix99dix99 Posts: 15member
    How about they fix the scanner/ bar code issue, before adding other stuff. When scanning things like lottery tickets, I have to go use my old iPhone 7, as it’s clear & reads it instantly. My 14, it’s just a blur, unless you pull the camera back, but then it’s too far to read. My next upgrade will be a low end version, as the camera stinks. 
    williamlondon
  • Reply 3 of 15
    mike1mike1 Posts: 3,328member
    dix99 said:
    How about they fix the scanner/ bar code issue, before adding other stuff. When scanning things like lottery tickets, I have to go use my old iPhone 7, as it’s clear & reads it instantly. My 14, it’s just a blur, unless you pull the camera back, but then it’s too far to read. My next upgrade will be a low end version, as the camera stinks. 

    Sounds like the problem lies somewhere between the floor and the phone.
    Never once had a problem scanning anything, bar codes, QR codes, checks, docs etc.
    iOS_Guy80williamlondonzeus423tokyojimu
  • Reply 4 of 15
    DAalsethDAalseth Posts: 2,850member
    dix99 said:
    How about they fix the scanner/ bar code issue, before adding other stuff. When scanning things like lottery tickets, I have to go use my old iPhone 7, as it’s clear & reads it instantly. My 14, it’s just a blur, unless you pull the camera back, but then it’s too far to read. My next upgrade will be a low end version, as the camera stinks. 
    That’s a problem with that particular phone, not the model. It sounds like that handset has a defect. 
    muthuk_vanalingamStrangeDaysForumPost
  • Reply 5 of 15
    dix99 said:
    How about they fix the scanner/ bar code issue, before adding other stuff. When scanning things like lottery tickets, I have to go use my old iPhone 7, as it’s clear & reads it instantly. My 14, it’s just a blur, unless you pull the camera back, but then it’s too far to read. My next upgrade will be a low end version, as the camera stinks. 
    Nope.  I want the 5x zoom.  I don't need the lotto scanner feature.  And this is about ME.
    williamlondonStrangeDaysForumPost
  • Reply 6 of 15
    charlesncharlesn Posts: 930member
    Well, at least AI is consistent in being consistently wrong. This is NOT a "5x optical zoom" lens. There are, in fact, no optical zoom lenses to be found in any iPhone ever. An optical zoom lens has optical elements (hence the name!) that move within the lens to change focal length. Why does Apple Insider find it impossible to accurately describe the lens for what it is: a fixed focal length 120mm lens which would replace the fixed focal length 77mm lens currently in the 15 Pro. iPhone lenses achieve zooming via sensor cropping (a/k/a "digital zoom") and computational tricks. The further you "zoom" with this method from the actual fixed focal length of the lens, the more you compromise quality. Whereas the current main lens in the 15 Pro handles the 24mm-76mm range, the main lens in the 16 Pro would have to handle the 24mm-119mm range, after which the 120mm lens would kick in. Photo quality vs the 15 Pro will be demonstrably inferior in the 77mm-119mm range, which is a sweet spot range for portraiture and short-mediium telephoto photos in general. 
    edited July 5 Alex1NForumPostmuthuk_vanalingambeowulfschmidt
  • Reply 7 of 15
    AniMillAniMill Posts: 174member
    Just in time. My iPhone 12 Mini is showing its age with the Camera app shutting down when trying to snap pic, and general slowness. So I’m good with a regular sized Pro w/ the 5x. That’ll be great.
  • Reply 8 of 15
    nubusnubus Posts: 472member
    charlesn said:
    Whereas the current main lens in the 15 Pro handles the 24mm-76mm range, the main lens in the 16 Pro would have to handle the 24mm-119mm range, after which the 120mm lens would kick in.
    You're right that iPhones don't have optical zoom. They have what one could describe as "optical range zoom" where multiple lenses shoot the same photo allowing an iPhone to create non-native focal length captures based on data.The 48 MP sensors seems to allow a wider distance between lenses while still delivering good photos.
    zeus423
  • Reply 9 of 15
    melgrossmelgross Posts: 33,580member
    nubus said:
    charlesn said:
    Whereas the current main lens in the 15 Pro handles the 24mm-76mm range, the main lens in the 16 Pro would have to handle the 24mm-119mm range, after which the 120mm lens would kick in.
    You're right that iPhones don't have optical zoom. They have what one could describe as "optical range zoom" where multiple lenses shoot the same photo allowing an iPhone to create non-native focal length captures based on data.The 48 MP sensors seems to allow a wider distance between lenses while still delivering good photos.
    I just don’t like the word “zoom” to mean digital cropping or interpolated pixels. I become uncomfortable with that, even if the results are “ok”.
    nubus
  • Reply 10 of 15
    oscargoscarg Posts: 24member
    48 MP on a puny-ass sensor behind a puny-ass lens. What a pathetic joke.

    Apple: Stop dicking around with gimmicks and bring back Touch ID and the headphone jack. You've had your laugh; now suck less.
  • Reply 11 of 15
    charlesncharlesn Posts: 930member
    nubus said:
    charlesn said:
    Whereas the current main lens in the 15 Pro handles the 24mm-76mm range, the main lens in the 16 Pro would have to handle the 24mm-119mm range, after which the 120mm lens would kick in.
    You're right that iPhones don't have optical zoom. They have what one could describe as "optical range zoom" where multiple lenses shoot the same photo allowing an iPhone to create non-native focal length captures based on data.The 48 MP sensors seems to allow a wider distance between lenses while still delivering good photos.
    Nubus, can you link to an article where this capability of shooting with multiple lenses to create non-native focal lengths is covered? This would be certainly be news to me and I read an awful lot about the iPhone Pro camera system. I do know that the iPhone is capable of shooting multiple captures of the same image with the same lens at the same time and then processing a "best" single photo out of all that data, but I've never read anything about multiple lenses capturing the same image at the same time and then interpolating a non-native focal length from that data. 
  • Reply 12 of 15
    nubusnubus Posts: 472member
    charlesn said:
    nubus said:
    charlesn said:
    Whereas the current main lens in the 15 Pro handles the 24mm-76mm range, the main lens in the 16 Pro would have to handle the 24mm-119mm range, after which the 120mm lens would kick in.
    You're right that iPhones don't have optical zoom. They have what one could describe as "optical range zoom" where multiple lenses shoot the same photo allowing an iPhone to create non-native focal length captures based on data.The 48 MP sensors seems to allow a wider distance between lenses while still delivering good photos.
    Nubus, can you link to an article where this capability of shooting with multiple lenses to create non-native focal lengths is covered? This would be certainly be news to me and I read an awful lot about the iPhone Pro camera system. 
    @charlesn - it seems the Deep Fusion and now Photonic Engine tech from Apple does use multiple lenses and exposures. The 12-layer pipeline then combines and uses AI to deliver a photo https://cocologicshelp.zendesk.com/hc/en-us/articles/360012215577-Single-and-Multi-Lens-Mode-Ultra-Wide-Wide-Tele. I were not correct in saying it affected the "7 lenses" that iPhone 15 Pro Max is marketed as offering. It might use multiple lenses and it might not: https://www.dpreview.com/articles/2668153890/apple-s-iphone-15-and-15-pro-imaging-tech-examined
  • Reply 13 of 15
    charlesncharlesn Posts: 930member
    nubus said:
    charlesn said:
    nubus said:
    charlesn said:
    Whereas the current main lens in the 15 Pro handles the 24mm-76mm range, the main lens in the 16 Pro would have to handle the 24mm-119mm range, after which the 120mm lens would kick in.
    You're right that iPhones don't have optical zoom. They have what one could describe as "optical range zoom" where multiple lenses shoot the same photo allowing an iPhone to create non-native focal length captures based on data.The 48 MP sensors seems to allow a wider distance between lenses while still delivering good photos.
    Nubus, can you link to an article where this capability of shooting with multiple lenses to create non-native focal lengths is covered? This would be certainly be news to me and I read an awful lot about the iPhone Pro camera system. 
    @charlesn - it seems the Deep Fusion and now Photonic Engine tech from Apple does use multiple lenses and exposures. The 12-layer pipeline then combines and uses AI to deliver a photo https://cocologicshelp.zendesk.com/hc/en-us/articles/360012215577-Single-and-Multi-Lens-Mode-Ultra-Wide-Wide-Tele. I were not correct in saying it affected the "7 lenses" that iPhone 15 Pro Max is marketed as offering. It might use multiple lenses and it might not: https://www.dpreview.com/articles/2668153890/apple-s-iphone-15-and-15-pro-imaging-tech-examined
    Hey! Thank you so much for posting these links, nubus! Much appreciated! Although--and I think you'd probably agree with this--the more you read about how iPhone is capturing and processing photos, the more confusing it becomes to understand what is actually going on! As you point out, the dpreview article--and they're an excellent source for delving into the technical details of a camera--seems inconclusive about whether iPhone is combining shots from its multiple lenses or using advanced sensor cropping techniques to achieve non-native focal lengths. 

    That said, for all of the highly advanced computational photography that Apple claims, with machine-learning based on analyzing hundreds of thousands of photos, blah, blah, blah, my 15 Pro proved totally inept while on vacation a couple of weeks ago at capturing that most basic of "tricky" photos: a backlit portrait shot. My wife and i were on a restaurant terrace, perched on a cliff, with stunning scenery behind us, but in significantly brighter light than us. Using all the computational trickery available to the 15 Pro, producing a photo that balanced the light between the foreground human subjects and the scenic backdrop should have been a piece of cake. But no! Shot after shot--and we shot a bunch--had us well-focused but nearly invisible in dark shadow. The scenic background, however--which the camera should "know" is not the subject--was perfectly exposed. Okay, I thought... let's use fill-flash, so I set the camera's flash on auto and shot another set of pics. The flash never fired once--again, based on the point of focus, the camera should know it has human subjects in dimmer light compared to the background and compensate with flash. But all we got were more photos of us in deep shadow. Okay, fine--I'll set the flash to "On" instead of "Auto" so it will definitely fire--and it did--but produced only marginally better results, (And yes, we were shooting from a close enough range that the flash should have had enough power to light us.) We came away with a couple of dozen photos in all, not a single one of them usable. 

    I would also say this about all the supposed "advances" in computational processing of iPhone photos: I've been a VERY avid photographer for decades, and even had my own home color darkroom back in the film days, and I would say my overall satisfaction with iPhone photos peaked with the 13 Pro and has slid ever since. There are just too many times when I'm looking at the lighting and/or color balance of the scene in front of my eyes, and the processed photo of it produced by iPhone has failed to capture it accurately. It's not egregiously wrong--it's not a "bad photo," per se--it's just "off" from what I saw, and not in a good way. It seems to me--and this is admittedly entirely anecdotal--that the more "advanced" Apple tries to get with its processing, the more it uses computational trickery to deliver the "perfect" photo, the further away it gets from delivering a photo that captures the scene as I saw it. 
    edited July 6 nubus
  • Reply 14 of 15
    nubusnubus Posts: 472member
    charlesn said:
    nubus said:
     it seems the Deep Fusion and now Photonic Engine tech from Apple does use multiple lenses and exposures. The 12-layer pipeline then combines and uses AI to deliver a photo https://cocologicshelp.zendesk.com/hc/en-us/articles/360012215577-Single-and-Multi-Lens-Mode-Ultra-Wide-Wide-Tele. I were not correct in saying it affected the "7 lenses" that iPhone 15 Pro Max is marketed as offering. It might use multiple lenses and it might not: https://www.dpreview.com/articles/2668153890/apple-s-iphone-15-and-15-pro-imaging-tech-examined
    Hey! Thank you so much for posting these links, nubus! Much appreciated! Although--and I think you'd probably agree with this--the more you read about how iPhone is capturing and processing photos, the more confusing it becomes to understand what is actually going on! 
    Fully agree. I guess some here know people working with this or other technologies at Apple, but we would never ask, and obviously they wouldn't be able to answer. It is computational magic delivered with a deep respect for people and photography. I really want to praise the photo team at Apple for doing this and for enabling all generations to capture and share moments.
  • Reply 15 of 15
    charlesn said:
    The further you "zoom" with this method from the actual fixed focal length of the lens, the more you compromise quality. Whereas the current main lens in the 15 Pro handles the 24mm-76mm range, the main lens in the 16 Pro would have to handle the 24mm-119mm range, after which the 120mm lens would kick in. Photo quality vs the 15 Pro will be demonstrably inferior in the 77mm-119mm range, which is a sweet spot range for portraiture and short-mediium telephoto photos in general. 
    This part, right here.
Sign In or Register to comment.