Apple has big camera upgrades lined up through iPhone 19 Pro
Analyst Ming-Chi Kuo claims to have details about 2024's iPhone 16 Pro range, and how optical zoom will keep improving up to 2027's iPhone 19 Pro.
Render of the iPhone 16 Pro and its camera system
Currently, only the iPhone 15 Pro Max features a tetraprism lens, which gives it a greatly increased optical zoom. As previously reported, the forthcoming iPhone 16 Pro is expected to get this improved camera system.
Analyst Ming-Chi Kuo says that this is the sole change, that both the iPhone 16 Pro and iPhone 16 Pro Max will feature the tetraprism lens. It will be the same model as introduced with 2023's iPhone 15 Pro Max.
However, in a blog post, Kuo says more significant updates will come with 2025's iPhone 17 Pro Max.
iPhone / Crystal-Optech as a leading beneficiary in iPhone tetraprism upgrade roadmaphttps://t.co/bveXq7UJ5D
-- (Ming-Chi Kuo) (@mingchikuo)
He says that model, specifically the iPhone 17 Pro Max, will feature an upgraded tetraprism lens to improve both zoom functionality and photo quality. Where the current tetraprism lens has a 1/3.1" 12MP Contact Image Sensor (CIS), the top 2025 iPhone will have a 1/2.6" 48MP CIS.
While that will be in the iPhone 17 Pro Max, it's not known whether it will also be in the iPhone 17 Pro. However, Kuo says that whether it is or not, the new tetraprism camera will be in 2026's iPhone 18 Pro.
For 2027's iPhone 19 Pro Max, Kuo says that the tetraprism lens will again be improved. While he has no specific details, he claims the upgrade will be a more substantial and significant one than before.
Kuo says that the iPhone 19 Pro Max will feature a still further improved optical zoom. Apple may not continue to call it a tetraprism lens, though, as one possibility is that more and smaller prisms will be used.
Separately, recent rumors have pointed to the iPhone 16 Pro -- and also the regular iPhone 16 -- gaining bigger and brighter screens.
Rumor Score: Likely
Read on AppleInsider
Comments
For my part I would like to have a camera to be useful to take the most mundane pictures without the constant frustrations that I always experience. I am standing in front my shiny car attempting to take a pick of a panel that needs scratch repair. What do I see? My reflection. So I try to lean away and what do I see? My hands hanging goofy like trying to shoot that pic. How bloody annoying.
Then you are trying to flog something online, same deal. A stainless steel kettle and there you are like some skulking creep in the reflection.
These are my bugbears about all this talk about cameras. For the average camera user I don't care how many pixels and how many lens when I can't solve the simple straightforward problem of reflection. Of course, you are going to jump in and say, hey get a tripod. Why didn't I think of that? Try lining up that shot, Sherlock.
The other issue with phones, negating the all pervasive issue with cameras, is the utterly, stupid inadvertent touching of the screen and suddenly when you look at your phone screen you are facing some alien in outer space trying to flog you a bunch of stellar dust. Huh? How they hell did I get there?
And finally there is this dot.com, Dutch Tulip Mania about AI. Every few years the IT industry sinks into the doldrums and then needs a spark, AI. Well, there was a company called Borland run by a bloke called Philip Khan who released a piece of software called Turbo AI back in the last century.
Guess what the challenge was? Data. The data that the IT industry is going to scrape to give you intelligent anything is your data manipulated by algorithms, in case you haven't figured that out.
In other words, it's not organic AI, it's old, crap data being scraped from humongous warehouses filled to the rafters with servers housing giga mounds of data. And the more we use our phones, our computers to search and do anything the data grows diametrically. But have you noticed this? As soon as you search for a warm toilet seat cover on your next search there are ten vendors that want to flog you warm toilet seat covers. That's not generative or predictive. That's just plain old stupid AI Mimicking. You searched for this so I am going to give you the same.
Anyone whose ever stock traded will have noticed the disclaimer: past winnings is not guarantee of future earnings. And that disclaimer ought to be slapped on any AI product in the future: past data is being used to give you your answers but it is no guarantee of anything useful. It's the old saying garbage in/garbage out.
Nvidia is running a storm of success to mega trillions, watch how they plummet back to earth same as the Dutch Tulip Mania and the dot.com when the ordinary folks work out that there is no magic bullet in AI. Just the same-o, same-o.
The day that someone delivers organic AI is the day I will sit up and take notice. Till, one big, fat yawn
Come to think of it, I believe that that is what Humane AI was trying to deliver. Real time AI. See how well they did??
“I predict…. cameras will keep getting better.”
Am I an analyst now?
Sometimes, a person has to compromise...
Myself, I really just want under screen Touch ID, and true optical 5x zoom, rumored at one time to arrive in the iPhone 18...
as for people taking pictures of birds or whatever, it’s not up to you to decide what they need in a camera. You also have no idea of what they’re thinking. If an iPhone doesn’t do it for you, but a flip phone and spends a few thousand on a “real” camera and lenses to see if you can do better.
i am looking forwards to all the improvements Apple can make to the cameras, the computational area and even some AI assist. I’m not ashamed to say that after a career of doing commercial work, running a commercial lab and consulting with Kodak and Apple about color standards many years ago, that despite having a Canon R5 with a number of lenses, I use mt iPhone Pro Max for most of my photography, as for a lot of work, it’s actually good enough.
If you're interested, read below for some TL;DR thoughts about how these lenses work.
For a true optical zoom as is used on 'real' cameras, there are many stacked lens elements inside the literally long lens you see mounted on the camera. The zoom happens when the distance between some of those lens elements is literally physically increased, thus changing the magnification of the light projected onto the sensor (or film) in the back of the camera. The critical thing to understand here is that the full area of the sensor is receiving the enlarged projected image. When "zoomed in," that small, distant object is optically "blown up" by the lenses to be larger, thus filling the full area of the sensor. What this means is that the resolution of the sensor's image does not change. If the lenses are of high quality, the zoomed-in image will be sharp and undistorted.
Another thing useful to note about "real" cameras is the use of interchangeable lenses. Most are "prime" or fixed focal length lenses, meaning the amount they're zoomed in (or out to wide angles) is static. The benefit of prime lenses is that, because there are no moving parts, there are no compromises, and thus they can be set perfectly for their focal length and thus be sharper and less prone to distortions than a movable zoom lens that must accommodate the physics of multiple focal lengths. For professional photographers, zoom lenses are used for convenience and speed for charging focal lengths. If a professional photographer is in a controlled environment and has foreknowledge of the focal length needed for what they're shooting, they'll swap out and use the exact prime lens for the purpose.
This is important to consider, because in reality, as best as I can tell, all of the lenses on an iPhone, including the tetraprism lens, are actually fixed, prime lenses. The only moving parts in them are image stabilizers*.
For smartphone cameras, a severe limitation is the front-to-back depth of the phone itself. Zoom and telephoto lenses on "real" cameras are long for a reason. The physical distance between lenses and the sensor is important to how much an image can be magnified before it's projected on the sensor.
The tetraprism lens in an iPhone offsets the sensor, and effectively uses multiple prisms (in one glass element) that allow the light to enter the front of the lens and then the prism, and then reflect back and forth four times before reaching the sensor. To achieve the same result without the tetraprism, the camera lens would have to stick out the back of the phone significantly further, which is something nobody wants. In reality, this lens is not a movable zoom lens, but a fixed focal length telephoto lens.
Here's where the loosey-goosey part comes in. The proper definition of optical zoom means the lens elements physically move and change in order to magnify smaller objects to be larger and larger on the camera sensor, while the resolution of the image remains fixed at the sensor's full megapixel resolution.
"Digital zoom" involves an unmagnified image reaching the sensor, and then the camera software crops a portion of that image and "blows it up" on the display. This quickly yields inferior results, because you are just magnifying a smaller number of pixels from the middle of the sensor, thus lowering the resolution of the final image.
What the so-called optical zoom on the iPhone is really doing is two things. First, it's selecting which of the lenses on the phone will be used. This is like when you swap out for a more powerful telephoto lens on a "real" camera. Then it's doing the digital cropping thing, but defining a certain minimum megapixel resolution as the "standard" resolution and maintaining that minimum by starting with a higher-resolution sensor before cropping out that acceptable minimum resolution portion of the image, e.g., cropping a 12 megapixel portion out of a 48 megapixel image. So calling this process optical zoom is, pun-intended, a bit of a stretch.
*Image stabilization involves suspending the camera lens and/or the sensor in a magnetic field, and using gyroscopes to measure small movements (your hands shaking), and moving the lens and/or sensor in the opposite direction to compensate for that shake. This prevents blurring of the image during exposure so that you get sharper photos as a result. So image stabilization involves moving parts in the lens, but doesn't involve changing the focal length of the lens.
That said, some phones have had this (ever improving) reflection removal technology for a while now.
This is from four years ago:
Similar technologies are used for casual underwater photography by smartphones.
Thanks for the Huawei ad. "Reflection removal" done after the fact, whether via an AI app or meticulously by hand in Photoshop, is an exercise of retouching an image to paint a preferred but fictional representation. Changing lighting, angles and/or using a polarizing filter when capturing the photo is about removing unwanted light, but still capturing the desired actual light reflected from the subject.
===============——— Likely
But in terms of actual help, here are two suggestions, this first one having already been made: the classic OG camera solution to reflections is a polarizing filter. But hey, maybe you don't want to mess around with filters for your iPhone. A second strategy, and one I often use if reflections are a problem. is to step further away from my subject and use the telephoto lens. Not always, but often, further distance will eliminate or greatly dissipate the reflection I see, and being at a greater distance gives you more room to maneuver out of camera frame while still getting your shot, without the "dangling hands" problem you mention. I suspect, if it isn't already available, there will be some kind of AI polarizing filter you can use after you've taken the shot that will eliminate reflections.
I don't recall exactly what iOS version first added these tools to Crop in Photos, but it seems like it was around 2021.
iPhone 12 Pro screenshot displaying original image with building 'leaning' away, cropped image with perspective controls applied, and the tilt/shift controls displayed below.