- Last Active
- unconfirmed, member
charlesn said:Even after a lifetime of of photography as a prosumer hobby to the point of having a home color darkroom, I'm finding it hard to understand what's going on with lenses in Apple's latest Pro cameras. I assumed, for example, that the new Pro Max tetraprism lens was handling the optical zoom range not available in the regular Pro: 3.1x to 5x. Nope! A professional camera review of the lens that I read this weekend noted that the new lens only does 5x. Everything from 3.1x to 4.9x is handled on the main camera sensor. Which means what, exactly? A crop? Apple isn't helping things by claiming that its three lens set-up is actually seven lenses. A true zoom lens covers all of the focal lengths within its range optically and uses the full sensor size within the camera regardless of the focal length you're using. In theory, the 3 lenses in the new Pro Max set up can cover all focal lengths from 13mm to 120mm. But how much of that range is achieved through the use of true zoom optics and how much is achieved through cropping of the full image on the sensor? To go back to my earlier example: how is the main camera handling 3.1x to 4.9x if not by cropping, since the main camera does not have that optical range?
When shooting FHD (1920x1080, 2 MP) video on a 48 MP sensor, the lossless digital zoom actually yields more information up to twice the focal length. Sensor zoom is used from 3.1x - 5x and again from 5.1x - 10x or 12x, and digital zoom beyond that.
Of course, the range for sensor zoom is less for UHD video and none for 12 MP photos shot on the 120 mm camera's 12 MP sensor. But shooting 24 MP photos on the wide camera's 48 MP sensor yields lossless sensor zooming up to 4x. I think the bad spot is shooting photos between 4.1x - 5x as it's just cropping and enhancing the 4x photo until 4.9x. Even in that range, with sufficient light, the upscaling is slight enough that pixelation will be minimal in the final result.
Feel free to correct anything I said.
AppleInsider said:The cameras largely remain the same 12MP wide and 12MP ultra-wide cameras as the iPhone 12.
A minor, but beneficial change, is that Apple has upgraded image stabilization. It's gone from optical image stabilization to sensor-shift image stabilization. This moves the sensor itself rather than the lens and results in sharper images. But most people won't notice this -- it's an incremental step up.
So even the cute little iPhone 13 mini has the best smart phone camera on the planet, by a wide margin.
cloudguy said:"Now the RAM is what Apple calls unified memory ..."
Unified memory was invented and named by Nvidia - back in 2013 - and is a widely known and used technology. So their options for calling it something else were a bit constrained.
I hope Apple doesn't transition Macs to ARM chips. The benefit of having a POSIX *n*x running on the same hardware as the rest of the world is hard to overstate. The thinking with the transition is that since ARM chips are so powerful sipping such little energy on iOS devices, imagine the workhorses they'd be on desktops? Sure? Maybe? But this would only be a short-lived advantage until the same physical obstacles affecting Intel come up. The reason to transition is that progression on the Intel architecture has decelerated. But this is universal and will affect the ARM architecture as well. The laws of physics won't give Apple's ARM engineers any advantages over Intel engineers. ARM may have a head start, but it WILL hit the same limits at 4nm process with yield problems, etc.