Apple's 5G 'iPhone 12' may use sensor-shifting technology to stabilize images

Posted:
in General Discussion edited December 2019
A new rumor claims that Apple's forthcoming 5G "iPhone 12" will come with sensor-shift stabilization, meaning that the actual camera sensor and not just the lens will move within the phone case to keep an image steady.




In an unconfirmed rumor, as well as 5G, some models of Apple's 2020 "iPhone 12" lineup may now feature sensor-shift stabilization. When excessive motion is detected, a sensor-shift system can move the iPhone's camera sensor to compensate.

According to Digitimes, sources in the industry are saying that this sensor-stabilization will be included in some models of the "iPhone 12," which is expected to be released in September 2020.

Sensor stabilization has the benefit that it can work with any of the lenses on the iPhone -- including any external ones that you add.

An alternative is lens stabilization, where a similar motion-detection makes the lens adjust to match. This is generally more effective than sensor stabilization, but it can require special lenses. Since the iPhone X and iPhone 8, iPhones have included a form of optical image stabilization that does move the lens in response to movement.

DigiTimes has a good track record when it comes to moves inside the supply chain. Its accuracy when predicting specific features, like Friday's report on sensor-shift stabilization, is nowhere near as good.

Apple's 2020 iPhone 12 range is now expected to include a total of four different models. According to analyst Ming-Chi Kuo, they are likely to vary chiefly by screen, from 4.7-inch LCD to 6.7-inch OLED, and number of number of cameras, from a single to a triple-lens system.

Kuo reports that all four will feature 5G, using Qualcomm modems, and sees these models prompting a super cycle of upgrades for users.

Comments

  • Reply 1 of 4
    hodarhodar Posts: 359member
    Not sure if Adobe patented this,or not - but their idea was very clever.  They analyze the picture as a whole, and use an algorithm to determine any movement that occurred during the exposure, and then remove that movement shift from the picture.  The result was a remarkably clearer picture.

    I would think that any smartphone that added this algorithm into it's post processing camera application - whether that movement came from sensors, or from analysis of the photo data - would be a HUGE value add.  To the best of my knowledge, no one does this - I may be wrong (and I hope I am).
  • Reply 2 of 4
    gatorguygatorguy Posts: 24,390member
    hodar said:
    Not sure if Adobe patented this,or not - but their idea was very clever.  They analyze the picture as a whole, and use an algorithm to determine any movement that occurred during the exposure, and then remove that movement shift from the picture.  The result was a remarkably clearer picture.

    I would think that any smartphone that added this algorithm into it's post processing camera application - whether that movement came from sensors, or from analysis of the photo data - would be a HUGE value add.  To the best of my knowledge, no one does this - I may be wrong (and I hope I am).
    I think a slight shift at the pixel-level is what enables the Google Pixel's Super-Res Zoom feature isn't it? I'm thinking it is also what makes Astro mode on this year's Pixel 4 possible.
    edited December 2019
  • Reply 3 of 4
    tmaytmay Posts: 6,441member
    gatorguy said:
    hodar said:
    Not sure if Adobe patented this,or not - but their idea was very clever.  They analyze the picture as a whole, and use an algorithm to determine any movement that occurred during the exposure, and then remove that movement shift from the picture.  The result was a remarkably clearer picture.

    I would think that any smartphone that added this algorithm into it's post processing camera application - whether that movement came from sensors, or from analysis of the photo data - would be a HUGE value add.  To the best of my knowledge, no one does this - I may be wrong (and I hope I am).
    I think a slight shift at the pixel-level is what enables the Google Pixel's Super-Res Zoom feature isn't it? I'm thinking it is also what makes Astro mode on this year's Pixel 4 possible.
    https://www.dpreview.com/articles/7921074499/five-ways-google-pixel-3-pushes-the-boundaries-of-computational-photography

    Google is using OIS to shift the lens, whereas Apple's technique is to use sensor shift, and though there is no mention of Apple using a pixel shift technique to enhance resolution, it would appear that it would work with both the rumored sensor shift and existing OIS.

    The advantage to sensor shift would come down to working well with any attached lens accessory, hence why high end mirrorless cameras typically have sensor shift and as well offer compatibility with the a lens OIS.
    edited December 2019 watto_cobra
  • Reply 4 of 4
    The first of many 5G rumors concerning the next Apple phone. But I think Apple has shown, they have bigger fish to fry.
    But in the mean time, 5G succeeds as click-bait.
    edited December 2019 tmaywatto_cobra
Sign In or Register to comment.