Future iPhones may use optical image stabilization to create 'super-resolution' images

Posted:
in General Discussion edited May 2014
Documents discovered on Thursday show Apple is hard at work on tech to take iPhone picture quality to new levels, specifically a "super-resolution" imaging engine that uses optical image stabilization to capture multiple samples that are then stitched together to form an incredibly high-density photo.

image


With the iPhone already one of the world's most popular digital consumer cameras, Apple is looking to build on its lead in the sector with new technology that significantly boosts picture resolution without the need for more megapixels.

According to a patent application published by the U.S. Patent and Trademark Office on Thursday covering "Super-resolution based on optical image stabilization," Apple is testing out alternative and unique uses for existing OIS tech, a much different path than rival smartphone makers.


Source: USPTO


In very basic terms, the invention uses an optical image stabilization (OIS) system to take a batch of photos in rapid succession, each at a slightly offset angle. The resulting samples are fed into an image processing engine that creates a patchwork super-resolution image.

Traditional OIS systems use inertial or positioning sensors to detect camera movement like shaking from an unsteady hand. Actuators attached to a camera's imaging module (CCD, CMOS or equivalent sensor), or in some cases lens elements, then shift the component in an equal and opposite vector to compensate for the unwanted motion.

Physical modes of stabilization usually produce higher quality images compared to software-based solutions. Whereas digital stabilization techniques compensate for shake by pulling from pixels outside of an image's border or running a photo through complex matching algorithms, OIS physically moves camera components.



Apple's filing takes the traditional OIS system and combines it with advanced image processing techniques to create what it calls "super-resolution" imaging.

According to one embodiment, the system is comprised of a camera, an actuator for OIS positioning, a positioning sensor, an inertial sensor, an OIS processor and a super-resolution engine. A central processor, such as the iPhone 5s' A7 system-on-chip, governs the mechanism and ferries data between different components.

In practice, a user may be presented with a super-resolution option in a basic camera app. When the shutter is activated, either by a physical or on-screen button, the system fires off a burst of shots much like iOS 7's burst mode as supported by the iPhone 5s.



While taking the successive image samples, a highly precise actuator tilts the camera module in sub-pixel shifts along the optical path -- across the horizon plane or picture plane. In some embodiments, multiple actuators can be dedicated to shift both pitch and yaw simultaneously. Alternatively, the OIS systems can translate lens elements above the imaging module.

Because the OIS processor is calibrated to control the actuator in known sub-pixel shifts, the resulting samples can be interpolated and remapped to a high resolution grid. The process is supported by a positioning sensor that can indicate tilt angle, further enhancing accuracy.

Considered a low-resolution sample, each successive shot is transferred to a super-resolution engine that combines all photo data to create a densely sampled image. Certain embodiments allow for the lower resolution samples to be projected onto a high-resolution grid, while others call for interpolation onto a sub-pixel grid.

Finally, the super-resolution engine can apply additional techniques like gamma correction, anti-aliasing and other color processing methods to form a final image. As an added bonus, the OIS system can also be tasked for actual stabilization duties while the super-resolution mechanism is operating.



The remainder of Apple's filing offers greater detail on system calibration, alternative methods of final image construction and key optical thresholds required for accurate operation. Also discussed are filters, sensor structures like micron lenses and specifications related to light and color handling.

It is unknown if Apple will choose to implement its super-resolution system in a near-future iPhone, but recent rumors claim the company will forego physical stabilization on the next-gen handset in favor of a digital solution. As OIS systems require additional hardware, the resulting camera arrays are bulky compared to a regular sensor module with software-based stabilization.

Apple's OIS-based super-resolution patent application was first filed for in 2012 and credits Richard L. Baer and Damien J. Thivent as its inventors.
«1345

Comments

  • Reply 1 of 88
    hill60hill60 Posts: 6,992member
    Wow, that's the biggest link I've ever seen.

    So it extends on the 28 megapixel panorama shots.
  • Reply 2 of 88
    philboogiephilboogie Posts: 7,675member
    Yes please, implement this Apple! That, together with a f/1.8.
    hill60 wrote: »
    Wow, that's the biggest link I've ever seen.

    That happens every now and then while Huddle converts the article to the forum. Here's the actual link:
    http://appft.uspto.gov/netacgi/nph-Parser?Sect1=PTO2&Sect2=HITOFF&u=%2Fnetahtml%2FPTO%2Fsearch-adv.html&r=2&p=1&f=G&l=50&d=PG01&S1=(348%2F208.5.CCLS.+AND+20140508.PD.)&OS=ccl/348/208.5+and+pd/5/8/2014&RS=(CCL/348/208.5+AND+PD/20140508)
    So it extends on the 28 megapixel panorama shots.

    Can you elaborate on that? As I understand it, the patent is on OIS.
  • Reply 3 of 88
    It's funny/fascinating/strange/ironic how the term embodiment as used by the US Patent office is a forerunner of the concept of use cases.
  • Reply 4 of 88

    OIS is always a welcome addition 

  • Reply 5 of 88

    Seeing is believing...

  • Reply 6 of 88
    blastdoorblastdoor Posts: 3,307member

    My understanding is that our brains do essentially the same thing. This is extremely cool. I hope they can implement this relatively soon. 

  • Reply 7 of 88
    gilly33gilly33 Posts: 434member
    What is our incessant need to know this information. Especially stuff that apparently puts Apple ahead of the game. Let's just get it out there so Samsung can start working on it too and call it something else. Gee freaking whiz.
  • Reply 8 of 88
    cnocbuicnocbui Posts: 3,613member

    The trouble with this approach, as I see it, is that you most need stabilisation in low light, when the camera is going to have to resort to a slow shutter speed.  A slow shutter speed makes it rather difficult to take multiple exposures in a reasonable time frame.

     

    It will be very interesting to see if this makes it into production and how effective it will be.

     

    Olympus make cameras with OIS that is bordering on magic.

  • Reply 9 of 88
    Another gimmick with all of the rubbish filters and apps they slap on will convince people they are actually photographers. Just another way to dumb down consumers.
  • Reply 10 of 88
    netmagenetmage Posts: 314member
    Hardly a gimmick and the potential has always been there with IS systems with moving parts. Instead of increasing resolution however consider moving the sensor so every resulting pixel has three samples (RGB) and producing a standard resolution image without interpolation.
  • Reply 11 of 88
    Quote:

    Originally Posted by gilly33 View Post



    What is our incessant need to know this information. Especially stuff that apparently puts Apple ahead of the game. Let's just get it out there so Samsung can start working on it too and call it something else. Gee freaking whiz.

    ...he says on an Apple rumour site

  • Reply 12 of 88
    lawrancelawrance Posts: 86member
    Well my iPhone 5 camera rattles. So I've always just considered that to be optical stabilization!
  • Reply 13 of 88
    MacProMacPro Posts: 19,728member
    Another gimmick with all of the rubbish filters and apps they slap on will convince people they are actually photographers. Just another way to dumb down consumers.

    Have you any idea how much more we photographers are willing to pay for IS on a Canon pro lens? Do you think they dumb down the DSLRs?
  • Reply 14 of 88
    iqatedoiqatedo Posts: 1,824member
    Quote:
    Originally Posted by cnocbui View Post

     

    ...Olympus make cameras with OIS that is bordering on magic.


     

    Canon has employed optical image stabilisation for over or close to, a decade.

     

    When a scene is captured on two or more frames and the camera knows exactly what the intervening movement was (shake or otherwise - from data supplied by the actuators), the processor can perform sub-pixel interpolation (almost mentioned is one phrase in the article), which produces a physically higher resolution image. This is because each camera pixel samples a little bit of the neighbouring pixel's image field, essentially sampling the image at a higher resolution that the imaging system itself is capable of. Super resolution is a fair description. Apple's implementation might be novel, Canon and others use a different means but producing a higher resolution image this way has been understood for years.

     

    So, no gimmick, physically valid outcome.

     

    Rumour has Apple implementing both a zoom moveable lens, allowing a faster imaging system and sub-pixel interpolation through optical image stabilisation all in a tiny package. Once the iPhone 5 has been on the market for two years, mine will be for the new phone. All possible because some boffin working in a lab discovered that it was possible to produce gain in one of those new-fangled semiconductor materials under the control of another. :) 

     

    Edit - moveable lens for focussing, not a zoom lens, duh.

  • Reply 15 of 88
    MacProMacPro Posts: 19,728member
    cnocbui wrote: »
    The trouble with this approach, as I see it, is that you most need stabilisation in low light, when the camera is going to have to resort to a slow shutter speed.  A slow shutter speed makes it rather difficult to take multiple exposures in a reasonable time frame.

    It will be very interesting to see if this makes it into production and how effective it will be.

    Olympus make cameras with OIS that is bordering on magic.

    My read is that the misalignment issue is being taken care of by the AI that stitches. The IS is making sure the low light images are as stable as possible of course. Good IS adds two or three F/stops in a pro lens, not sure what Apple will be able to gain in speed but I suspect they may in the same ball park. I have been predicting Apple will take on the high end camera makers for years. Sapphire lenses, AI, IS and so on. If they could start using massive or multiple sensors they solve the issue of digital zoom too since there is no pixel doubling if the data is there to crop from, light stays the same on the larger or multiple sensors if the lens is up to the task. Exciting times indeed! Now we need 4K from Apple in some device (which may not be an iPhone of course) and given the MacPro and Final Cur Pro X are geared for that I suspect it is coming. I'm not hanging up my Canon DSLRs or Sony 4K video camera any time soon but one day .... maybe.
  • Reply 16 of 88
    MacProMacPro Posts: 19,728member
    iqatedo wrote: »
    Canon has employed optical image stabilisation for over or close to, a decade.

    When a scene is captured on two or more frames and the camera knows exactly what the intervening movement was (shake or otherwise - from data supplied by the actuators), the processor can perform sub-pixel <span style="line-height:22px;">interpolation (almost mentioned is one phrase in the article)</span>
    , which produces a physically higher resolution image. This is because each camera pixel samples a little bit of the neighbouring pixel's image field, essentially sampling the image at a higher resolution that the imaging system itself is capable of. Super resolution is a fair description. Apple's implementation might be novel, Canon and others use a different means but producing a higher resolution image this way has been understood for years.

    So, no gimmick, physically valid outcome.

    Rumour has Apple implementing both a zoom moveable lens, allowing a faster imaging system and sub-pixel interpolation through optical image stabilisation all in a tiny package. Once the iPhone 5 has been on the market for two years, mine will be for the new phone. All possible because some boffin working in a lab discovered that it was possible to produce gain in one of those new-fangled semiconductor materials under the control of another. :)  

    Edit - moveable lens for focussing, not a zoom lens, duh.

    I wouldn't be shocked if Apple come out with a lens that focuses not by traditional movement but by shape shifting like the human eye lens. Then again there is the Field lens technology I hope they pursue with multiple focal planes embedded in one RAW image. That would be fun!
  • Reply 17 of 88
    flaneurflaneur Posts: 4,526member
    Another gimmick with all of the rubbish filters and apps they slap on will convince people they are actually photographers. Just another way to dumb down consumers.

    I take it you're still using a pinhole camera and wet plates? Those guys were real photographers, but they were accused of dumbing down painting.
  • Reply 18 of 88
    MacProMacPro Posts: 19,728member
    flaneur wrote: »
    I take it you're still using a pinhole camera and wet plates? Those guys were real photographers, but they were accused of dumbing down painting.

    LOL

    Then there was Johannes Vermeer who seems to have had the best of both worlds. :D
  • Reply 19 of 88
    cnocbuicnocbui Posts: 3,613member
    Quote:

    Originally Posted by digitalclips View Post





    My read is that the misalignment issue is being taken care of by the AI that stitches. The IS is making sure the low light images are as stable as possible of course. Good IS adds two or three F/stops in a pro lens, not sure what Apple will be able to gain in speed but I suspect they may in the same ball park. I have been predicting Apple will take on the high end camera makers for years. Sapphire lenses, AI, IS and so on. If they could start using massive or multiple sensors they solve the issue of digital zoom too since there is no pixel doubling if the data is there to crop from, light stays the same on the larger or multiple sensors if the lens is up to the task. Exciting times indeed! Now we need 4K from Apple in some device (which may not be an iPhone of course) and given the MacPro and Final Cur Pro X are geared for that I suspect it is coming. I'm not hanging up my Canon DSLRs or Sony 4K video camera any time soon but one day .... maybe.



    Massive sensors, image stabilisation by shifting the sensor, using huge pixel counts to allow for uncompromised digital zoom - you realise these are all things pioneered by Nokia very effectively and available now in their high end phones?  But lets not give any credit to any company except Apple.  As is well known, they are the only company capable of innovation.

  • Reply 20 of 88
    bugsnwbugsnw Posts: 717member

    Always someone to poo-poo the cameras that consumers actually love and use. It's not a dumbing down. The vast majority of people can't tell the difference between a pro shot and a good pic with an iPhone 5S.

     

    I do understand the technical debate and that's something many of us enjoy. But I always stop short of saying iTunes sucks and will never be successful because it isn't some wildly huge, complicated file type. Or, like I've heard many photographers say, digital cameras will never overtake film because of x,y,z.

     

    It's the content, stupid. Interesting photos are interesting because of the content.

Sign In or Register to comment.