Apple exploring light field image editing software, patent application shows

Posted:
in General Discussion edited April 2015
A patent application filed with the U.S. Patent and Trademark Office on Thursday reveals Apple is continuing to explore light field camera applications, in this case imagining how users would manipulate image data to change focus or depth of field in a photo editing app.


Source: USPTO


Published under the wordy title "Method and UI for Z depth image segmentation," Apple's application covers the software side of light field imaging, describing an application that edits data produced by highly specialized imaging sensors. At its most basic level, the filing imagines steps a user could take to focus, and later refocus, an image captured by a light field camera.

Apple starts off by outlining hardware capable of producing imaging data necessary for advanced refocusing computations. A generic "light field camera" is described as being a camera that not only captures light, but also information about the direction at which light rays enter the system. This rich data set can be used to produce a variety of images, each with a slightly different perspective.




Such specialized imaging sensors do exist in the real world, perhaps most visibly in equipment made by consumer camera maker Lytro. Alternatively, the document could be referring to a plenoptic camera that uses a microlens array to direct light onto a traditional CMOS chip, achieving similar results with less fuss. Apple coincidentally owns IP covering such implementations.

One of the main features of a light field camera is its ability to recalculate depth of field, a term describing a function of aperture size and subject distance that presents itself as a camera system's range of focus.

A large aperture results in a shallow depth of field, meaning only objects that fall within a tight depth range are in focus, while small apertures offer a wider depth of field. In a highly simplified example, objects between two and four feet away from a camera system set to an aperture of f/2 at a specific focusing distance may come out pin-sharp in a photo. Conversely, the same camera system set with an aperture of f/16 or smaller would reproduce objects in sharp focus from two feet to infinity.




In a light field system, data related to light ray direction is processed to determine the depth of an object or objects in frame. Apple's invention uses this depth information to parse an image into layers, which can then be altered or even removed by an editing application.

In one embodiment, Apple's system ingests light field image data, processes it and generates a histogram displaying depth versus number of imaging pixels. Analyzing the histogram, the system can determine peaks and valleys from which image layers are formed.




Each layer corresponds to a certain depth in the light field image. Applying a UI tool, such as a slider bar, users can instruct the system to show foreground layers while obscuring background layers. Manipulating the depth slider causes certain layers to pop into and out of focus.

Alternative methods include selecting an object for focus within an image by clicking or tapping on it directly. With light field data, layers assigned to these objects can be reproduced in various states of focus, or calculated focusing distance, in relation to other layers (foreground and background). In some scenarios, layers can be deleted altogether.




Apple's image editing program also incorporates the usual assortment of photo adjustment settings like contrast, brightness and color correction. Bundled into one app, users can manipulate photo properties alongside depth of field, focus, layer selection and other advanced features specific to light field imaging.

Put into the context of recent events, today's filing is especially interesting. On Tuesday, Apple confirmed -- albeit through usual non-answer comment -- the purchase of Israel-based LinX Computational Imaging Ltd., a firm specializing in "multi-aperture imaging technology." Some estimate the deal to be worth some $20 million.


LinX image sensor.


What Apple plans to do with LinX technology is unclear, but it wouldn't be a stretch to connect the IP with a light field camera system. In the near term, however, Apple is more likely to apply its newly acquired multi-lens camera solutions to capturing traditional photos at high resolutions using small form-factor components.

Apple's light field imaging software patent application was first filed for in October 2013 and credits Andrew E. Bryant and Daniel Pettigrew as its inventors. According to what is believed to be Pettigrew's LinkedIn profile, he has since left his post as Image Processing Specialist at Apple for the same position at Google.

Comments

  • Reply 1 of 15
    xixoxixo Posts: 450member
    Very cool report. The Lytro technology has the potential to change electronic imaging forever.

    tl; dr: you can focus a picture [B]after[/B] you take it.

    That would work well on a phone, tablet or watch (and other unrealized future products).
  • Reply 2 of 15
    MacProMacPro Posts: 19,718member
    Yes! This is what I was hoping for. This is the future of Photography IMHO. Post editing RAW images will be a whole new ball of wax. This is very big people!

    p.s. I hope Apple is granted this patent and has a slew more in the works. Google and Scamsung will be hard at work and willing to spend years in court to sell a rip off no doubt, all the while we will have years of Gatorguy posting prior art and spurious articles showing Google invented this already.
  • Reply 3 of 15
    waterrocketswaterrockets Posts: 1,231member
    Quote:
    Originally Posted by digitalclips View Post



    Yes! This is what I was hoping for. This is the future of Photography IMHO. Post editing RAW images will be a whole new ball of wax. This is very big people!



    p.s. I hope Apple is granted this patent and has a slew more in the works. Google and Scamsung will be hard at work and willing to spend years in court to sell a rip off no doubt, all the while we will have years of Gatorguy posting prior art and spurious articles showing Google invented this already.

     

    This is really huge.

     

    FWIW, Google could just buy Lytro, although I don't see how that would align with ad sales.

  • Reply 4 of 15
    gatorguygatorguy Posts: 24,176member
    "Someone" is trolling me pretty hard today. :rolleyes: At least the third attempt in the last several hours but I don't plan to play along so no need to continue diverting thread discussions with[B] egregious trolling.[/B]
    http://arstechnica.com/staff/2011/09/announcing-increased-moderation-of-trolls-in-discussion-threads/
  • Reply 5 of 15
    muppetrymuppetry Posts: 3,331member
    Interesting patent, but the connection with the LinX technology is not obvious, since LinX does not use light field techniques.
  • Reply 6 of 15
    hodarhodar Posts: 357member
    But the Lytro technology seems to directly apply to the 2/3D nature of the LinX technology in the hardware. The applications of this are pretty neat, but the IP implications are equally thrilling. Now with a 3D object being capable of being captured in a single photo, someone could take a picture of a room; and then potentially extract a 3D replica of almost anything in that room.

    Now, it would be non-functional, obviously - but consider the security ramifications in industry, manufacturing, R&D across almost every single facet of technology.
  • Reply 7 of 15
    Yes! This is what I was hoping for. This is the future of Photography IMHO. Post editing RAW images will be a whole new ball of wax. This is very big people!

    p.s. I hope Apple is granted this patent and has a slew more in the works. Google and Scamsung will be hard at work and willing to spend years in court to sell a rip off no doubt, all the while we will have years of Gatorguy posting prior art and spurious articles showing Google invented this already.

    Yeah, there's a good article about this and FCPX at patentlyapple.com:

    http://www.patentlyapple.com/patently-apple/2015/04/apple-to-bring-z-depth-mapping-to-final-cut-pro-x.html


    To illustrate the possibilities, they show an Adobe video on z-depth processing:


    [VIDEO]


    What's most interesting to me is the compositing of layers towarrds the end ...

    It appears that they are reconstructing a layer (the parachutist?) even though it contains missing parts occluded by a shape nearer to the camera.

    Is this possible with a single lens/camera or are multiple z-depth images composited together?


    Either way, @digi, it won't be too long before we see some of your shots of wildlife composited with completely different surroundings ...

    Say, an alligator or manatee swimming alongside Michael Phelps in the swimming pool at the 2016 Rio de Janeiro Summer Olympics ...


    Tho, I don't expect we'll be satisfied with the scoring of the East German Judge   :D
  • Reply 8 of 15
    muppetry wrote: »
    Interesting patent, but the connection with the LinX technology is not obvious, since LinX does not use light field techniques.

    Mas o menos ...

    The LinX tech may be different, but it produces similar results:
    The LinX array cameras revolutionizes mobile imaging as we know it today
    • Better color accuracy and uniformity
    • HDR - Higher dynamic range
    • UHDR - ultra high dynamic range on dedicated modules
    • Low noise levels
    • Higher resolution
    • Low module costs
    • Better color accuracy and uniformity
    • No Autofocus for modules of up to 20 mpix
    • Zero shutter lag
    • Tiny package with up to 50% reduction in Z module height allowing slim devices and edge-to-edge display


    Depth maps can be used for various applications:
    • 3D scanning of objects
    • Sizing of objects
    • Refocusing: knowing the depth at every pixels allows us to apply a synthetic blur to emulate a shallow depth of field
    • Background removal and replacement
    • Gesture recognition

    http://www.scribd.com/doc/261875793/LinX-Imaging-Presentation

    Some of the processing is done concurrently with the image processing -- making possible near real-time feedback.

    Great for instant-replay at events ...

    Outdoor and Indoor mapping ...

    A realtor walking through a property taking video -- and capturing room dimensions ...
     
  • Reply 9 of 15
    dasanman69dasanman69 Posts: 13,002member
    Mas o meños ...

    There's no ñ in the word menos.
  • Reply 10 of 15
    dasanman69 wrote: »
    Mas o meños ...

    There's no ñ in the word menos.

    Claro senor !
  • Reply 11 of 15
    gatorguygatorguy Posts: 24,176member


    What's most interesting to me is the compositing of layers towarrds the end ...

    It appears that they are reconstructing a layer (the parachutist?) even though it contains missing parts occluded by a shape nearer to the camera.

    Is this possible with a single lens/camera or are multiple z-depth images composited together?

    There's patents that claim you can.
    https://www.google.com/patents/WO2014165472A1?cl=en&dq="light+field"+inassignee:Google&hl=en&sa=X&ei=5fUvVcW5CbX9sASjuYHACQ&ved=0CF0Q6AEwCQ
    https://www.google.com/patents/US20140125810?dq=refocusing+light+field+"light+field"+inassignee:Google&hl=en&sa=X&ei=rcAvVauDMKixsATH4YHQAQ&ved=0CCAQ6AEwAA
  • Reply 12 of 15
    dasanman69dasanman69 Posts: 13,002member
    Claro senor !

    You forgot the ¡ in front of Claro. :lol:
  • Reply 13 of 15
    dasanman69 wrote: »
    Claro senor !

    You forgot the ¡ in front of Claro. :lol:

    Así Es La Vida ... muy peñoso ...

    When I was a kid my heros were the Lone Ranger and Tonto ... quite unflattering to the Native American!
     
  • Reply 14 of 15
    dasanman69dasanman69 Posts: 13,002member
    Así Es La Vida ... muy peñoso ...

    When I was a kid my heros were the Lone Ranger and Tonto ... quite unflattering to the Native American!
     

    Just because one Native American was Tonto doesn't mean they all were, but that's still better than Dorothy's dog Toto. That's slang for a woman's sexual organ in many Latin countries. :lol:
  • Reply 15 of 15

    "Such specialized imaging sensors do exist in the real world, perhaps most visibly in equipment made by consumer camera maker Lytro. Alternatively, the document could be referring to a plenoptic camera that uses a microlens array to direct light onto a traditional CMOS chip, achieving similar results with less fuss. Apple coincidentally owns IP covering such implementations. "

     

    I think they have it backward.  Lytro uses a microlens array in front of a traditional sensor.  Apple owns IP for a type of implementation but not the way that Lytro implements it.

Sign In or Register to comment.