Apple invents 3-sensor iPhone camera with light splitting cube for accurate colors, low-light perfor

Posted:
in iPhone edited March 2015
An Apple patent published on Tuesday details a miniaturized iPhone camera system that employs a light-splitting cube to parse incoming rays into three color components, each of which are captured by separate sensors.


Source: USPTO


As granted by the U.S. Patent and Trademark Office, Apple's U.S Patent No. 8,988,564 for a "Digital camera with light splitter" examines the possibilities of embedding a three-sensor prism-based camera module within the chassis of a thin wireless device, such as an iPhone. Light splitting systems do not require color channel processing or demosaicing, thereby maximizing pixel array resolution.

Commonly found in prosumer video cameras, and more recently in handheld camcorder models, three-sensor imaging technology splits incident light entering a camera into three wavelengths, or colors, using a prism or prisms. Usually identified as red, green and blue, the split images are picked up by dedicated imaging sensors normally arranged on or close to the prism face.




Older three-CCD cameras relied on the tech to more accurately capture light and negate the "wobble" effect seen with a single energy-efficient CMOS chip. Modern equipment employs global shutter CMOS modules that offer better low-light performance and comparable color accuracy, opening the door to entirely new shooting possibilities.

Apple's design uses light splitting techniques similar to those applied in current optics packages marketed by Canon, Panasonic, Philips and other big-name players in the camera space. For its splitter assembly, Apple uses a cube arrangement constructed using four identical polyhedrons that meet at dichroic interfaces.




By coating each interface with an optical coating, particular wavelengths of incident light can be reflected or allowed to transmit through to an adjoining tetrahedron. Adjusting dichroic filters allows Apple to parse out red, green and blue wavelengths and send them off to three sensors positioned around the cube. Aside from RGB, the patent also allows for other color sets like cyan, yellow, green and magenta (CYGM) and red, green, blue and emerald (RGBE), among others.

Light splitters also enable other desirable effects like sum and difference polarization, which achieves the same results as polarization imaging without filtering out incident light. The process can be taken a step further to enhance image data for feature extraction, useful in computer vision applications.




Infrared imaging can also benefit from a multi-sensor setup, as the cube can be tuned to suppress visible wavelength components, or vice versa.

Tuesday's patent is an extension of a previously published Apple invention that uses mirrors and optics to achieve optical image stabilization without eating up valuable space. In such embodiments a foldable, or mobile, mirror redirects incoming light, bouncing it toward a rectangular tube that terminates with a cuboid light splitter. In between the mirror and sensor package are a series of lenses that can be operably moved for focus and zooming. Motors are applied to vary the mirror's angle to offset hand shake recorded by onboard sensors.




It is unclear if Apple intends to implement a light splitter into a future product, though the company is working hard to keep iPhone at the fore of ultra-mobile photography. For example, the latest iPhone 6 Plus includes an OIS system that produces sharp images even in low-light situations.

Apple's cube light splitter camera system patent was first filed for in 2011 and credits Steven Webster and Ning Y. Chan as its inventors.
«13

Comments

  • Reply 1 of 42
    wizard69wizard69 Posts: 13,377member
    Just make the damn thing Apple! Seriously I'd be all over a new iPhone that is a seriously good photography tool. To that end I would hope that they don't miniaturize it to the point that any optical benefit goes out the window. Hopefully they can hit 12 mega pixels while upping light gathering effciency significantly. At is 12 mega pixels that are measurably of higher quality than what we get now.
  • Reply 2 of 42
    singularitysingularity Posts: 1,328member
    Very very interesting. Now impliment it in a product and take my money! ?
  • Reply 3 of 42
    The problem with this system is that it would mean less light hits the sensor array due to the mirror and additional glass in the way. This means noisier images and slower shutter speeds let alone the fact that there are more mechanical parts to do wrong.... I wouldn't hold your breath for this to be realistically an option for a while.... Sensor tech would need to improve substantially to obviate the lack of light.
  • Reply 4 of 42
    smalmsmalm Posts: 677member

    You forgot that they can put in three times the sensor area and they don't need the colour filters on the sensor dots anymore or to interpolate the colours.

  • Reply 5 of 42
    jumejume Posts: 209member
    The Dark Side of the Moon tech
  • Reply 6 of 42
    blastdoorblastdoor Posts: 3,558member

    I think camera improvements (both in the camera itself and the SOC) could be the primary thing pushing the iPhone upgrade cycle for the next few years. For me, CPU/GPU has been good enough since the iPhone 5. I bought the 6+ primarily for the camera and the screen. Now I'd say the screen needs no more attention, but the camera could still be better. 

  • Reply 7 of 42
    Basically a digital version of Technicolor.
  • Reply 8 of 42
    So, built in Instagram filters... Nice!
  • Reply 9 of 42
    I am guessing this camera patent for strengthening Prime Sense computer vision technology and preparing the way towards holographic technology. Apple has done much research on computer vision that could dazzle us like the Iron Man computer vision/holographic technology. Looking forward to see this stuff in the real world!
  • Reply 10 of 42
    foggyhillfoggyhill Posts: 4,767member
    Quote:
    Originally Posted by Marc Rogoff View Post



    The problem with this system is that it would mean less light hits the sensor array due to the mirror and additional glass in the way. This means noisier images and slower shutter speeds let alone the fact that there are more mechanical parts to do wrong.... I wouldn't hold your breath for this to be realistically an option for a while.... Sensor tech would need to improve substantially to obviate the lack of light.

     

    You gain sensor size, but lose light in the optics and mirrors and its more fragile , not sure it can be a good trade off. Maybe if it is coupled with some other tech they patented. Probably won't ever find its way into a phone.

     

    IF they need to do real time processing of the 3 colors in good light, it could be interesting. Maybe it would be for a supplementary camera on the phone?

  • Reply 11 of 42
    rob53rob53 Posts: 3,299member

    If you read the actual patent filing, this system isn't exactly small. "x-length of the deflector, zoom lens system, light splitter and image sensors is in the range 18 mm-32 mm." (See first diagram, it's the total length of the optical system.) At a max of over 1" long, it would take up most of the top portion of an iPhone. The patent also talks a lot about the zoom capability, which conceivably could be the greatest enhancement with this system. We're talking optical zoom, not digital zoom. Give me a true optical zoom of 5-10X (as well as macro) and there wouldn't be any need for an external lens system.

  • Reply 12 of 42
    MarvinMarvin Posts: 15,474moderator
    Older three-CCD cameras relied on the tech to more accurately capture light and negate the "wobble" effect seen with a single energy-efficient CMOS chip. Modern equipment employs global shutter CMOS modules that offer better low-light performance and comparable color accuracy, opening the door to entirely new shooting possibilities.

    It's the global shutter that gets rid of the wobble or rolling shutter, not the prisms. CCDs usually have it but CMOS didn't. CCD wasn't suitable for phones or even cheaper cameras due to cost and power draw. Camera manufacturers likely found they could get 3 cheaper CCDs than a single large one and so split the colors onto 3 separate smaller CCDs to get similar performance to a large one. Apple should use global shutter CMOS sensors if they aren't already. FCPX corrects rolling shutter artifacts but it's better if they don't happen in the first place.
    The problem with this system is that it would mean less light hits the sensor array due to the mirror and additional glass in the way. This means noisier images and slower shutter speeds let alone the fact that there are more mechanical parts to do wrong.... I wouldn't hold your breath for this to be realistically an option for a while.... Sensor tech would need to improve substantially to obviate the lack of light.

    You don't need a Bayer filter though. Right now, light shining on a sensor would get passed through a pattern of colored squares so each RGB component only gets about 1/3 of the sensor area (although it's not always split 3 ways) as well as losing light passing through the filter and the pixels are spread apart for each color. If you take away the filter and put in 3 sensors, you get 3x the light plus some saving from losing light through the filter. There's a test of this here:

    http://www.dpreview.com/forums/thread/2878305

    Quite a large loss in light. The mirror plus lenses will lose some light but these are intended to be highly transmissive materials and not absorbing anything so more efficient than a Bayer filter. I would expect at least a doubling in low light sensitivity over the current setup.

    Another benefit to the lens position shown is they can share the same sensor setup with the front-facing camera so both get high-resolution and with the mirror, they can put OIS in the iPhone 6, get rid of the camera bump and get optical zoom. I think this will be a setup that arrives in the iPhone in future, maybe not the 6S but the 7.
    I am guessing this camera patent for strengthening Prime Sense computer vision technology and preparing the way towards holographic technology.

    It allows them to capture infrared light, which is used by the Kinect etc for depth tracking and it works for night vision. An iPhone or iPad could be used as a baby monitor at night, badger watching and so on. Depth tracking opens up huge possibilities for apps. Developers can make apps that help you measure your body for clothes e.g bust size for a comfortable bra, leg length, waist size for the right jeans, head size for hats, you can make 3D models of objects/people very easily.

    http://appleinsider.com/articles/14/07/11/apples-secret-plans-for-primesense-3d-tech-hinted-at-by-new-itseez3d-ipad-app


    [VIDEO]


    You can take 3D pictures of your food and put it on Instagram and people can look at it in 3D.
  • Reply 13 of 42
    muppetrymuppetry Posts: 3,331member
    Quote:

    Originally Posted by Marc Rogoff View Post



    The problem with this system is that it would mean less light hits the sensor array due to the mirror and additional glass in the way. This means noisier images and slower shutter speeds let alone the fact that there are more mechanical parts to do wrong.... I wouldn't hold your breath for this to be realistically an option for a while.... Sensor tech would need to improve substantially to obviate the lack of light.



    No, it doesn't mean that. Although the light path is split, the splitter is very efficient, and so approximately one third of the light hits each of the three individual color sensors, which are each as large as a conventional 3-color sensor. In fact less light is discarded, because the dichroic splitter leads prevents the loss of information that occurs in a conventional system when red light hits a green sensor, etc.. Additionally, the pixel density is effectively 3 (or 4) times higher with no decrease in pixel size.

  • Reply 14 of 42



    You are correct the prism will not impact things but if you read what I am saying - the mirror (required for the redirection) and the lens elements for a zoom will certainly impact the light transmission which finally gets to the sensor.

  • Reply 15 of 42
    muppetrymuppetry Posts: 3,331member
    Quote:

    Originally Posted by Marc Rogoff View Post

     



    You are correct the prism will not impact things but if you read what I am saying - the mirror (required for the redirection) and the lens elements for a zoom will certainly impact the light transmission which finally gets to the sensor.




    Disagree. Mirrors and dichroic splitters are nearly 100% efficient. Lenses are a different matter, but no different here to any other arrangement for similar optical properties.

  • Reply 16 of 42

    The difference is that there is no zoom lens in the iPhone at present therefore the cost will be at loss of light if compared to the standard focal length used for the current setup - bottom line with this system there will be a loss of light gathering power and that is my point. 

  • Reply 17 of 42
    pujones1pujones1 Posts: 222member
    Marvin wrote: »
    It's the global shutter that gets rid of the wobble or rolling shutter, not the prisms. CCDs usually have it but CMOS didn't. CCD wasn't suitable for phones or even cheaper cameras due to cost and power draw. Camera manufacturers likely found they could get 3 cheaper CCDs than a single large one and so split the colors onto 3 separate smaller CCDs to get similar performance to a large one. Apple should use global shutter CMOS sensors if they aren't already. FCPX corrects rolling shutter artifacts but it's better if they don't happen in the first place.
    You don't need a Bayer filter though. Right now, light shining on a sensor would get passed through a pattern of colored squares so each RGB component only gets about 1/3 of the sensor area (although it's not always split 3 ways) as well as losing light passing through the filter and the pixels are spread apart for each color. If you take away the filter and put in 3 sensors, you get 3x the light plus some saving from losing light through the filter. There's a test of this here:

    http://www.dpreview.com/forums/thread/2878305

    Quite a large loss in light. The mirror plus lenses will lose some light but these are intended to be highly transmissive materials and not absorbing anything so more efficient than a Bayer filter. I would expect at least a doubling in low light sensitivity over the current setup.

    Another benefit to the lens position shown is they can share the same sensor setup with the front-facing camera so both get high-resolution and with the mirror, they can put OIS in the iPhone 6, get rid of the camera bump and get optical zoom. I think this will be a setup that arrives in the iPhone in future, maybe not the 6S but the 7.
    It allows them to capture infrared light, which is used by the Kinect etc for depth tracking and it works for night vision. An iPhone or iPad could be used as a baby monitor at night, badger watching and so on. Depth tracking opens up huge possibilities for apps. Developers can make apps that help you measure your body for clothes e.g bust size for a comfortable bra, leg length, waist size for the right jeans, head size for hats, you can make 3D models of objects/people very easily.

    http://appleinsider.com/articles/14/07/11/apples-secret-plans-for-primesense-3d-tech-hinted-at-by-new-itseez3d-ipad-app


    [VIDEO]


    You can take 3D pictures of your food and put it on Instagram and people can look at it in 3D.

    That is awesome!! I don't know when I missed reading that article but I'm glad you brought it up. Thanks.

    I'm ready for whatever camera improvement they are going to give us.

    Thanks to all the folks who are replying with knowledge about this subject.
  • Reply 18 of 42
    nolamacguynolamacguy Posts: 4,758member
    Quote:

    Originally Posted by Marc Rogoff View Post



    The problem with this system is that it would mean less light hits the sensor array due to the mirror and additional glass in the way. This means noisier images and slower shutter speeds let alone the fact that there are more mechanical parts to do wrong.... I wouldn't hold your breath for this to be realistically an option for a while.... Sensor tech would need to improve substantially to obviate the lack of light.

     

    you know youre in trouble when the only Like on your post is that of our resident naysayer, skeptic, and proponent of anything negative or critical of apple. how insane.

  • Reply 19 of 42



    I dont frequent the forum enough to recognise him I am afraid but it is pretty amazing how many trolls there are - presumably paid by competitors...

  • Reply 20 of 42
    paxmanpaxman Posts: 4,729member
    Basically a digital version of Technicolor.
    Exactly my thought. I have also owned two Panasonic camcorders with 3ccd's. They were both had fantastic image quality.
Sign In or Register to comment.