Apple patents 'super resolution' multi-sensor cameras, display-integrated light sensor tech
Apple continues to win patents detailing advanced technology rumored to show up in next year's iPhone model, the most recent being a trio of inventions dealing with "super resolution" multi-sensor cameras and light sensors embedded into functional phone displays.

As awarded by the U.S. Patent and Trademark Office on Tuesday, Apple's U.S. Patent Nos. 9,467,666 and 9,466,653 detail a multi-sensor camera assembly capable of achieving maximum image quality in a minimal amount of space. The system splits incoming light into at least three (red, blue and green) wavelengths using a series of prisms, directs and captures the rays with independent light sensors and combines the resulting data into a "super resolution" image via specialized software.
In one embodiment described in the '666 patent covering "Miniature camera super resolution for plural image sensor arrangements," Apple notes the use of color splitters allows for enhanced image resolution compared to single-sensor solutions as a lower portion of light is absorbed by filters.
Conventional single-sensor cameras commonly apply a Bayer pattern filter to derive color (typically through a demosaicing or software interpolation process) from CCD or CMOS sensor data. By contrast, three-sensor cameras gather and apply nearly all incoming light -- in some cases three times the amount of a single-sensor array -- passing through the beam splitter to a final output image.
Apple's IP refers to the beam splitter as either a Philips prism or smaller stacked cube variation, the latter described in detail in the '653 patent for a "Digital camera with light splitter," itself an extension of a patent granted in 2015. In a cube configuration, sensors collecting red, green and blue light information are embedded into ceramic substrates and positioned on three sides of the prism. The light splitter is designed to direct specific wavelengths to each sensor with minimal waste.
In addition to enhanced light gathering capabilities, three-sensor arrays offer better performance in situations calling for polarization imaging. Typically, polarizing filters shed 50 percent of incident light to increase visibility of specific targets, but the same results can be achieved with the splitting cube as it enables polarization sum and difference imaging. Infrared imaging is another area that would benefit from multiple wavelength-dedicated sensors.
As for deployment in smartphones, Apple illustrates a "folded" camera design in both of today's patents. To minimize system depth, a mirror mechanism is used to bounce incoming light through lens elements positioned orthogonal to the phone's Z-axis. For example, the objective lens can be installed as it appears on current iPhones, while the imaging sensor is positioned at some other location in the chassis at a right angle to said lens.
This design holds advantages far beyond space savings. The mirror, for example, can be mounted on a miniature motor to enable optical image stabilization, while the long connecting tube offers additional room for zoom capabilities.

Whether Apple plans to implement the inventions in a shipping device remains unclear, though the company already owns a number of patents covering three-sensor, three-lens systems, and cube splitter designs and other imaging technology.
The '666 patent goes on to explain super resolution algorithms, which in part use natural misalignments in the relative positions of the arrays to sample a given scene at higher spatial resolution. Super samples are also gleaned from edges detected in sub-sampled sites, as well as color information and artifacts.
Apple's super resolution camera patent was first filed for in July 2015 and credits Richard J. Topliss and Richard H. Tsai as its inventors. The cube splitter patent update was filed for in March 2015 and credits Steven Webster and Ning Y. Chan as its inventors.

Apple's third patent deals with embedded sensors, specifically light-sensing components like ambient light and proximity sensors.
As noted in Apple's U.S. Patent No. 9,466,653 for "Electronic devices with display-integrated light sensors," light-sensing apparatuses are commonly displaced from the device display. While advantageous from a production standpoint, such implementations lead to wasted space, or in some cases force sleek designs to be modified. Indeed, the iPhone's proximity and ALS are positioned above the display near the handset speaker.
Apple proposes forming sensors on display layers that already support conductive traces. Some embodiments provide for sensor positioning at the periphery of a device display beyond the edge of touch sensitive traces.
For example, an ALS unit might be disposed at the extreme edge of an OLED display, then covered by a touch-sensitive layer and finally an encapsulation layer. Alternatively, a handset could include a dedicated TFT layer onto which a variety of sensors are embedded.
In any case, the sensor or sensors are disposed within the display itself, not above it as with current iPhone models. This design tweak alone would save precious millimeters off final design specifications and could pave the way to a true full-screen display.
Apple is said to be working on an advanced iPhone design with "full-screen face," meaning the rumored OLED display stretches across the device's entirety. The company last week patented technology detailing a fingerprint sensor that works through portable device displays, though the ear speaker remains a problem.
Apple's embedded light sensor patent was first filed for in June 2015 and credits Erik G. de Jong, Anna-Katrina Shedletsky and Prashanth S. Holenarsipur as its inventors.

As awarded by the U.S. Patent and Trademark Office on Tuesday, Apple's U.S. Patent Nos. 9,467,666 and 9,466,653 detail a multi-sensor camera assembly capable of achieving maximum image quality in a minimal amount of space. The system splits incoming light into at least three (red, blue and green) wavelengths using a series of prisms, directs and captures the rays with independent light sensors and combines the resulting data into a "super resolution" image via specialized software.
In one embodiment described in the '666 patent covering "Miniature camera super resolution for plural image sensor arrangements," Apple notes the use of color splitters allows for enhanced image resolution compared to single-sensor solutions as a lower portion of light is absorbed by filters.
Conventional single-sensor cameras commonly apply a Bayer pattern filter to derive color (typically through a demosaicing or software interpolation process) from CCD or CMOS sensor data. By contrast, three-sensor cameras gather and apply nearly all incoming light -- in some cases three times the amount of a single-sensor array -- passing through the beam splitter to a final output image.
Apple's IP refers to the beam splitter as either a Philips prism or smaller stacked cube variation, the latter described in detail in the '653 patent for a "Digital camera with light splitter," itself an extension of a patent granted in 2015. In a cube configuration, sensors collecting red, green and blue light information are embedded into ceramic substrates and positioned on three sides of the prism. The light splitter is designed to direct specific wavelengths to each sensor with minimal waste.
In addition to enhanced light gathering capabilities, three-sensor arrays offer better performance in situations calling for polarization imaging. Typically, polarizing filters shed 50 percent of incident light to increase visibility of specific targets, but the same results can be achieved with the splitting cube as it enables polarization sum and difference imaging. Infrared imaging is another area that would benefit from multiple wavelength-dedicated sensors.
As for deployment in smartphones, Apple illustrates a "folded" camera design in both of today's patents. To minimize system depth, a mirror mechanism is used to bounce incoming light through lens elements positioned orthogonal to the phone's Z-axis. For example, the objective lens can be installed as it appears on current iPhones, while the imaging sensor is positioned at some other location in the chassis at a right angle to said lens.
This design holds advantages far beyond space savings. The mirror, for example, can be mounted on a miniature motor to enable optical image stabilization, while the long connecting tube offers additional room for zoom capabilities.

Whether Apple plans to implement the inventions in a shipping device remains unclear, though the company already owns a number of patents covering three-sensor, three-lens systems, and cube splitter designs and other imaging technology.
The '666 patent goes on to explain super resolution algorithms, which in part use natural misalignments in the relative positions of the arrays to sample a given scene at higher spatial resolution. Super samples are also gleaned from edges detected in sub-sampled sites, as well as color information and artifacts.
Apple's super resolution camera patent was first filed for in July 2015 and credits Richard J. Topliss and Richard H. Tsai as its inventors. The cube splitter patent update was filed for in March 2015 and credits Steven Webster and Ning Y. Chan as its inventors.

Apple's third patent deals with embedded sensors, specifically light-sensing components like ambient light and proximity sensors.
As noted in Apple's U.S. Patent No. 9,466,653 for "Electronic devices with display-integrated light sensors," light-sensing apparatuses are commonly displaced from the device display. While advantageous from a production standpoint, such implementations lead to wasted space, or in some cases force sleek designs to be modified. Indeed, the iPhone's proximity and ALS are positioned above the display near the handset speaker.
Apple proposes forming sensors on display layers that already support conductive traces. Some embodiments provide for sensor positioning at the periphery of a device display beyond the edge of touch sensitive traces.
For example, an ALS unit might be disposed at the extreme edge of an OLED display, then covered by a touch-sensitive layer and finally an encapsulation layer. Alternatively, a handset could include a dedicated TFT layer onto which a variety of sensors are embedded.
In any case, the sensor or sensors are disposed within the display itself, not above it as with current iPhone models. This design tweak alone would save precious millimeters off final design specifications and could pave the way to a true full-screen display.
Apple is said to be working on an advanced iPhone design with "full-screen face," meaning the rumored OLED display stretches across the device's entirety. The company last week patented technology detailing a fingerprint sensor that works through portable device displays, though the ear speaker remains a problem.
Apple's embedded light sensor patent was first filed for in June 2015 and credits Erik G. de Jong, Anna-Katrina Shedletsky and Prashanth S. Holenarsipur as its inventors.
Comments
Phil Schiller
It was a serious...brain fart.
If this transpires, it will surely be called the devil's work or some other dumb meme.
See this earlier story on the patent.
See also Wikipedia entry on 3 CCD cameras.
Maybe I'm not understanding your question, but cameras already use green for luminance. That's why the color space is still commonly 4:2:0. That's changing quickly of course and this new configuration would better support wider color gamuts such as 4:4:4.