Apple patents three-sensor, three-lens iPhone camera for enhanced color photos
The U.S. Patent and Trademark Office on Tuesday granted Apple a patent for a camera system that uses three separate sensors, one for luminance and two for chrominance, to generate images with both higher resolution and color accuracy.

Source: USPTO
Apple's U.S. Patent No. 8,497,897 for "Image capture using luminance and chrominance sensors" describes a unique multi-sensor camera system that can be used in portable devices like the iPhone.
The main thrust of the patent is to combine three separate images generated by one luminance sensor disposed between two chrominance sensors. Each sensor has a "lens train," or lens assembly, in front of it that directs light toward the sensor surface. The document notes that the sensors can be disposed on a single circuit board, or separated.
Important to system's functionality is sensor layout. In most embodiments, the luminance sensor is flanked on two sides by the chrominance sensors. This positioning allows the camera to compare information sourced from three generated images. For example, an image processing module can take raw data from the three sensors, comprising luminance, color, and other data, to form a composite color picture. The resulting photo would be of higher quality than a system using a single unified sensor.
To execute an effective comparison of the two chrominance sensor images, a stereo map is created so that differences, or redundancies, can be measured. Depending on the situation and system setup (filters, pixel count, etc.), the stereo map is processed and combined with data from the luminance sensor to create an accurate scene representation.

Source: USPTO
The stereo map also solves a "blind spot" issue that arises when using three sensors with three lens trains. The patent offers the example of an object in the foreground obscuring an object in the background (as seen in the first illustration). Depending on the scene, color information may be non-existent for one sensor, which would negatively affect a photo's resolution.
To overcome this inherent flaw, one embodiment proposes the two chrominance sensors be offset so that their blind regions do not overlap. If a nearby object creates a blind region for a first sensor, the offset will allow for the image processor to replace compromised image data with information from a second sensor.
Further, the image processor can use the stereo disparity map created from data generated by the two chrominance sensor images to compensate for distortion.
Other embodiments call for varied resolutions or lens configurations for the chrominance and luminance sensors, including larger apertures, different filters, or modified image data collection. These features could enhance low-light picture taking, for example, by compensating for lack of luminance with information provided by a modified chrominance sensor. Here, as with the above embodiments, the image processor is required to compile data from all three sensors.
While Apple is unlikely to implement the three-sensor camera tech anytime soon, a future iPhone could theoretically carry such a platform.
Apple's luminance and chrominance sensor patent was first filed for in 2010 and credits David S. Gere as its inventor.

Source: USPTO
Apple's U.S. Patent No. 8,497,897 for "Image capture using luminance and chrominance sensors" describes a unique multi-sensor camera system that can be used in portable devices like the iPhone.
The main thrust of the patent is to combine three separate images generated by one luminance sensor disposed between two chrominance sensors. Each sensor has a "lens train," or lens assembly, in front of it that directs light toward the sensor surface. The document notes that the sensors can be disposed on a single circuit board, or separated.
Important to system's functionality is sensor layout. In most embodiments, the luminance sensor is flanked on two sides by the chrominance sensors. This positioning allows the camera to compare information sourced from three generated images. For example, an image processing module can take raw data from the three sensors, comprising luminance, color, and other data, to form a composite color picture. The resulting photo would be of higher quality than a system using a single unified sensor.
To execute an effective comparison of the two chrominance sensor images, a stereo map is created so that differences, or redundancies, can be measured. Depending on the situation and system setup (filters, pixel count, etc.), the stereo map is processed and combined with data from the luminance sensor to create an accurate scene representation.

Source: USPTO
The stereo map also solves a "blind spot" issue that arises when using three sensors with three lens trains. The patent offers the example of an object in the foreground obscuring an object in the background (as seen in the first illustration). Depending on the scene, color information may be non-existent for one sensor, which would negatively affect a photo's resolution.
To overcome this inherent flaw, one embodiment proposes the two chrominance sensors be offset so that their blind regions do not overlap. If a nearby object creates a blind region for a first sensor, the offset will allow for the image processor to replace compromised image data with information from a second sensor.
Further, the image processor can use the stereo disparity map created from data generated by the two chrominance sensor images to compensate for distortion.
Other embodiments call for varied resolutions or lens configurations for the chrominance and luminance sensors, including larger apertures, different filters, or modified image data collection. These features could enhance low-light picture taking, for example, by compensating for lack of luminance with information provided by a modified chrominance sensor. Here, as with the above embodiments, the image processor is required to compile data from all three sensors.
While Apple is unlikely to implement the three-sensor camera tech anytime soon, a future iPhone could theoretically carry such a platform.
Apple's luminance and chrominance sensor patent was first filed for in 2010 and credits David S. Gere as its inventor.
Comments
I doubt however that Apple will use 3 lenses. It would cost to much. (even if it just cost 1 dollar. Apple sells 120 million iphones per year. That is almost 300 million in extra cost)
- Samsung will have a phone using it before Apple;
- Apple will sue;
- Nokia will sue Apple because they're doing something "similar" in Lumia, first;
- Sony will jump at the chance to say they thought of it first;
- Engadget, Verge, Ars and countless other sites will claim it as "obvious";
- Fans of Android will claim to have seen the same sensor on Star-Trek in 1971, so "prior art";
- Any and all lawsuits will drag out for 5 years;
- USPTO will invalidate Apple's original patent;
Summary: not worth the time or ink to register the patent.
End. Of. Story.
/s
Best post of the day
Apple's patent filings are Samsung's R&D.
Quote:
Originally Posted by ThePixelDoc
@shompa - no need to worry:
- Samsung will have a phone using it before Apple;
- Apple will sue;
- Nokia will sue Apple because they're doing something "similar" in Lumia, first;
- Sony will jump at the chance to say they thought of it first;
- Engadget, Verge, Ars and countless other sites will claim it as "obvious";
- Fans of Android will claim to have seen the same sensor on Star-Trek in 1971, so "prior art";
- Any and all lawsuits will drag out for 5 years;
- USPTO will invalidate Apple's original patent;
Summary: not worth the time or ink to register the patent.
End. Of. Story.
/s
WELL said! Sad but TRUE!
Then again, every Android based phone / tablet maker has been losing $$ but Samsung. No wonder.
Psh. My Sprint Samsung Galaxy X Fire LTE HD with SprintCast has a camera with moar megapixels. That's the only important camera number.
/s
AnalogJack
Is this like Photoshop's LAB mode?
Only if iOS allows apps to capture more sensor data than the current default: 8-bit jpeg.
Quote:
Originally Posted by ThePixelDoc
@shompa - no need to worry:
- Samsung will have a phone using it before Apple;
- Apple will sue;
- Nokia will sue Apple because they're doing something "similar" in Lumia, first;
- Sony will jump at the chance to say they thought of it first;
- Engadget, Verge, Ars and countless other sites will claim it as "obvious";
- Fans of Android will claim to have seen the same sensor on Star-Trek in 1971, so "prior art";
- Any and all lawsuits will drag out for 5 years;
- USPTO will invalidate Apple's original patent;
Summary: not worth the time or ink to register the patent.
End. Of. Story.
/s
So Apple didn't think through all of this and is just being stupid? Damn you're so much smarter than Cook et al.
Hoping to see this as a landmark feature for the iPhone 6. It's amazing this patent is 3 years old an technology hasn't been able to support this idea until now.
Anyway, the point being that just because something is patented doesn't always make it a good idea. But who knows, it might lead to some interesting capabilities down the road.
Quote:
Originally Posted by Tallest Skil
Samsung needs a few scandals on the order of the IRS, et. al. Something internal that proves they've broken laws internationally.
Indeed. Samsung is kind of smarting that Congress is not after them for tax minimization. They so want to copy Apple on that front, too.
Quote:
Originally Posted by shompa
I doubt however that Apple will use 3 lenses. It would cost to much. (even if it just cost 1 dollar. Apple sells 120 million iphones per year. That is almost 300 million in extra cost)
Cost too much? You realize that amount is less than .2% of their 2012 revenues? Or less than .85% of their revenues from last quarter. It's a miniscule drop in the bucket.
Quote:
Originally Posted by shompa
Fun patent. It would be possible to have for example 12Mpix x3 setup for 36mpix camera.
I doubt however that Apple will use 3 lenses. It would cost to much. (even if it just cost 1 dollar. Apple sells 120 million iphones per year. That is almost 300 million in extra cost)
I don't think it will be anytime soon, but eventually Apple will use multiple sensors as will everyone else. The more lenses the higher the megapixels as you noted, but it also allows for interesting effects and capabilities. It's the basis of the Lytro camera for example, and a device with a small cluster of lenses like a compound eye, could easily take pictures that equal, and even *best* SLR quality.
At the moment the the chief limiting factor of small cameras is the lack of the SLR's giant optics. Small plastic or even glass lenses just can't compete. Cameras with compound or multiple lenses however will theoretically eliminate that factor in the near future as well as bringing features that an SLR will never, and can never have through image processing.
Quote:
Originally Posted by Gazoobee
I don't think it will be anytime soon, but eventually Apple will use multiple sensors as will everyone else. The more lenses the higher the megapixels as you noted, but it also allows for interesting effects and capabilities. It's the basis of the Lytro camera for example, and a device with a small cluster of lenses like a compound eye, could easily take pictures that equal, and even *best* SLR quality.
At the moment the the chief limiting factor of small cameras is the lack of the SLR's giant optics. Small plastic or even glass lenses just can't compete. Cameras with compound or multiple lenses however will theoretically eliminate that factor in the near future as well as bringing features that an SLR will never, and can never have through image processing.
This.
Apple wants the iPhone to be the best camera phone in the world. Theirs is a world of full range dSLRs for professionals, 4/3rds dSLRs for prosumers, iPhones, and crap cameras. As phone components become less expensive and ARM chips more computationally powerful, camera technology will be critical for 3 reasons.
1) replace HDR with instant technology
2) improve low light pictures
3) better recognition of objects, places and faces (visual Siri... "That picture you just took of your mother... she looks jaundiced... want me to call her doctor?")
When weighing an increase in cost, you need to compare it against potential increases in revenues attributed to the new additions.
This will never happen. Someone else may try it, but they'll fail. There are lots of theoretical issues to resolve, ranging from properly computing the location of reflections, reconstructing color information lost due to them, and the solution to the blind spot is not a solution at all, because color information is irreversibly lost in this stereo setup.
Patents are always worth registering for big companies like Apple, they cost less than what those companies pay for toilet paper in a single week and they can be used defensively.
Introducing Icamera: has 48 single cameras to produce 16 single amazing photos, or a single panama video of 2.3 gigapixels with built in Bluetooth to connect to device; 1 terra-byte hard drive and it being a full helmet at just under 5 pounds, showing every human viewing angle for you for hours of time (6 hour battery life on full camera video); Can you see what I see?
/s