Sony says dual-lens camera tech launching with 'major smartphone players' in next year

Posted:
in iPhone
Apple's image sensor supplier, Sony, is expecting its dual-lens camera technology to be launched by "major smartphone players" within the next year, according to comments by the company's CFO.




"Well, for next year, our so-called dual-lens -- dual-camera platform will be launched by, we believe, from major smartphone players," said Kenichiro Yoshida during Sony's most recent results call, as reported by Xperia Blog.

Yoshida warned however that the high-end smartphone market is "slowing down," which could impact demand and production by smartphone makers.

"So, we believe the real start, the takeoff of smartphone with dual-lens camera will be in the year of 2017," he noted.

Apple is believed to be working on a dual-lens camera based on Sony technology, which could appear in an "iPhone 7 Plus", or at least some versions of it. The second lens might be used for optical zoom functions, overcoming one of the main limitations of most smartphone cameras.

Yoshida's comments suggest that Apple could potentially hold a dual-lens system back for an "iPhone 7s," although market constraints are less likely to affect the company given the popularity and profitability of its devices.

Comments

  • Reply 1 of 17
    A two lens system makes no sense to me. If there were a depth-sensing LIDAR-like sensor instead of a second lens, I could see some use for that.
  • Reply 2 of 17
    msanttimsantti Posts: 1,377member
    I doubt it's dual cameras just for the sake of it.

    i want this stuff on the 7, not the 7s.
    fastasleep1983
  • Reply 3 of 17
    kkerstkkerst Posts: 330member
    Think Lytro, but with two lenses. I've been waiting for far field cameras on the iPhone since it came out.
    cornchip
  • Reply 4 of 17
    Mr_GreyMr_Grey Posts: 118member
    A two lens system makes no sense to me. If there were a depth-sensing LIDAR-like sensor instead of a second lens, I could see some use for that.
    Well, once you have two (or more) cameras tied together in this fashion you can do lots of different processing on the images and one of those choices is actually depth-sensing.  LIDAR seems unlikely in the extreme given that it would require lasers embedded into the phone which is neither thick enough to accommodate them, nor powerful enough to drive the lasers.  
    edited February 2016 cornchip
  • Reply 5 of 17
    Mr_Grey said:
    A two lens system makes no sense to me. If there were a depth-sensing LIDAR-like sensor instead of a second lens, I could see some use for that.
    Well, once you have two (or more) cameras tied together in this fashion you can do lots of different processing on the images and one of those choices is actually depth-sensing.  LIDAR seems unlikely in the extreme given that it would require lasers embedded into the phone which is neither thick enough to accommodate them, nor powerful enough to drive the lasers.  

    http://www.theverge.com/2015/4/5/8347735/this-new-camera-sensor-could-turn-your-phone-into-a-3d-scanner
    cornchip
  • Reply 6 of 17
    jfc1138jfc1138 Posts: 3,090member
    msantti said:
    I doubt it's dual cameras just for the sake of it.

    i want this stuff on the 7, not the 7s.
    It does feel more of a full on 7 thing than an "s" bump up to me...
    canukstormretrogusto
  • Reply 7 of 17
    wizard69wizard69 Posts: 13,377member
    A two lens system makes no sense to me. If there were a depth-sensing LIDAR-like sensor instead of a second lens, I could see some use for that.
    The idea, from what I've heard, is that one lens assembly captures the color component while the othe luminance.  At least that is the goal in one description I've seen.   Supposedly this means optimization for each task and thus a very wide range light level capabilities.  It also means more pixels available with a lower noise level.  

    It it will be very interesting to see what actually ships with the next iPhone.   Given that I don't expect Apple to ship anything unless it can improve overall photographic performance.  Since photographic quality is huge selling point with respect to iPhone there has to be a benefit or Apple won't do it.   I also very interested to know what the trade offs will be.  There are always trade offs to be made with such tech transitions.  

    As as for the first iteration, well I would expect a lot of new features beyond trying to offer higher quality pictures.  Apples first goal with introducing new tech is always to keep things simple.  We might simply see a 20 mega pixel camera advertised.    
    cornchip
  • Reply 8 of 17
    mattinozmattinoz Posts: 2,299member
    I thought the two lens are set for different depth ranges?
    Which allows for simpler shorter optics and a thinner camera.

    Doesn't light field photography require a much bigger distance between the outer edges of the lens?

  • Reply 9 of 17
    2 lens don't provide depth sensing. You get at most high quality. According to the white paper of that company Apple purchased...
  • Reply 10 of 17
    mjtomlinmjtomlin Posts: 2,673member
    A two lens system makes no sense to me. If there were a depth-sensing LIDAR-like sensor instead of a second lens, I could see some use for that.
    2 lens don't provide depth sensing. You get at most high quality. According to the white paper of that company Apple purchased...

    If the cameras were sensitive enough you can measure shift and determine distance. This would allow you to adjust focus of an image after it's taken. Of course being THAT close, I can only imagine the depth is fairly limited. But I believe the real point here is using the second sensor to obtain more information and therefor producing a more accurate image.
  • Reply 11 of 17
    foggyhillfoggyhill Posts: 4,767member
    mjtomlin said:
    A two lens system makes no sense to me. If there were a depth-sensing LIDAR-like sensor instead of a second lens, I could see some use for that.

    If the cameras were sensitive enough you can measure shift and determine distance. This would allow you to adjust focus of an image after it's taken. Of course being THAT close, I can only imagine the depth is fairly limited. But I believe the real point here is using the second sensor to obtain more information and therefor producing a more accurate image.
    I think being able to but a bigger sensor in a smaller body is the main advantage, though I doubt it doubles the sensor size and it doesn't need to. It just needs to be able to better in low light. This complexifies OIS, any future optical zooming, so indeed there must be some major advantage in doing it if Apple does it.
  • Reply 12 of 17
    http://www.scribd.com/mobile/doc/261875793/LinX-Imaging-Presentation

    Scroll to to the end of the document, "The Optimal Array"

  • Reply 13 of 17
    cornchipcornchip Posts: 1,945member
    Rest assured it won't look anything like that 
  • Reply 14 of 17
    19831983 Posts: 1,225member
    Apple lost its traditional lead in camera image quality with the 6 series (especially the 6s) hopefully this new approach will put them back on top again.
  • Reply 15 of 17
    hmmhmm Posts: 3,405member
    wizard69 said:
    A two lens system makes no sense to me. If there were a depth-sensing LIDAR-like sensor instead of a second lens, I could see some use for that.
    The idea, from what I've heard, is that one lens assembly captures the color component while the othe luminance.  At least that is the goal in one description I've seen.   Supposedly this means optimization for each task and thus a very wide range light level capabilities.  It also means more pixels available with a lower noise level.  

    It it will be very interesting to see what actually ships with the next iPhone.   Given that I don't expect Apple to ship anything unless it can improve overall photographic performance.  Since photographic quality is huge selling point with respect to iPhone there has to be a benefit or Apple won't do it.   I also very interested to know what the trade offs will be.  There are always trade offs to be made with such tech transitions.  

    As as for the first iteration, well I would expect a lot of new features beyond trying to offer higher quality pictures.  Apples first goal with introducing new tech is always to keep things simple.  We might simply see a 20 mega pixel camera advertised.    
    I thought they were using it to reduce noise. I never understood the idea of splitting up color and luminance at the sensor level, as that seems like a poor representation of the underlying hardware. They're using filtered sites to capture transmit specific wavelength ranges to the sensor.
  • Reply 16 of 17
    cnocbuicnocbui Posts: 3,613member
    Mr_Grey said:
    A two lens system makes no sense to me. If there were a depth-sensing LIDAR-like sensor instead of a second lens, I could see some use for that.
    Well, once you have two (or more) cameras tied together in this fashion you can do lots of different processing on the images and one of those choices is actually depth-sensing.  LIDAR seems unlikely in the extreme given that it would require lasers embedded into the phone which is neither thick enough to accommodate them, nor powerful enough to drive the lasers.  
    If the two lenses have different focal lengths, no you couldn't.
  • Reply 17 of 17
    cnocbuicnocbui Posts: 3,613member
    1983 said:
    Apple lost its traditional lead in camera image quality with the 6 series (especially the 6s) hopefully this new approach will put them back on top again.
    What traditional lead?  Apple were very slow to bring quality cameras to their phones.  The first one with a decent offering was the 4S.
    hmm
Sign In or Register to comment.