2019 iPhone predicted to have triple-lens camera with super-wide lens, improved selfie cam...

24

Comments

  • Reply 21 of 63
    elijahg said:
    I'm almost disappointed to say photos from some relatively cheap Android phones are really quite impressive, especially in low-light or night shots. Apple needs to pull a blinder to surpass those. Photos from Huawei's P30 are stunning (even if the colours have to be fixed), making the iPhone's look very average, which is quite saddening really. 
    The iPhone is agreed to be in the top 5 cameras of 2018, what else are you expecting?
  • Reply 22 of 63
    cmka~+cmka~+ Posts: 40member
    Does improved selfie camera along with super wide front camera taken together sound to anyone else like Apple is laying the groundwork for a built in 360 camera?
    watto_cobra
  • Reply 23 of 63
    hentaiboyhentaiboy Posts: 1,252member
    gatorguy said:
    tmay said:
    melgross said:
    Not innovation. Already in LG V-series phones since 2018. In fact they have 5 lenses/cameras now.
    And they take mediocre pictures.
    https://frankdoorhof.com/web/2019/04/solving-the-p30-pro-color-problems/

    Long story short, Huawei didn't build a color model to work with their imager.
    LG is using the Huawei camera module? I did not know that. Seems like everyone uses Sony. 
    The P30 uses Sony sensors. 

    https://www.dpreview.com/news/7055536789/huawei-p30-pro-uses-sony-image-sensors-and-technology-from-corephotonics

  • Reply 24 of 63
    ...Kuo predicts that the ear cameras...
    Nice typo, guys.
    edited April 2019
  • Reply 25 of 63
    gatorguygatorguy Posts: 24,176member
    hentaiboy said:
    gatorguy said:
    tmay said:
    melgross said:
    Not innovation. Already in LG V-series phones since 2018. In fact they have 5 lenses/cameras now.
    And they take mediocre pictures.
    https://frankdoorhof.com/web/2019/04/solving-the-p30-pro-color-problems/

    Long story short, Huawei didn't build a color model to work with their imager.
    LG is using the Huawei camera module? I did not know that. Seems like everyone uses Sony. 
    The P30 uses Sony sensors. 

    https://www.dpreview.com/news/7055536789/huawei-p30-pro-uses-sony-image-sensors-and-technology-from-corephotonics

    Thanks for that link. Tmay had indicated otherwise, but perhaps was confused about what was being discussed. 
  • Reply 26 of 63
    tmaytmay Posts: 6,309member
    gatorguy said:
    hentaiboy said:
    gatorguy said:
    tmay said:
    melgross said:
    Not innovation. Already in LG V-series phones since 2018. In fact they have 5 lenses/cameras now.
    And they take mediocre pictures.
    https://frankdoorhof.com/web/2019/04/solving-the-p30-pro-color-problems/

    Long story short, Huawei didn't build a color model to work with their imager.
    LG is using the Huawei camera module? I did not know that. Seems like everyone uses Sony. 
    The P30 uses Sony sensors. 

    https://www.dpreview.com/news/7055536789/huawei-p30-pro-uses-sony-image-sensors-and-technology-from-corephotonics

    Thanks for that link. Tmay had indicated otherwise, but perhaps was confused about what was being discussed. 
    My link was only wrt the color profile issue of the RYYB imager, so I wasn't all that interested in the hardware specs,

    More interesting is that Samsung now owns Corephotonics. I wasn't previously aware of that.

    Sony does have the 48 MP IMX586 sensor designed to "bin" to 12 MP but I don't think those will be available until 2H 2019. I would think that this would be the sensor of choice for premium cameras in the next generation of smartphones.
    watto_cobra
  • Reply 27 of 63
    Thanks all, for the interesting comments. 

    I’m still astonished at the quality we all get from smartphone cameras.  It’s just stunning compared with what most of us could achieve with a good SLR 20 years ago. As a keen amateur I’d often be disappointed with poor exposure or blurred images on old SLRs which are extremely rare with modern smart phones. (Of course modern SLRs are even more impressive).  

    I agree that what most of us crave is a good telephoto lens though. 2x isn’t enough and I’m not excited about wide angle when I can use Panorama mode for the rare times I’d use the wide camera. 

    The Oppo camera in the video above with 10x zoom does look very impressive. 
    watto_cobra
  • Reply 28 of 63
    SpamSandwichSpamSandwich Posts: 33,407member
    melgross said:
    I’d rather have a 1.4 sensing site size for the tele sensor instead of the super wide camera so that that camera would take pics as good as the wide camera does. Then I’d like to have Apple’s computational methods give us a night mode as Huawei and Google does. Both would be more useful than a super wide.
    Yes, Google’s night vision mode (based on the samples I’ve seen) is seriously impressive. Would love to see Apple improve on what’s out there now.

    It also makes me wonder why these software and hardware innovations haven’t yet found their way into security cameras. Night vision and grain reduction in consumer grade cameras would be a huge breakthrough for security companies and the public.
  • Reply 29 of 63
    TarbuckleTarbuckle Posts: 4unconfirmed, member
    Apple need to up their game so good news. Today comparison photos on web taken in near total darkness with iPhone came out just totally black but with great detail on P30 Pro so hoping they'll also catch up with low light software of Huawei also. Currently way behind the 8 ball.
  • Reply 30 of 63
    tmay said:
    gatorguy said:
    hentaiboy said:
    gatorguy said:
    tmay said:
    melgross said:
    Not innovation. Already in LG V-series phones since 2018. In fact they have 5 lenses/cameras now.
    And they take mediocre pictures.
    https://frankdoorhof.com/web/2019/04/solving-the-p30-pro-color-problems/

    Long story short, Huawei didn't build a color model to work with their imager.
    LG is using the Huawei camera module? I did not know that. Seems like everyone uses Sony. 
    The P30 uses Sony sensors. 

    https://www.dpreview.com/news/7055536789/huawei-p30-pro-uses-sony-image-sensors-and-technology-from-corephotonics

    Thanks for that link. Tmay had indicated otherwise, but perhaps was confused about what was being discussed. 
    Sony does have the 48 MP IMX586 sensor designed to "bin" to 12 MP but I don't think those will be available until 2H 2019. I would think that this would be the sensor of choice for premium cameras in the next generation of smartphones.

    IMX586 is already present in at least 2 Android phones that I am aware of.

    1. Xiaomi Redmi Note 7 Pro at $200

    2. Honor View 20 at $500

    avon b7tmaygatorguy
  • Reply 31 of 63
    tmaytmay Posts: 6,309member
    tmay said:
    gatorguy said:
    hentaiboy said:
    gatorguy said:
    tmay said:
    melgross said:
    Not innovation. Already in LG V-series phones since 2018. In fact they have 5 lenses/cameras now.
    And they take mediocre pictures.
    https://frankdoorhof.com/web/2019/04/solving-the-p30-pro-color-problems/

    Long story short, Huawei didn't build a color model to work with their imager.
    LG is using the Huawei camera module? I did not know that. Seems like everyone uses Sony. 
    The P30 uses Sony sensors. 

    https://www.dpreview.com/news/7055536789/huawei-p30-pro-uses-sony-image-sensors-and-technology-from-corephotonics

    Thanks for that link. Tmay had indicated otherwise, but perhaps was confused about what was being discussed. 
    Sony does have the 48 MP IMX586 sensor designed to "bin" to 12 MP but I don't think those will be available until 2H 2019. I would think that this would be the sensor of choice for premium cameras in the next generation of smartphones.

    IMX586 is already present in at least 2 Android phones that I am aware of.

    1. Xiaomi Redmi Note 7 Pro at $200

    2. Honor View 20 at $500

    Thanks

    I only read one review, but it appears that the IMX586 at 48 MP isn't really delivering the IQ.
  • Reply 32 of 63
    melgrossmelgross Posts: 33,510member
    tmay said:
    melgross said:
    Not innovation. Already in LG V-series phones since 2018. In fact they have 5 lenses/cameras now.
    And they take mediocre pictures.
    https://frankdoorhof.com/web/2019/04/solving-the-p30-pro-color-problems/

    Long story short, Huawei didn't build a color model to work with their imager.
    Yes. Good article. I’ve been talking about this for years. No Android phone camera is properly set up. Not a single one. The problem with most reviews is that the reviews know little to nothing about photography. They can’t tell anything about their photos other than sharpness and exposure. They don’t understand color, for one thing. Or saturation. They take mediocre pictures that are often mis shot for what they’re trying to illustrate. They don’t understand flare, or other shooting problems. 

    I’m now reading in a couple of places that Google has implemented color management in the Android OS. But they haven’t. That will require an entire overhaul of their display model. What they’ve done is a very modest ability to display (sometimes) two specific standards, but not at the same time. This is far from color management. I used to teach that.
    muthuk_vanalingamwatto_cobra
  • Reply 33 of 63
    melgrossmelgross Posts: 33,510member

    tmay said:
    gutengel said:
    It seems that night mode on the camera is one feature that separates the Pixel from the rest of the market (despite being a unreliable phone). Let's see what apple has under their sleeves this time.
    I believe Night Mode requires cloud processing.

    Apple would prefer to do it as an edge process, in real time, on any future iPhone. They will get there, but there isn't anything stopping a third party app today.
    I wonder. Google and Huawei’s processing takes place after the picture is taken. Takes several seconds to process, and can’t be reversed. I don’t know whether the original unprocessed photo is retained. But Apple’s SoC is far more powerful in a number of ways. It’s possible it can be done in camera. I don’t know why Apple hasn’t done it yet. It’s not as though they can’t. Maybe this year. Rumor has it that their AI May be up to 4 times as powerful in the A13. That would put them well ahead of rivals, which they’re ahead of now.
    watto_cobra
  • Reply 34 of 63
    melgrossmelgross Posts: 33,510member

    gatorguy said:
    tmay said:
    gutengel said:
    It seems that night mode on the camera is one feature that separates the Pixel from the rest of the market (despite being a unreliable phone). Let's see what apple has under their sleeves this time.
    I believe Night Mode requires cloud processing.

    Apple would prefer to do it as an edge process, in real time, on any future iPhone. They will get there, but there isn't anything stopping a third party app today.
    I don't believe so sir. Where did you see it required cloud processing? FWIW Google is moving more of their features on-device rather than requiring cloud processing, ex. Google Translate
    Google is trying a mixed model. They won’t move it all to the device, because the SoCs available to them are too weak for that, and likely will be for years.
    watto_cobra
  • Reply 35 of 63
    melgrossmelgross Posts: 33,510member

    gatorguy said:
    SNJOps said:
    Computational photography can only go so far at the moment. 
    How much further? Have you read something about where the limitations will be? Personally I see a dim future for Sony's mirrorless line, no better than for DSLR's, and much of it will be due to computational photography.
    https://www.youtube.com/watch?v=WsAdG6wIAaM
    It’s interesting when reading Thom Hogan, the well known photo expert. He knows the photo industry very well. He says that no camera company has a processing chip that even approaches the best SoC. Not even close. He says that they can’t do what companies like Apple do. That they don’t know where they’re going with this.

    i believe that they’re mired in a competitive jungle. They only look to each other. In interviews with executives from the various camera companies, you can see this. Some have even said, though not always directly, that they can’t compete against smartphones, so they hope for full frame and the major resolution and noise advantages of that format.

    but it’s more than picture quality that’s a stake here. While Thom, and others, complain that camera companies aren’t doing what smartphone companies can do with getting photos up to web sites and such, I think that’s a losing battle, and the most important one. Even more important than ultimate quality for most people. And why is that? It’s because a smartphone is exactly that—a phone. It has built-in LTE. Cameras don’t have that, and it’s not likely they will. It would be another $100, plus the monthly fee from the carriers. How many people buying a camera will pay for that? So the best they can do is to push a photo to your phone, where you do whatever you can there, and send it from there. Too complicated for most people to bother with, particularly since the resolution needs to be severely brought down to send it. Better to use the smartphone instead.
    watto_cobra
  • Reply 36 of 63
    melgrossmelgross Posts: 33,510member
    tmay said:
    gatorguy said:
    hentaiboy said:
    gatorguy said:
    tmay said:
    melgross said:
    Not innovation. Already in LG V-series phones since 2018. In fact they have 5 lenses/cameras now.
    And they take mediocre pictures.
    https://frankdoorhof.com/web/2019/04/solving-the-p30-pro-color-problems/

    Long story short, Huawei didn't build a color model to work with their imager.
    LG is using the Huawei camera module? I did not know that. Seems like everyone uses Sony. 
    The P30 uses Sony sensors. 

    https://www.dpreview.com/news/7055536789/huawei-p30-pro-uses-sony-image-sensors-and-technology-from-corephotonics

    Thanks for that link. Tmay had indicated otherwise, but perhaps was confused about what was being discussed. 
    My link was only wrt the color profile issue of the RYYB imager, so I wasn't all that interested in the hardware specs,

    More interesting is that Samsung now owns Corephotonics. I wasn't previously aware of that.

    Sony does have the 48 MP IMX586 sensor designed to "bin" to 12 MP but I don't think those will be available until 2H 2019. I would think that this would be the sensor of choice for premium cameras in the next generation of smartphones.
    There is at least one camera that has a sensor like that, or that particular sensor. The pictures can go from high resolution with very poor noise and dynamic range, or average resolution with very good noise and dynamic range. You pick. I don’t see that as a long term solution. A lot of this is gimmicky. You have to make on the spot decisions as to how your going to use these cameras that most people aren’t going to understand, or want to make. That’s always been a problem with complicated choices, often, the wrong one is made, and you can’t go back from it.
    edited April 2019
  • Reply 37 of 63
    melgrossmelgross Posts: 33,510member

    melgross said:
    I’d rather have a 1.4 sensing site size for the tele sensor instead of the super wide camera so that that camera would take pics as good as the wide camera does. Then I’d like to have Apple’s computational methods give us a night mode as Huawei and Google does. Both would be more useful than a super wide.
    Yes, Google’s night vision mode (based on the samples I’ve seen) is seriously impressive. Would love to see Apple improve on what’s out there now.

    It also makes me wonder why these software and hardware innovations haven’t yet found their way into security cameras. Night vision and grain reduction in consumer grade cameras would be a huge breakthrough for security companies and the public.
    I imagine it’s because of they heavy processing involved. These cameras rely on highly sensitive sensors and light amplification which requires no real processing.
    muthuk_vanalingamwatto_cobra
  • Reply 38 of 63
    gatorguygatorguy Posts: 24,176member
    melgross said:
    tmay said:
    melgross said:
    Not innovation. Already in LG V-series phones since 2018. In fact they have 5 lenses/cameras now.
    And they take mediocre pictures.
    https://frankdoorhof.com/web/2019/04/solving-the-p30-pro-color-problems/

    Long story short, Huawei didn't build a color model to work with their imager.
    Yes. Good article. I’ve been talking about this for years. No Android phone camera is properly set up. Not a single one. The problem with most reviews is that the reviews know little to nothing about photography. They can’t tell anything about their photos other than sharpness and exposure. They don’t understand color, for one thing. Or saturation. They take mediocre pictures that are often mis shot for what they’re trying to illustrate. They don’t understand flare, or other shooting problems. 

    I’m now reading in a couple of places that Google has implemented color management in the Android OS. But they haven’t. That will require an entire overhaul of their display model. What they’ve done is a very modest ability to display (sometimes) two specific standards, but not at the same time. This is far from color management. I used to teach that.
    Color is not nearly as important to me personally, especially from a smartphone camera. Punched up a bit is fine.

    Heck I just spent around three hours this weekend post-processing a portrait shoot and MANY of the shots were boosted, and every one of those distributed was modified in some way or another. Sometimes minor and sometimes major. Why? Because the people who wanted the shots would not have been happy with "as is" for the most part. 

    Mel, I'm certain as a photographer yourself you knew that and do the same thing. Heck, what's one of the the first things you typically (not always) do when opening a photo in Photoshop: Boost the vibrance and saturation a tad.  If the client looks healthier, the hair more vibrant, the skin is clearer, and her green dress didn't become ocean blue, it's all good. Even commercial products get their colors boosted. The actual color of "things" is less important than what the client WANTS the color to be, right?


    edited April 2019 muthuk_vanalingamAppleDumpling
  • Reply 39 of 63
    CheeseFreezeCheeseFreeze Posts: 1,247member
    I wonder what the reasoning for an ultra-wide lens is. I have more need for more optical zoom, the phone goes plenty wide already.
    One priority for Apple is to improve low-light performance, considering the innovations on Android (e.g Google's Night Sight) and hopefully more dynamic range in video and photos.

  • Reply 40 of 63
    CheeseFreezeCheeseFreeze Posts: 1,247member

    gatorguy said:
    SNJOps said:
    Computational photography can only go so far at the moment. 
    How much further? Have you read something about where the limitations will be? Personally I see a dim future for Sony's mirrorless line, no better than for DSLR's, and much of it will be due to computational photography.
    https://www.youtube.com/watch?v=WsAdG6wIAaM
    There are limitations due the lack of information the sensor/lens serves a ML algorithm. The more 'source data' (e.g resolution, depth maps, larger sensor, dynamic range), the better the starting point for computational photography. I don't see a dim future for full frame and medium format cameras at all. Although the market is really shrinking, it serves a different end-user and use-case. And don't forget that computational photography can also be applied to mirrorless cameras. But yes - the photo camera manufacturers better start working on getting this on par with phones (AFAIK they are).  
Sign In or Register to comment.