'iPhone 11' camera & new 'A13' chip element will provide far better photography

2»

Comments

  • Reply 21 of 28
    Gabygaby Posts: 194member
    I’m expecting Quantum Film to finally make an appearance this year, and more from patents obtained from LinX i.e the multiple sensors working more cohesively together at all times to increase image fidelity even further. Fingers crossed... 19 days and counting....
    watto_cobra
     1Like 0Dislikes 0Informatives
  • Reply 22 of 28
    tmaytmay Posts: 6,469member
    daven said:
    tmay said:
    daven said:
    I can see how having dedicated silicon for matrix operations will benefit photography and video production. Multiple cameras give you different samples of points in space. If two cameras read the same point differently you have to have to choose which sample is correct, average the two data values, or have some algorithm determining some intermediate value for that point. For three cameras, if two of the cameras agree, the true value is likely the value the two cameras agreed on. However, when you have multiple cameras you also have different view points and have to calculate how the points correlate. When you do that you use matrices with sine and cosine values and almost all the time you don't have a direct correlation. A point in one camera is almost always corresponds to a point between other pixel points in the second camera so you have to sample the surrounding points and calculate what the value of the corresponding point in the second image would be if it were sampled. It can get computationally expensive to do that and having dedicated silicon may make it practical. Having three cameras adds to the complexity but also adds to the amount of data you have to make sure you have the value of the pixel correct.

    It really is amazing how far digital photography has come in twenty years.
    It might be the case that Apple builds an internal model of the three lens configuration, specific to an individual iPhone, processes it with machine learning, and that becomes the basis of subsequent calculation, until an optimization level is reached, perhaps giving the user realtime response.

    But, yeah, it is amazing.
    Yes. The phone can start out with a basic general calibration and fine tune it with machine learning to get it even better. Because of manufacturing variations (the lenses will not be mounted with the same pixel precision between any two cameras so there will always be differences) and because the world is 3d you will have differences in close photos. Right now for photographs, many phones take multiple photos often at slightly different exposures, align them (they should be aligned because of camera shake) and average the pixels. That works ok for still photos but if you have a fast moving subject or are making a video, that isn't a great technique. Now consider three simultaneous photos... you can do pixel averaging for videos along with the standard camera manufacturer calibration. Of course I'm speculating on all of this. I have no inside information. From what I understand, the camera setup also allows for wider field of view photos so you can fix mis-framed photos.

    That said, I upgraded my iPhone 5s with an Xs Max so I won't be buying a new phone for a while.
    I'm thinking that Apple starts with a factory calibration, and then the iPhone, over time, applies machine learning to the various real world configurations of focus amongst the three lenses. The iPhone should be able to compute the image with the least amount of overhead with that process. For still images, it might be best to just do the full computation, and maybe that won't add the much latency to the process. Video is much more forgiving, so it may be a 24, 30, or 60 fps process. We'll just have to wait.

    I'm speculating as well.

    Depending on whether the new iPad Pro offers the triple lens, I'll be buying the triple lens iPhone, but I'm fine with using an iPad for a camera for some things anyway.
    edited August 2019
    watto_cobra
     1Like 0Dislikes 0Informatives
  • Reply 23 of 28
    Gabygaby Posts: 194member
    tmay said:
    As their products mature, Apple really has to get away from this annual extravaganza:   Expectations for it build throughout the year and, at the end, people and media are always dissapointed -- and the stock drops.

    I see Apple increasingly going to more impromptu roll-outs throughout the year.   I suspect that a 5G phone somewhere in the first half of 2020 will be one of those.
    Highly unlikely that Apple would move away from a September release for any 2020 iPhone even if they could incorporate 5G earlier. There is still a substantial feature and OS update that is best taken after WWDC, giving time for developers to introduce updated and new Apps. Hence, why September has become the standard.
    Since when did infrastructure advancements and improvements become “features”? There’s been so much hype around it and its ridiculous. As is the audacity to charge a premium for the modems. Technology is always advancing and that cost should not be for the customer to pay for outright as they are already paying more in replayed costs. Typical of greedy corporations trying to externalise even more of their cost of business. 
     0Likes 0Dislikes 0Informatives
  • Reply 24 of 28
    tmaytmay Posts: 6,469member
    Gaby said:
    tmay said:
    As their products mature, Apple really has to get away from this annual extravaganza:   Expectations for it build throughout the year and, at the end, people and media are always dissapointed -- and the stock drops.

    I see Apple increasingly going to more impromptu roll-outs throughout the year.   I suspect that a 5G phone somewhere in the first half of 2020 will be one of those.
    Highly unlikely that Apple would move away from a September release for any 2020 iPhone even if they could incorporate 5G earlier. There is still a substantial feature and OS update that is best taken after WWDC, giving time for developers to introduce updated and new Apps. Hence, why September has become the standard.
    Since when did infrastructure advancements and improvements become “features”? There’s been so much hype around it and its ridiculous. As is the audacity to charge a premium for the modems. Technology is always advancing and that cost should not be for the customer to pay for outright as they are already paying more in replayed costs. Typical of greedy corporations trying to externalise even more of their cost of business. 
    LOL
    watto_cobra
     1Like 0Dislikes 0Informatives
  • Reply 25 of 28
    coolfactorcoolfactor Posts: 2,390member
    Rumours like this are exciting! Breakthrough, unique Apple hardware advances that the competition find difficult to match. Let's bring it!

    lolliverwatto_cobra
     2Likes 0Dislikes 0Informatives
  • Reply 26 of 28
    coolfactorcoolfactor Posts: 2,390member
    here I thought the bump was ugly as hell, then they gave us the notch. now they are taking it to the next level; the hideous looking multi-cam. Just watching apple from the side lines these days, my last remaining apple product is my SE. Good thing apple does not need my annual 3k spend.

    I can't see a SE sized phone coming, sad days ahead when it dies. there seem to be no more small phones in the market.

    First, the notch is a non-issue, and you'd realize that if you actually used a device with one. I loved my SE, too, and was determined to stay at that size, but when my second SE started giving me charging problems, I finally switched to the XR, and I'm very thankful that I did. Yes, this phone is too big to use comfortably with one hand, for my small stature, but everything else about it is amazing! The camera, the speakers, the battery life. It's a complete transformation of how I used my phones from the past. It just keeps on ticking and doing whatever I need it to do.

    king editor the gratelolliverGeorgeBMacwatto_cobra
     4Likes 0Dislikes 0Informatives
  • Reply 27 of 28
    davendaven Posts: 768member
    tmay said:
    daven said:
    tmay said:
    daven said:
    I can see how having dedicated silicon for matrix operations will benefit photography and video production. Multiple cameras give you different samples of points in space. If two cameras read the same point differently you have to have to choose which sample is correct, average the two data values, or have some algorithm determining some intermediate value for that point. For three cameras, if two of the cameras agree, the true value is likely the value the two cameras agreed on. However, when you have multiple cameras you also have different view points and have to calculate how the points correlate. When you do that you use matrices with sine and cosine values and almost all the time you don't have a direct correlation. A point in one camera is almost always corresponds to a point between other pixel points in the second camera so you have to sample the surrounding points and calculate what the value of the corresponding point in the second image would be if it were sampled. It can get computationally expensive to do that and having dedicated silicon may make it practical. Having three cameras adds to the complexity but also adds to the amount of data you have to make sure you have the value of the pixel correct.

    It really is amazing how far digital photography has come in twenty years.
    It might be the case that Apple builds an internal model of the three lens configuration, specific to an individual iPhone, processes it with machine learning, and that becomes the basis of subsequent calculation, until an optimization level is reached, perhaps giving the user realtime response.

    But, yeah, it is amazing.
    Yes. The phone can start out with a basic general calibration and fine tune it with machine learning to get it even better. Because of manufacturing variations (the lenses will not be mounted with the same pixel precision between any two cameras so there will always be differences) and because the world is 3d you will have differences in close photos. Right now for photographs, many phones take multiple photos often at slightly different exposures, align them (they should be aligned because of camera shake) and average the pixels. That works ok for still photos but if you have a fast moving subject or are making a video, that isn't a great technique. Now consider three simultaneous photos... you can do pixel averaging for videos along with the standard camera manufacturer calibration. Of course I'm speculating on all of this. I have no inside information. From what I understand, the camera setup also allows for wider field of view photos so you can fix mis-framed photos.

    That said, I upgraded my iPhone 5s with an Xs Max so I won't be buying a new phone for a while.
    I'm thinking that Apple starts with a factory calibration, and then the iPhone, over time, applies machine learning to the various real world configurations of focus amongst the three lenses. The iPhone should be able to compute the image with the least amount of overhead with that process. For still images, it might be best to just do the full computation, and maybe that won't add the much latency to the process. Video is much more forgiving, so it may be a 24, 30, or 60 fps process. We'll just have to wait.

    I'm speculating as well.

    Depending on whether the new iPad Pro offers the triple lens, I'll be buying the triple lens iPhone, but I'm fine with using an iPad for a camera for some things anyway.
    Ha! My situation is flipped. As I mentioned, I have this years iPhone but my latest iPad is the first generation iPad mini and it is showing it's age so maybe Santa will bring me an updated iPad?
    watto_cobra
     1Like 0Dislikes 0Informatives
  • Reply 28 of 28
    here I thought the bump was ugly as hell, then they gave us the notch. now they are taking it to the next level; the hideous looking multi-cam. Just watching apple from the side lines these days, my last remaining apple product is my SE. Good thing apple does not need my annual 3k spend.

    I can't see a SE sized phone coming, sad days ahead when it dies. there seem to be no more small phones in the market.
    I don't find the bump that ugly. What I find ugly is how, on these mockup renders, the Apple logo is positioned so close to it. It’s not balanced at all. If the logo would be gone or placed further away from the bump, I wouldn’t mind so much. 

    The camera is one of the main reasons I upgrade - the processor of the X that I have now is fast enough - but it completely replaced any full frame professional camera that I used before because I always carry the phone. I would love to be able to shoot with a larger sensor, especially video, even if that mode is done computational. 
    Which I think will be Apple’s secret announcement: Video portrait mode & enhance portrait overall. Better portrait detection (photos can turn out ugly now) because of the third lens and new chip helping out. Better low light (like some android phones do already).
    I also hope the dynamic range can be improved and support for BRAW or Prores Raw, even if that means the ability to do so with third party software. 

    muthuk_vanalingamwatto_cobra
     2Likes 0Dislikes 0Informatives
Sign In or Register to comment.