'iPhone XI' and 'iPhone XI Max' case manufacturing dummies pop up on Chinese social media

Posted:
in iPhone edited April 20
A pair of images of an "iPhone XI" dummy for manufacturing purposes purports to show accurate dimensions of the 2019 iPhone lineup, including a square camera extrusion.




The images appear to be 3d prints or milled units from a CAD file. Discussion of the dummies suspects them to be iPhone "blanks" matching the dimensions of a future iPhone, used to engineer protective third-party cases.

Little can be gleaned from the blanks that hasn't already been rumored. The camera penetration is square, with three areas where a camera lens would be located. A fourth smaller cutout in the camera extrusion suggests where the flash may end up on the final unit.

The second image shows that Apple may be planning to retain the notch. The notch shows four sensor penetrations, and a speaker hole.




The provenance of the images isn't clear. They may in fact be dummies generated from leaked specifications, in much the same way that accurate enclosure dummies were available for the iPhone X and iPhone XS families in late April of 2017 and 2018, respectively. Notably, at the corresponding times, the names for the products were not accurate. However, they may also be pure speculation based on previous rumors.

Previous predictions about the 2019 iPhone lineup speculate that the rear cameras of the expected 6.5-inch OLED, 5.8-inch OLED, and 6.1- inch LCD 2019 iPhone models will likely upgrade to triple-camera and dual-camera, respectively. More specifically, a Sony-provided super-wide camera will be added to the model. A new black coating will be used to make the camera "inconspicuous," but what precisely that entails is not presently known.

Ming-Chi Kuo has also speculated that the 2019 iPhone lineup will retain a Lightning connector rather than adopt USB-C, as the iPad Pro range has. iPhones are also expected to keep Apple's TrueDepth camera and an associated display notch. All or part of the lineup is slated to get UWB (ultra-wide band) for indoor positioning and navigation, a frosted glass casing, and larger batteries. One interesting addition is so-called "bilateral" wireless charging, which would allow the phone to charge other devices wirelessly, acting as a charging pad of sorts.

TrueDepth may see an update with a higher-power flood illuminator for better Face ID recognition, Kuo said, while a new 6.1-inch LCD model might be upgraded to incorporate 4GB of RAM, up from the current 3GB in the iPhone XR.

The Slashleaks post on Saturday was sourced from social media venue Weibo.
«1345

Comments

  • Reply 1 of 82
    avon b7avon b7 Posts: 4,216member
    At first, and depending on the render, I wasn't sure if I liked the camera placement setup.

    Over time I've reached the conclusion that something seems wrong. It seems lopsided.

    There was talk of making them less visible in the final product. I hope that's the case.
    racerhomie3curtis hannahchemengin
  • Reply 2 of 82
    StrangeDaysStrangeDays Posts: 8,588member
    Brace yourselves...Apple critics are activating their “concern” mode
    racerhomie3lkrupprepressthiscornchip
  • Reply 3 of 82
    avon b7avon b7 Posts: 4,216member
    Brace yourselves...Apple critics are activating their “concern” mode
    Erm no. If there is a mode, it is 'opinion mode'.

    What's yours?

    Let me guess. If it had a Huawei logo on the back you would be heaving into a bucket but as it has an Apple logo on the back you love it!

    Did I guess right?

    As I implied, I'll wait to see if they manage to hide it somehow.


    cornchipboredumbdesignr1STnTENDERBITSchemengin
  • Reply 4 of 82
    tmaytmay Posts: 3,959member
    avon b7 said:
    At first, and depending on the render, I wasn't sure if I liked the camera placement setup.

    Over time I've reached the conclusion that something seems wrong. It seems lopsided.

    There was talk of making them less visible in the final product. I hope that's the case.
    You always make me laugh!

    Seriously, Apple's configuration is likely not "cosmetic", but developed around an optimum placement of the three imagining sensors, for images, video and AR. We won't know until after it is delivered, but I'd bet that configuration will soon be copied by many of the Android OS device makers.

    Over time, I've come to the realization that you are really shallow.

    History would tell you that Apple doesn't spend a lot of time hiding function.
    edited April 20 repressthiscurtis hannahStrangeDaysanantksundaramshark5150
  • Reply 5 of 82
    avon b7avon b7 Posts: 4,216member
    tmay said:
    avon b7 said:
    At first, and depending on the render, I wasn't sure if I liked the camera placement setup.

    Over time I've reached the conclusion that something seems wrong. It seems lopsided.

    There was talk of making them less visible in the final product. I hope that's the case.
    You always make me laugh!

    Seriously, Apple's configuration is likely not "cosmetic", but developed around an optimum placement of the three imagining sensors, for images, video and AR. We won't know until after it is delivered, but I'd bet that configuration will soon be copied by many of the Android OS device makers.

    Over time, I've come to the realization that you are really shallow.

    History would tell you that Apple doesn't spend a lot of time hiding function.
    Is that optimum in the sense of the hockey puck mouse?

    Please explain why this placement is more 'optimum' than the Mate 20 Pro placement? Even if it is only 'likely'.

    Or why not forget 'optimum' altogether and give your opinion on the cosmetic angle. You know, just in case 'optimum' doesn't eventually factor into anything.

    By the way, it should be clear that I am referring to the camera grouping and not the distribution within the grouping!
    edited April 20
  • Reply 6 of 82
    holyoneholyone Posts: 391member
    avon b7 said:
    At first, and depending on the render, I wasn't sure if I liked the camera placement setup.

    Over time I've reached the conclusion that something seems wrong. It seems lopsided.

    There was talk of making them less visible in the final product. I hope that's the case.
    My question is why doesn't Jony just elongate the current set up to fit three cameras and then turn the round flash into a long strip, place it outside the bump elegantly parallel ? I just can't see this square thing looking good even if it was completely flush, I think there's a HUAWEI with 3 cam square setup that's quite objectionable.
    superklotoncurtis hannah
  • Reply 7 of 82
    tmaytmay Posts: 3,959member
    avon b7 said:
    tmay said:
    avon b7 said:
    At first, and depending on the render, I wasn't sure if I liked the camera placement setup.

    Over time I've reached the conclusion that something seems wrong. It seems lopsided.

    There was talk of making them less visible in the final product. I hope that's the case.
    You always make me laugh!

    Seriously, Apple's configuration is likely not "cosmetic", but developed around an optimum placement of the three imagining sensors, for images, video and AR. We won't know until after it is delivered, but I'd bet that configuration will soon be copied by many of the Android OS device makers.

    Over time, I've come to the realization that you are really shallow.

    History would tell you that Apple doesn't spend a lot of time hiding function.
    Is that optimum in the sense of the hockey puck mouse?

    Please explain why this placement is more 'optimum' than the Mate 20 Pro placement? Even if it is only 'likely'.

    Or why not forget 'optimum' altogether and give your opinion on the cosmetic angle. You know, just in case 'optimum' doesn't eventually factor into anything.

    By the way, it should be clear that I am referring to the camera grouping and not the distribution within the grouping!
    Well, since you asked...

    Having 3 lenses in a line isn't going to give you much spatial information in the axis perpendicular to that line.

    Apple has one sensor that is off axis that will give very good spatial information. Actually, the primary imager could be any of the three, without issue.

    This would be the preferred configuration for obtaining depth information, ie, range finding, even if you have a TOF sensor.

    The real question will be how well it enables computation of up to three overlapping images or videos.

    Seems pretty obvious. 

    I'd argue that three in a row is easier to package.

    My guess is that Apple will be very particular about the alignment of those three imagining modules in the manufacturing process. I would also guess that they are aligned in an equilateral triangle with high precision on the spacing.  It's even possible that Apple will align the sensor surfaces in the same plane, but I think that is technically impossible with three different focal lengths and variation in the sensor package. If they could, the computations would be slightly easier, but I'm guessing that isn't even a problem.

    The real question will be, what are the orientations of the sensor by width and heights with each other. That I will not speculate on, but Apple certainly will consider it.



    edited April 20 repressthiscornchipanantksundaramradarthekatcaladanianshark5150
  • Reply 8 of 82
    holyoneholyone Posts: 391member
    tmay said:
    avon b7 said:
    tmay said:
    avon b7 said:
    At first, and depending on the render, I wasn't sure if I liked the camera placement setup.

    Over time I've reached the conclusion that something seems wrong. It seems lopsided.

    There was talk of making them less visible in the final product. I hope that's the case.
    You always make me laugh!

    Seriously, Apple's configuration is likely not "cosmetic", but developed around an optimum placement of the three imagining sensors, for images, video and AR. We won't know until after it is delivered, but I'd bet that configuration will soon be copied by many of the Android OS device makers.

    Over time, I've come to the realization that you are really shallow.

    History would tell you that Apple doesn't spend a lot of time hiding function.
    Is that optimum in the sense of the hockey puck mouse?

    Please explain why this placement is more 'optimum' than the Mate 20 Pro placement? Even if it is only 'likely'.

    Or why not forget 'optimum' altogether and give your opinion on the cosmetic angle. You know, just in case 'optimum' doesn't eventually factor into anything.

    By the way, it should be clear that I am referring to the camera grouping and not the distribution within the grouping!
    Well, since you asked...

    Having 3 lenses in a line isn't going to give you much spatial information in the axis perpendicular to that line.

    Apple has one sensor that is off axis that will give very good spatial information. Actually, the primary imager could be any of the three, without issue.

    This would be the preferred configuration for obtaining depth information, ie, range finding, even if you have a TOF sensor.

    The real question will be how well it enables computation of up to three overlapping images or videos.

    Seems pretty obvious. 

    I'd argue that three in a row is easier to package.

    My guess is that Apple will be very particular about the alignment of those three imagining modules in the manufacturing process.


    Cool, but if that is true wouldn't 4 sensors be more useful in that regard ?
  • Reply 9 of 82
    tmay said:
    avon b7 said:
    tmay said:
    avon b7 said:
    At first, and depending on the render, I wasn't sure if I liked the camera placement setup.

    Over time I've reached the conclusion that something seems wrong. It seems lopsided.

    There was talk of making them less visible in the final product. I hope that's the case.
    You always make me laugh!

    Seriously, Apple's configuration is likely not "cosmetic", but developed around an optimum placement of the three imagining sensors, for images, video and AR. We won't know until after it is delivered, but I'd bet that configuration will soon be copied by many of the Android OS device makers.

    Over time, I've come to the realization that you are really shallow.

    History would tell you that Apple doesn't spend a lot of time hiding function.
    Is that optimum in the sense of the hockey puck mouse?

    Please explain why this placement is more 'optimum' than the Mate 20 Pro placement? Even if it is only 'likely'.

    Or why not forget 'optimum' altogether and give your opinion on the cosmetic angle. You know, just in case 'optimum' doesn't eventually factor into anything.

    By the way, it should be clear that I am referring to the camera grouping and not the distribution within the grouping!
    Well, since you asked...

    Having 3 lenses in a line isn't going to give you much spatial information in the axis perpendicular to that line.

    Apple has one sensor that is off axis that will give very good spatial information. Actually, the primary imager could be any of the three, without issue.

    This would be the preferred configuration for obtaining depth information, ie, range finding, even if you have a TOF sensor.

    The real question will be how well it enables computation of up to three overlapping images or videos.

    Seems pretty obvious. 

    I'd argue that three in a row is easier to package.

    My guess is that Apple will be very particular about the alignment of those three imagining modules in the manufacturing process. I would also guess that they are aligned in an equilateral triangle with high precision on the spacing.  It's even possible that Apple will align the sensor surfaces in the same plane, but I think that is technically impossible with three different focal lengths and variation in the sensor package. If they could, the computations would be slightly easier, but I'm guessing that isn't even a problem.

    The real question will be, what are the orientations of the sensor by width and heights with each other. That I will not speculate on, but Apple certainly will consider it.



    Then, It would have been even better if one of the cameras was on the other top corner of the phone.. it would have provided better parallax for depth/spatial detection/information..plus it would have allowed them to do away with the hideous square cam area.
    edited April 20
  • Reply 10 of 82
    tmaytmay Posts: 3,959member
    holyone said:
    tmay said:
    avon b7 said:
    tmay said:
    avon b7 said:
    At first, and depending on the render, I wasn't sure if I liked the camera placement setup.

    Over time I've reached the conclusion that something seems wrong. It seems lopsided.

    There was talk of making them less visible in the final product. I hope that's the case.
    You always make me laugh!

    Seriously, Apple's configuration is likely not "cosmetic", but developed around an optimum placement of the three imagining sensors, for images, video and AR. We won't know until after it is delivered, but I'd bet that configuration will soon be copied by many of the Android OS device makers.

    Over time, I've come to the realization that you are really shallow.

    History would tell you that Apple doesn't spend a lot of time hiding function.
    Is that optimum in the sense of the hockey puck mouse?

    Please explain why this placement is more 'optimum' than the Mate 20 Pro placement? Even if it is only 'likely'.

    Or why not forget 'optimum' altogether and give your opinion on the cosmetic angle. You know, just in case 'optimum' doesn't eventually factor into anything.

    By the way, it should be clear that I am referring to the camera grouping and not the distribution within the grouping!
    Well, since you asked...

    Having 3 lenses in a line isn't going to give you much spatial information in the axis perpendicular to that line.

    Apple has one sensor that is off axis that will give very good spatial information. Actually, the primary imager could be any of the three, without issue.

    This would be the preferred configuration for obtaining depth information, ie, range finding, even if you have a TOF sensor.

    The real question will be how well it enables computation of up to three overlapping images or videos.

    Seems pretty obvious. 

    I'd argue that three in a row is easier to package.

    My guess is that Apple will be very particular about the alignment of those three imagining modules in the manufacturing process.


    Cool, but if that is true wouldn't 4 sensors be more useful in that regard ?
    Yep, but each added sensor is giving you diminishing returns, ie, bang for the buck, increasing the device cost and reducing sales volume. That doesn't mean that I don't agree with you, just that Apple seems to be on a roadmap that makes sense for them by moving from two to three imagers for two of their upcoming devices. 

    I'm on record as being an early buyer of the Max version of this when they arrive.
    holyonerepressthis
  • Reply 11 of 82
    mdriftmeyermdriftmeyer Posts: 7,304member
    avon b7 said:
    At first, and depending on the render, I wasn't sure if I liked the camera placement setup.

    Over time I've reached the conclusion that something seems wrong. It seems lopsided.

    There was talk of making them less visible in the final product. I hope that's the case.
    Take an Engineering Physics course on Optics then come back and comment. You'll realize the reason for this array once you understand the laws of optics. They are attempting to create a DSLR capable lens array inside a phone.
    edited April 20 tmayemig647StrangeDaysanantksundaramradarthekat
  • Reply 12 of 82
    tmaytmay Posts: 3,959member
    tmay said:
    avon b7 said:
    tmay said:
    avon b7 said:
    At first, and depending on the render, I wasn't sure if I liked the camera placement setup.

    Over time I've reached the conclusion that something seems wrong. It seems lopsided.

    There was talk of making them less visible in the final product. I hope that's the case.
    You always make me laugh!

    Seriously, Apple's configuration is likely not "cosmetic", but developed around an optimum placement of the three imagining sensors, for images, video and AR. We won't know until after it is delivered, but I'd bet that configuration will soon be copied by many of the Android OS device makers.

    Over time, I've come to the realization that you are really shallow.

    History would tell you that Apple doesn't spend a lot of time hiding function.
    Is that optimum in the sense of the hockey puck mouse?

    Please explain why this placement is more 'optimum' than the Mate 20 Pro placement? Even if it is only 'likely'.

    Or why not forget 'optimum' altogether and give your opinion on the cosmetic angle. You know, just in case 'optimum' doesn't eventually factor into anything.

    By the way, it should be clear that I am referring to the camera grouping and not the distribution within the grouping!
    Well, since you asked...

    Having 3 lenses in a line isn't going to give you much spatial information in the axis perpendicular to that line.

    Apple has one sensor that is off axis that will give very good spatial information. Actually, the primary imager could be any of the three, without issue.

    This would be the preferred configuration for obtaining depth information, ie, range finding, even if you have a TOF sensor.

    The real question will be how well it enables computation of up to three overlapping images or videos.

    Seems pretty obvious. 

    I'd argue that three in a row is easier to package.

    My guess is that Apple will be very particular about the alignment of those three imagining modules in the manufacturing process. I would also guess that they are aligned in an equilateral triangle with high precision on the spacing.  It's even possible that Apple will align the sensor surfaces in the same plane, but I think that is technically impossible with three different focal lengths and variation in the sensor package. If they could, the computations would be slightly easier, but I'm guessing that isn't even a problem.

    The real question will be, what are the orientations of the sensor by width and heights with each other. That I will not speculate on, but Apple certainly will consider it.



    Then, It would have been even better if one of the cameras was on the other top corner of the phone.. it would have provided better parallax for depth/ detection/information..plus it would have allowed them to do away with the hideous square cam area.
    If you are primarily interested in stereo-photography, but otherwise, the parallax would rule out any closeup use. My point is that having them closely spaced in a line isn't as beneficial as having them closely spaced in a triangular configuration, from a computational photography standpoint.
    radarthekat
  • Reply 13 of 82
    holyoneholyone Posts: 391member
    tmay said:
    holyone said:
    tmay said:
    avon b7 said:
    tmay said:
    avon b7 said:
    At first, and depending on the render, I wasn't sure if I liked the camera placement setup.

    Over time I've reached the conclusion that something seems wrong. It seems lopsided.

    There was talk of making them less visible in the final product. I hope that's the case.
    You always make me laugh!

    Seriously, Apple's configuration is likely not "cosmetic", but developed around an optimum placement of the three imagining sensors, for images, video and AR. We won't know until after it is delivered, but I'd bet that configuration will soon be copied by many of the Android OS device makers.

    Over time, I've come to the realization that you are really shallow.

    History would tell you that Apple doesn't spend a lot of time hiding function.
    Is that optimum in the sense of the hockey puck mouse?

    Please explain why this placement is more 'optimum' than the Mate 20 Pro placement? Even if it is only 'likely'.

    Or why not forget 'optimum' altogether and give your opinion on the cosmetic angle. You know, just in case 'optimum' doesn't eventually factor into anything.

    By the way, it should be clear that I am referring to the camera grouping and not the distribution within the grouping!
    Well, since you asked...

    Having 3 lenses in a line isn't going to give you much spatial information in the axis perpendicular to that line.

    Apple has one sensor that is off axis that will give very good spatial information. Actually, the primary imager could be any of the three, without issue.

    This would be the preferred configuration for obtaining depth information, ie, range finding, even if you have a TOF sensor.

    The real question will be how well it enables computation of up to three overlapping images or videos.

    Seems pretty obvious. 

    I'd argue that three in a row is easier to package.

    My guess is that Apple will be very particular about the alignment of those three imagining modules in the manufacturing process.


    Cool, but if that is true wouldn't 4 sensors be more useful in that regard ?
    Yep, but each added sensor is giving you diminishing returns, ie, bang for the buck, increasing the device cost and reducing sales volume. That doesn't mean that I don't agree with you, just that Apple seems to be on a roadmap that makes sense for them by moving from two to three imagers for two of their upcoming devices. 

    I'm on record as being an early buyer of the Max version of this when they arrive.
    I see, But personally don't you think it would be worth the cost to Apple, just to justify or compensate for the esthetics, most people won't appreciate the technicalities, all they'll see is an affront at the back of their iPhones, 4 cameras just seem like a simpler idea to sell
  • Reply 14 of 82
    tmay said:
    tmay said:
    avon b7 said:
    tmay said:
    avon b7 said:
    At first, and depending on the render, I wasn't sure if I liked the camera placement setup.

    Over time I've reached the conclusion that something seems wrong. It seems lopsided.

    There was talk of making them less visible in the final product. I hope that's the case.
    You always make me laugh!

    Seriously, Apple's configuration is likely not "cosmetic", but developed around an optimum placement of the three imagining sensors, for images, video and AR. We won't know until after it is delivered, but I'd bet that configuration will soon be copied by many of the Android OS device makers.

    Over time, I've come to the realization that you are really shallow.

    History would tell you that Apple doesn't spend a lot of time hiding function.
    Is that optimum in the sense of the hockey puck mouse?

    Please explain why this placement is more 'optimum' than the Mate 20 Pro placement? Even if it is only 'likely'.

    Or why not forget 'optimum' altogether and give your opinion on the cosmetic angle. You know, just in case 'optimum' doesn't eventually factor into anything.

    By the way, it should be clear that I am referring to the camera grouping and not the distribution within the grouping!
    Well, since you asked...

    Having 3 lenses in a line isn't going to give you much spatial information in the axis perpendicular to that line.

    Apple has one sensor that is off axis that will give very good spatial information. Actually, the primary imager could be any of the three, without issue.

    This would be the preferred configuration for obtaining depth information, ie, range finding, even if you have a TOF sensor.

    The real question will be how well it enables computation of up to three overlapping images or videos.

    Seems pretty obvious. 

    I'd argue that three in a row is easier to package.

    My guess is that Apple will be very particular about the alignment of those three imagining modules in the manufacturing process. I would also guess that they are aligned in an equilateral triangle with high precision on the spacing.  It's even possible that Apple will align the sensor surfaces in the same plane, but I think that is technically impossible with three different focal lengths and variation in the sensor package. If they could, the computations would be slightly easier, but I'm guessing that isn't even a problem.

    The real question will be, what are the orientations of the sensor by width and heights with each other. That I will not speculate on, but Apple certainly will consider it.



    Then, It would have been even better if one of the cameras was on the other top corner of the phone.. it would have provided better parallax for depth/ detection/information..plus it would have allowed them to do away with the hideous square cam area.
    If you are primarily interested in stereo-photography, but otherwise, the parallax would rule out any closeup use. My point is that having them closely spaced in a line isn't as beneficial as having them closely spaced in a triangular configuration, from a computational photography standpoint.
    I dont see why it would be useless for closeups... specially with interpolation  algorithms...
    i think  it would offer even better precision in stereo/3d depth detection . 
  • Reply 15 of 82
    tmaytmay Posts: 3,959member
    holyone said:
    tmay said:
    holyone said:
    tmay said:
    avon b7 said:
    tmay said:
    avon b7 said:
    At first, and depending on the render, I wasn't sure if I liked the camera placement setup.

    Over time I've reached the conclusion that something seems wrong. It seems lopsided.

    There was talk of making them less visible in the final product. I hope that's the case.
    You always make me laugh!

    Seriously, Apple's configuration is likely not "cosmetic", but developed around an optimum placement of the three imagining sensors, for images, video and AR. We won't know until after it is delivered, but I'd bet that configuration will soon be copied by many of the Android OS device makers.

    Over time, I've come to the realization that you are really shallow.

    History would tell you that Apple doesn't spend a lot of time hiding function.
    Is that optimum in the sense of the hockey puck mouse?

    Please explain why this placement is more 'optimum' than the Mate 20 Pro placement? Even if it is only 'likely'.

    Or why not forget 'optimum' altogether and give your opinion on the cosmetic angle. You know, just in case 'optimum' doesn't eventually factor into anything.

    By the way, it should be clear that I am referring to the camera grouping and not the distribution within the grouping!
    Well, since you asked...

    Having 3 lenses in a line isn't going to give you much spatial information in the axis perpendicular to that line.

    Apple has one sensor that is off axis that will give very good spatial information. Actually, the primary imager could be any of the three, without issue.

    This would be the preferred configuration for obtaining depth information, ie, range finding, even if you have a TOF sensor.

    The real question will be how well it enables computation of up to three overlapping images or videos.

    Seems pretty obvious. 

    I'd argue that three in a row is easier to package.

    My guess is that Apple will be very particular about the alignment of those three imagining modules in the manufacturing process.


    Cool, but if that is true wouldn't 4 sensors be more useful in that regard ?
    Yep, but each added sensor is giving you diminishing returns, ie, bang for the buck, increasing the device cost and reducing sales volume. That doesn't mean that I don't agree with you, just that Apple seems to be on a roadmap that makes sense for them by moving from two to three imagers for two of their upcoming devices. 

    I'm on record as being an early buyer of the Max version of this when they arrive.
    I see, But personally don't you think it would be worth the cost to Apple, just to justify or compensate for the esthetics, most people won't appreciate the technicalities, all they'll see is an affront at the back of their iPhones, 4 cameras just seem like a simpler idea to sell
    Most people didn't initially appreciate the ascetics of AirPods, the Notch, or removal of the headphone jack, but Apple actually came out better than fine on those. Perhaps best to wait and see the result in the flesh.
    StrangeDaysanantksundaramcaladanian
  • Reply 16 of 82
    holyoneholyone Posts: 391member
    tmay said:
    holyone said:
    tmay said:
    holyone said:
    tmay said:
    avon b7 said:
    tmay said:
    avon b7 said:
    At first, and depending on the render, I wasn't sure if I liked the camera placement setup.

    Over time I've reached the conclusion that something seems wrong. It seems lopsided.

    There was talk of making them less visible in the final product. I hope that's the case.
    You always make me laugh!

    Seriously, Apple's configuration is likely not "cosmetic", but developed around an optimum placement of the three imagining sensors, for images, video and AR. We won't know until after it is delivered, but I'd bet that configuration will soon be copied by many of the Android OS device makers.

    Over time, I've come to the realization that you are really shallow.

    History would tell you that Apple doesn't spend a lot of time hiding function.
    Is that optimum in the sense of the hockey puck mouse?

    Please explain why this placement is more 'optimum' than the Mate 20 Pro placement? Even if it is only 'likely'.

    Or why not forget 'optimum' altogether and give your opinion on the cosmetic angle. You know, just in case 'optimum' doesn't eventually factor into anything.

    By the way, it should be clear that I am referring to the camera grouping and not the distribution within the grouping!
    Well, since you asked...

    Having 3 lenses in a line isn't going to give you much spatial information in the axis perpendicular to that line.

    Apple has one sensor that is off axis that will give very good spatial information. Actually, the primary imager could be any of the three, without issue.

    This would be the preferred configuration for obtaining depth information, ie, range finding, even if you have a TOF sensor.

    The real question will be how well it enables computation of up to three overlapping images or videos.

    Seems pretty obvious. 

    I'd argue that three in a row is easier to package.

    My guess is that Apple will be very particular about the alignment of those three imagining modules in the manufacturing process.


    Cool, but if that is true wouldn't 4 sensors be more useful in that regard ?
    Yep, but each added sensor is giving you diminishing returns, ie, bang for the buck, increasing the device cost and reducing sales volume. That doesn't mean that I don't agree with you, just that Apple seems to be on a roadmap that makes sense for them by moving from two to three imagers for two of their upcoming devices. 

    I'm on record as being an early buyer of the Max version of this when they arrive.
    I see, But personally don't you think it would be worth the cost to Apple, just to justify or compensate for the esthetics, most people won't appreciate the technicalities, all they'll see is an affront at the back of their iPhones, 4 cameras just seem like a simpler idea to sell
    Most people didn't initially appreciate the ascetics of AirPods, the Notch, or removal of the headphone jack, but Apple actually came out better than fine on those. Perhaps best to wait and see the result in the flesh.
    tmay said:
    holyone said:
    tmay said:
    holyone said:
    tmay said:
    avon b7 said:
    tmay said:
    avon b7 said:
    At first, and depending on the render, I wasn't sure if I liked the camera placement setup.

    Over time I've reached the conclusion that something seems wrong. It seems lopsided.

    There was talk of making them less visible in the final product. I hope that's the case.
    You always make me laugh!

    Seriously, Apple's configuration is likely not "cosmetic", but developed around an optimum placement of the three imagining sensors, for images, video and AR. We won't know until after it is delivered, but I'd bet that configuration will soon be copied by many of the Android OS device makers.

    Over time, I've come to the realization that you are really shallow.

    History would tell you that Apple doesn't spend a lot of time hiding function.
    Is that optimum in the sense of the hockey puck mouse?

    Please explain why this placement is more 'optimum' than the Mate 20 Pro placement? Even if it is only 'likely'.

    Or why not forget 'optimum' altogether and give your opinion on the cosmetic angle. You know, just in case 'optimum' doesn't eventually factor into anything.

    By the way, it should be clear that I am referring to the camera grouping and not the distribution within the grouping!
    Well, since you asked...

    Having 3 lenses in a line isn't going to give you much spatial information in the axis perpendicular to that line.

    Apple has one sensor that is off axis that will give very good spatial information. Actually, the primary imager could be any of the three, without issue.

    This would be the preferred configuration for obtaining depth information, ie, range finding, even if you have a TOF sensor.

    The real question will be how well it enables computation of up to three overlapping images or videos.

    Seems pretty obvious. 

    I'd argue that three in a row is easier to package.

    My guess is that Apple will be very particular about the alignment of those three imagining modules in the manufacturing process.


    Cool, but if that is true wouldn't 4 sensors be more useful in that regard ?
    Yep, but each added sensor is giving you diminishing returns, ie, bang for the buck, increasing the device cost and reducing sales volume. That doesn't mean that I don't agree with you, just that Apple seems to be on a roadmap that makes sense for them by moving from two to three imagers for two of their upcoming devices. 

    I'm on record as being an early buyer of the Max version of this when they arrive.
    I see, But personally don't you think it would be worth the cost to Apple, just to justify or compensate for the esthetics, most people won't appreciate the technicalities, all they'll see is an affront at the back of their iPhones, 4 cameras just seem like a simpler idea to sell
    Most people didn't initially appreciate the ascetics of AirPods, the Notch, or removal of the headphone jack, but Apple actually came out better than fine on those. Perhaps best to wait and see the result in the flesh.
    Good point, lets see what Jony's got for us.
  • Reply 17 of 82
    tmaytmay Posts: 3,959member
    holyone said:
    tmay said:
    holyone said:
    tmay said:
    holyone said:
    tmay said:
    avon b7 said:
    tmay said:
    avon b7 said:
    At first, and depending on the render, I wasn't sure if I liked the camera placement setup.

    Over time I've reached the conclusion that something seems wrong. It seems lopsided.

    There was talk of making them less visible in the final product. I hope that's the case.
    You always make me laugh!

    Seriously, Apple's configuration is likely not "cosmetic", but developed around an optimum placement of the three imagining sensors, for images, video and AR. We won't know until after it is delivered, but I'd bet that configuration will soon be copied by many of the Android OS device makers.

    Over time, I've come to the realization that you are really shallow.

    History would tell you that Apple doesn't spend a lot of time hiding function.
    Is that optimum in the sense of the hockey puck mouse?

    Please explain why this placement is more 'optimum' than the Mate 20 Pro placement? Even if it is only 'likely'.

    Or why not forget 'optimum' altogether and give your opinion on the cosmetic angle. You know, just in case 'optimum' doesn't eventually factor into anything.

    By the way, it should be clear that I am referring to the camera grouping and not the distribution within the grouping!
    Well, since you asked...

    Having 3 lenses in a line isn't going to give you much spatial information in the axis perpendicular to that line.

    Apple has one sensor that is off axis that will give very good spatial information. Actually, the primary imager could be any of the three, without issue.

    This would be the preferred configuration for obtaining depth information, ie, range finding, even if you have a TOF sensor.

    The real question will be how well it enables computation of up to three overlapping images or videos.

    Seems pretty obvious. 

    I'd argue that three in a row is easier to package.

    My guess is that Apple will be very particular about the alignment of those three imagining modules in the manufacturing process.


    Cool, but if that is true wouldn't 4 sensors be more useful in that regard ?
    Yep, but each added sensor is giving you diminishing returns, ie, bang for the buck, increasing the device cost and reducing sales volume. That doesn't mean that I don't agree with you, just that Apple seems to be on a roadmap that makes sense for them by moving from two to three imagers for two of their upcoming devices. 

    I'm on record as being an early buyer of the Max version of this when they arrive.
    I see, But personally don't you think it would be worth the cost to Apple, just to justify or compensate for the esthetics, most people won't appreciate the technicalities, all they'll see is an affront at the back of their iPhones, 4 cameras just seem like a simpler idea to sell
    Most people didn't initially appreciate the ascetics of AirPods, the Notch, or removal of the headphone jack, but Apple actually came out better than fine on those. Perhaps best to wait and see the result in the flesh.
    tmay said:
    holyone said:
    tmay said:
    holyone said:
    tmay said:
    avon b7 said:
    tmay said:
    avon b7 said:
    At first, and depending on the render, I wasn't sure if I liked the camera placement setup.

    Over time I've reached the conclusion that something seems wrong. It seems lopsided.

    There was talk of making them less visible in the final product. I hope that's the case.
    You always make me laugh!

    Seriously, Apple's configuration is likely not "cosmetic", but developed around an optimum placement of the three imagining sensors, for images, video and AR. We won't know until after it is delivered, but I'd bet that configuration will soon be copied by many of the Android OS device makers.

    Over time, I've come to the realization that you are really shallow.

    History would tell you that Apple doesn't spend a lot of time hiding function.
    Is that optimum in the sense of the hockey puck mouse?

    Please explain why this placement is more 'optimum' than the Mate 20 Pro placement? Even if it is only 'likely'.

    Or why not forget 'optimum' altogether and give your opinion on the cosmetic angle. You know, just in case 'optimum' doesn't eventually factor into anything.

    By the way, it should be clear that I am referring to the camera grouping and not the distribution within the grouping!
    Well, since you asked...

    Having 3 lenses in a line isn't going to give you much spatial information in the axis perpendicular to that line.

    Apple has one sensor that is off axis that will give very good spatial information. Actually, the primary imager could be any of the three, without issue.

    This would be the preferred configuration for obtaining depth information, ie, range finding, even if you have a TOF sensor.

    The real question will be how well it enables computation of up to three overlapping images or videos.

    Seems pretty obvious. 

    I'd argue that three in a row is easier to package.

    My guess is that Apple will be very particular about the alignment of those three imagining modules in the manufacturing process.


    Cool, but if that is true wouldn't 4 sensors be more useful in that regard ?
    Yep, but each added sensor is giving you diminishing returns, ie, bang for the buck, increasing the device cost and reducing sales volume. That doesn't mean that I don't agree with you, just that Apple seems to be on a roadmap that makes sense for them by moving from two to three imagers for two of their upcoming devices. 

    I'm on record as being an early buyer of the Max version of this when they arrive.
    I see, But personally don't you think it would be worth the cost to Apple, just to justify or compensate for the esthetics, most people won't appreciate the technicalities, all they'll see is an affront at the back of their iPhones, 4 cameras just seem like a simpler idea to sell
    Most people didn't initially appreciate the ascetics of AirPods, the Notch, or removal of the headphone jack, but Apple actually came out better than fine on those. Perhaps best to wait and see the result in the flesh.
    Good point, lets see what Jony's got for us.
    So here's the question that I would want to have answered.

    Three imagers, all aligned more or less with the width/length orientation of the iPhone,  or two in that orientation, and one rotated 90 degree so that people can shoot wide format while still holding the iPhone vertically. Not sure of the positives/negatives of computational photography with that configuration. I'm sure that Apple has considered this.
  • Reply 18 of 82
    Stories like this one make me realize that there should be a filter that allows me to filter out stories that don't have a certain threshold of likelihood of truthfulness/accuracy. The editors should rate each story on a scale of likelihood. This story is probably low on that scale, perhaps 50%, so if my setting is set to 75%, then I wouldn't even see this story show up in the list of stories. I feel that many people would want their filter to be set to 100%, such as people who like product reviews but won't want rumours of upcoming products. Doing this would give AppleInsider a good reason for using a browser cookie. I think that this is what cookies were invented for.
  • Reply 19 of 82
    tmaytmay Posts: 3,959member

    tmay said:
    tmay said:
    avon b7 said:
    tmay said:
    avon b7 said:
    At first, and depending on the render, I wasn't sure if I liked the camera placement setup.

    Over time I've reached the conclusion that something seems wrong. It seems lopsided.

    There was talk of making them less visible in the final product. I hope that's the case.
    You always make me laugh!

    Seriously, Apple's configuration is likely not "cosmetic", but developed around an optimum placement of the three imagining sensors, for images, video and AR. We won't know until after it is delivered, but I'd bet that configuration will soon be copied by many of the Android OS device makers.

    Over time, I've come to the realization that you are really shallow.

    History would tell you that Apple doesn't spend a lot of time hiding function.
    Is that optimum in the sense of the hockey puck mouse?

    Please explain why this placement is more 'optimum' than the Mate 20 Pro placement? Even if it is only 'likely'.

    Or why not forget 'optimum' altogether and give your opinion on the cosmetic angle. You know, just in case 'optimum' doesn't eventually factor into anything.

    By the way, it should be clear that I am referring to the camera grouping and not the distribution within the grouping!
    Well, since you asked...

    Having 3 lenses in a line isn't going to give you much spatial information in the axis perpendicular to that line.

    Apple has one sensor that is off axis that will give very good spatial information. Actually, the primary imager could be any of the three, without issue.

    This would be the preferred configuration for obtaining depth information, ie, range finding, even if you have a TOF sensor.

    The real question will be how well it enables computation of up to three overlapping images or videos.

    Seems pretty obvious. 

    I'd argue that three in a row is easier to package.

    My guess is that Apple will be very particular about the alignment of those three imagining modules in the manufacturing process. I would also guess that they are aligned in an equilateral triangle with high precision on the spacing.  It's even possible that Apple will align the sensor surfaces in the same plane, but I think that is technically impossible with three different focal lengths and variation in the sensor package. If they could, the computations would be slightly easier, but I'm guessing that isn't even a problem.

    The real question will be, what are the orientations of the sensor by width and heights with each other. That I will not speculate on, but Apple certainly will consider it.



    Then, It would have been even better if one of the cameras was on the other top corner of the phone.. it would have provided better parallax for depth/ detection/information..plus it would have allowed them to do away with the hideous square cam area.
    If you are primarily interested in stereo-photography, but otherwise, the parallax would rule out any closeup use. My point is that having them closely spaced in a line isn't as beneficial as having them closely spaced in a triangular configuration, from a computational photography standpoint.
    I dont see why it would be useless for closeups... specially with interpolation  algorithms...
    i think  it would offer even better precision in stereo/3d depth detection . 
    It really depends on the various field's of view of each of the imagers, but almost certainly more difficult to do macro closeups. There would also be an issue with the differential between spacing between the imagers on the two models, which would complicate manufacture.
    edited April 20 radarthekat
  • Reply 20 of 82
    chasmchasm Posts: 1,673member
    Noting that the back plates don't have the Apple logo in the same place, and someone has added (badly) the word iPhone in different fonts and sizes on the back, I'm a bit skeptical. But of course you never know.
    curtis hannahcornchipcaladanian
Sign In or Register to comment.