Looking at Apple's new camera system on the iPhone XS and iPhone XS Max
The iPhone XS and iPhone XS Max have an externally similar camera assembly to the iPhone X, but under the lens is a new system. AppleInsider delves into the new camera, and what it will do for you.

In terms of camera specifications, Apple's latest flagship iPhones are very much like past models, but are at the same time very different. Calling it a "new dual-camera system," Apple carries over the twin 12-megapixel shooter layout introduced with iPhone X -- a wide-angle lens and a 2X telephoto lens stacked one atop the other.
In fact, looking at the raw specs, the iPhone XS and the iPhone X appear to have largely unchanged cameras. They share identical megapixel counts, F numbers, and other key aspects, but that is only painting part of the picture. Behind those lenses are larger sensors, a faster processor and an improved image signal processor (ISP). Those, combined with several other new features make these the best cameras to ever grace iPhone.

The main wide-angle camera has a new, larger sensor. The module boasts a 1.4-micrometer pixel pitch, up from the 1.22-micrometer pixel pitch found on the iPhone X. This nearly 20-percent increase in pixel depth should greatly help with light sensitivity, and indeed Apple SVP of Worldwide Marketing Phil Schiller said as much onstage on Wednesday.

Apple has also tightly integrated the ISP with the newly upgraded Neural Engine found in its A12 Bionic processor. The A12 Bionic is an extremely powerful chip that incorporates a number of upgrades over its A11 Bionic predecessor, including faster and more efficient processing cores, and a beefed up GPU. More importantly, the now 8-core Neural Engine plays a larger role in capturing and processing images.
For example, the Neural Engine assists with facial recognition, facial landmarks and image segmentation. Image segmentation helps separate the subject from the background and is likely what is being used to simulate Portrait Mode photos on the iPhone XR, but with a single-lens setup.
The additional speed allows more information to be captured, which has enabled Apple to build in a new Smart HDR feature. High Dynamic Range photos have been available on iPhone for some time, but Smart HDR is more accurate and snappy thanks to the faster processor capturing even more data.

Portrait Mode has quickly become one of the most popular features on iPhones, and it gets even better with iPhone XS. A new feature allows for advanced depth control, which allows users to better handle bokeh.
After shooting a Portrait Mode photo, users can edit the image and adjust the simulated aperture, increasing or decreasing the amount of background blur. The lower the on-screen F value, the wider the "aperture," presenting more blur and larger bokeh in the background.
Lastly, a new and improved TrueTone flash helps brighten low-light photos.
Turning to video, there are less, but still important improvements. When shooting at 30 frames per second rather than 60, the dynamic range of the captured video is extended. This lends itself to more detailed clips, which is especially noticable on the HDR10 enabled displays. As demoed in Apple's keynote, lowlight performance is also significantly improved with less grain and noise noticable.
Cameras are always important to iPhones, and never more so than during the S-model years where the phones tend to lack any large physical changes to the design. AppleInsider will be going hands-on with the latest round of iPhones soon to truly put the cameras to the test.

In terms of camera specifications, Apple's latest flagship iPhones are very much like past models, but are at the same time very different. Calling it a "new dual-camera system," Apple carries over the twin 12-megapixel shooter layout introduced with iPhone X -- a wide-angle lens and a 2X telephoto lens stacked one atop the other.
In fact, looking at the raw specs, the iPhone XS and the iPhone X appear to have largely unchanged cameras. They share identical megapixel counts, F numbers, and other key aspects, but that is only painting part of the picture. Behind those lenses are larger sensors, a faster processor and an improved image signal processor (ISP). Those, combined with several other new features make these the best cameras to ever grace iPhone.

The main wide-angle camera has a new, larger sensor. The module boasts a 1.4-micrometer pixel pitch, up from the 1.22-micrometer pixel pitch found on the iPhone X. This nearly 20-percent increase in pixel depth should greatly help with light sensitivity, and indeed Apple SVP of Worldwide Marketing Phil Schiller said as much onstage on Wednesday.

Apple has also tightly integrated the ISP with the newly upgraded Neural Engine found in its A12 Bionic processor. The A12 Bionic is an extremely powerful chip that incorporates a number of upgrades over its A11 Bionic predecessor, including faster and more efficient processing cores, and a beefed up GPU. More importantly, the now 8-core Neural Engine plays a larger role in capturing and processing images.
For example, the Neural Engine assists with facial recognition, facial landmarks and image segmentation. Image segmentation helps separate the subject from the background and is likely what is being used to simulate Portrait Mode photos on the iPhone XR, but with a single-lens setup.
The additional speed allows more information to be captured, which has enabled Apple to build in a new Smart HDR feature. High Dynamic Range photos have been available on iPhone for some time, but Smart HDR is more accurate and snappy thanks to the faster processor capturing even more data.

Portrait Mode has quickly become one of the most popular features on iPhones, and it gets even better with iPhone XS. A new feature allows for advanced depth control, which allows users to better handle bokeh.
After shooting a Portrait Mode photo, users can edit the image and adjust the simulated aperture, increasing or decreasing the amount of background blur. The lower the on-screen F value, the wider the "aperture," presenting more blur and larger bokeh in the background.
Lastly, a new and improved TrueTone flash helps brighten low-light photos.
Turning to video, there are less, but still important improvements. When shooting at 30 frames per second rather than 60, the dynamic range of the captured video is extended. This lends itself to more detailed clips, which is especially noticable on the HDR10 enabled displays. As demoed in Apple's keynote, lowlight performance is also significantly improved with less grain and noise noticable.
Cameras are always important to iPhones, and never more so than during the S-model years where the phones tend to lack any large physical changes to the design. AppleInsider will be going hands-on with the latest round of iPhones soon to truly put the cameras to the test.
Comments
Anyone...?
This is just simulating that by using depth mapping to dynamically change the blur of the background with post processing, at least how I understand it. Portrait mode has done this for a while (simulating bokeh), but these seems both better and more customizable.
This is not the same as looking through a camera viewfinder, changing the f/stop (closing down/opening up the lens) with the same shutter speed and noticing the change in exposure for the camera before a picture is taken.
- Apple’s Bokeh control simulates what it would look like if the f/stop had been changed with the shutter speed and light exposure being constant.
This done in AI where one variable can be changed.
In photography, bokeh is the aesthetic quality of the blur produced in the out-of-focus parts of an image produced by a lens. Bokeh has been defined as "the way the lens renders out-of-focus points of light". Differences in lens aberrations and aperture shape cause some lens designs to blur the image in a way that is pleasing to the eye, while others produce blurring that is unpleasant or distracting ("good" and "bad" bokeh, respectively).
Source: Wikipedia
I've got a few questions in my mind about the improvements to the camera system. Based on what Phil said during the presentation, it looks like the iPhone potentially stores 6-12 shots for each photo taken. That would potentially add to the size of each photo.
What happens to these additional shots when you transfer the photos to the Mac? Will Photos on the Mac also be updated to allow for the new options, like setting the DoF for shots?
What happens when I choose to export the unmodified originals from Photos on the Mac? Will it export each of the shots as a separate photo?