New Apple video ad shows the effects of Portrait Lighting mode in the iPhone 8 Plus

Posted:
in iPhone
Apple is promoting the photographic capabilities of the iPhone 8 Plus in a new advertisement released on Saturday morning, with the ad highlighting the new Portrait Lighting mode included with the iPhone by showing the effects users can produce using the feature.




Titled "Portraits of Her," the new ad on Apple's YouTube channel shows singer Shannon Wise of group The Shacks as she walks on a sidewalk. As the commercial goes on, the lighting of both Wise and the background changes, matching the different lighting effects capable within Portrait Mode, before ending with another person holding an iPhone 8 Plus to take Wise's picture.

As well as being the main subject of the song, Wise's band The Shacks recorded the track "This Strange Effect" used in the advertisement.





Introduced with the iPhone 8 Plus, Portrait Lighting is a mode that extends the Portrait mode's depth-of-field effect to include a number of lighting styles. Portrait Lighting can light the subject in five ways, including Natural Light, Studio, Light, and Contour Light largely affecting the subject, while Stage Light and Stage Light Mono illuminate just the subject at the same time as darkening the background.

Apple performed extensive research into portraiture for the feature, studying a number of well-known painters and photographers in the process. Images from the dual cameras on the back of the iPhone 8 Plus are used by the Apple-designed image signal processor to create a depth map of the scene and separate the subject from the background, before machine learning analyzes the face for features and to add appropriate lighting.

Portrait Lighting will also be usable on the iPhone X, though due to the device's TrueDepth camera for Face ID's 3D facial mapping, it will be available to use on the front of the iPhone X as well as with the rear cameras.
Soli
«1

Comments

  • Reply 1 of 29
    SoliSoli Posts: 8,802member
    I'm not a big smartphone camera user but the advancements in this year's camera HW and SW keeps making me rethink holding onto my iPhone 7 Plus for another year.
    lostkiwi
  • Reply 2 of 29
    Hmmm.  One would be forgiven for thinking Portrait Lighting is a video mode of the 8+ camera.  30 sec video. 21 sec showing PL "used" in moving video.  4 sec showing PL actual use.  Seems legit. :D
    freediverx
  • Reply 3 of 29
    Hmmm.  One would be forgiven for thinking Portrait Lighting is a video mode of the 8+ camera.  30 sec video. 21 sec showing PL "used" in moving video.  4 sec showing PL actual use.  Seems legit. :D
    It is almost certain that Apple has this kind of real time video effects in the pipeline and NOBODY has the ability to follow them if they ever go there (which I expect them to).
    Solicornchiptmaywatto_cobrasphericedred
  • Reply 4 of 29
    foggyhill said:
    Hmmm.  One would be forgiven for thinking Portrait Lighting is a video mode of the 8+ camera.  30 sec video. 21 sec showing PL "used" in moving video.  4 sec showing PL actual use.  Seems legit. :D
    It is almost certain that Apple has this kind of real time video effects in the pipeline and NOBODY has the ability to follow them if they ever go there (which I expect them to).
    No doubt.
  • Reply 5 of 29
    foggyhillfoggyhill Posts: 4,767member

    They don't even need to do the rendering real time really, they only need to be able to save the depth map real time for every frame.

    Then, you could do some fantastic things either real time, or in post that take a hell of a long time to do right now.

    Is there a standard right now that even exists for saving 3D scanning info (or just sensor info) as a streams synced to camera (which is really just another sensor)? If not, I think there should be.

    tmayspheric
  • Reply 6 of 29
    foggyhill said:
    Hmmm.  One would be forgiven for thinking Portrait Lighting is a video mode of the 8+ camera.  30 sec video. 21 sec showing PL "used" in moving video.  4 sec showing PL actual use.  Seems legit. :D
    It is almost certain that Apple has this kind of real time video effects in the pipeline and NOBODY has the ability to follow them if they ever go there (which I expect them to).
    So tell me if I have this right.  You find it perfectly acceptable to create fictitious "secret" Apple technology as a justification for them producing an ad that is really misleading.  The special effects displayed in that ad are bog standard effects that can be achieved easily in post using off the shelf editing software.  There was nothing real time about that ad.  Anyone even remotely familiar with film will tell you this.  Real time video effects in the pipeline.  tee hee.  You literally just made up something.
    mobiuswilliamlondon
  • Reply 7 of 29
    foggyhillfoggyhill Posts: 4,767member
    foggyhill said:
    Hmmm.  One would be forgiven for thinking Portrait Lighting is a video mode of the 8+ camera.  30 sec video. 21 sec showing PL "used" in moving video.  4 sec showing PL actual use.  Seems legit. :D
    It is almost certain that Apple has this kind of real time video effects in the pipeline and NOBODY has the ability to follow them if they ever go there (which I expect them to).
    So tell me if I have this right.  You find it perfectly acceptable to create fictitious "secret" Apple technology as a justification for them producing an ad that is really misleading.  The special effects displayed in that ad are bog standard effects that can be achieved easily in post using off the shelf editing software.  There was nothing real time about that ad.  Anyone even remotely familiar with film will tell you this.  Real time video effects in the pipeline.  tee hee.  You literally just made up something.
    Lets see 5 post person who takes a dump on Apple and my head... I'm surely going to take your "words" with grace... Or not.

    Tee hee yourself big mouthed condescending sac of liquefied crap.

    The phone already saves the depth map for the photo, and it tracks continuously the depth map when creating animojis and doing facial recognition on the X.

    Going from there to saving this depth map in parallel with the video stream is in fact not out there AT ALL, except if your some slow wit lunk. Are you?

    They're already capturing and processing the video streams and creating depth maps in those applications on the X.

    So, talking about such thing in the pipeline it's not some imaginary fantasy in some far off future,
    they likely could do it with the tech they got on the X if the processing is done in post
    (though they may wait till the next version if it needs a lot of power to do the rendering real time of the video/depth map processing).

    Being able to just manipulate the damn video with a slider in this way as it plays is something that would be exceptional and yes, only Apple who can design and dedicate whole areas of its huge SOC to video processing could afford to do it.

    Also, anyone using all caps in their username like you do is by default a reject from some bad 1990s hacker movie... Try another persona.


    suddenly newtonkevin keepscooter63williamlondonStrangeDays
  • Reply 8 of 29
    foggyhill said:
    foggyhill said:
    Hmmm.  One would be forgiven for thinking Portrait Lighting is a video mode of the 8+ camera.  30 sec video. 21 sec showing PL "used" in moving video.  4 sec showing PL actual use.  Seems legit. :D
    It is almost certain that Apple has this kind of real time video effects in the pipeline and NOBODY has the ability to follow them if they ever go there (which I expect them to).
    So tell me if I have this right.  You find it perfectly acceptable to create fictitious "secret" Apple technology as a justification for them producing an ad that is really misleading.  The special effects displayed in that ad are bog standard effects that can be achieved easily in post using off the shelf editing software.  There was nothing real time about that ad.  Anyone even remotely familiar with film will tell you this.  Real time video effects in the pipeline.  tee hee.  You literally just made up something.
    Lets see 5 post person who takes a dump on Apple and my head... I'm surely going to take your "words" with grace... Or not.

    Tee hee yourself big mouthed condescending sac of liquefied crap.

    The phone already saves the depth map for the photo, and it tracks continuously the depth map when creating animojis and doing facial recognition on the X.

    Going from there to saving this depth map in parallel with the video stream is in fact not out there AT ALL, except if your some slow wit lunk. Are you?

    They're already capturing and processing the video streams and creating depth maps in those applications on the X.

    So, talking about such thing in the pipeline it's not some imaginary fantasy in some far off future,
    they likely could do it with the tech they got on the X if the processing is done in post
    (though they may wait till the next version if it needs a lot of power to do the rendering real time of the video/depth map processing).

    Being able to just manipulate the damn video with a slider in this way as it plays is something that would be exceptional and yes, only Apple who can design and dedicate whole areas of its huge SOC to video processing could afford to do it.

    Also, anyone using all caps in their username like you do is by default a reject from some bad 1990s hacker movie... Try another persona.


    5 posts or 5000... immaterial.  It won't change the fact that you're just making up stuff.  Last I checked, woulda-shoulda-coulda wasn't sufficient evidence for, well, for anything really.  Maybe you don't realize you're just making up stuff?  This here: "they likely could do it with the tech they got on the X if the processing is done in post (though they may wait till the next version if it needs a lot of power to do the rendering real time of the video/depth map processing)."  ← Yeah, that's all made up by you... with nary a fact to back it up.

    Your ad hominem attack really hurts my feelings.  Honestly, based on what you've posted thus far, it's probably the best way for you to go.  It's not as if you're going to present an argument based on facts so personal attacks might be your only avenue.
    djsherlymobiuswilliamlondongatorguy
  • Reply 9 of 29
    Am I the only one who thinks the effects look terrible? If they look bad in a slick commercial, I can’t imagine the real thing with my non-model friends and family, and my non-perfectly-controlled situational lighting getting any better results. 
    retrogusto
  • Reply 10 of 29
    So tell me if I have this right.  You find it perfectly acceptable to create fictitious "secret" Apple technology as a justification for them producing an ad that is really misleading.
    I take it you also expected AirPods to allow the wearer to defy gravity in real life?
    williamlondonmacseekertmaywatto_cobra
  • Reply 11 of 29
    Rayz2016Rayz2016 Posts: 4,591member
    So tell me if I have this right.  You find it perfectly acceptable to create fictitious "secret" Apple technology as a justification for them producing an ad that is really misleading.
    I take it you also expected AirPods to allow the wearer to defy gravity in real life?
    I was about to say the same thing. 

    By the time you reach the end of the ad it’s perfectly clear that this is for altering photos and not videos (and before I got to the end I had no idea what it was all about). I think you’d need to a bit of an idiot to think otherwise. 


    pscooter63watto_cobra
  • Reply 12 of 29
    SoliSoli Posts: 8,802member
    hallmansm said:
    Am I the only one who thinks the effects look terrible? If they look bad in a slick commercial, I can’t imagine the real thing with my non-model friends and family, and my non-perfectly-controlled situational lighting getting any better results. 
    In what way? Can you give me an example of the same effects being done on other devices with better results? I'd love to see side-by-side of these different cameras taking snapshots of the same image at the same time.
  • Reply 13 of 29
    tmaytmay Posts: 3,737member
    foggyhill said:

    They don't even need to do the rendering real time really, they only need to be able to save the depth map real time for every frame.

    Then, you could do some fantastic things either real time, or in post that take a hell of a long time to do right now.

    Is there a standard right now that even exists for saving 3D scanning info (or just sensor info) as a streams synced to camera (which is really just another sensor)? If not, I think there should be.

    I agree, and since Apple conveniently can pass the depth and lighting metadata on in postprocessing to iMovie, or FCX, or even ARKit for that matter, it does look like a roadmap to a future video feature. Real time masking with a variety of effects for video is just awaiting a bit of a processor evolutions, and it seems obvious that this is where Apple is going.

    Not seeing the ad as misleading, as new person does. My Mileage Varied.
    pscooter63
  • Reply 14 of 29
    tmaytmay Posts: 3,737member

    "Portraits of the Presidents Cup

    The PGA TOUR hired Brad Mangin to shoot the Presidents Cup with his new iPhone 8 Plus. Using the 12MP dual cameras on the new iPhone while taking advantage of the intimacy of such a pocketable device, Brad is able to capture stunning professional photos that would be tough with larger, traditional cameras. At the 2017 Presidents Cup, Brad is taking advantage of the new Portrait Lighting feature on iPhone 8 Plus to put a sharp focus on faces around Liberty National. The photos provide a unique look at the PGA TOUR through the eyes of a photographer who has been covering sports for 30 years."


    https://www.pgatour.com/studio-18.html


    pscooter63watto_cobra
  • Reply 15 of 29
    palegolaspalegolas Posts: 1,277member
    I'm sure the video mode is in the works, as many have stated, but as of right now this is pretty misleading since it's a stills-only effect for now. (it was pretty though)
    edited October 2017 watto_cobra
  • Reply 16 of 29
    lkrupplkrupp Posts: 7,060member
    hallmansm said:
    Am I the only one who thinks the effects look terrible? If they look bad in a slick commercial, I can’t imagine the real thing with my non-model friends and family, and my non-perfectly-controlled situational lighting getting any better results. 
    Why yes, you are the only one. Are you lonely tonight?
    StrangeDaystmaypscooter63watto_cobra
  • Reply 17 of 29
    hallmansm said:
    Am I the only one who thinks the effects look terrible? If they look bad in a slick commercial, I can’t imagine the real thing with my non-model friends and family, and my non-perfectly-controlled situational lighting getting any better results. 
    I just upgraded from a 6+ to an 8+,  and I’ve been incredibly impressed with the iPhone camera. I’ve become of really big fan of portrait mode… the photos often look spectacular!

    With that said, the studio lighting modes are about as close to a gimmick as I’ve ever seen Apple approach. I haven’t been able to get a single picture using studio lighting to look anything short of horrendous. Yes, I know it’s supposed to be beta software, but it really feels undercooked....  Samsung levels of bad. I frankly find it hard to believe that they’re advertising this as a feature at this point. 

    Since I’ve just crapped all over studio lighting, let me just reiterate how good portrait mode is.  I received the iPhone 8+ before I left on a family trip to Disney World, and I couldn’t put the camera down, I was so impressed with the results.






    gatorguywatto_cobra
  • Reply 18 of 29
    StrangeDaysStrangeDays Posts: 7,562member
    Computational photography is here to stay. 
    pscooter63watto_cobra
  • Reply 19 of 29
    StrangeDaysStrangeDays Posts: 7,562member
    foggyhill said:
    Hmmm.  One would be forgiven for thinking Portrait Lighting is a video mode of the 8+ camera.  30 sec video. 21 sec showing PL "used" in moving video.  4 sec showing PL actual use.  Seems legit. :D
    It is almost certain that Apple has this kind of real time video effects in the pipeline and NOBODY has the ability to follow them if they ever go there (which I expect them to).
    Real time video effects in the pipeline.  tee hee.  You literally just made up something.
    Actually the X’s TrueDepth camera system does exactly this. There is a great video demo with virtual scenes here (at the 7:58 mark):

    https://youtu.be/9pOTNQk9Xb8
    edited October 2017 watto_cobraspheric
  • Reply 20 of 29
    StrangeDaysStrangeDays Posts: 7,562member
    hallmansm said:
    Am I the only one who thinks the effects look terrible? If they look bad in a slick commercial, I can’t imagine the real thing with my non-model friends and family, and my non-perfectly-controlled situational lighting getting any better results. 
    Yeah i’m thinking it’s you. Here’s a serious photographer who was impressed by it and has great photos:

    https://daringfireball.net/linked/2017/09/22/austin-mann-iphone-8-camera
    edited October 2017 anantksundaramddawson100watto_cobra
Sign In or Register to comment.