Apple didn't start with a goal for Cinematic Mode, but worked hard to get it right

Posted:
in iPhone
In a recent in-depth interview, a pair of Apple executives have shed more light on the goals and creation of the new Cinematic Mode on the iPhone 13 lineup.

Credit: Apple
Credit: Apple


Kaiann Drance, Apple's vice president of Worldwide Product Marketing, and Johnnie Manzari, an Human Interface Team designer, spoke with TechCrunch about Cinematic Mode -- and how the company ran with the idea despite not having a clear goal.

"We didn't have an idea [for the feature]," said Manzari. "We were just curious -- what is it about filmmaking that's been timeless? And that kind of leads down this interesting road and then we started to learn more and talk more with people across the company that can help us solve these problems."

The feature relies heavily on Apple's new A15 Bionic chip and the Neural Engine. According to Drance, bringing a high-quality depth of video to video is much more difficult than in photos.

"Unlike photos, video is designed to move as the person filming, including hand shake," said Drance. "And that meant we would need even higher quality depth data so Cinematic Mode could work across subjects, people, pets, and objects, and we needed that depth data continuously to keep up with every frame. Rendering these autofocus changes in real time is a heavy computational workload."

Before starting to work on Cinematic Mode, the Apple executives said the team spent time researching cinematography techniques to learn more about focus transition and other optical characteristics. Manzari said the team started with "a deep reverence and respect for image and filmmaking through history."

When they were developing Portrait Lightning, Apple's design team studied classic portrait artists like Andy Warhol and painters like Rembrandt.

The process was similar for Cinematic Mode, with the team speaking with some of the best cinematographers and camera operators in the world. They then continued by working with directors of photography, camera operators, and other filmmaking professionals.

"It was also just really inspiring to be able to talk to cinematographers about why they use shallow depth of field," said Manzari. "And what purpose it serves in the storytelling. And the thing that we walked away with is, and this is actually a quite timeless insight: You need to guide the viewer's attention."

Of course, the Apple designer realized that these techniques require a high level of skill, and weren't something that the average iPhone user could pull off easily.

That, Manzari said, is where Apple came in. The company worked through technical problems and addressed issues with fixes like gaze detection. Some of the problems were fixed with machine learning, which is why the mode leans heavily on the iPhone's baked-in Neural Engine.

Manzari said this type of feature development represents the best that Apple has to offer.

"We feel like this is the kind of thing that Apple tackles the best. To take something difficult and conventionally hard to learn, and then turn it into something, automatic and simple."

Read on AppleInsider
jas99

Comments

  • Reply 1 of 5
    tmaytmay Posts: 5,465member
    I figure that the Apple imaging team can take a well deserved break once an iPhone can successfully mimic the candlelit scene in Stanley Kubrick's "Barry Lyndon", shot through a Zeiss Planar 50mm f/0.7 lens.

    That's a big ask.

    https://ascmag.com/articles/flashback-barry-lyndon
    StrangeDays
  • Reply 2 of 5
    tmay said:
    I figure that the Apple imaging team can take a well deserved break once an iPhone can successfully mimic the candlelit scene in Stanley Kubrick's "Barry Lyndon", shot through a Zeiss Planar 50mm f/0.7 lens.”
    “We do the difficult in no time. The impossible takes a little longer. Just wait and see.” The Apple Camera Team.
    edited September 23 watto_cobra
  • Reply 3 of 5
    Steve Jobs used to say that Apple is in the intersection of ‘Technology and the Fine Arts.’
    This is a demonstration of that! They started looking to the ‘fine art of filming’!
    jas99watto_cobra
  • Reply 4 of 5
    auxioauxio Posts: 2,343member
    lmasanti said:
    Steve Jobs used to say that Apple is in the intersection of ‘Technology and the Fine Arts.’
    This is a demonstration of that! They started looking to the ‘fine art of filming’!
    I believe it was the intersection of Technology and Liberal Arts, but I understand the L word gets some people fired up these days.
    StrangeDaysMacProrundhvidwatto_cobra
  • Reply 5 of 5
    tmay said:
    I figure that the Apple imaging team can take a well deserved break once an iPhone can successfully mimic the candlelit scene in Stanley Kubrick's "Barry Lyndon", shot through a Zeiss Planar 50mm f/0.7 lens.

    That's a big ask.

    https://ascmag.com/articles/flashback-barry-lyndon
    Not Stanley Kubrick’s best movie but it was his perfect movie.  Thanks for this post.
    watto_cobra
Sign In or Register to comment.