Sync can be achieved a few ways. Since it is possible to record 2 channel of audio on an iPhone, production audio could be laid down onto one side as a sync track. Additionally, timecode functions as an audio signal. It could be generated by an external source and recorded onto the second audio track. This would be plenty to sync multiple cameras in post.
Good points. Of course, doing all that means either tripling the size and weight of the iPhone with a pair of wireless receivers or tethering it with cables, either of which makes the experience less like what we associate with shooting on an iPhone and more like a traditional camera rig. It seems like that would defeat the benefit of using an iPhone in the first place.
Anything can be accomplished with enough will power, and I'm not saying Soderbergh shouldn't shoot movies with iPhones. I just don't understand why he would. That said, I also realize that the validity of his choice is not tied to whether or not I understand it. He has his reasons and preferences, and what I think doesn't matter.
So where is a good online write up that goes over what tools are available (and necessary) to shoot with an iPhone? Primarily concerned with audio and lenses.
So where is a good online write up that goes over what tools are available (and necessary) to shoot with an iPhone? Primarily concerned with audio and lenses.
BeastGrip moondog labs anamorphic lens audio is typically a Zoom F4 field recorder
boltsfan17 said: Soderbergh is still using a camera dolly, gimbal, and very expensive lighting equipment to film with the iPhone.
I'l bet that there are many instances where using an iPhone doesn't need all of that and a simple handheld iPhone with gimbal would suffice where a traditional camera could not.
Not being continuously stuck with all of that give versatility and maneuverability that would absolutely be liberating for ambitious film makers. I saw similar effusive remarks when some were using Canon 5Ds for shooting a TV show or two.
Maybe Soderbergh will try an X next. Manufacturers won't be put out of business but I can see this becoming a bigger deal.
The final scene of “The Florida Project” was shot on an iPhone, surreptitiously. I believe the rest of the movie was shot somewhat traditionally, however.
And that last scene looked like complete crap. And it didn't need to be shot surreptitiously as far as the camera is concerned because don't many Disney World visitors bring all kinds of still and video cameras into the park? I'm sure they couldn't go in with a pro video rig or 35mm Panavision film camera, but it could have just as well been shot with a modestly configured Red camera or with a full-frame DSLR that also does video or with a higher-end consumer video rig. It only needed to be shot surreptitiously as far as crew and direction was concerned. It was just a shot of the two kids running into the park.
If nothing else, this post resulted in my $1.99 purchase of the 8mm Vintage Camera app. I set it to 70s, 18fps, and 480p resolution. The results are very convincing. Well worth a buck ninety-nine!
Absolutely idiotic. iPhone is not a very good camera for professional use. For example I shot some images of the lunar eclipse this morning on my iPhone. Terrible quality. I then used my DSLR Nikon 800 and a long lens. Perfect. Honestly I don’t understand why professional videographers advertise they shoot with iPhone unless they are paid spokespersons for Apple.
Absolutely idiotic. iPhone is not a very good camera for professional use. For example I shot some images of the lunar eclipse this morning on my iPhone. Terrible quality. I then used my DSLR Nikon 800 and a long lens. Perfect. Honestly I don’t understand why professional videographers advertise they shoot with iPhone unless they are paid spokespersons for Apple.
1) You tried shooting an object that's over 384,000 km away.
2) It doesn't sound like you use any additional lens on the iPhone, but mention using a long lens for the Nikon.
Shooting a serious film is such a complex, expensive and overwhelmingly ambitious task that I don't know why someone would compromise on camera's (other than pretending to do so to serve some obsecure commercial purpose)
Shooting a serious film is such a complex, expensive and overwhelmingly ambitious task that I don't know why someone would compromise on camera's (other than pretending to do so to serve some obsecure commercial purpose)
The person in charge may feel that there's some benefit that justifies the compromise. I don't know what it might be because he didn't say (at least not in the article), but as long as the product looks good it doesn't really matter how it's acquired.
I just wonder if Soderbergh discusses choices like this with the crew in advance, because it affects them more than him. The Director of Photography has to figure out how to use it to frame and follow the required shots, colourists and effects editors have to work with chroma-restricted source files, and both sound and picture editors face additional sync steps. All certainly manageable, but it does make one wonder what the payoff is perceived to be to justify the extra hassle.
Absolutely idiotic. iPhone is not a very good camera for professional use. For example I shot some images of the lunar eclipse this morning on my iPhone. Terrible quality. I then used my DSLR Nikon 800 and a long lens. Perfect. Honestly I don’t understand why professional videographers advertise they shoot with iPhone unless they are paid spokespersons for Apple.
1) You tried shooting an object that's over 384,000 km away.
2) It doesn't sound like you use any additional lens on the iPhone, but mention using a long lens for the Nikon.
That is the thing. You can get 90% with just about any camera. To get that last 10% you need to pay thousands of dollars.
Shooting a serious film is such a complex, expensive and overwhelmingly ambitious task that I don't know why someone would compromise on camera's (other than pretending to do so to serve some obsecure commercial purpose)
The person in charge may feel that there's some benefit that justifies the compromise. I don't know what it might be because he didn't say (at least not in the article), but as long as the product looks good it doesn't really matter how it's acquired.
I just wonder if Soderbergh discusses choices like this with the crew in advance, because it affects them more than him. The Director of Photography has to figure out how to use it to frame and follow the required shots, colourists and effects editors have to work with chroma-restricted source files, and both sound and picture editors face additional sync steps. All certainly manageable, but it does make one wonder what the payoff is perceived to be to justify the extra hassle.
I thought the implication was that he himself is shooting the scenes, not an assistant director or a DP.
Absolutely idiotic. iPhone is not a very good camera for professional use. For example I shot some images of the lunar eclipse this morning on my iPhone. Terrible quality. I then used my DSLR Nikon 800 and a long lens. Perfect. Honestly I don’t understand why professional videographers advertise they shoot with iPhone unless they are paid spokespersons for Apple.
Shooting 4K video on an iPhone with proper studio lighting has absolutely nothing in common with shooting a bright object like the moon which is thousands of miles away in the dark with a full frame sensor and a long-ass telephoto lens.
Sync can be achieved a few ways. Since it is possible to record 2 channel of audio on an iPhone, production audio could be laid down onto one side as a sync track. Additionally, timecode functions as an audio signal. It could be generated by an external source and recorded onto the second audio track. This would be plenty to sync multiple cameras in post.
Good points. Of course, doing all that means either tripling the size and weight of the iPhone with a pair of wireless receivers or tethering it with cables, either of which makes the experience less like what we associate with shooting on an iPhone and more like a traditional camera rig. It seems like that would defeat the benefit of using an iPhone in the first place.
Anything can be accomplished with enough will power, and I'm not saying Soderbergh shouldn't shoot movies with iPhones. I just don't understand why he would. That said, I also realize that the validity of his choice is not tied to whether or not I understand it. He has his reasons and preferences, and what I think doesn't matter.
What? Audio was most likely properly recorded on separate hardware away from the iPhone/camera rig with boom mics over the subjects. You sync audio with video footage using the waveform data in common between the video source picking up its own audio and the actual audio track you intend to use. In Final Cut Pro X I think you just throw both clips on the timeline and hit command-G or something and it syncs it. No need for timecode or whatever, this isn't film.
Absolutely idiotic. iPhone is not a very good camera for professional use. For example I shot some images of the lunar eclipse this morning on my iPhone. Terrible quality. I then used my DSLR Nikon 800 and a long lens. Perfect. Honestly I don’t understand why professional videographers advertise they shoot with iPhone unless they are paid spokespersons for Apple.
The scenes in the trailer look a bit muddy in quality, especially the interior scenes with lower lighting, Youtube compression will be making it worse:
It looks like they tried to hide the noise in low light by clamping the light levels. When shooting something that could be iconic and will last generations, it makes sense to capture it with the best tech available. Soderbergh said he found the phone liberating due to the size:
"I’ve been shooting stuff on my phone with intention and purpose for a couple years. I started seriously thinking at the end of the year last year that I gotta find something that really works for that. And just by chance a writer friend of mine called me up out of the blue looking for work. I said, “I don’t have anything for you, but if you can write me a super low-budget thriller/horror type thing, I’ll shoot it June 1.” This was mid-January. Three weeks later, a script shows up, and I love it. I said, “Let’s go.”
It was so liberating. I’m going to do it again. … The ability to put the lens anywhere I wanted in a matter of seconds, if not minutes, was incredibly freeing. You want to put a camera above somebody’s head, you’ve got to lash a rope to it and tie it to something so it doesn’t kill them. This, you just stick it on a piece of velcro and shoot. If I literally want to lay it on the floor, I can. It’s a 4K capture. I’ve seen it on a giant screen; nobody, if they didn’t already know, would ever suspect. It looks like a normal movie."
It sounds like he just wants a more compact device than the heavy film cameras, even a point and shoot video camera can be quite bulky and heavy to strap onto something. RED may have seen this demand from filmmakers and their Hydrogen One phone releasing later this year is their way of targeting it. It's certainly possible to get better sensors and lenses in that form factor more suited for filmmaking. The iPhones will keep getting better too.
It will be much like what happened with photography, iPhones are just with people all the time (and have built-in editing) so it inevitably gets used for capturing when nothing else is available and the more familiar people get with the process, the more it can be used for important roles. Product manufacturers will evolve to meet the demand and everyone's tech will improve as a result.
Sync can be achieved a few ways. Since it is possible to record 2 channel of audio on an iPhone, production audio could be laid down onto one side as a sync track. Additionally, timecode functions as an audio signal. It could be generated by an external source and recorded onto the second audio track. This would be plenty to sync multiple cameras in post.
Good points. Of course, doing all that means either tripling the size and weight of the iPhone with a pair of wireless receivers or tethering it with cables, either of which makes the experience less like what we associate with shooting on an iPhone and more like a traditional camera rig. It seems like that would defeat the benefit of using an iPhone in the first place.
Anything can be accomplished with enough will power, and I'm not saying Soderbergh shouldn't shoot movies with iPhones. I just don't understand why he would. That said, I also realize that the validity of his choice is not tied to whether or not I understand it. He has his reasons and preferences, and what I think doesn't matter.
What? Audio was most likely properly recorded on separate hardware away from the iPhone/camera rig with boom mics over the subjects. You sync audio with video footage using the waveform data in common between the video source picking up its own audio and the actual audio track you intend to use. In Final Cut Pro X I think you just throw both clips on the timeline and hit command-G or something and it syncs it. No need for timecode or whatever, this isn't film.
Think about dragging the audio and video for thousands of takes onto a timeline and lining them all up. It's an improvement over what was possible even just a decade ago, but it's just too time-consuming (in other words, expensive) to be practical for most productions. It's also ripe for human error.
Note also that the camera will often be far enough away from the action that the audio it records will be delayed relative to the production track (sound moves really slowly, only about a foot per millisecond). Using waveforms to line them up will result in the sound being out of sync with the picture. With the camera only 25 feet away it would be out by a full frame.
Sync can be achieved a few ways. Since it is possible to record 2 channel of audio on an iPhone, production audio could be laid down onto one side as a sync track. Additionally, timecode functions as an audio signal. It could be generated by an external source and recorded onto the second audio track. This would be plenty to sync multiple cameras in post.
Good points. Of course, doing all that means either tripling the size and weight of the iPhone with a pair of wireless receivers or tethering it with cables, either of which makes the experience less like what we associate with shooting on an iPhone and more like a traditional camera rig. It seems like that would defeat the benefit of using an iPhone in the first place.
Anything can be accomplished with enough will power, and I'm not saying Soderbergh shouldn't shoot movies with iPhones. I just don't understand why he would. That said, I also realize that the validity of his choice is not tied to whether or not I understand it. He has his reasons and preferences, and what I think doesn't matter.
What? Audio was most likely properly recorded on separate hardware away from the iPhone/camera rig with boom mics over the subjects. You sync audio with video footage using the waveform data in common between the video source picking up its own audio and the actual audio track you intend to use. In Final Cut Pro X I think you just throw both clips on the timeline and hit command-G or something and it syncs it. No need for timecode or whatever, this isn't film.
Think about dragging the audio and video for thousands of takes onto a timeline and lining them all up. It's an improvement over what was possible even just a decade ago, but it's just too time-consuming (in other words, expensive) to be practical for most productions. It's also ripe for human error.
Note also that the camera will often be far enough away from the action that the audio it records will be delayed relative to the production track (sound moves really slowly, only about a foot per millisecond). Using waveforms to line them up will result in the sound being out of sync with the picture. With the camera only 25 feet away it would be out by a full frame.
There are teams of people who are employed just to handle audio in film productions with a reasonable budget. They don't have to sync all the takes, just the edited takes and it's someone's full-time job to do that. Picture lock then sound edit:
They have to record sound separately for quality and location. They're definitely not using the internal mic on the phone. Here's a video showing some of the audio process in filmmaking (3:20):
"A Sound Editor creates the soundtrack by cutting and synchronizing to the picture, sound elements, such as production wild tracks, dialogue tracks, library material and foley in analog or digital form and presents these to the re-recording mixer for final sound balance. Excellent hearing and a good sense of timing are required, as are attention to detail and good communication skills."
Here's a video with a comparison of iPhone footage to Arri Alexa (most widely used camera in movies):
2:35 shows interior footage and the shadow areas are grainy. They'd have to drop the levels to hide all that noise. The Arri does HDR too so the color range shown at the end is higher. 3:20 shows setup time.
Maybe what a company should make is something like the Lytro with a decent lens and sensor that can wirelessly send it's output in real-time to an iPhone. Apple had a patent a while ago about this:
They wouldn't use the iPhone itself for filming but it could be attached if needed. It could even be an iPhone case with a retractable lens and the iPhone can record movement data into a track to help with stabilzation. Like the following one but it would be better if it was compact enough to keep it attached, perhaps using a multi-camera array, which would allow HDR:
It might be too low volume to be worth Apple doing but if they target photographers too, that would increase the potential market a bit. It would be crucial to keep the weight down and balanced so having the lens in the middle at the back would be a better location. Zoom isn't as important as quality so it's more about the sensors. It would have its own storage too so that saves running out of space on the phone.
[...] They have to record sound separately for quality and location. They're definitely not using the internal mic on the phone.
I understand that. You may have missed the context in which my comments were made:
fastasleep suggested the production audio (the sound that is recorded separately) could be sync'ed to picture without time code by using software that compares the waveform of the production track to the "wild" sound captured by the microphone on the iPhone. I was simply pointing out some limitations imposed by that approach.
There's another one I didn't mention before -- that the sound captured by the iPhone may be so dissimilar to the production track, either because of distance or crew noise captured by the iPhone, that the software may not even be able to match them. This could be overcome by sending a feed of the production audio to the iPhone Lightning connector as a reference track, but then one might as well just send time code instead.
[...] They have to record sound separately for quality and location. They're definitely not using the internal mic on the phone.
I understand that. You may have missed the context in which my comments were made:
fastasleep suggested the production audio (the sound that is recorded separately) could be sync'ed to picture without time code by using software that compares the waveform of the production track to the "wild" sound captured by the microphone on the iPhone. I was simply pointing out some limitations imposed by that approach.
There's another one I didn't mention before -- that the sound captured by the iPhone may be so dissimilar to the production track, either because of distance or crew noise captured by the iPhone, that the software may not even be able to match them. This could be overcome by sending a feed of the production audio to the iPhone Lightning connector as a reference track, but then one might as well just send time code instead.
Well, we don’t know how big his audio post crew is or what software they’re using, so speculation about the details is kind of pointless. I just meant that there are methods beyond timecode that work, and you made it sound like they’d be running microphones directly into the iPhone used to shoot for production audio which I’m sure they did not do.
[...] They have to record sound separately for quality and location. They're definitely not using the internal mic on the phone.
I understand that. You may have missed the context in which my comments were made:
fastasleep suggested the production audio (the sound that is recorded separately) could be sync'ed to picture without time code by using software that compares the waveform of the production track to the "wild" sound captured by the microphone on the iPhone. I was simply pointing out some limitations imposed by that approach.
There's another one I didn't mention before -- that the sound captured by the iPhone may be so dissimilar to the production track, either because of distance or crew noise captured by the iPhone, that the software may not even be able to match them. This could be overcome by sending a feed of the production audio to the iPhone Lightning connector as a reference track, but then one might as well just send time code instead.
Well, we don’t know how big his audio post crew is or what software they’re using, so speculation about the details is kind of pointless. I just meant that there are methods beyond timecode that work, and you made it sound like they’d be running microphones directly into the iPhone used to shoot for production audio which I’m sure they did not do.
I'm not sure where anyone would get the idea that I was talking about recording production audio with the phone, since my comment was specifically about how the wild sound captured by the iPhone would DIFFER from the production track. That was the whole point.
Comments
Anything can be accomplished with enough will power, and I'm not saying Soderbergh shouldn't shoot movies with iPhones. I just don't understand why he would. That said, I also realize that the validity of his choice is not tied to whether or not I understand it. He has his reasons and preferences, and what I think doesn't matter.
moondog labs anamorphic lens
audio is typically a Zoom F4 field recorder
Not being continuously stuck with all of that give versatility and maneuverability that would absolutely be liberating for ambitious film makers. I saw similar effusive remarks when some were using Canon 5Ds for shooting a TV show or two.
Maybe Soderbergh will try an X next. Manufacturers won't be put out of business but I can see this becoming a bigger deal.
2) It doesn't sound like you use any additional lens on the iPhone, but mention using a long lens for the Nikon.
I just wonder if Soderbergh discusses choices like this with the crew in advance, because it affects them more than him. The Director of Photography has to figure out how to use it to frame and follow the required shots, colourists and effects editors have to work with chroma-restricted source files, and both sound and picture editors face additional sync steps. All certainly manageable, but it does make one wonder what the payoff is perceived to be to justify the extra hassle.
What? Audio was most likely properly recorded on separate hardware away from the iPhone/camera rig with boom mics over the subjects. You sync audio with video footage using the waveform data in common between the video source picking up its own audio and the actual audio track you intend to use. In Final Cut Pro X I think you just throw both clips on the timeline and hit command-G or something and it syncs it. No need for timecode or whatever, this isn't film.
It looks like they tried to hide the noise in low light by clamping the light levels. When shooting something that could be iconic and will last generations, it makes sense to capture it with the best tech available. Soderbergh said he found the phone liberating due to the size:
https://www.wired.com/story/steven-soderbergh-q-and-a/
"I’ve been shooting stuff on my phone with intention and purpose for a couple years. I started seriously thinking at the end of the year last year that I gotta find something that really works for that. And just by chance a writer friend of mine called me up out of the blue looking for work. I said, “I don’t have anything for you, but if you can write me a super low-budget thriller/horror type thing, I’ll shoot it June 1.” This was mid-January. Three weeks later, a script shows up, and I love it. I said, “Let’s go.”
It was so liberating. I’m going to do it again. … The ability to put the lens anywhere I wanted in a matter of seconds, if not minutes, was incredibly freeing. You want to put a camera above somebody’s head, you’ve got to lash a rope to it and tie it to something so it doesn’t kill them. This, you just stick it on a piece of velcro and shoot. If I literally want to lay it on the floor, I can. It’s a 4K capture. I’ve seen it on a giant screen; nobody, if they didn’t already know, would ever suspect. It looks like a normal movie."
It sounds like he just wants a more compact device than the heavy film cameras, even a point and shoot video camera can be quite bulky and heavy to strap onto something. RED may have seen this demand from filmmakers and their Hydrogen One phone releasing later this year is their way of targeting it. It's certainly possible to get better sensors and lenses in that form factor more suited for filmmaking. The iPhones will keep getting better too.
It will be much like what happened with photography, iPhones are just with people all the time (and have built-in editing) so it inevitably gets used for capturing when nothing else is available and the more familiar people get with the process, the more it can be used for important roles. Product manufacturers will evolve to meet the demand and everyone's tech will improve as a result.
Note also that the camera will often be far enough away from the action that the audio it records will be delayed relative to the production track (sound moves really slowly, only about a foot per millisecond). Using waveforms to line them up will result in the sound being out of sync with the picture. With the camera only 25 feet away it would be out by a full frame.
http://www.raindance.org/the-13-steps-of-post-production/
They have to record sound separately for quality and location. They're definitely not using the internal mic on the phone. Here's a video showing some of the audio process in filmmaking (3:20):
http://www.media-match.com/usa/media/jobtypes/sound-editor-jobs-402783.php
"A Sound Editor creates the soundtrack by cutting and synchronizing to the picture, sound elements, such as production wild tracks, dialogue tracks, library material and foley in analog or digital form and presents these to the re-recording mixer for final sound balance. Excellent hearing and a good sense of timing are required, as are attention to detail and good communication skills."
Here's a video with a comparison of iPhone footage to Arri Alexa (most widely used camera in movies):
2:35 shows interior footage and the shadow areas are grainy. They'd have to drop the levels to hide all that noise. The Arri does HDR too so the color range shown at the end is higher. 3:20 shows setup time.
Maybe what a company should make is something like the Lytro with a decent lens and sensor that can wirelessly send it's output in real-time to an iPhone. Apple had a patent a while ago about this:
http://appleinsider.com/articles/13/11/26/apple-patents-lytro-like-refocusable-camera-for-iphone
They wouldn't use the iPhone itself for filming but it could be attached if needed. It could even be an iPhone case with a retractable lens and the iPhone can record movement data into a track to help with stabilzation. Like the following one but it would be better if it was compact enough to keep it attached, perhaps using a multi-camera array, which would allow HDR:
http://blog.gsmarena.com/relonch-camera-iphone-case-aps-c-sensor/
It might be too low volume to be worth Apple doing but if they target photographers too, that would increase the potential market a bit. It would be crucial to keep the weight down and balanced so having the lens in the middle at the back would be a better location. Zoom isn't as important as quality so it's more about the sensors. It would have its own storage too so that saves running out of space on the phone.
fastasleep suggested the production audio (the sound that is recorded separately) could be sync'ed to picture without time code by using software that compares the waveform of the production track to the "wild" sound captured by the microphone on the iPhone. I was simply pointing out some limitations imposed by that approach.
There's another one I didn't mention before -- that the sound captured by the iPhone may be so dissimilar to the production track, either because of distance or crew noise captured by the iPhone, that the software may not even be able to match them. This could be overcome by sending a feed of the production audio to the iPhone Lightning connector as a reference track, but then one might as well just send time code instead.