Since I’ve talked a lot about pipelines and workflow on this site, I’d like to post a diagram I put together for this year’s San Antonio Film Academy. I’ll be putting up a few more of my lecture materials later on, but keep watching the SAICFF website for CDs of all the lectures, not just mine. Anyhow, here’s a quick layout of the five main departments involved in post-production of a film:
The double lines represent image data, solid lines; audio, and the dotted lines are timecode, shotlists, or EDLs. Assuming that we shoot on 35mm film, the camera and audio recorder are timecode-synced, and the film can be developed and telecine’d to get our dailies throughout production. As soon as post-production begins, we begin our first off-line edit using the dailies (with timecode embedded).
Ideally, our film has been heavily storyboarded before shooting began, so the editing itself has largely been worked out on paper. Once we get the first edit down, we know which takes we’re using and how much film we need to develop and scan for our full DI. We also have a rough cut to give our composer so he can begin working on the music.
By the time our second edit is finished, we’ve polished things a bit more, and we can give shot lengths to our VFX division. They might be adding in dinosaurs, or just doing simple sky replacements, but now they have the data scans and timecode that they need to start work. By now we can also see what lines will need to be looped, and we keep updating our composer.
As we continue to hone our edit over the next few iterations, we can add in rough music (enabling us to drop our temp tracks as soon as possible), initial audio mixes, and low-rez VFX renders as they become available. By the time we get to our final edit we have a pretty good idea of what everything is going to look and sound like, so we shouldn’t see too many surprises in the final compile. This edit is locked.
Ideally, our composer has been working on a computer that allows him to arrange and export synthesized orchestral music to the editors, and then print of notation for the real orchestra from that. That simplifies things since the final edit will be based on the final, if synthetic score, which then only has to be recorded by live musicians. In the meantime, the EDL from the final edit is dictating the final compile of all our digitally enhanced and color graded film scans.
I’ve drawn this out in a much simpler fashion than ever exists in real life. Few films are this organized in post, and none of these stages are laid out in set, identical blocks of time. Often, individual scenes of a film will be finalized and locked while editing continues on other parts of the film. And of course, a complex film with lots of effects will require a far more detailed pipeline. Have a look at this workflow diagram (pdf) from Superman Returns, and this Studio Daily article on how that was designed for a real-world example.
Obviously, a simpler film will need a simpler workflow, and there are a few steps that could theoretically be avoided on a smaller, independent production. However, if I were shooting anything other than regular SD or HDV, I would probably do as much as possible in an offline edit for expediency’s sake, even if it was just using downconverted versions of HD files I already had.