This article originally appeared in Vol.18 No.3 of TV Technology Magazine
Let's face it; the digital revolution is over. We've got enough digital toys to last us for a good long while and now all we've got to do is figure out how to use them. For quite some time, video meant TV, or at least a nice corporate project, and film was for the movies. Yeah, there was the odd commercial project, maybe a television show or two that we'd shoot on film, but basically, film meant the big-time. Big time budgets, big time crews, big time everything. And now, well video is darn near anything you want it to be and film just inched up another notch on the Endangered Species list.
The culprit is HiDefinition video. Yeah, I know HD has been around since the seventies, but heck, none of us really took it all that seriously. It made for good conversation at the SMPTE meetings. Lots of cool new verbiage and what great cannon fodder for debate. Promotional 'research' funding wasn't all that hard to get but the real money maker was in consulting. Prior to 1999, more money was made in consulting on HD than producing with it.
Ahhhhh, those were the good old days ... and now ... well, its as if all those people that we've been consulting to just got it all at once.
Everywhere you look here in LA are productions using HD, and all of us who were preaching the gospel of technological innovation are being ask to put-up or shut-up. We've got HD for film, HD for digital projection, HD for broadcast ... and then there's all of the post production applications. For years, 601 was the leading postproduction standard and now, almost overnight we're seeing a quantum shift to 1080/24P.
HD is rapidly becoming the digital production standard that we've all been waiting for. Sure it still has a few drawbacks. Acquisition equipment is still rather pricey and as usual, the manufacturers are still trying to play the standards shell game.
The truly amazing thing is that the industry seems to be driving the manufacturers for a change. Two years ago, you'd be hard pressed to even find a player that could pump out any flavor of 24P. Now, I run out of fingers and toes when I try to count all the television shows that are switching over.
HD quite simply pushes more resolution through the new and emerging production environments faster and easier than the previous myriad of formats. This sudden acceptance of HD isn't occurring just at the high end post houses either. At the high end of desktop production you've got the Intelligent Paradigm, VideoExplorer2 and the new Pinnacle Systems, CineWave. Both are relatively inexpensive, one card systems that just about any PC or Macintosh user can slap into their home computer and start banging around HiDefinition video. Apple's new FinalCut ProHD offers up an editorial environment that is every bit as professionally endowed as its far, far more expensive kin.
You'd think that with all of this concurrent acceptance for HD that there'd be more than a little confusion, and there is. The single most common mistake I see in HD production is calibration. I'm guessing that the culprit is twofold. On one side you've got the film shooters who are accustomed to an acquisition medium that has enormous latitude. The vast bulk of film DPs shoot down the middle of the road and figure that they'll time and tweak in postproduction. On the flip side of that equation are the videographers who assume that since they're now shooting on HD the resolution will give them a wider FIIP (Fix-It-In-Post) ratio.
Both are right to some extent but unlike film, where the inherent latitude can accommodate a multitude of sins, HD is a finite commodity. You shouldn't think of HD as digital film, but rather as very good DigiBeta. The HD project destined to be printed to film needs to have a flatter gamma curve than a project destined for digital projection. The HD project destined for broadcast generally uses a 'safe' factory pre-set that produces a lower density signal with greater latitude.
Whichever distribution mechanism you're shooting for, the secret to good HD is quite simply a waveform monitor and a good set of charts. Keep in mind that every time you recalibrate or re-time an HD shot in postproduction, you're decreasing your signal's inherent resolution. You're throwing away data that it desperately needs.
For feature projects where there are multiple cameras in play, it is customary to use the 'Video Village' configuration which takes the SDI and RGB out of the two cameras and then feeds the images into an HD waveform monitor. The monitor is then used to switch the 'A' and 'B' signals into the 'God' monitor.
For smaller shoots, pick-ups, remote locations and inserts, I use an Aja converter to get the HD SDI signal into my laptop computer where I use the waveform and vectorscope functions in Apple's FinalCut Pro to adjust the camera's parameters to match the rest of the footage.
Every HD shoot I've been on has used the CamAlign chart from DSC Labs. These charts are constructed from aircraft aluminum and are color balanced for optimum colorimetry and density calibrations.
In a recent HD for Film project, we were shooting with two Sony F900 HDCAM cameras. One was equipped with the impressive Panivision HD CiniAlta lens system and the other was equipped with a CanonHD lens. The inherent color temperature difference between the two lenses was significant and had we not spend the time and energy to chart and calibrate the two cameras at every setup, we would have sacrificed upwards of 6% of our signal's already limited color space in postproduction.
In my most humble opinion, careful adherence to a common set of calibration references can easily increase the output resolution of an HD project by as much as 15%. In a 'shot for broadcast' production this isn't necessarily as critical as with a project destined for projection, but every step you take away from the HD signal is one that can't be recaptured.
Scott Billups is a SMPTE member and award winning Director/DP/VFX Supervisor.