Frozen Explosion - New twist on a familiar effect
The so-called "frozen time, virtual camera" effect has been around for years, and received its greatest attention for the unique fight sequences in The Matrix. Now, the upcoming Warner Bros. Feature, Swordfish, starring John Travolta and directed by Dominic Sena, will attempt to take the concept to a new level in its opening shot by illustrating an explosion and its resulting chaos from the POV of an extreme slow-motion, moving camera.
On a drizzly, overcast day during the week-long, live action shoot conducted late last year in Ventura, California, Boyd Shermis, the film's visual effects supervisor, explained the enormous complexity of the endeavor. Shermis feels that, for several reasons, the work performed that week, combined with several months of post massaging on the sequence, will result in major steps forward for the frozen-time effect.
"There are numerous areas we have traveled into that have convinced me this is the most complex use of this effect ever attempted.," says Shermis. "The point of the shot is to take the audience directly into the dramatic event, to make them feel like they are a part of the explosion, to really shock them. To do that, we had to shoot the various live components for the scene separately, at different frame rates, and we had to point the still camera array (a total of 134 cameras for the exterior portion of the shot, and 53 cameras for a continuing interior shot) at multiple targets, rather than centering them all around a single target. And those elements were all moving requiring us to come up with a complicated computer previsualization to map the entire thing out and compute a targeting procedure for the still cameras. There were also far more layers to this shot than any similar sequence I've ever heard about."
Shermis commissioned Frantic Film, Winnipeg, Canada- the company that is currently posting the shot- to create a detailed CG previsualization to help him break down issues such as camera placement, speed, and object choreography. He then used that previz as a guide while shooting eight week's worth of high-speed video tests of small explosions to further calculate how many still cameras would be needed, and where to aim them.
Shermis turned to the effects team at Reel EFX, Inc., North Hollywood - one of the companies to pioneer still camera array technology in the mid-`90s- to shoot the scene using its proprietary Multicam system. For the exterior shot, Reel EFX built a winding rig to hold 134 computer-controlled still cameras, consisting of Canon bodies with specially rebuilt shutters and other interior parts, 35mm lenses, and tiny onboard computers to permit synchronization with the rest of the array.
"This was the only time we had ever tried matching our camera firing sequence to a computer prviz all the way through," explains Tom Costan, head engineer at Reel EFX. "Usually, they design the final scene to match the shots we capture, or modify the original previsualization as we go along. Here, we spent a lot of time in set-up, figuring out aiming points on the set, to match the previz exactly. We actually hung a frame on pipes at various locations around the set, and measured up from a ground plane pipe to a horizontal component projected onto that frame, and that is how we figured out exact target spots."
The Reel EFX team also had to shoot dozens of different elements - a practical explosion, separate takes of cars, stuntmen, glass, extras, and various kinds of debris flying around the Ventura set, among other things- at widely varying frame rates. )Filmakers used standard Fuji 250 daylight motion-picture film stock, specially spooled into still camera canisters, and after shooting, processed and spliced together at Crest National, Hollywood.)
"The reason we had to shoot the different frame rates was because of the explosion itself," Shermis explains. "We had to photograph the explosion at 1,000fps in order to slow it down and see it unfold on film as it puffs out and expands. To photograph stunt guys, cars, and actors at that same frame rate, however, means they would be moving too slow and wouldn't look very interesting. So, during our test phase, we calculated what the best frame rates were for each element that would permit us to composite them all together, and make it all feel like the same time and place. We shot most of the stunt guys at around 250fps, the cars at 125fps, shattering glass at 350fps, and so on. When we combine it all in post, we'll have sort of a ballet of motion, where each component, because it has a separate mass and weight, will match the others, though they were shot at different frame rates."
The frame rate issue was a big reason why Shermis turned to Reel EFX. The company's Multicam system includes proprietary timing control software that enabled company engineers to carefully program a host computer to instruct each camera unit when to fire, each within a microsecond of the other. This avoids the optical jitters that would occur without strict computer control, and allows programming changes to be made within minutes, rather than having to readjust individual cameras.
"Essentially, the timing control software tells each camera what is expected of it in terms of timing and exposure," explains Jim Gill, co-owner of Reel EFX. "We program in parameters that will permit them all to fire at the right time in order to synchronize everything. The software passes that information to the cameras prior to each shot. Pulses are sent out from our host computer, and each camera's computer waits for its specific pulse, and then fires when it receives that pulse. Here, by spacing and aiming the cameras correctly, we were able to have each camera fire at precisely the right time - within a thousandth of a second of each other- to pick up the shot with no frame jitter or distortion between cameras."
With even the greatest care, however, such a complicated shot requires significant post work, which is currently taking place at Frantic Films. In particular, there were two major areas that required delicate handling: the need to generate so-called "in-between" synthetic frames and color correction issues.
"The duration of the exterior shot we designed would have required 400 frames of photography captured at the same time," says Shermis. "To do that, even if we had 400 cameras, we would have had to space them apart from each other in ways that were not physically possible in the available space we had on set. So the way to balance that sad to take keyframes, and then create in-between frames using elements from each of the keyframes. These elements were repositioned in the computer to represent the portion of the shot for a microsecond that we would have taken if we had additional cameras. That was also the big reason for shooting each element separately. We had to synthetically generate two frames for every film frame we captured in order to make it seamless. By shooting each element separately, we had individual components to work with, isolated, with nothing overlapping. Pieces of those discreet components were then used to generate the in-between frames in post. The only other synthetic aspect to the shot is a bunch of CG ball bearings fanning out 360 degrees, and some additional shattering glass. Those were done because it would have been too dangerous to shoot them practically."
The color correction work is being performed at Frantic Films using digital Fusion's and related new coloring software written by Digital Fusion's creators at Eyeon Software, Toronto, specifically for use on Swordfish. That work was crucial because of overcast weather that lingered over the Ventura shoot during filming.
"The wide variance of the exposures and the color temperature of the photography were impacted by the weather," says Shermis. "These cameras are connected together, but they are still independent in terms of capturing separate images. Therefore, with overcast skies coming and going, we got wide variances. The only solution for that is very difficult frame-by-frame color correction - along and slow process."