A Virtual Time Machine
Commercials and music videos share just as much style with the fashion industry as they do with the motion picture realm. Virtual showcases of the latest technological fads, new spots are packed by agencies and directors with whatever is current and fresh - whether that means glossy supermodels in form-fitting denim, or crushed blacks, desaturated colors and selective focus accomplished with swing-and -tilt lenses. Each time a new technique or piece of equipment is discovered, its use seems to overexploited until it threatens to become as tired and outdated as a polyester leisure suit.
This recursive cycle has followed every technological advance from zoom lens to the morph. However, once the initial novelty of a new technique or device wears off, its true value as a production tool can be more clearly assessed.
The latest victim of this vicious cycle has been the frozen-time "virtual camera" effect, which was extensively outlined by one of the pioneers in this field. Dayton Taylor, in AC Sept. '96. After popping up in dozens of commercials and music videos, the 3-D freeze gag was used to spectacular effect in the thriller The Matrix (see AC April '99), which features characters frozen in mid-stride, mid-punch, or mid-air as the rest of the world revolves around them. A play on the concept of sterophotogrammetry (see effects coverage of Batman and Robin in AC Dec. '97), the virtual-camera effect is most commonly achieved with a vast array of single lens reflex (SLR) still cameras that are synchronized to fire simultaneously. One frame from each camera then becomes 1/24 of a second in an edited sequence. Since every camera is fired simultaneously, the action captured is exactly the same for each camera. However, since each camera is placed at a different position around the subject, the perspective on that action is slightly different. When sequentially edited together, the progressive positioning of each camera allows a "virtual" camera to dolly around a subject that remains perfectly frozen in a single moment of time.
The SLR array concept was employed by Taylor, who used a series of custom built cameras linked together and fed a single roll of 35mm film. A Mitchell magazine, cut in half, provided the feed and take-up.
Jim Gill, vice president of the North Hollywood based mechanical effects boutique Reel EFX Inc., pioneered several variations of this technique (a gallery of the company's work is displayed on its website at www.reelefx.com). In 1996, commercial director Tony Kaye approached Gill and his partners with a technical conundrum; to dolly with a tennis ball served from - and then returned to - court superstar Andre Agassi. Completely opposed to any compositing effects whatsoever, Kaye insisted on a method that would allow him to capture the real thing. The Reel EFX team examined the requirements first by clocking the speed of a ball from Agassi's serve. "We found that the ball initially topped of at about 100 miles per hour," Gill says. "It then dropped to about 35 miles per hour by the time it bounced at the other end court. That same cycle would be repeated as the ball was returned by the opponent's racquet. We calculated the G-forces involved in instantly accelerating a camera to 100 miles per hour, and quickly realized that no camera system could withstand that." Refusing to be bested, Gill turned to a little piece of film history for the answer.
In 1872, English still photographer Eadweard Muybridge set out to determine whether or not all four of a horses hooves left the ground during a gallop. He placed a series of 24still cameras alongside a racetrack, and then independently triggered the units with trip-wires as a horse ran past them. The famous series of photographs he took would provide an essential link between still photography and the concept of motion pictures. Over a century later, Jim Gill would return to this concept to achieve Kaye's goal.
Gill recalls, "We thought that instead of trying to push a camera beyond practical speeds, we would take a series of successive pictures from where that camera would theoretically be each time the shutter opened. To do that, we lined up 100 Canon SLRs across the court - 50 to follow the ball out and 50 to track it back - and calculated how far the ball would move each fraction of a second, based on the speed of Agassi's serve. We then placed the cameras far apart at first, and progressively closer together as the ball's speed fell off." Employing a Pentium PC to individually trigger each camera in sequence, Gill and his team accomplished the impossible: a virtual dolly capable of instantly accelerating to 100 mph, decelerating to just over 30, switching directions and then accelerating back to 100 mph in a single smooth shot.
Over time, Gill and his team at Reel EFX have refined their Multicam system into a viable production tool, carefully adapting all of their 100 Canon Rebel-EOS cameras to place them almost completely under computer control. "One of the problems that we noticed early on stemmed from the camera's own reaction time," notes Gill. "Basically when you push the button in a standard SLR, the computer in the camera 'polls' that button 50 times a second to see if its been pushed. In the interim, though, the computer is asleep. When you're talking about exposures of, say, 1/250 of a second, that 1/50 of a second is a long time to wait. When we 'told' each camera to take a picture (in tests), some of them would snap the shot immediately, while others would have to wake up first. The result was that not all of the cameras shot exactly the same moment in time, which would have created a bit of jitter in the commercial's frozen action. To get around that problem, we took control of the cameras' functions ourselves."
Dylan Hixon, head of mechanical engineering for Reel EFX, explains, "We've built a custom processor that sits in the base of each camera and took complete control over the shutter, aperture, shutter speed and mirrors. When we get ready to take a shot, we arm the cameras to fire - all of the mirrors come up, the shutters are lifted into position and held by their electromagnets, and the shutter speed is already programmed in. When we're ready, all of the shutters drop instantly on our commands.
"Each camera is identified by a serial number that's burned into the chip in the bottom of the camera." Hixon continues. "The computer will go out and find the cameras, and then come back and say, 'Camera one is this serial number.' Then it sends a program to each processor that basically says, 'Serial number X, you're the fourth camera in the series, so your shutter speed is going to be this. Look for the fourth pulse, that's when you're going to fire.' We can then send out a single pulse command to fire all of the cameras simultaneously or a pulse stream to fire them sequentially. When the pulse stream comes down the line, the chip in the camera counts the pulses: one' two' three' fire!"
Given this ability to fire each camera separately, the Multicam array can do much more than simply produce a virtual-camera freeze. "The effects one can achieve are pretty much unlimited," comments Gill. "Because each frame of film is a separate picture-taking entity, you can put it anywhere in space. If you place the cameras 100 feet apart, you can achieve a 3,000-feet-per-second dolly move - which translates into literally hundreds of miles an hour. If you place the cameras closer and closer, you could conceivably start from four miles away and zap right into somebody pouring beer or something. I can shoot at 24 cameras per second (for film finish) or 30 cameras per second (for video finish) and play that back.
The options for manipulating this effect are infinite: you can do a filter gradation as you go from one camera to the next down the line; you can put different lenses on the camera and go longer and longer, or shorter and shorter; you can ramp from any speed to any other speed; you can alter your depth of field by adding NDs and stopping down with each successive camera; or you can do the reverse, keeping the f-stop the same and changing the shutter speed to have a completely blurry subject become sharp in the frame. The only limitation with the Multicam array is that with 100 cameras, we're limited to a final screen time of a little over four seconds (at 24 fps). However, within that four seconds you can do anything from a time-lapse shot that lasts for months to a 3,000 fps shot of a bullet being fired form a gun."
Taking the concept one step further, music video director Martin Weisz, with support from Otto Nemenz and Reel EFX, decided to replace the SLRs with an array of motion-picture cameras. While using 15 Arriflex 35-IIIs for rap artist Dru Hill's video "You Are Everything," Weisz found that the cameras gave him a considerable level of creative control. "With the still cameras," the director explains, "you are locked into a freeze, and the trigger has to pulled at exactly the right time. Each camera gives you only one frame to with which to work. With the Arris, though, you get several thousand frames to choose from. I can start a move at any frame I want, and then have the choice to freeze the action or just do a really fast dolly. Also, in post, it's much easier to deal motion-picture rolls individual frames - developing, telecine and editing are much more difficult to accomplish when you're working with a bunch of five-foot rolls (from the SLR cameras)."
The major task on the Dru Hill video was synchronizing the shutters of the 15 Arri 35-IIIS. Each camera was individually hooked into a phasing sync box and slaved to one master camera. All of the cameras were then run simultaneously and manually synched with a timing light. The process could take anywhere from 10 to 20 minutes, and all of the cameras had to be run for the entire time. If a battery went bad and needed to be replaced, the entire process would have to be repeated.
On Weisz's next project, the Puff Daddy video "Public Enemy No, 1," he doubled the ante by using 30 Arri 35-IIIs. The Reel EFX team built a power pack that could control the cameras from a single source and maintain consistent power to each, and Otto Nemenz designed a single sync box with 30 individual controls to streamline the shutter-synching process.
In Weisz's latest opus, the video for "You Don't Know Me Like You Used To" (featuring Brandy with guests Shaunta and Da Brat), he scaled back to just 12 Arri 35IIIs, but once again incorporated Reel EFX's new battery supply and Otto Nemenz's sync box. "The video is a bit complex in that we're going to try to make (Brandy's) movements - her rhythm - dictate the rhythm of the world around her," explains Weisz. "Whenever she's standing still, people around her are moving, but whenever she moves, the rest of the world stops. I'm using the Arris for the dance break that serves as a climax; we can have incredible movement around her and manipulate her motion however I want during the move. I can skip frames, freeze her or allow her to move naturally. Also since I have multiple frames to choose from, I can decide where to begin the effect., stop the move at any point - say the sixth camera - let it roll for a few moments, and then freeze her and move back to camera one - whatever effect looks best. With multiple frames (as opposed to just one with and SLR array), I can edit exactly on the beat, which is important in music videos. This method really gives me the freedom to create all kinds of different looks."
"A Virtual Time Machine" by Jay Holben (December 1999)
Reprinted with permission of American Cinematographer
To view a scanned version of this article click here