close
close

Photographing on an LED volume…on film

Photographing on an LED volume…on film

Magnopus details his unique virtual production approaches for Fallout, which included shooting the LED wall scenes on 35mm film. An excerpt from the print magazine.

Prime Video’s post-apocalyptic series Stand out was shot on 35mm film with anamorphic lenses.

Although this is a format no stranger to the series’ executive producers, Jonathan (Jonah) Nolan and Lisa Joy, who took the same approach Westworld–It is a format that is not too common for an episodic project that also relies heavily on recording in an LED volume.

Getting there required close collaboration with Magnopus, a studio that was previously part of the studio Westworld R&D efforts and some filming (on film) in an LED tape for the fourth season of this series.

“In this fourth season of Westworld This is where the development of this technology integrated into storytelling really began,” advises AJ Sciutto, head of virtual production at Magnopus.

Sciutto oversaw Magnopus’ deployment and collaboration efforts on multiple virtual production services for Stand outincluding virtual art department, LED volume operation and in-camera VFX (ICVFX). Fallouts The visual effects supervisor was Jay Worth and the visual effects producer was Andrea Knoll. The series’ virtual production managers were Kathryn Brillhart and Kalan Ray, who each oversaw four episodes of the series. Magnopus managed stage operations for the first four episodes, while All of it Now looked after the four-episode second series. You can find out more about how the film side of the production came into play below. However, the process first began with the construction of an LED volume in New York, where the series would be filmed.

“At the time,” says Sciutto, “there was no LED volume in New York that could accommodate a show of that size.” Led by production manager Margot Lulick and our partners Manhattan Beach Studios and Fuse Technical Group, Ben Grossmann, CEO, designed from Magnopus, and the Magnopus team in Long Island at Gold Coast Studios created an LED volume that met Jonah’s desired specifications. He likes walk-and-talks, he likes longer recordings, almost single takes. He likes to be surrounded by immersive content. Therefore, the design of the volume was largely horseshoe-shaped. It was not cylindrical like you see in many volumes these days. It was a horseshoe to give us a big, long, flat section to walk and talk. The final size of the LED wall was 75 feet wide, 21 feet tall and nearly 100 feet long.”

The assets for the LED wall – including virtual sets for the underground vaults and post-apocalyptic environments of Los Angeles – were designed to run in full real-time 3D using Epic Games’ Unreal Engine. “We were using the latest and greatest versions of Unreal at the time,” explains Sciutto. “For the first few episodes of the season it was Unreal 4.27, and then we took a couple of months off between the first and last four episodes and at that point we updated Unreal to 5.1 and there were some benefits to using 5.1. Lumen was one of them, the real-time global lighting system, which we considered quite important to the needs of the sets we were working with. And so about a week before we shot the scenes with it, we updated the engine versions to Unreal 5.1, which can be a frightening moment for anyone who has ever worked in this industry. According to Epic, we were probably the first major production to actually use 5.1 in practice, and it ended up working great for us.”

To make it work for the film

With the LED wall stage set up and the construction of the virtual art department underway, Magnopus still had to resolve any issues that arose from shooting 35mm film on the volume. Sciutto notes that genlock was the most important factor. “You need to be able to sync the camera so you’re in sync with the LEDs updating. We had worked with Keslow Camera at Westworld to get sync boxes designed for the Arricam LT and Arricam ST to read a genlock signal and actually be able to phase-lock the camera. It took a few months just to design the leading and trailing edges of the genlock signal so the camera could read that and phase it.”

“After doing a few camera tests,” Sciutto continues, “we felt like we were in good shape, but then we had to do some wedge testing because the actual latency between Unreal and the render nodes was going to the Brompton Processors flows.” The screen was slightly dynamic. We had to do some wedge testing to figure out what that latency offset was so we could then adjust the camera.”

The next hurdle was the color workflow. “Normally,” says Sciutto, “you build in a color profile for your camera, but since the HD tap on the film camera isn’t actually an HD tap, you wing it. Well, you don’t really wing it. There’s a lot of science behind it in terms of what you see in the newspapers and how you redial the wall and how you redial on a digital camera. You can’t really trust what you see from the HD tap. So we had a Sony Venice sitting on sticks right next to the film camera. We had a LUT applied to the digital camera that mimicked our film camera so we could do live color correction on the overall look of the wall.”

Sciutto adds that another challenge was understanding the nature of the various outputs from the film lab in relation to daytime footage. “Depending on what day of the week we run the dailies, they might change the chemical bath on Mondays, so at the end of the week it might be a little more magenta or more green. From the digital camera footage we could see that we were always in a very comfortable area.”

This daily process of shipping bulk goods from New York to Burbank for development and digitization also impacted the pre-lighting of the LED wall, Sciutto explains. “If you do a pre-light day with the film camera, you don’t really know what you shot until a day and a half later. So we did a series of pre-light shoots where we shot in one day, developed the film, had a day to review our content and make any adjustments before doing another pre-light day. This created a schedule for us that allowed set dec to make a few adjustments to the scene, or our virtual art department to make lighting adjustments to the virtual content to make sure it matched the physical content, and for everything to be prepared for the light and color settings that we had to adjust to on the actual day of shooting.”

When asked about the final look of the LED wall footage, Sciutto responded, “When you see the finished results in daytime footage, the transition between your foreground actors, your middleground stage sets, and your virtual background content is definitely smoother.” This transition was enhanced by the graininess of the Film made a little gentler [compared to digital] and that helped a lot. Film also allows you to capture much more detail and range, allowing for more dynamic color balancing in post-production than with digital. If you switch to digital and then apply too much pressure or turn it up too high, things can get “weird” quickly. So shooting on film gave us more flexibility, at least during the DI.”

Read the full story in issue #21 of befores & afters magazine.

Join Patreon’s new before and after tier: Click here for the daily VFX slider


Stay up to date with VFX news -> Subscribe to the before and after weekly and get a free Tech of Terminator 2 e-book!


Related Post