How ‘The Mandalorian,’ ‘Watchmen,’ ‘Westworld,’ and Others Achieved Feature-Quality Visual Effects

Click here to read the full article.

Animation
Animation

Thanks to “Game of Thrones,” there’s no longer a gap in visual effects quality between high-end features and episodic TV, despite disparities in budgets and schedules. The HBO landmark fantasy-drama set the standard early in its run, winning seven Emmys for bone-crunching battles and medieval world building, and the industry stepped up and has continued raising the bar ever since.

It’s an efficient, global workforce that keeps improving and adjusting, especially during the lockdown, when it was forced to work even more remotely in post-production. Just look at the stunning work among this season’s Emmy contenders: “The Mandalorian,” “Watchmen,” “Westworld,” “Stranger Things,” “The Dark Crystal: Age of Resistance,” “Altered Carbon,” “Lost in Space,” and “Cosmos: Possible Worlds.”

More from IndieWire

“One of the keys now in our industry is to keep pushing the limits of what we are doing,” said Martin Pelletier, VFX supervisor for RodeoFX, who oversaw the complex creature work on Season 3 of Netflix’s “Stranger Things,” and was promoted to overall supervisor for Season 4. “The planning of a feature next to episodic is almost the same, which doesn’t make any sense. One is going to shoot over the course of three to five months, and the other, six months to a year. But what is really impressive is that once the show wraps and we are fully in post-production mode, we don’t have much more time than a regular feature to come up with triple the amount of work in some cases.”

Intuiting the vision of showrunners becomes vital, as is hiring visual effects companies that specialize in features, because, unlike features, there’s little time for iteration. “And every bit of inspiration from the Duffer brothers comes from features, particularly the ’80s, with fully animatronic creatures,” Pelletier said. He took his cue in Season 3 from John Carpenter’s “The Thing” in crafting the Mindflayer in many forms, including the 9-foot hospital monster and 22-foot spider reboot.

“There’s understandable budget growth across the show based on needing to outdo the previous season, so there’s just this scope expectation not only from a production standpoint but from a viewer standpoint,” said Sean Santiago, Netflix director of visual effects. “But we’re still trying to do it as efficiently as possible, and we’re able to continue leveraging tax credits and supporting the work force in Canada.”

“When you get into episodic, you really understand your flaws in a pipeline,” Pelletier said. “What your needs are, what you’re struggling with, and how fast you can turn things around on a high-end level.”

With HBO’s “Watchmen,” Damon Lindelof’s present-day sequel re-imagining of the graphic novel, the imperative was: What would the world look like today after all of the events of the alternate history? Erik Henry, the Emmy-winning VFX supervisor of “Black Sails,” not only treated it like a long-form feature, but only reached out to companies well-versed in features.

“We’re demanding high-end quality,” Henry said. “I would get as much information as I could well before scripts were done, and that enabled me to get a jump on R&D for how Doctor Manhattan [Yahya Abdul-Mateen II] would be blue, what the Millennium Clock and its centrifuge would look like, and how we would do the Looking Glass Mask [Tim Blake Nelson] in CG.”

Henry was on “Watchmen” for nearly two years, pulling every trick that he could think of to buy more time for important sequences while preserving the emotional integrity of the performances. “You can’t expect the Millennium Clock to be done in a month,” he added. “I asked Damon to include me in on the process of what the writers were writing, and sometimes Damon [didn’t have all the design answers]. But it would be enough to get started. I was able to use great artists who know that quality is a win-win for them.”

For “Cosmos,” the Nat Geo series from showrunners Ann Druyan and Brannon Braga about the evolution of life and consciousness, VFX supervisor Jeff Okun (“Stargate”) came in with a total feature mindset. “It’s essentially getting inside the showrunner’s head,” he said, “so, by extension, you become them so you can get closer to what they’re after. You have to find your stride quickly, and you have to remain flexible enough to be able to modify your approach based on the overarching evolution of the story, which impacts your budget.”

Although Okun was reunited with “Stargate” cinematographer Karl Walter Lindenlaub, the experience was often frightening. “He’d come up with some crazy thing that was cinematically wonderful. and at the same time, you’re not sure you can afford it,” he said. “You bring scope with a limited budget and time, and that all comes down to experience and trust [overseeing 15 companies scattered across the world].”

But wrapping his head around quantum physics was daunting. He tried following the script and then watched storyboard after storyboard get rejected by the team of advisors. “The science had advanced from the time I started to the time I delivered. I got the equations from scientists and plugged them in our computers and had them run simulations. So that was exciting and frustrating,” he said.

Yet there’s also room for important innovation, which occurred on Jon Favreau’s live-action “Star Wars” series “The Mandalorian,” with the game-changing virtual production platform called StageCraft from Industrial Light & Magic. This allowed the filmmakers to generate complex and exotic digital backdrops in real-time (using Epic’s Unreal game engine), while shooting the eight-episode bounty hunter series from Disney+ at Manhattan Beach Studios in L.A.

More than 50 percent of Season 1 was filmed using StageCraft, eliminating the need for costly and time-consuming location shoots entirely. Instead, actors performed in a massive semicircular LED video wall and ceiling where the practical set pieces were combined with digital extensions on the screens. Environments were lit and rendered from the perspective of the camera to provide parallax in real-time, as though the camera were really capturing the physical environment with accurate interactive light on the actors and practical sets.

StageCraft offered a new paradigm shift that began in 2018 with design from Doug Chiang, vice president of Lucasfilm, and led to four months of testing a prototype LED stage. Once greenlit, it was a race against time for an October 1 shoot date. “It was frantic because everything we were discussing was theory,” said ILM’s VFX supervisor Richard Bluff. “Everybody knew that you could point a camera at the LED screens. What people didn’t know was which camera body and lens package combination would give you the best results, and with which LED panel and what processors.

“At the same time, Epic and the ILM R&D department combined forces to branch off the latest version of Unreal and start building out the various tools that would be required to for the work on the stage: the camera tracking tech, machine syncing, sending the info to the LED screens, integrating the green screen, or getting the real-time camera tracking system working at a higher resolution while maintaining a lower resolution reflection environment.”

In parallel with testing, Bluff worked on schedules trying to understand this new paradigm shift by putting post-production visual effects in pre-production and how that impacted all of the departments. ILM needed around six weeks to turnaround an environment to get it on the screen and to be as photoreal and as flexible as needed. “Everybody has embraced the technology and brought something new to it, whether it’s coming up with unique ways to use the volume or expanding the scope visually,” Bluff said.

A paradigm shift occurred on Season 2 of Netflix’s “Lost in Space.” That’s because the VFX team convinced the filmmakers the importance of planning the effects during prep instead of waiting until post-production. “No one really listened when we tried to say how much we needed to be heavily involved in a lot of the decisions that were being made during Season 1,” said VFX supervisor Jabbar Rasaini. For example, the site where the Robinsons crash land wasn’t built to be large enough to fit the digital ship made for it.

In Season 2, however, there was closer collaboration with the art department, which only built interiors, leaving more options for VFX. But the bulk of the work surrounded the robots. The scene-stealing Robot got an overhaul from Mackevision, and the climactic robot fight between the copper-looking Scarecrow and silver baddie (Third Alien Robot) nearly got cut for budgetary reasons. “We worked it out financially with Image Engine and then showrunner Zack Estrin was pitched a Robinson-centric fight to justify its importance,” Rasaini said.

There was a hybrid approach adopted for Netflix’s “The Dark Crystal” prequel series, further exploring the wondrous Gelfing culture established in the Jim Henson 1982 movie. They combined the handmade Henson puppetry with CG enhancement from DNEG TV. “It’s a two-pronged approach being in the process with director Louis [Leterrier] and everybody from Henson right from the beginning, so a lot of it has to do with the prep work,” said VFX supervisor Sean Matheisen, who’s team delivered more than 4,000 shots ranging from facial augmentation, digi-doubles, and CG environments for villages, forests, and caves.

The work was distributed around the globe: DNEG in Vancouver and Montreal; London, where it was shot; L.A.; and Chennai and Mumbai, taking advantage of tax breaks and running 24 hours during full production. “Those were the best ways to approach feature level quality but for a television budget,” added Matheisen. “The main things we dealt with were digital versions of the puppets, making an eyebrow go slightly larger if we couldn’t see it in the puppet work.”

“We went back and forth with DNEG breaking down the situations with scenes or character,” said Toby Froud, the Henson creature & costume designer. “Is this a puppet? Is this a digi-double? Are we going to do an effect here? Throwing out ideas for practical, enhancement, and full CG. It was a great way to work because nothing was off the table at that beginning point. We were just trying to figure out how to present the best characters on screen.”

Season 3 of HBO’s “Westworld” got more VFX-intensive in exploring the outside world for the first time, and adopting LED wall tech for enhancing the backgrounds. The first application was for the airpod flying sequences (shot on a gimbal), but the second was more complicated. They used a massive LED wall on a set in L.A. to project a virtual background in real-time (shot in Spain) with Epic’s Unreal engine. To pull that off, showrunner Jonathan Nolan and VFX supervisor Jay Worth visited the set of “The Mandalorian.”

“The added challenge was that we shoot on film and most of the tech was developed for digital,” Worth said. “We had to find sync boxes and camera equipment that would work well with this digital tech of LED wall and real-time rendering. We didn’t have time to test it all, but the camera crew and the team from Unreal came through. It was amazing that the texture of the film grain really did make the background look better.”

Meanwhile, Worth was brought on to oversee Season 2 of Netflix’s sci-fi series, “Altered Carbon,” working remotely from L.A. with DNEG and the Canadian crew. They were tasked with further exploring the host planet, Harlan’s World, taking their cues from DNEG’s work on Season 1. “We weren’t trying to reinvent anything —they laid such great groundwork…that we were able to focus on a tighter view of this one area in Harlan’s World,” Worth said. “There were emotional aspects of it visually, and, for me, that was the heart of the show. It’s our job to understand that it’s a genre show that’s character-driven.”

Best of IndieWire

Sign up for Indiewire's Newsletter. For the latest news, follow us on Facebook, Twitter, and Instagram.