How 'Avengers: Infinity War' VFX Teams Brought Josh Brolin's Thanos to Life

As the big bad of Disney/Marvel's Avengers: Infinity War, Thanos, the genocidally inclined supervillain played by Josh Brolin, required the Herculean efforts of not one but two visual effects houses, with an assist from a facial-scanning system offered by a third. That's because not only was the performance-capture CG character a massive presence, standing 8 feet tall, but he also commanded nearly a full hour of screen time.

"The script went through a lot of iterations," says the film's overall VFX supervisor, Dan Deleeuw, explaining that one such iteration completely changed the scope of the VFX work, which on Jan. 22 was nominated for the VFX Oscar. "Joe Russo [who with brother Anthony directed the film] came up with the idea of telling the story more from Thanos' point of view. Thanos went from supporting villain to one of the main characters driving the plot."

The process of bringing Thanos to life started at Marvel, which oversaw development for the character, a brawny humanoid with purple-tinted skin. While 14 VFX houses worked on Infinity War, Digital Domain and Weta Digital in tandem continued Thanos' development and shared responsibilities for keeping his look and performance consistent throughout the film.

Before filming began, Brolin's own facial expressions — with the actor expressing everything from a wide smile to a forbidding frown — were captured by Industrial Light & Magic's high-resolution facial-scanning system, Medusa. The technology, which was developed by Disney Research in Zurich and launched a few years ago, has been used to capture roughly 130 actors (like Andy Serkis when he played Supreme Leader Snoke in Star Wars: The Last Jedi), and it will be recognized by the Academy of Motion Picture Arts and Sciences on Feb. 9 at the annual Scientific and Technical Awards.

On the set, Brolin wore body- and facial-capture systems to record his performance. So the other actors would have a sense of Thanos' height, Brolin sometimes wore a backpack with a pole extending above his head to provide his fellow actors with an eyeline. And sometimes, he simply stood on a platform above them.

The motion-capture data from the set effectively was combined with the data provided by Medusa. Kelly Port, VFX supervisor at Digital Domain Port, says the company fed the Medusa data into a new Digital Domain system dubbed Masquerade that used machine learning to "learn" Brolin's expressions and create a higher-resolution version of what Brolin did on set. That was then re-targeted onto the CG Thanos, and animators, as is customary, further enhanced the performance by hand, particularly where Thanos' anatomy didn't match that of Brolin's — like Thanos' unfortunately creased chin.

Weta used a different process, based on tools that it developed to create Caesar for the Planet of the Apes films. Weta VFX supervisor Matt Aitken explains that the team applied the performance-capture information to their system, using the Medusa data to check for accuracy, to create the performance on a digital version of Brolin that was then applied to the digital Thanos.

Notes Port of the collaborative efforts that were required: "The pressure was on from day one. Thanos needed to hold up along the live-action characters."

This story first appeared in the Jan. 24 issue of The Hollywood Reporter magazine. To receive the magazine, click here to subscribe.