- Share this article on Facebook
- Share this article on Twitter
- Share this article on Email
- Show additional share options
- Share this article on Print
- Share this article on Comment
- Share this article on Whatsapp
- Share this article on Linkedin
- Share this article on Reddit
- Share this article on Pinit
- Share this article on Tumblr
“The Bourne Ultimatum” (Universal)
The third film in the franchise took viewers on a high-octane chase last summer, requiring an array of visual effects to supplement the action-packed stunt sequences.
“We ended up with around 500 shots in the film,” says visual effects supervisor Peter Chiang. “In the tradition of all the previous ‘Bourne’ films, the visual effects requirement was to blend in seamlessly with the principal photography and, in particular, emulate the gritty, handheld style of a Paul Greengrass film.”
Chiang says that in situations where safety was the primary concern, foregrounds of the actors were shot against green screen with plates of the background derived separately.
“CG glass and bullet hits were added to a number of shots as well as CG cars for the chase in New York. The sequence at Waterloo Station needed a wide shot revealing the concourse through a tri-panel. The physical gap between each panel and the difficult location called for a CG solution.”
The biggest challenge, he says, was in transitioning from CG shots to a pure live-action shots.
“We were very conscious of the subtleties in the original photography and included a lot more CG layers of reflections and dirt passes and spent more time on the grade.”
One particularly difficult scene involved a sequence in which Bourne reverses a car over the edge of the Port Authority car park. “The real stunt shot by Dan Bradley needed to be shot at a different location for safety reasons,” Chiang relates. “We were then required to map the whole event back into a Port Authority environment. Green screen shots of Matt Damon were combined with energetic second unit plates of hits on vehicles. CG cars, rig removals and adjustments to backgrounds were needed throughout.”
“The Golden Compass” (New Line Cinema)
The adaptation of the first installment of Philip Pullman’s best-
selling “His Dark Materials” trilogy, which opens Dec. 7, centers on two children who live in parallel worlds surrounded by a huge cast of shape-shifting creatures. Bringing the creatures to life required roughly 1,200 visual effects shots and offered a number of technical and aesthetic challenges for visual effects supervisor Michael Fink and his team.
“They all speak, they all have fur and one of them changes shape,” says Fink, noting that the primary challenge in creating the creatures was that they were intimately interacting with their human counterparts.
A second major challenge, he says, came in creating the white bears who play a central role in the film.
“We had to render photo-real white bears who wear armor, which means the armor has to interact with their fur, has to move with their bodies and have weight of its own — and the bears talk,” he adds. “Doing a white bear with armor — in a white snowy environment — that talks is about as tough a technical challenge has anyone has ever tried. The bears are basically a gray scale — their shades run from white where the sun hits them, to nearly black in the deepest shadows. Matching their colors into a background and keeping it consistent from one shot to another can be terribly difficult.”
“Harry Potter and the Order of the Phoenix” (Warner Bros.)
Visual effects of all sorts were once again a critical element to bringing the world of Harry Potter to the screen. This time around, the most significant achievements by visual effects supervisor Tim Burke and his team were in creating the film’s many digital creatures — specifically the Centaurs.
“We had Centaurs in the first film, but I think audiences will see that they have come a long way since then,” says Burke. “They aren’t a half-man, half-horse combination. They are beings unto themselves.”
Combining believable human and horse torsos was a greater challenge than doing either one separately says Ben White, CG supervisor for Framestore CFC.
“When the centaurs move around, they have to move in a convincing, horse-like manner, and yet still have the characteristics of a human — which is very tricky stuff to do, both in terms of animation and the movement of muscle and skin. To really push the realism of the characters, we developed a new method for doing sliding skin over underlying volume.”
Those techniques, says White, proved helpful in creating another CG creature — Kreacher, an elderly and somewhat sinister house elf.
“We took an entirely muscle-based approach to Kreacher’s facial animation system, extending the functionality of the tools we designed for the centaurs and taking them even further, to give him skin that’s appropriately soft and stretchy for such an elderly character.”
“I Am Legend” (Warner Bros.)
In this sci-fi thriller, Will Smith plays the last human survivor of a virus that either turns people into mutants or kills them outright. The film offered two major challenges for visual effects supervisor Janek Sirrs and his team. The first was in creating a version of New York three years into the future, which required the team to shoot in Manhattan and then clean out signs of life, including electricity and people. They also aged cloth flags and added cracks in the streets with plants growing through. The second challenge was in the creation of digital main characters known as “the infected.”
“They are 100% CG,” says Jim Berney, visual effects supervisor at Sony Pictures Imageworks. “There’s Alpha Male, Alpha Female as well as 43 others. … They look like human beings. They are just a little sick in their appearance and odd in their behavior. They are definitely a leap or two beyond the digital double.”
Close-ups of digital creatures were especially challenging, says Berney, particularly one of the film’s final scenes in which an Alpha Man is fighting with Will Smith.
“We motion-captured the movement, but then would go in and key frame the affected type of behavior. It’s a balance between the technology of capturing motion and the incredible amount of talent and artistry brought to life with these animators. (The digital characters) are hairless, the skin is semitransparent, and they are a little jaundiced. You can see through to a slight layer of muscle.”
“Pirates of the Caribbean: At World’s End” (Disney)
Can Captain Jack Sparrow repeat last year’s win and claim Oscar gold for the arsenal of visual effects techniques used in the trilogy’s final installment? “World’s End” offered a number of new challenges — most notably the maelstrom in the final climactic battle — says Industrial Light + Magic’s John Knoll, who served as the film’s visual effects supervisor.
“CG water is extremely difficult to do, especially when you have to do large quantities of it,” Knoll says, explaining that the shot count was much smaller in CG-water-heavy films such as 2000’s “The Perfect Storm” and last year’s “Poseidon.”
“We had to scale up our efforts to do a much larger sequence,” he says. “And it was a daytime sequence, so you can’t rely on shading of nighttime to hide a lot of details. And (because of) the shape of the maelstrom, you saw water at many different scales simultaneously. Typically, when you are doing a flatter ocean, there are a bunch of cheats you can do by putting detail in the foreground … but that’s not really possible when it’s a constant curvature and you see all different scales at the same time.”
In all, the team completed nearly 2,000 visual effects shots, the bulk in only four and a half months, what Knoll says was “the single hardest part of the movie. … There is no time for plan B, and by its nature so much of the work has never been done before.”
“Spider-Man 3” (Sony Pictures)
The second installment in the franchise earned an Oscar for visual effects, and this time visual effects supervisor Scott Stokdyk and his team pushed the bounds of combining character animation and effects animation even further.
“There’s been a lot of work done in recent years in terms of CG characters and environments,” he says. “The only place you can really go from there is making very interesting virtual content within the frame. To get that, we combined effects work with an underlying emotional performance.”
“There is an emotional core to everything that we do in the visual effects, and in particular the character animation,” says animation supervisor Spencer Cook. “I was very conscious of the choices in the body language that we made in the character animation to convey the thought processes and emotions.”
Stokdyk explains that the project also resulted in two key R&D developments: Sandstorm, an internally developed software that allowed the team to manipulate and control the behavior of sand, from individual grains to hundreds of millions of grains; the second was software to help the character animators to animate the symbiotic goo. “This tool allowed the animators, on a shot-by-shot basis, to create their own character and animate it,” he relates.
As to the challenges of CG sand, Stokdyk explains: “Sometimes it is a solid, and sometimes it behaves like a liquid, sometimes it behaves like a gas, so it undergoes a bunch of different state changes. A lot of research has been done into it because it is such a unique phenomenon.”
While there is no real front-runner, there is certainly some strong admiration for the giant robots in Michael Bay’s blockbuster “Transformers.” But can they transform into Oscar statues?
VFX pros have praised not just the robots, but the way in which they are seamlessly placed into the live action environment.
Industrial Light + Magic visual effects supervisor Scott Farrar attributes the success of the VFX to three key elements.
First, the lighting of the robots and environments. “These are
highly reflective objects, just like a car body,” he explains. “We had to reflect the environments into these guys and then add the dramatic lighting. We worked really hard to get that as real as possible.”
So photo-real lighting and photo-real objects (were critical elements). We had to develop layers of texture, on elements such as a body panel. An artist had to put those layers in to see the light the way a real piece of aluminum would see light.”
Second, the visual effects team aimed to create very athletic robots. “Michael wanted them to be athletic, Ninja-style warriors,” Farrar relates. References were used from movies, new sequences filmed with stunt people, and even the animators themselves.
Lastly, these robots had to give a performance. “There were over 100 (robot) speaking parts, so they had to act, they had to be characters,” Farrar says. “Therefore, the complexity of the faces was driven very high. There were also 100 pieces on Optimus’ face that move to portray emotion.” Farrar estimates that Optimus Prime has a total of 10,108 parts. “He was a very complicated character.”
Sign up for THR news straight to your inbox every day