In returning to J.J. Abrams‘ Star Trek universe, Industrial Light + Magic visual effects supervisor Roger Guyett wanted to extend the ideas and detail of ILM’s work on the first film.
“We wanted to find that number 11 on the dial,” he said of the work on Star Trek Into Darkness. “It gave us a great opportunity to revisit these pieces — keeping them very familiar but to reinvent them ever so slightly and give them a new level of energy.”
The film, which incorporated some Imax photography and was additionally released in 3D, was given an immersive quality.
“We added a dimensional element to warp,” Guyett cited as an example. “When the ship goes into warp, we do a camera movement, which is a play on the concept of the Hitchcockian dolly zoom. [Just before] the ship goes into warp you get this lens distortion; In 3D we were able to pull the ship off the screen and into the audience, and when it is released we created these trails left behind and [the audience] travels through it.”
Simulated destruction included a volcano that erupts — with Spock in harm’s way — in the film’s opening sequence. “The way the molten lava reacts to different temperatures was an amazingly detailed and controllable simulation that allowed us to create amazing texture for the lava,” Guyett said. “It becomes a real character and is extremely threatening.”
That sequence also involved quite a bit of water simulation, including for the hero shot of the Enterprise rising out of the sea. “Water technology has really jumped exponentially in the last few years, and we were able to take advantage of technologies that were used for [ILM’s work on] Battleship.”
Work also involved a substantial amount of digital doubles — including the use of motion capture to create “a library of the movements.”
Virtual environments or set extensions figure prominently, including in the creation of the red forest. “The budget for the forest got smaller as the jungle got more complicated,” Guyett admitted, adding: “J.J. is very good at using small sets and making them feel big.”
Guyett related that they shot live action on a set that was just 40 ft. wide, and the rest of the environment was created in the computer. “That is an efficient and effective way of using your [production dollars],” he said.
A futuristic, architectural version of San Francisco also was generated in the computer. “When you are creating a future version, you know you’ll have to do a lot of augmentation. You can’t just walk onto the street and photograph it. We basically rebuilt San Francisco,” he explained.
Set extensions were used to bring scale to the interior of the Enterprise. “This time, we decided to build a much more complex series of corridors, which allowed [the audience] to travel with the crew around the ship a little bit more, ” the VFX supervisor said. “There’s large atrium that allows you to see down to the multilevels of the ship, and it gives you an understanding of the scale of the ship.”
As to the technical aspects of the project, Guyett noted that much of the work involved use of a new rendering pipeline based around the Arnold ray tracer. “The science of the way we are lighting the ships is probably a little more accurate that what we achieved in the first movie,” he explained. “[With Arnold] you get a lot of secondary lighting effects. For instance, if we shine a light on a surface, the light bounces off that surface and may illuminate other parts of a setting. In the first movie, we achieved that, but we were cheating a lot of those effects. In this movie, the ray tracer does a more accurate representation of the way light is reflected off surfaces.”