1:43pm PT by Carolyn Giardina
VFX Pros Team Up to Advance the Realism of Digital Humans
A group of Hollywood visual effects pros that have dubbed themselves the “digital human league" are trying to raise the bar in what we think of as synthetic humans — and they aim to get their work out in early 2015.
If you remember early efforts to create CG humans where the characters looked quite real — but were off just enough that they might have felt a bit creepy — then you understand a perceptual zone known as the “uncanny valley." It's a term that was sometimes used in conversations about CG humans surrounding the release of films such as The Polar Express. That’s what this "digital human league" is looking to overcome.
Many artists will tell you that its possible to get very close to this goal if you throw enough time and money at the project. But more recently, Christopher Nichols, creative director at Chaos Group (which develops V-Ray rendering software) wanted to pursue some synthetic human work, using V-Ray, and simultaneously, similar ideas were being created by Paul Debevec, chief visual officer at USC’s Institute for Creative Technologies (ICT).
They formed a group that includes additional participants from ICT; VFX pros including Academy Award winner Steve Preeg (The Curious Case of Benjamin Button); and Angela Tinwell, a researcher who authored a book titled The Uncanny Valley in Games and Animation.
“We want to take what we already know, apply this knowledge, and try to tackle the uncanny valley in a much more analytical way,” Nichols told The Hollywood Reporter, adding that this involves looking for all the subtleties that make a face feel alive. “With the uncanny valley, artists tend to overreach to overcorrect the problems. But if we take scientific data, there’s no reason not to get a correct [starting point], and then have artists do what they are good at.”
He calls this effort “Wiki Human Project," aimed at overcoming the uncanny valley by developing "a reference for how to make digital humans in a consistent way."
Said Nichols: “We are going to open source the data and give it to public to test it and see how to use it.” He expects the first version to be available to the public by next March.