VFX house Digital Domain used AI as part of its Masquerade facial capture system to superimpose Josh Brolin’s performance onto the villain Thanos in Avengers: Infinity War. “We’ve seen huge advancements in that area of research lately,” says Darren Hendler, director of the studio’s digital human group. “Real-time rendering together with our AI-driven, real-time facial capture system is changing the timelines for creating movie creatures. Soon we’ll be able to generate realistic Thanos-like performances live on set.” Hendler says a suite of deep-learning technologies on the horizon “can potentially allow us to quickly and realistically transform a stunt double into an actor or change an actor’s lines.”
AI is having a “drastic” effect in the world of digital humans, says Christopher Nichols, director of Chaos Group Labs and key member of the Digital Human League, a research and development group: “The massive challenges that were put on artists and tools to find all those subtle nuances that make a human look real are actually much easier to solve through deep learning.” The improvements have also led to deepfakes, such as the viral faux PSA using FaceSwap AI in which Jordan Peele ventriloquizes former President “Barack Obama” calling President Trump a “complete dipshit.” Says Nichols: “A great number of people want to stop the technology and outright outlaw it. [But] the tools themselves have huge potentials that we should all continue to explore.” He concludes, “What I believe is that we need more education on the subject and better tools that help detect what is real and not real. We have come to accept that Photoshop can manipulate images very well and easily, and we are not banning that as a tool. But we have come to understand and doubt every image that is presented to us.”
Read more in The Hollywood Reporter.
Images copyright MARVEL STUDIOS ©
