Powered by Machine Learning, the Photorealistic Digital Human Can Answer Questions and Mimics Moods in Real Time; FMX Demo Running on Unreal Engine
Today at FMX, Oscar-winning VFX studio and leaders in digital human technology, Digital Domain, announces “Zoey,” the world’s most advanced autonomous human. Powered by machine learning and created using an advanced version of the technology and process that helped bring Thanos to the big screen, the photorealistic Zoey can engage in conversations with multiple participants at once, remember people, access the internet to answer questions and more, paving the way for the next step in the evolution of AI.
“Autonomous humans are poised to become a common part of modern life, and as AI continues to expand what they can do, combined with the potential of the metaverse, the desire to interact with them face-to-face is becoming more and more important,” said Daniel Seah, Global CEO of Digital Domain. “For decades, Digital Domain has been pioneering digital human technology, so creating the most advanced autonomous humans possible was a natural step for us. Zoey takes
the concepts of virtual assistants like Alexa and Siri several steps further, creating a digital helper you can truly interact with.”
Building on its proof-of-concept autonomous human, “Douglas,” Zoey is the result of years of research and development from the award-winning visual effects house, spearheaded by its internal Digital Humans Group. The physical appearance of Zoey is based on actress Zoey Moses, who worked with Digital Domain to create a comprehensive set of facial movements and mannerisms, as well as a full range of emotive expressions. Using that data and its proprietary facial animation tool “Charlatan” — recently seen in blockbuster films, including the Oscar-nominated Free Guy — artists were able to take the real footage of Moses and create a flexible digital face, capable of reacting in real time.