Vicon Brings Real-Time, Full Body Tracking Down to the Fingertips With Shogun 1.3

Full Body Performance Capture Data Streamed Directly Into Game Engines, New File Formats Added; Live Booth Demo at SIGGRAPH 2019.

Los Angeles, Calif.
July 29, 2019

Vicon today announces the upcoming release of Shōgun 1.3, the latest addition to its industry-leading performance capture platform for entertainment, including games and VFX. Shōgun 1.3 introduces the world’s most precise finger solving, enabling artists to create and see fully animated characters in real-time, saving both time and money. Users will also be able to stream their own characters directly into game engines, and support for the Universal Scene Description (USD) format has been added, opening the door to mobile devices and augmented reality.


The addition of the new finger solving in Shōgun 1.3 means that users can record the entire body – from skeletal movements to the smallest hand gestures – for use in projects ranging from Hollywood blockbusters to AAA games. To create this highly sophisticated finger solving, Vicon partnered with Framestore, the Academy Award-winning VFX studio whose resume includes films like Gravity, Blade Runner 2049 and Spider-Man: Far From Home.

“For several years we’ve worked with Vicon on a host of projects, so the opportunity to work with their team to address something that artists have been waiting on for years was an easy decision,” said Richard Graham, Framestore’s virtual production supervisor. “We have already successfully deployed it on a number of projects, and given that it is part of the whole-body solve, it fits straight into our real-time and offline pipelines.”


For more than 18 months, Vicon and Framestore worked to perfect Shōgun’s finger tracking, based on a dense 58-marker set capable of tracking finger and knuckle movements. Users can choose a reduced set to animate the fingers; that data can then be combined in real-time with a user’s digital rig, producing a fully animated digital character capable of intricate movements, from writing a letter to playing an instrument. A process that used to take weeks of painstaking animation can now be done instantly.


“Capturing full five-finger motion capture data has been at the top of customers’ wish lists,” said Tim Doubleday, Vicon product manager. “This is a milestone for Vicon and motion capture.”

New Retargeting Pipeline for Game Engines

Along with finger solving, Shōgun 1.3 grants users the ability to record data directly into a game engine – including both Unity and Unreal Engine 4 – without the need for third-party packages. Shōgun users can retarget a performance onto any FBX skeleton in Shōgun, and within seconds see an animated version of their character, adding full performance digital avatars in the game engine. Through virtual production workflows, users can then change settings on the fly and alter the scene in real-time as needed. This allows for a new form of film and game development, delivering intricate and realistic projects in a fraction of the time.


New File Formats

Shōgun 1.3 is now the only motion capture platform on the market that can export skeletal data using the new Universal Scene Description (USD) format, a file type currently used by major VFX companies around the world, including ILM, Framestore and Pixar. By using the USD format, Shōgun 1.3 can export data directly to iOS devices, giving users the ability to view Vicon’s data directly on an Apple device, while also supporting Apple’s new AR kit technology for use in augmented reality.


Shōgun Live & Post Highlights

Shōgun Live users will find several improvements to the overall functionality, including multi-machine support, allowing scaling of processes across multiple PCs and improving performance for large captures. The addition of “camera mask painting” also means users can block out background noises like reflective objects and additional light sources.


Shōgun Post will feature a new gap list function, allowing users to first identify individual users and then separate a single performer. From there, Shōgun will identify any gaps in the performer’s movements and automatically fill them in based on the expected movement. A waving hand missing a marker on a finger will add that finger or a gap in a moving fist will be filled, all without forcing artists to spend precious time individually filling in these gaps one by one.

Shōgun 1.3 Showcased at SIGGRAPH

At SIGGRAPH 2019, Vicon will demonstrate the Shōgun platform by letting people direct their own animated movie starring live actors, all in real-time. The actors will work with the attendees while utilizing the latest virtual production tools, including real-time capture from Apple’s new AR Kit technology.


All data will be captured in Shōgun 1.3 and streamed into Epic Games’ Unreal Engine. The actors will appear as animated characters, with digital assets added to the scene courtesy of the Epic Marketplace. Something that normally takes weeks or months can now be done instantaneously.


Vicon will host its interactive SIGGRAPH demo July 30 – August 1, with sessions taking place at Vicon’s booth #741 throughout the day. Between sessions, Vicon’s Tim Doubleday will offer a closer look at the new tools, with scheduled demos taking place on Tuesday and Wednesday at 4 p.m., and Thursday at 2:30 p.m.



Shōgun 1.3 is currently in closed beta, with a roster of users that includes Electronic Arts, Framestore, ILM, Pixar, Ubisoft and more. The beta will be available to the public in September, with the full version expected later this year.