Mocap-Driven, Real-Time Digital Fashion
Real-time cloth simulation has long been the holy grail for digital fashion experiences.
FIA collaborated with Digital Domain, a world-leading visual effects company that pushes the boundaries of digital storytelling, to show the potential of real-time cloth simulation within a virtual fashion environment.
The ability to model cloth in real-time at the same level as movie special effects, opens up the possibility of a new era of fashion experiences, blending the physical and digital in seamless ways.
Visual effects technology has developed to the point where digital humans have become photorealistic and their use is widespread, but the simulation of their clothing has often been lacking. The ability to model cloth in real-time at the same level as movie special effects, opens up the possibility of a new era of fashion experiences, blending the physical and digital in seamless ways. This includes more compelling versions of virtual try-on, motion-capture driven performances, immersive live events and other mixed reality experiences.
Skeletal data was streamed directly into Unreal Engine using a motion capture suit.
Working with Digital Domain’s proprietary machine learning algorithms, we were able to train a model to produce cloth deformations in real-time based on the movements of a live performer. Having trained the model, skeletal data was streamed directly into Unreal Engine using a motion capture suit, driving a virtual character and generating realistic cloth simulation within a virtual environment.
Explore new possibilities
Get in Touch