Open in App
  • U.S.
  • Election
  • Newsletter
  • Creative Bloq

    How to animate a character with mocap

    By Paul Hatton,

    2 days ago
    https://img.particlenews.com/image.php?url=30ea5S_0u6E8JSY00
    (Image credit: Paul Hatton)

    Live link motion capture is becoming increasingly popular technology within virtual production workflows. The requirement for high-end and expensive hardware and software is almost non-existent with free or inexpensive options centre stage.

    The applications for this tech are widespread including film, TV, archviz, advertising and videos for education. Some cases will still need higher-end solutions, but for a lot of the industry these emerging tools will begin to take centre stage. Projects that could have benefited from some form of custom animation but didn’t have the budget will now find their options wide open.

    There are a number of AI tools for filmmaking breath through, and one example of this tech is Wonder Studio, an AI tool that transfers motion-capture data onto a CG character for use in a live-action scene. This does all the hard work of generating motion-tracking data, roto masks and clean plates, and finally replacing a live actor with a CG character. Such developments are helping to revolution is e the way directors and creators are generating their content. Tools such as these are drastically reducing film and TV budgets , and making it possible to get live feedback on set.

    RADiCAL is not limited to only MetaHuman characters, but any character that’s properly rigged and set up

    Another example, this time with a game engine, is Live Link Face for Unreal Engine, which enables animators to capture facial movements to be used with MetaHuman Animator. This can all be achieved in real-time, but is unfortunately limited to use on iPhones and iPads.

    The raw video and depth data captured can be applied to any MetaHuman character and offers a quick workflow for creating virtual content from real human performances. It’s also possible, through its ARKit, to stream data live to Unreal Engine, removing unnecessary steps between the actor and final product.

    RADiCAL makes use of similar technology, but applies it to a whole character rather than just faces. It’s also not limited to only MetaHuman characters, but any character that’s properly rigged and set up. The motion-tracking data can be linked into Unity, Unreal Engine and a number of other applications. If a character model includes blendShapes or Face Morphs and is connected to RADiCAL’s Quick Rig retargeter, you can use face tracking in your scene without the need for any extra equipment.

    Expand All
    Comments / 0
    Add a Comment
    YOU MAY ALSO LIKE
    Most Popular newsMost Popular
    Total Apex Sports & Entertainment15 days ago
    Streaming on Men's Journal Pursuits11 days ago

    Comments / 0