Digital Puppets
It's been a while since I've journaled my interests and research... a big area when I get time to explore is Digital Puppets... this has continued to be growing as an area for the last few years, with lots of subvertable content like SparkAR or SnapChat Filters - through to the explosion of Vtubers.
Anywho, the big moment in the summer of 2021 - was the release of Metahumans for Unreal Engine
Some impressive visual quality harnessing the Unreal Engines Realtime Rendering... the TrackingApp on the Ipad - Live Live Face that gives cheap realtime facial tracking that can be mapped onto the Metahuman, or any appropirately rigged character in Unreal. The Metahuman experience itself is reminicent of Adobe Fuse and Autodesk Character Generator of old - just taking it to the next level of rigging, blend shapes and some excellent hair simulation - so this isn't a revelation in itself - but is part of the evolution of creating a Digital Human / Clone. Quixel bridge is the tool that helps link the cloud based Avatar editor for Metahuman, and the export to Unreal or Maya editable mesh
When playing in Maya 2020 ( 2022 isn't supported yet ) - you need to enable DirectX shaders to see the content properly - though hair isn't exported. More play to be had, but I can see Metahumans - becoming quickly the comic sans of explainer avatars - so there needs to be more scope to create content away from the Uncanny Valley.
Which leads me onto my personal fave development of the recent weeks, the addition of Body Tracking ( finally ) to the awesome Adobe Character Animator. This is currently in Beta - so a bit of work in progress and glitchy. But if you work within its limitations, it opens up lots of new opportunities for quick turnaround on characters for Explainer animations or Webseries.
here's a quick tryout
Something that hopefully will be worked on in the future or some clever hacks of the layers/swapsets - is to foreshortening of limbs - with body tracking currently - it does make you work within the limitations and plane of the puppets design. Loads of additional information here to get up and running, so check it out - its beta at the moment - so community feedback is important.
Some impressive visual quality harnessing the Unreal Engines Realtime Rendering... the TrackingApp on the Ipad - Live Live Face that gives cheap realtime facial tracking that can be mapped onto the Metahuman, or any appropirately rigged character in Unreal. The Metahuman experience itself is reminicent of Adobe Fuse and Autodesk Character Generator of old - just taking it to the next level of rigging, blend shapes and some excellent hair simulation - so this isn't a revelation in itself - but is part of the evolution of creating a Digital Human / Clone. Quixel bridge is the tool that helps link the cloud based Avatar editor for Metahuman, and the export to Unreal or Maya editable mesh
When playing in Maya 2020 ( 2022 isn't supported yet ) - you need to enable DirectX shaders to see the content properly - though hair isn't exported. More play to be had, but I can see Metahumans - becoming quickly the comic sans of explainer avatars - so there needs to be more scope to create content away from the Uncanny Valley.
Which leads me onto my personal fave development of the recent weeks, the addition of Body Tracking ( finally ) to the awesome Adobe Character Animator. This is currently in Beta - so a bit of work in progress and glitchy. But if you work within its limitations, it opens up lots of new opportunities for quick turnaround on characters for Explainer animations or Webseries.
here's a quick tryout
Something that hopefully will be worked on in the future or some clever hacks of the layers/swapsets - is to foreshortening of limbs - with body tracking currently - it does make you work within the limitations and plane of the puppets design. Loads of additional information here to get up and running, so check it out - its beta at the moment - so community feedback is important.