Solomon Jagwe has told us about creating realistic characters with MetaHuman and setting up facial animations.
Introduction
My name is Solomon W. Jagwe. a Film Director, 3D Artist, and Animator from Uganda, currently based in the USA. I am the Co-Founder of Sowl Studios, an Art and Animation Studio based in Alexandria, Virginia, USA, and the Creator and Director of the Adventures of Nkoza and Nankya.
My early education in Primary and secondary school was in Uganda. I came to the USA on an Art scholarship. I went to Ohio Valley University, Montgomery College, and George Mason University, where I majored in Art and Visual Technology. I worked in the defense industry for more than 15 years, creating content for Video games, cinematics, real-time strategy simulations, and soldier training. This page has some more info on the kind of games we developed for the Military. The defense contractor studios I worked for include: Rival Interactive, Cornerstone, SAIC, and Camber.
From 2016 to 2018, I worked as the Art Director for a Virtual Reality studio based in DC, called Floreo Tech, where we made virtual reality content to help children on the Autism spectrum develop social skills in a safe environment. I transitioned to working full-time on the Adventures of Nkoza and Nankya in 2018, and created an Art and Animation startup, together with my wife, to produce content for the series.
Starting the Speech Experiment
I had an idea for a short animated film featuring 2 kids and a father to portray the dream of a Father of a better world for his kids. I wanted the characters to be as realistic as possible, and fortunately, I was granted Early Access to the Epic Games MetaHuman character creator which featured incredibly realistic 3D characters.
Setting Up the Character
I usually start with a sketch when creating 3D characters, but in this case, I simply jumped into the MetaHuman character, waited in line for my turn, and planned at least 25 minutes of creation for each character since you can only do 1-hour sessions per turn. So I had to move fast. I did my best to plan ahead in terms of what I wanted the characters to look like. I spent more time on the Father character because he was the first one I wanted to animate.
Here is a video of how I use the system to create a character:
Making Facial Animations
When the MetaHuman Creator first arrived on the scene, we were limited to using the Live Link Face plugin and app from Epic Games to create the facial animation. During those early days, there were limited options to fine-tune the animation that was produced by the iPhone. When iClone 7.91 was released with an updated Live Face app and Motion Live supporting Arkit natively in iClone, a world of possibilities was unleashed. The addition of Acculips meant that users could now animate character faces more accurately using audio tracks and text. It was text to speech taken to another level.
This new update of the iClone MetaHuman kit bridges that gap of making it possible to animate the MetaHuman characters in the Unreal Engine, using all the new iClone animation tools like Acculips, Face Key and Face puppet. Here is a step by step video showing how I created the facial animation using iClone to drive the facial animation of the MetaHuman character:
With the current Live Face plugin in iClone, especially the version for iClone 7.91 and above, the iPhone profile has been greatly improved to the point where even without much tweaking, you get very good results. What's even better is that after a sequence is recorded, you still have the ability to fine-tune the facial animation using the Face Key and Face puppet tools in iClone.
Conclusion
I think this workflow is going to be extremely helpful to indie filmmakers and storytellers, to help us bring our stories to life, especially for those on a budget and who have small teams. It's going to be easier to use MetaHumans to bring concept ideas to life as well as creating final animations for projects that animators wish to pitch to producers.
Currently, the MetaHuman creator is an online platform, which means that you need a good internet connection to be able to use it to its full quality. Many artists around the world don't have access to good bandwidth. I am hoping that an offline version will be made available by Epic Games.
Another limitation has to do with access to hardware like iPhones, which most people can't readily afford since you need an iPhone X or higher with a depth-sensing camera. Fortunately one can still animate the MetaHumans with the built-in iClone animation tools even without an iPhone.
My hope is that an android version can be introduced that works with Motion Live and Live Face. There are so many users who have Android phones that they wish they could use to create facial animation.