Filmmaker Richard Mans shares his real-time workflow for creating captivating 3D vertical shorts on TikTok, Instagram, and YouTube.
Richard Mans/FUZZYREALMS
Richard Mans is a New Zealand-based Artist and Filmmaker renowned for his fantastical short-form videos. His acclaimed 2011 animated short film “Abiogenesis” garnered widespread acclaim, winning awards and enjoying an extensive festival run globally. He also focused on animations and user experiences for prototype display technologies in the Automotive industry, where Mans continued to innovate.
Excited by the possibilities of high-quality real-time rendering, he is now dedicated to creating captivating short-form videos. Fuzzyrealms serves as his online portfolio, showcasing his latest works. Passionate about crafting brief glimpses into fantastical worlds, Mans finds inspiration in daily creativity. Follow him on Instagram, YouTube, Facebook, TikTok, or subscribe to his mailing list for updates.
Act Out Your Dreams
My initial idea is very loose. No storyboards, I just dream and edit shots in my head. Once I settle on an idea I put on my motion tracking suit (I use a Perception Neuron 3 suit) and start acting out the characters in the scenes, this process helps me further refine my idea.
I record my Motion Capture directly in iClone using the Motion Live Plugin. I find that seeing my motion capture in real-time on a human character helps me troubleshoot issues with the motion capture performance that I may not see using a plain Motion Dummy.
Refine Your Performance With iClone
Two tools in iClone that I always use for editing my motion capture are the Smooth Filter and the Edit Motion Layer tool. The Smooth Filter (found in the Curve Editor) eliminates any unwanted jittering noise that is usually in the raw Motion Capture Data. Sometimes I may not even notice the jittering until my final render when I’ll notice what looks like the character skipping frames. So I find It’s good practice to start with the Smooth Filter.
I use the “Edit Motion Layer” tool to fix more obvious issues like hands passing through the character’s leg, and correcting the character’s posture.
Give It Soul
Once I’m happy with the edited motion capture, I blend various facial animation clips (from Reallusion’s “Digital Soul” pack) in the timeline. This way I can quickly create a natural-looking facial performance that matches my motion capture.
Simulate a Crowd in iClone
Looking at reference videos (on YouTube) of busy streets in New York City reminds me that there’s often a flow of pedestrians walking down a street. For example, groups of people moving in one direction tend to stick to a certain side of the path. So I create two paths with characters flowing in opposite directions to achieve this effect. Then add a few more people to a third path in the distance to give the scene more depth.
Creating crowds of characters can be taxing on system resources, so preferably every character needs to end up in the final camera shots. So I restrict the paths to areas where I imagine the camera shots will go.
Send Everything to Unreal Engine
Transferring everything to Unreal Engine is very straightforward using iClone’s Unreal Live Link plugin. First under the “Transfer File Settings” I deselect “Place Actors in Scene” and select “Bake all animations to sequencer” then transfer my two main Characters. Then I change the “Transfer File Settings” to “Exclude Morph Targets” and transfer all the other characters (the crowd). This way everything is optimized and organized nicely into Sequences in Unreal Engine.
Now that everything is transferred to Unreal Engine, I play with my cameras in Unreal Engine until I’m happy with my shots and the camera cuts in Sequencer.
Dress It Up With Cargo
With the performances and cameras in place, I use Cargo (a plugin by KitBash3D) to dress up the shots. Cargo gives me access to a huge library of assets that I can instantly import into Unreal Engine. I simply select the asset in the Cargo interface, click import, and after a few seconds the asset loads in Unreal Engine with all the materials applied.
Finishing Touches and Thoughts
I add color grading and VFX elements (like the dust particles at the end of this short) using Adobe After Effects. Then sound effects and music using Adobe Audition. I love playing around with the sound and music, it’s where the short really comes to life.
Thanks for joining me through this process and I hope you’ve gained some valuable insights into creating short-form animations. With tools like iClone, Unreal Engine, and Kitbash3D, the possibilities are endless. So, whether you’re an aspiring filmmaker or a seasoned pro, keep pushing the boundaries of creativity and never stop dreaming big! Please check out my social media channels to see more of my work.
If you are interested in generating amazing crowd scenes using creative tools, please check out the original post for more.