logo80lv
Articlesclick_arrow
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
profile_loginLogIn

Mortal Kombat 11: Blood Production in Houdini

Matt Battaglia from NetherRealm Studios shared the workflow on blood production for Mortal Kombat 11, discussed the data optimization when working with meshes, the geometry cache system and its advantages. 

Introduction

Hi! I’m Matt Battaglia, Technical Art Lead at NetherRealm Studios. I’ve been around the industry for about 15 years now, starting out of college with my first job working on the cartoon “Jimmy Neutron.” I got my start as a modeler and environment artist but always had an interest in more technical disciplines. I joined NetherRealm Studios about five years ago and have worked on Mortal Kombat X, Injustice 2, and Mortal Kombat 11. I now lead a very talented Technical Art team here that covers tools, R&D, rigging and character setup, character cloth and simulation, in-engine setup, and scripting and some special purpose VFX work.

Creating Realistic Blood Effects

We don’t actually use vertex animation textures. That technique did allow us to prove out some prototype work and build a proof of concept, but we ultimately developed an Alembic-based import pipeline and then store that geometry data in a proprietary optimized format we call a geometry cache. The big advantage of using geometry caches is that we’re not faking or approximating the look of fluids, we’re actually doing real fluid simulation and bringing that into the game engine. None of the work we did with more traditional VFX techniques resulted in a truly believable fluid motion, especially, when it’s close to the camera and plays in slow-motion.

Using Alembic

At a high level, Alembic is an open-source geometry format that originated in the film industry. In our case, we cache out complex fluid simulations and convert those meshes into optimized, triangulated game geometry. Using Alembic as an import pipeline between external tools and our game engine gives us a much more optimized, easy to use import/export path since many tools have built-in support for the format. 

Blood Library

About the Mesh Flipbook 

We use the term “Mesh Flipbook” to describe what we call heterogeneous mesh sequences. The easiest way to understand this is to consider the opposite type of mesh sequence first, which we call homogenous. If you model a cloth flag and run a wind simulation through it, the topology of that flag stays the same between frames. The core structure of that flag geometry, like how each vertex is connected to each other, stays consistent frame to frame. Only the position, normal and tangent vectors, are updated every frame. Now for a heterogeneous mesh, consider a fluid simulation instead of a flag, like an object splashing into a pond. That geometry could start out as a simple planar surface but then completely change with splashes that tear off into separate droplets, then all merge back together again into a calm surface. That topology, how each vertex is connected, can completely change every frame. Not only do we have to store all the same data as the homogenous case, but we have to store and rebuild the mesh connectivity every frame as well. The biggest challenge with heterogeneous meshes is the extra data that we must store, and we lose the ability to use traditional animation compression methods to reduce memory usage. If you’d like to dive into the more technical aspects of these geometry caches, I presented a SIGGRAPH 2019 talk with a programming colleague that goes into the specifics, titled “Mortal Kombat 11: High Fidelity Cached Simulations in Real-Time.” The slides are published here.

Blood Previs

As far as generating all of those meshes, we used pretty standard techniques using Houdini. The simulations themselves were done with FLIP, then converted to VDB volumes and then meshed. The trick to getting everything into a 155 MB budget is a combination of three things: optimization of the data that we cover in our SIGGRAPH presentation, being very careful and optimal with poly reduction of our simulation meshes and carefully designing a library of very flexible simulation shapes that we can re-use all over the game. At runtime, all of the geometry cache data is sampled at the vertex shader stage, which should be familiar to anybody who has done world position offset material work in Unreal Engine. That computation is extremely fast, which makes this solution viable for our 60FPS fighting games.

Candle Placement

The Meshes Workflow

One of the coolest aspects of our geometry cache system is that we can use that data in a variety of ways. In the case of our cinematic blood effects for Fatalities, we spawn that blood as regular static meshes. That means we can very easily manipulate the shape, position, and rotation of these meshes and compose each blood impact specifically for the camera of each shot. In addition, we can add traditional keyframe animation on top of what the simulation is doing to layer in translation, rotation, and scale and give each shot more uniqueness. Essentially, we have built a modular library of blood shapes that we can drop into a cinematic, compose a very cool composition for the camera and tweak the timing to really get the feel of a brutal impact relatively quickly.

The Benefits of The Geometry Cache System

 

We have a third type of geometry cache of just point data that can be read by our particle systems, so we can do really complex offline particle simulations and bring that data back into the editor and apply sprite or mesh geometry on top. We’re looking into ways to encode complex rigid body simulations into these caches so that each rigid piece has all of its vertex transforms stored on a single centroid, which makes them super cheap. Another exciting R&D opportunity is attaching different types of data and attributes to point caches to drive other systems. For example, we could do a very complex rigid body destruction and generate a point on every impact location, then encode that point with material type or velocity vectors which in turn drive game systems that spawn the appropriate debris emitter that is oriented the correct way. Now that we have access to powerful offline simulation tools and have the groundwork laid for an import pipeline, our heads are swirling with ideas on how we can expand our capabilities to areas previously not possible in games.

Matt Battaglia, Technical Art Lead at NetherRealm Studios

Interview conducted by Kirill Tokarev

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more