Michał Wawruch has walked us through the Durance of Hate Project, detailing modeling, texturing, and lighting pipelines, shared how Unreal Engine's PCG helped to create Diablo 2's environment, and gave some tips for aspiring artists.
In case you missed it
You may find this article interesting
Introduction
Hello, my name is Michał. I spent a couple of years making real-time visualizations of real estate for high-end PCs. Currently, I'm mostly working in the metaverse industry, doing levels for all kinds of platforms, including Android and iOS. Optimizing complex scenes for these platforms can be tricky, but it's incredibly satisfying to see something that initially seemed impossible run smoothly on a pocket device at 60 FPS. I also released a couple of projects for the Unity Asset Store. I love the technical aspects of game engines, and pushing boundaries with them is something I'm chasing.
I've been working in Unity since 2014. Over the years, I have also started learning Unreal Engine 4, mostly by just throwing stuff around, nothing serious. When Unreal Engine 5 came out, I was first skeptical about Nanite and Lumen, mostly due to the initial limitations. With time, Epic was increasing the capabilities of Nanite and fixing Lumen artifacts that convinced me that the initial limitations were no longer an issue. Now, I believe that these technologies are the future, especially Nanite or mesh shaders introduced in Alan Wake II.
When PCG (Procedural Content Generation) came out in the Unreal Engine 5.2 preview, I saw it as an opportunity for artists to create tools and even whole worlds with it. I decided it was time to switch things up and create something bigger in Unreal Engine.
The Durance of Hate Project
I love Diablo 2, and I'm very thankful for Diablo 2: Resurrected, I spent hundreds of hours trying to obtain almost unobtainable items. I think Diablo has its unique dark fantasy charm. Its story is very bleak and brutal. Surprisingly, environments are full of color and contrast. The artists at Vicarious Visions did an awesome job recreating that.
When picking the project, I had a couple of goals in mind:
- Developing an efficient asset creation pipeline using Nanite and Lumen;
- Testing out Virtual Shadow Maps performance with a large number of shadowcasting lights;
- Overusing PCG to check out additional possibilities, such as splatter, asset randomizer, and asset customizer;
- I wanted the project to work and look like a finished product in a standalone build on a low/mid-end PC. I had the GTX 1060/RX580 GPU in mind as a minimum specification.
I picked level 3 of Durance of Hate because it's one of the more complicated dungeons in terms of assets. It also has a fixed layout, so fans of the game would be able to immediately recognize the place. It has a lot of lights, so I'll be able to test out heavy loads on the Virtual Shadow Maps. I also wanted to make something containing plenty of gore, making Durance of Hate a perfect fit.
Stage 1: Gathering References
I had a couple of references.
Game environment – Diablo 2: Resurrected's environment was my main reference when it came to the tileset and overall scale. I made a lot of zoomed-in screenshots in high resolution to catch all the details.
Main concept – When it came to the overall mood of a scene, I was following a concept made by Gray Rogers for Diablo 2: Resurrected. I loved the contrasts and overall brightness of this environment in his concept. It's a bit brighter than the final game.
Cinematic – Durance of Hate here was completely different from the game, but I loved the idea of the summoned pillar of light and the animation of hell gates, so I decided to add it.
Stage 2: Blockout
I started by analyzing and separating all unique assets that were a part of Durance of Hate. When I wasn't sure whether assets were separated or not, I always made them separated. I trusted that Lumen's indirect occlusion was good enough that the lack of AO texture on merged elements wouldn't be visible.
Each color represents a different instance. Each instance already has proper scale and rotation. I wanted the shape to represent an approximate version of the final prop. I made skulls and torches about 1.5x of the original size because I wanted to have distance camera shots. I always remembered Durance of Hate by its golden skulls, and I was worried that they might be invisible in the distance. Also, I saw the large distance between them as an opportunity to make them more visible.
Then, I exported the finished blockout as a USD file to Unreal Engine to ensure that each instance was properly represented in the engine. This way, I would be able to replace temporary objects with their finished versions without having to manually place them.
Stage 3: Modeling
I used this project to develop a pipeline of efficient asset creation with Nanite and Lumen in mind. When it comes to the modeling pipeline, I set myself a couple of rules at the beginning:
- I want to use Nanite everywhere I can. I do not want traditional LODs.
- The modeling process needs to be efficient, I don't want to be slowed down by mesh loading. I am prepared to still use a high poly to low poly workflow if it will speed up the process.
- No hard faces. My last project showed me that Nanite hates hard faces, so I try to avoid them.
- Only retopology by algorithms.
- UV seams should be created manually; I aim to ensure that no visible seams are present.
For modeling, I used ZBrush and Blender. Usually, the base shape was made in Blender and then exported to ZBrush. All the details were made in ZBrush. I use a lot of alphas to save time.
After making the high poly version, I am decimating the mesh and unwrapping it in Blender. Even though it's Nanite, I still use a high-to-low poly workflow with a Normal Map, not only for speed but also for better-looking results. I still find that assets with 150k triangles and the Normal Map have better shading than 2 million polygons without the Normal Map, and it's a nightmare to unwrap, load, and texture 2 million triangle mesh in any software.
The whole set just shows how efficient, in terms of design, Diablo 2 was. It's a great example to break down if you are a part of a small team.
Nanite scales exceptionally well, it has a pretty large performance drop at the beginning, but scaling is excellent, and for more complicated scenes, it's an obvious choice. I was able to achieve 350 FPS in 1080p on RTX 3070 without all the additional features turned on.
Stage 4: Texturing
The texturing process was pretty straightforward. I prepared 2 smart materials in Substance 3D Painter as a base for everything. Those were bronze and stone.
After applying one of these smart materials and tinkering with them, I added a blood overlay and used an animation brush to paint it over. Mesh density is dense enough that it spreads naturally all over the model.
It was pretty much it. I didn't use masks, trim sheets, or vertex painting. I wanted to keep everything simple and unified.
Stage 5: PCG's Use Cases
Floor
Unfortunately, version 5.2 of Unreal Engine didn't support Nanite tessellation, so I decided to use PCG as a splatter for floor tiles. I didn't want to use seamless materials, as I was trying to explore new ways of making large ground surfaces.
Making a floor with PCG Graph made it possible to create events for damaged tiles. I can just create an Actor, assign a specific tag to it, and then set a radius, in which destroyed tiles are gonna be spawned.
Randomization
I also used PCG to randomize assets. Mostly in very simple ways, by just offsetting or rotating objects based on their position in the world.
Splatter
Here's another very simple tool, which was mostly used for bone piles in this project.
Stage 6: Gore
Original Durance of Hate was filled with shrines, enemies, treasure chests, and armor/weapon racks. It effectively filled the space. I wanted to fill the space even more without using the interactive elements present in Diablo 2.
The entirety of Act 3 in Diablo 2 draws inspiration from Mayan civilization. The Durance of Hate is renowned for its association with human sacrifices and blood baths, making gore the obvious choice to occupy the empty areas.
These assets were pretty much the most important part of building a horror of Durance of Hate. As a base, I used textures available at Quixel Bridge.
I used AO Texture on the Unwrapped plane as a base in ZBrush. Then, I sculpted and inflated details to make everything more "bubbly". The results were surprisingly good.
I saved this high poly model and then retopologized it to mid-poly, which is going to be Nanited in Unreal Engine. Textures were baked in Substance 3D Painter, everything was multiplied by the original Albedo, I just made it more vibrant. As a final step, I overlaid more blood all over the model. The effects were very satisfying.
Then, I scattered these bloody masses all over the environment. I was still looking at the original concepts and made sure that clean places were going to stay clean.
There are exactly 188 of these abominations scattered all over the environment.
Stage 7: Lighting
The lighting in Diablo 2 is well-thought-out. Light around the hero creates horror, a bit like the fog in Silent Hill. On the screen, we often see a space that is completely dark, and to see what is there, we have to enter and discover it ourselves allowing opponents to ambush and surprise us. This builds tension because we know that we cannot feel safe outside the city. The only safe space in the wild is about 5 meters around our protagonist.
When I started the project, I knew I had a problem. I didn't want to show my work without lighted places and at the same time, I wanted to convey the atmosphere of Diablo. I decided to go on a compromise and light the scene only with torches and a tornado in the middle, so first I turned off SkyLight and Directional Light completely. This turned off ambient light. Then I added torches with shadows turned on.
The first challenge was light complexity. Torches very close to each other could cause problems with performance due to casting shadows, so the direct light radius was reduced to 5 meters.
After placing all the torches, the problem is immediately visible, there are a lot of completely dark spaces. It's more in line with the game, but the game has a hero light above the player's head, which always lights up the environment around the player. When it comes to presentation, I have no access to the hero light because there's no hero, but there is Lumen.
So the decision was to increase the indirect intensity of each torch and then multiply it by the indirect intensity of Lumen. It worked pretty well. I also increased the volumetric scattering of these lights. I do believe it adds depth to the environment.
Then it was all about adding point light inside the tornado in the middle.
I do believe Lumen only supports objects that are rendered in the Deferred pass, so keep it in mind while making transparent/translucent objects with the expectation of receiving indirect light and reflections from them. A simple workaround is to make a masked copy of the original object only visible to Lumen using RayTracingQualitySwitchReplace.
That was pretty much it. Lumen makes lighting easy. There are no UV2, lightmap groups, and rebaking. It just works.
When it comes to Virtual Shadow Maps, they don't scale well with that amount of light. So, using tricks like light cookies is still a viable option if we want the best performance.
Conclusion
I had a lot of fun making this project. I'm a big Diablo 2 fan, and Diablo 2 Resurrected literally resurrected my love for this game, so making such an iconic place as Durance of Hate was a blast. I consider the project a success, as I managed to achieve all the goals I set for myself. I also love the reaction of Diablo 2 fans; it's heartwarming to see so many people sharing their memories of this game.
A demo of the entire project is available here, it's the final build of the environment using a customized top-down framework.
I believe that if you want to test something out, try not to isolate the feature but build on it. Create something with the tech, because it's usually when the real problems are showing, and solving them gives you the necessary experience when it’s time to create something big. You can eliminate problems in pre-production and possibly save the team thousands of work hours. Think out of the box and test things you wouldn't do in a production environment. That's how you discover new workflows and tools. I really recommend watching the GDC talk by Tor Frick titled Building the World of The Ascent. It's a great example of thinking out of the box.
Michał Wawruch, 3D Environment Artist
Interview conducted by Gloria Levine
Keep reading
You may find these articles interesting