logo80lv
Articlesclick_arrow
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
profile_loginLogIn

Creating a Warhammer 40K-Inspired Scene with Blender & UE5's Substrate

Ben Close talked to us about the 40K Inquisition Hallway project, discussing hard-surface modeling in Blender, modular asset workflows, and material layering in Unreal Engine 5, including gradient-mapped metal and advanced decal setups for a polished final result.

Introduction

My name is Ben Close, and I originally got into 3D art through modding. I started by creating mods for Elder Scrolls games and custom maps for a Battlefield 2 tournament league. Although games were always a passion, I initially studied programming and general engineering. After being homeschooled for much of my early education and starting an Engineering associate degree, I transferred to Full Sail University for their Game Programming Bachelor's program. About a third of the way through, they launched a Game Art degree, and I made the switch. The technical skills I gained from programming have been invaluable, especially as tools and workflows continue to evolve.

After graduating, I landed my first role as an Environment Artist at 38 Studios, working on an ambitious MMO. Although the company went under due to political issues, it was an incredible learning experience and helped me build a strong professional network. I then joined Turn 10 Studios at Microsoft, where I worked on Forza Motorsport 5 and 6. I managed most of the track and environment shaders and created assets for several tracks, including our E3 showcases. On Forza Motorsport 6, I was heavily involved with developing the wetness system, which was an awesome learning experience in breaking down a complex visual target and figuring out how to actually plan and build the shaders and systems needed to hit it.

After that, I spent some time at Monolith working on Middle-earth: Shadow of War as the primary texture artist. I wasn't there that long, but I basically lived inside Substance Designer for almost a year, and learned a ton!

At this point, I was looking for something with a bit more work-life balance and I ended up joining Certain Affinity. Amazing studio, great people, and a lot of fun projects. Two standout highlights for me were Call of Duty: Modern Warfare Remastered and Halo Infinite. I led the centralized material team, developing tools, shaders, and materials across multiple projects, and also served as an Associate Art Director on Halo Infinite, helping to align teams with the project's artistic vision while collaborating with 343 Industries.

While at Certain Affinity, one of the things I'm most proud of starting was our monthly Material Challenge. They ranged from small things like building a pixel processor for a geometric pattern or trying out a new piece of software, to creating especially challenging full materials. Several pieces in my portfolio actually began as part of those challenges.

Most recently, I was the Lead Material Artist at 5by5 Interactive, working on an early alpha Unreal Engine 5 project. While there, I was responsible for creating textures and assets along with nearly all our materials/shaders and lighting/post-process work. I also worked on a number of more technical systems, including TOD, Wind, and PCG.

At this point, I have been in the industry for nearly fifteen years, and I have built a strong background in environment art, materials, and shaders, technical art, and team leadership.  Also some shameless self promotion but I am looking for a new opportunity so please feel free to reach out!

Inspiration & References

The very beginning of this project traces back to around 2022, though at that time, it was more of a loose idea than a focused project. Originally, it was just a 40K-themed setting I used as an excuse to improve my hard surface modeling in Blender. I had used Maya and 3ds Max professionally at past jobs, but still felt new with Blender. It wasn't until late in 2024 that I'd say I started the project, though, and began to narrow down my idea into an actual scene.

The inspiration for the scene came from a beautiful piece of work from Syama Pedersen's Astartes project. There is a brief shot of Space Marines walking down a hallway, and those visuals stuck with me. The camera for the shot was high up over the Astartes' shoulders and sold their imposing scale. The other part that stuck was the prayer writs all over the walls. In 40K lore, these can be used to try and protect a person, item, or area from the corruption of Chaos. It's something that can ground the world, and the 40K universe is full of this type of worldbuilding!

On top of that, in 2024, I had been completely pulled into the Horus Heresy series. I was about 20 books deep when I started seriously working on this project, and between those books, the Eisenhorn trilogy, and the start of the Ravenor series, I had no shortage of inspiration. If you are not already familiar with the Warhammer universe, that might all sound like gibberish, but needless to say, the worldbuilding and atmosphere had a big impact on the direction of the scene.

I have always been drawn to grounded and gritty science fiction and fantasy worlds, which is a big part of what appealed to me about the Warhammer setting. It is similar in tone to what I enjoyed working on with Halo Infinite: worlds that feel tangible, high stakes, and filled with weight and consequence. Those kinds of settings are incredibly inspiring creatively because they pull you in and make the act of building believable spaces even more satisfying.

For gathering and organizing references, I used Miro instead of PureRef. I had started using Miro while working at Certain Affinity and got hooked on its flexibility, especially for collaborative work, though even for solo projects I find it hard to go back. I began by pulling from my personal library of saved art and images, then expanded into more specific 40K iconography and design references from recent games.

In terms of stylistic direction, I wanted to modernize the visual language slightly. While I can enjoy older 40K art, some of it can feel overly blocky or simplistic by today's standards. I looked at the hard surface design work from Wolfenstein: The New Order for inspiration and also referenced newer 40K titles like Space Marine 2 and Darktide, which showcase more contemporary, high fidelity interpretations of the universe.

Altogether, the goal was to stay true to the core identity of Warhammer while pushing the design a little toward a more detailed and grounded feel, without losing the larger-than-life sense of form that makes the universe so iconic.

Composition & Modeling

The main composition for the scene was inspired by a few seconds of footage from the Astartes video. Beyond that, I approached the project the same way I would build an actual game environment. I did not lock down specific camera shots early on because I wanted the scene to have the flexibility to be explored from multiple angles, similar to how a player would move through a real space. My primary reference shot is shown at the top of my reference sheet.

When I started building the scene, I looked through some of my older 40K modeling practice meshes made in Blender. I used a few elements to help block out the proportions and figure out how the modular pieces would fit together. Most of the hallway assets were designed to snap cleanly in 3-meter sections. The floor was a little different because of the hexagonal pattern, which did not easily match the 3-meter grid, so I built the floor out of two meshes that shared most of the same UVs, with differences only in areas like the embossed Aquila (the eagle) and Latin text.

My modeling approach usually starts by focusing on function before form. I enjoy figuring out how something could actually be built or used, which makes adding believable details much easier. That mindset probably comes from my engineering background, and I think it helps environments feel more grounded and authentic. I am also a firm believer that creativity benefits from having some constraints, and the rules of physics and functional logic can be great guides. Mike Hill has some excellent talks on designing with function in mind if you are interested in exploring this kind of workflow.

Of course, within the Warhammer 40K universe, form often takes priority, even though there is a functional logic behind the dogmatic design language. Spaceships covered in cathedral windows and spires definitely lean towards form as a priority. With that in mind, I made sure to establish the large form elements early during the blockout phase. After that, I could shift focus toward how the assets could be built more functionally. It was a fun challenge for this project, even in less obvious things like the railing connection, which looks like the top of the Inquisition "I" symbol.

For the modeling workflow itself, I relied heavily on booleans. I primarily used HardOps for most of the hard surface work, with some Boxcutter mixed in. I started with large-scale boolean passes to establish the main forms, and once those were in a good spot, I did a second boolean pass to add smaller secondary details, like recessed panels or stamped metal shapes. After that, I refined the models with bevels and moved into UVing. There is an early shot of a wall piece below where you can see the initial boolean work before cleanup, but even at that point, the major visual read was already established.

I also used some other techniques, like a cloth sim for the pinned ceiling fabric. I think about having to do something like this in ZBrush a long time ag,o and love how quick and easy some of these new tools are!

For some assets, like the incense burners and the wax seals, I switched to ZBrush. While I could have sculpted them in Blender, I knew I could create them much faster in ZBrush with familiar workflows. It was a practical choice, especially since by that point in the project, I was focused on wrapping things up efficiently.

Retopology & Unwrapping

I did not focus too heavily on retopology except in cases where I knew it could cause performance issues, such as areas with excessive quad overdraw. The main example was optimizing the bolts. They initially had some janky and dense meshes, but I went through and did some retopology work, partly for practice but also because, without it, the UVs on those elements would have turned into a million small UV islands. Cleaning them up made the unwrapping process much more manageable.

For most of the project, I used Nanite for the static meshes, and since the majority of the geometry was not overly dense, I did not need extensive retopology elsewhere. However, one challenge that came up with using a Boolean-heavy modeling workflow was dealing with internal faces. If you are not careful, boolean operations can leave behind a lot of unnecessary internal geometry that does not contribute to the final silhouette and wastes resources.

I stuck with Blender for my unwrapping, along with its Zen UV addon. Most assets were set up with two UV channels. The UV0 channel acted similarly to a lightmap UV, where I focused on maximizing UV space to support my Asset Blend Masks. The UV1 channel was used for tileable material layers and was standardized to maintain a consistent texel density of 2K pixels per 1 meter.

Below is a shot of the UV0 layout for one of my wall meshes. In some cases, I split the mesh into a primary and a secondary detail mesh. This helped keep UV layouts clean and manageable, and in a real-time game environment, separating tertiary details could also be useful for performance, allowing the engine to cull or swap those elements at distance if needed.

Texturing

For the materials, I wanted to experiment with Substrate's horizontal and vertical blending and also improve on some of my past material layering workflows. One of the biggest pain points I have had with UE5's material layering system is the time-consuming and repetitive process of setting up material instances and assigning blend masks for every layer on each asset. It is also an easy place for mistakes, like accidentally assigning the wrong mask, which can become extremely tedious to debug. Even though my project was relatively small, it can still be slow and disruptive to the overall art flow.

To solve that, before I started texturing, I built a Blueprint Editor Utility (shown below) to automate part of this setup. The tool would automatically create material instances for a selected asset, create folders, name and assign them, and make assigning their associated parents and asset blend masks a much quicker and efficient process. While it only saved me about an hour or two for this project, I know this could save a huge amount of time in a full studio pipeline. Although there were some limitations due to how Blueprint interacts with material layer parameters, the tool still removed some of the most tedious steps. Finding ways to streamline boring or repetitive parts of the art process has always been a very rewarding part of bridging the tech and art sides of production.

The first place I started for the material workflow was making a list of technical and visual features that I wanted to support. This was one of the first things I put on my Miro board, and it let me better plan the material layers that I would need to create as well.

I started by developing a set of standard functions for base layer parameters and consistent texture packing. For full color layers, I used BaseColor, Normal, and a packed ARMH texture (Ambient Occlusion, Roughness, Metallic, and Height mapped to RGBA channels, respectively). I then started creating my base layers inside Substance 3D Designer. I ended up with about 11 base layers that I later used along with variants for my parent layered materials.

You can also see my decal texture setup inside Substance 3D Designer below. For my scene, I used a mix of mesh decals and projected DBuffer decals. Most of the decals were floated meshes, but I also set up material parameters to isolate parts of the texture for projection. I used the projected decals mainly on rounded elements where creating a custom mesh wasn't ideal.

Later in production, I also built a gradient-mapped material setup, primarily for my base metal layers. These layers used normal maps along with a packed GRAH texture (containing Gradient, Roughness, AO, and Height). The goal was to minimize texture memory usage and refresh my experience with building and using gradient-mapped color materials. The setup supported three gradient colors by default, but I also built an option to use a color lookup texture if needed. For my purposes, though, that level of complexity wasn't necessary, so I stuck with the simpler three-color approach.

There were also a few unique material setups, such as the prayer writs, wax drips, and adhesive tape residue that needed special treatments like subsurface scattering. I also made glass and tube light materials. For the glass, I used Substrate's horizontal blend to accurately simulate both light scattering and refraction while layering a rough, reflective dirt surface on top.

After finishing out the master material and some of those initial functions for the layers, I started on the Material Layer Blend. This is the critical piece of a good material layering workflow and something I wanted to push myself on for this project. I have a shot of it below, and that is with a couple of functions to keep the graph small. Although the full complexity was more than necessary for a smaller scene like this, it gave me the opportunity to refine my skills and make the workflow more scalable for future work.

Once the master material and layer systems were working, I created a smart material inside Substance 3D Painter to speed up and standardize the asset-specific blend masks. I experimented with baking curvature and AO into the mesh vertex colors but since my meshes were actually relatively low poly due to mostly being hard surface, those bakes did not provide much additional benefit unless I subdivided more heavily. In the end, I stuck with the Asset Mask Blend workflow and used my Painter smart material instead.

The smart material packed various masks into my User0, User1, and User2 channels, while the RGB colors were only for easy previewing. This approach got me to about 80 percent of the final blending quality quickly, and from there, I could go in manually to tweak and refine specific areas as needed inside Unreal.

Once the workflow was finalized and tested on a sample mesh, running the full set of assets through the pipeline was fast. It only took a little over a week to texture all the meshes and tune their material blends to a good level of finish.

The video below walks through the floor material specifically, showing a detailed breakdown of the layered material setup and key parameters I used to fine-tune the surface.

From then on, the rest of the texturing work mainly focused on refining layer blends and making small adjustments to the base layer textures, such as tweaking roughness or height information where needed.

Assembling the Final Scene

The scene assembly came together quickly, and I didn't change much with the overall layout or main composition. I started by importing my very rough blockout meshes into Unreal, which helped me plan out my modular pieces before refining the meshes further.

The biggest adjustment from blockout to final was increasing the hallway spacing. Early on, I had a column placed every 2.5 meters, but with the longer focal length I used for my primary camera shot, the hallway felt too compressed. I adjusted the spacing to 3 meters, and, while it was a relatively small change, I think it helped open up the scene and improve the visual read.

Most of the detailing work came through materials, decals, and the prayer writs and seals. Since much of the environment was dominated by clean, hard-surface elements, I knew I needed a strong contrasting element to break up the shape language and keep things visually interesting.

I originally had bigger plans for additional details, including a room beyond the doorway. I ended up scoping it down to shorten the project, but the blocked-out space is still there if I ever want to expand it later. The idea was to have a Warp/Xenos entity taking over that room, with chaotic tendrils creeping into the hallway. I was initially thinking of using PCG or some form of Blueprint splines to have the tendrils spread onto surfaces dynamically, which would have been a fun technical challenge! For now, though, that concept will stay in blockout form, but it's something I'd love to revisit down the line.

Lighting, Rendering & Post-Production

For lighting, my general approach is to stick as closely as possible to realistic lighting values and sources throughout most of the project. Toward the end, I sometimes make subtle adjustments, such as tweaking light intensities or volumetric effects, but I try to keep any changes minimal and save them for final polish.

Early in the process, I bring in lighting guides so I can balance my darkest, midtone, and brightest albedo values. This helps so I don't have to artificially skew exposure settings to compensate for inaccurate material values and can help me get an early idea of texture values and what surfaces might be better changed. Overall, I try to focus on maintaining solid PBR and PBL standards to ensure consistency and predictability, especially when it comes to how the camera lens and post-process settings behave.

For the actual setup, I used large rectangle lights for the main high wall floodlights. I was aiming for a look somewhere between incandescent and mercury vapor bulbs, creating a slightly greenish-blue tint. Later in the project, I added small point lights near the floodlights to introduce softer, scattered illumination. I also placed additional point lights to create a faint burning glow inside the incense burners, along with a red glow for the warning lights on the door.

Once I finalized my render cameras, I placed a few additional point lights specifically for those shots, keeping them subtle and complementary to the scene's overall lighting setup. I used cinematic cameras for all my renders, and most of my post-process work focused on fine-tuning the lens settings to match the mood I wanted.

My Post Process Volume settings were kept simple overall. I made minor adjustments to lock my exposure and added a slight color temperature tint through grading, but most of the scene's color and atmosphere came directly from the materials and light colors rather than relying too heavily on post-process effects.

Conclusion

It is a little tricky to track the exact amount of time since parts of this project go all the way back to when I was first learning Blender. A lot of those early pieces were completely remade, but it still feels like part of the same project timeline. From the blockout phase onward, the main production took about a month and a half. Most of the materials, UV work, lighting, and camera setup were completed in around three weeks, and quite a few of those days were spent adjusting the placement and detailing of the prayer writs and wax seals.

Weirdly, one of the most time-consuming aspects of this project was placing the prayer writs, wax seals, and adhesive tape. Part of the difficulty came from how I chose to place things in Blender. I tried to be efficient by creating some early clumps of assets to reuse and refine, but at that point, I had not yet created the wax seal meshes or adhesive tape elements, so I had to go back and revisit my placement as I added those elements. You can see some of the premade clumps in the image below but without the wax seals and tape.

Once I settled on a placement I was happy with, I did automate some of the masking work through bakes but I also spent additional time hand painting wax drips and refining those masks to better visually connect the meshes to the environment. Compared to other parts of the project, which were less iterative and slow, this ended up taking two to three full days. It was a good lesson in how quickly you can get to 80 percent quality, but how much extra effort it takes to push a scene that lasts 20 percent to feel polished.

If I were to revisit this process, I would definitely look into using some level of procedural placement, either using Geometry Nodes or PCG. It would have saved a lot of time, allowed for quicker iterations, and made it much easier to create more variations across the scene.

Ben Close, Lead Material Artist

Interview conducted by Gloria Levine

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more