Cino Lai shared a breakdown of the Substance Texture Design Pipeline of Cell project, discussing translating a 2D concept into 3D and creating Cell's cicada features, texturing segmented armor, skin, and eyes using custom materials with Substance 3D.
Introduction
Hello, everyone! I'm Cino, an Artist at Adobe Substance 3D. I'm passionate about everything related to material design. Usually, I also do some creations around material-related content and work processes. I'm very happy and honored to be here at 80 Level again. Today, I will share the ideas and methods for creating the texture work process of Cell.
It has been a long time since my last interview with 80 Level. During this period, I have been fortunate to get in touch with many excellent designers from various industries. I've learned knowledge from different industries and created works by combining the existing and continuously updated new functions of Substance 3D. Therefore, you will see a wide variety of subject matters, such as clothing, shoes, cars, cups, robots, etc., on my Artstation.
Choosing Substance 3D
With the continuous improvement of the Substance 3D ecosystem, my creative experience has transitioned from simply designing PBR textures to creating customized texture work processes. This is precisely what I appreciate the most about Substance 3D. The customization of texture work processes for the specific requirements of a certain project has the ability to help teams work more efficiently. To a certain extent, it eliminates repetitive and cumbersome work, making the design process more relaxed and interesting, and bringing more creativity to designers.
For me, creating customized work processes benefits from several features within the Substance 3D ecosystem:
- The procedural digital asset library
- Non-destructiveness
- Powerful customization capabilities
By leveraging these features and combining the requirements of some industries or the goals of projects, it is possible to specifically create corresponding texture design work processes.
As you can see, although the content of my works now covers a wide range, basically all of my works are developed around the same work process ideas.
Inspiration & References
I have been fond of reading the Dragon Ball since I was a child, and I'm a loyal fan. In previous cases, whenever I had the chance, I would test the new functions or workflows of some software by using elements from Dragon Ball. For example, here you can see the parametric content creation and the Modeler modeling:
Some time ago, when I was discussing the Substance 3D work process with a friend, we talked about the design applications in the biological category. Coincidentally, I lacked cases in this aspect, so I wanted to create a case to demonstrate the possibility of improving the efficiency that the Substance 3D work process has in this type. Cell is one of my favorite characters in Dragon Ball, and its characteristics meet the work process that I want to showcase. So, let's get started!
Modeling
I completed the 3D model creation of Cell in Blender. After polygon modeling, I performed subdivision and finally exported a medium-level model.
Under this kind of work process, a lot of time for creating high-poly models as well as making modifications and adjustments can be saved in the model-making stage, making the modeling process easier and faster. However, this is also the most challenging part of the entire work process. We need to make a reasonable plan for the creative content right from the start, determining the specific details and division of labor for models, materials, and tools. For example, we need to figure out which structures are to be modeled manually and which ones are to be generated by materials or tools. Depending on various factors, such as the creative content, the precision of the assets, and the application scenarios, the proportion of these process divisions will vary.
Such a work process is more like continuous project planning. The created tool sets and material libraries can be applied to the creation of other similar types of assets later. This is also the main reason why I can complete my work faster and faster. Over the years, I have gradually accumulated quite a number of customized materials and toolsets. Whenever dealing with content of the same nature, I can save a lot of repetitive work by using the tools in the asset library. Like in this mecha case, once the model standards are determined and the corresponding asset library is set up, it is possible to switch the detailed content or even generate different styles very efficiently.
Let's get back to the model stage of Cell. Since it's relatively low-polygon modeling, I created it as individual components one by one. The models of different colors are all independent, which is also for the convenience of creating UVs later. When creating UVs, the head was divided into four UV areas, considering more close-up shots: the face, the top of the head, the eyes, and the neck. The other parts were basically divided according to the overall body parts. The only special thing is that I grouped all the joints and the connecting parts of the torso together. Because this is the area where I plan to generate the skin material, I planned to generate the content with a custom material later, and it would be more efficient for generation and adjustment if they were grouped together.
Retopology & Unwrapping
I used the automatic UV in Blender to create UVs, and finally 13 UVs were created. Subsequently, I imported the model into Painter to start texture painting, with each UV using a resolution of 4096.
In this process, the medium-poly model is used to ensure the richness of the character's silhouette. Then, the generation of high-poly details is placed in the subsequent texture painting process. In this way, through a procedural approach, not only can rich content be generated flexibly and quickly, but also the normal details that were previously baked from the high-poly model can be directly obtained, saving the working time for retopology and baking.
I think the assets within the scene are of more value. Most are static objects, and we can freely replace various rich structures. Combined with Nanite in Unreal Engine 5, a lot of time can be saved. In many cases, I would directly omit the work of retopology.
Texturing
The texturing work for Cells also requires a prior plan. I usually divide it into two major groups. One is the Smart Materials, which is composed of some basic materials, layers, and tools. When I switch between different models/UV groups, I can directly use the unified set of tools and quickly generate content under a unified standard.
The other is the part that can be accomplished simply by parameterizing a single material sphere. However, the creation of such custom material spheres usually takes more time.
Let's conduct an in-depth breakdown, starting with the dark and light green carapace parts.
This is accomplished by Smart Materials, which include the following layer groups and the respective effects they generate. During the texture design process, I enabled the displacement so that I could see the final effects of the texture and high-poly details in real-time, facilitating adjustments.
Next, let's take a closer look at the composition and specific functions of each layer group. Firstly, there is the Base Texture group. This group contains the materials downloaded from the asset library and the Fill Layer that controls colors through Smart Masks. The texture content in this group is directly generated based on the model without the need for any additional manual operations.
Then, there is the Structure group, which is responsible for generating the detailed structures of the high-poly model. I created a parameterized Alpha in Designer and used the Path tool in Painter. Then some random detail processing was carried out through the filters in Painter.
I divided the details into three Path layers: large, medium, and small. Each layer was matched with different parameters to complete the superposition of the structures from large to small. During this process, I simultaneously enabled the displacement. When the content of this layer is applied to a new model, the pen's position needs to be manually adjusted to match the structure of the new model.
Then there are only the Detail Layer and the Speckle Layer left. The Detail Layer is also completed by combining brush Alpha and Path tool.
The speckle material I created in Designer has a very simple node group. It mainly utilizes the randomness of the Tile Sampler node and the Scale Map Multiplier in the node to control the size adjustment of local graphics.
I created a graphic in Designer to test the effect of random sizes. Finally, it will be replaced with an Input node so that a hand-drawn mask can be introduced in Painter for custom control.
I only output material with a parameterized Mask-like effect and didn't create a complete speckle material with rich details. The main reason is that in this way, various assets in the material library can be linked through masks in Painter, which not only brings rich effect extensibility but also improves efficiency.
The above are the functions of several layers in this Smart Material. When new components need to have content created, adjust the corresponding colors or the Path in each layer.
Next is the skin part, which is also the most efficient step in the Cell case. The skin material I created using Designer only requires parameter adjustment to generate the final texture and model details. It can automatically generate texture content along the corresponding paths according to the model's topological structure.
This approach is more beneficial for an entire project, meaning that as long as the project contains assets with this kind of biological texture, you can generate content in batches.
For such materials with strong customization, I usually bake the model first and then customize it based on the information of the baked textures. Only in this way can we ensure that models with the same standard can be used in batches.
The UV resolution of the model will also affect the generated content. For example, the parameter sizes and intensities of the nodes are all affected by the UV resolution. Therefore, for this kind of generative material sphere based on project model assets, relatively strict requirements will be imposed on the formulation of relevant standards for the preliminary model and UV.
According to the plan, what I extracted was the UV Mask of the model. I imported the Mask texture and the body trunk model into Designer to create textures. I created materials around the UV distribution and import the model to conveniently view the effects.
The eye material was also created in Designer. The Tile Sampler was used to loft the shape of small water droplets. Then, through the pupil's shape, the Scale Map Input and Vector Map Input were linked to control the size gradient from the center of the pupil to the periphery and the orientation of the small water droplets, respectively.
Let me share a node that I particularly like and often use to add details: Multidir.Warp. It is very simple and fast, but it can generate rich details. It has many modes that can be switched, and it can bring different surprises each time.
There are the main creation processes of Cell. The creation principles of the black carapace, orange skin parts, and green speckle Smart Materials are the same. They have less content and are simpler. The main difference lies in the basic textures selected from the asset library. When the tools are almost ready, it will be relatively easy to create textures for other parts subsequently.
Rendering Verification in Substance 3D Stager
I like using Stager to verify the production effects. Because Painter can send the models and textures to Stager with one click at any time. After the models are sent to Stager, there is a one-click option to enable ray tracing, and then the final effects can be seen in the shortest time.
If your models have displacement, the displacement effect will also be automatically enabled in Stager. You can further manually adjust the number of subdivision surfaces. Under the ray tracing rendering mode, the displacement will also produce realistic shadow and reflection effects, which is closest to what it will look like after I import it into the engine.
Lighting, Rendering & Post-Production
I imported the model and texture into the Electric Dreams Environment of Unreal Engine 5. I didn't make any changes to the scene because it was already perfect. The only thing I did was to pick a good location for Cell. For more detailed effects, I added displacement to each component in UE5.
I created a camera. All the effects were adjusted within the camera, such as exposure, depth of field, and vignetting. Finally, adjust the lighting angle according to the lens.
Subsequently, I felt that something was missing. So I opened Designer again to start creating lightning. I used the Spline node to control the free adjustment of the lightning shape, and also used the Multidir.Warp node to deform the Spline. After ensuring a fixed deformation value, various shapes of lightning textures can be obtained simply by adjusting the position of the Spline or randomizing the noise.
With the addition of lightning, a satisfactory overall effect was achieved.
Conclusion
The most time-consuming part of this process was the planning at the beginning of the project as well as the preparation and testing of tools. Since I was working on this project in fragmented time periods, I don't have a complete record of the time spent. However, I do have a rather deep impression of the time consumption in some steps, especially the part of creating customized content.
For example, for the blue skin material, after having an idea, I spent the first day in Designer verifying its feasibility and trying out some combinations of nodes. On the second day, I made adjustments and refinements, and on the third day, I was able to test it in Painter.
The Smart Material group for the green carapace was done in the same way. It took about 3 to 5 days to complete the selection of basic textures from the asset library, the process of customizing masks and using the pen tool, as well as the verification of the effects.
Under such a workflow, planning in the early stage, customizing materials and tools, and verifying the process are the most time-consuming parts. Once these are determined, the subsequent content production will become very fast. This is also what I like most about Substance 3D. Its powerful customizing ability can greatly improve work efficiency, and it is especially suitable for team collaboration. I hope that through this case, everyone can see more possibilities within the Substance 3D ecosystem.
For artists who are new to the field of material creation, I'd like to recommend an excellent Designer fabric tutorial made by one of the many great teachers in the Substance 3D teams! The tutorial is very systematic, ranging from basic knowledge to specific project applications, and can help artists establish a good understanding of the working principles and processes of Designer.
Finally, I'd like to express my gratitude once again to 80 Level for the invitation. I will continue to explore the Substance 3D workflow related to materials, and I'm looking forward to more opportunities to learn and communicate with everyone in the future.