logo80lv
Articlesclick_arrow
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
profile_loginLogIn

Hair & Fur: Procedural Hair Texture Generation Tool

Olivier Lau reviewed his Hair & Fur tool for making hair/fur textures in Substance Designer: structure, features, capabilities, advantages, and more. 

Introduction

I am a generalist developer who began to look into 3D and game development related topics about 2 years ago. I initially focused on learning Unreal Engine 4 and photogrammetry techniques, and more recently Substance Designer. At this time, I was also looking at modeling characters and creatures and this is why I got interested in hair and fur generation.

Approaching Hair Generation

The first introduction I had to real-time hair was when watching Johan Lithval’s CGMA webinar Creating hair for games. I was blown away by the quality of what was done there, the techniques used and that I was finally understanding how the different parts were coming together. However, not being a Maya user I had to find another way to generate the hair textures. I looked at other tools but was not very successful in producing something easily and in a reasonable amount of time.

From what I could see, the most common way to make hair/fur for games was to use hair cards, low-poly geometry onto which hair clump textures are mapped. These are then handled by a specific shader that can drive various effects such as depth, transparency, anisotropic reflection, etc. The types of texture maps required and the way shaders use them is not necessarily common to all workflows, it depends on how the shader is working and performance requirements. I think the fact that some workflows require specific maps and some others (for which sometimes only the map name changes but not the function) is creating some confusion among artists, at least it did that to me at first. Hair & Fur is able to deliver 10 types of maps (including derived ones, not counting all the possible generation modes) which generally covers most usages.

In parallel to hair topics, I was looking at Substance Designer and was especially interested in its programmatic capabilities (functions). What really linked the hair texturing to Substance Designer was when I watched Vincent Gault’s enlightening FX-Map introduction stream (in French). I understood then that most Substance Designer nodes were built upon a reduced set of atomic nodes, and the FX-Map was actually acting as an advanced screen device able to display patterns anywhere into a texture with various options. Most of them could be driven by functions. Adding to this the ability to iterate, this makes the FX-Map a freely programmable self-contained texture generator. I started to implement a Bézier curve and gradually it started to look like a tool that could generate hair/fur textures.

I want to thank Nicolas Wirrmann from Allegorithmic who provided invaluable help regarding the understanding of Substance Designer programming specificities, advice, and optimization tips. Hair & Fur wouldn’t have been the same without his support.

Hair & Fur Features

1 of 4

Hair & Fur lets users shape hair clumps of various types and styles to be used as hair cards and generate PBR texture maps plus others typically needed for hair shaders. It also provides tiling tools to assemble clumps into larger textures so rendering engines have a smaller amount of textures to handle. The tool may also be used for 2D works (samples are provided). It features two colorization modes that may be used or not. The Depth map, for instance, may also be colored outside of Hair & Fur, either in Substance Designer or other applications. Samples (examples), Templates (start-up files) and Presets (base designs) are also provided so people can quickly get started with various hair designs.

Even though running in Substance Designer (SD), only a basic knowledge of the latter is required as most items are self-contained in the tool’s Substances. Substance Academy has great introduction videos to the UI basics of Substance Designer and Hair & Fur also has tutorial videos first part of which contains an overview of the SD UI.

Hair Clump Shaping

A challenge regarding hair texture generation is that there are a lot of hairstyles (including coloring), but the feature set and the user interface need to be limited to a few items only. So every time I am looking at supporting something to represent a given hairstyle, there is the need to abstract it from a specific example into something more general that would work for this case but also others. The approach I came up with is to view things at several levels and provide shaping functionalities for each. The top level called Parent represents either the clump shape or a part of the clump. Up to ten Parent Strands can be generated this way, each usually representing some large scale specificity within the clump. They can be individually positioned, rotated and shown/hidden along with their associated strands (children and subdivisions as described next). Child Strands are spawned per parent, their count is not limited, they are used to “fill up” the clump section associated with their Parent Strand. Then come Subdivision Strands, which are spawned per parent or child strand. They provide thickness as strands are sometimes aggregated together along with a common shape like in curly hair. Finally, Related Strands share an organization relationship, they are used to create braids but also cordage.

Strands can be shaped and organized using up to 10 control points. Their position is defined by either various parameters or directly by moving them in Substance Designer’s 2D view.

Child hair strands can be combed, either along the whole clump length or partially.

Pattern Management

Strands are made of patterns that are either bitmaps or procedural textures generated by Substance Designer. Four default patterns are available, users can also provide their own, grayscale or color. Pattern size and resolution can be chosen, patterns can be oriented to follow the clump flow or not and can be dynamically sized, reducing their count to produce a continuous line. Dynamic size can also be used to create a fluff effect or even straws when pushed to an extreme.

Strand Thickness, Fading, and Length

Strand thickness can be controlled at root and tip along a defined length to which randomness factors can be added. This is mostly useful for thick strands which may be used for stylized hair.

A fade effect at root and tip can be applied to strands in several maps, it can be either uniform or more or less random based on user settings.

Strand length can be controlled as well as root grouping with a proximity factor.

Subdivision Stands

Subdivision stands are spawned per parent or child strands (then called referral strands), usually very close to their referral strands to give them thickness. Users define a spreading space where they can drift into, their dispersion can be controlled, their depth may be more or less related to one of their referral strands. Subdivision’s depth can also be attenuated on the side of their spreading space (Side Depth), braids use this feature for roundness. Some subdivisions may be stray and move away from their spreading space to create flyaway hair. Subdivision strands may be randomly distributed among their referrals to create diversity.

Modulation

Modulation is used to create wavy/curly hair and braids/cordage when associated with Related Strands. There are two modulation functions, sine (with variations) and triangle, the latter may be used for non-hair designs. Users define the frequency/amplitude of the waves. Amplitude can be more or less randomized as well as frequency through the usage of frequency modulation. These settings help for realism so not every curl is exactly the same.

Depth can also be affected by modulation, creating a 3D effect the placement of which is configurable and can be combined with the actual strand depth. Modulation amplitude can be faded at root and tip, so it is possible to have the root part less wavy than the tip, for instance. Modulation follows the strand flow so it can be used for strands in any direction such as round shapes (curved hair or hair buns).

Texture Map Generation

Hair & Fur outputs multiple map types, some are useful for shaders, others are more suited to 2D works.

  • Mask: binary mask distinguishing hair strands from the background.
  • Alpha (Opacity): this output retains the opacity of the pattern and supports fade effects at root and tip.
  • ID (grayscale/color): each hair strand is represented by a random gray level or color with configurable ranges. Shaders may use this to distinguish hair strands from each other and apply them to a distinct effect.
  • Depth (Height): this map contains height information-driven by depth profiles that can be generated automatically or user-provided. This can be used as pixel depth offset by shaders to provide a sense of depth to the hair. It can also be used for other purposes, such as a specular mask or intensity.
  • Gradient Ramp (Root map): similar to the Mask map but using a configurable gradient at the root. This is used by shaders to change color (like darkening) hair towards the root.
  • Color (Diffuse/Albedo): the Color map may not always be required by shaders, as it can be generated in real-time derived from other maps and user parameters. However, it can be useful for 2D works or shaders requiring such an input. Hair & Fur provides various ways of coloring hair, along the strand’s length (Length colorization), by hair groups (Group colorization), depth-driven and by a combination of modes. This will be detailed further below.
  • Flow (Direction): a vector map containing directions of hair strands along with the clump flow. This is used by shaders to drive the anisotropic reflection on hair so it follows the hair flow. The flow information can be generated per hair strand or more globally for the clump (or anything in between). It also generates a directional background (solid or using dilation).
1 of 2
  • Normal: the Normal map is derived from the Depth map using a Substance Designer node. This map may not be required by all shaders.
  • Ambient Occlusion: also derived from the Depth map, it may not be required by all shaders.
  • Utility: this map is generally not exported but used while working on the clump design. It can display any other map with an optional control point overlay as well as the Flow map using direction patterns so the user can verify the flow is as expected.

 

Depth Management

Hair depth is managed through an independent depth profile. It generates a texture where each column represents a distinct depth profile which can be associated with a hair strand. 

Depth profiles are generated automatically with general generation parameters such as frequency/amplitude of the depth variations. The first 10 columns of the depth profile texture are reserved for parent strands. These can either use general/specific generation parameters or even a user-provided external depth profile (which may be generated by SD nodes or other apps).

Colorization

Hair & Fur has two colorization modes - Length and Group - which can be used independently or combined. Both operate using a user-provided color source texture. Length colorization colors hair strands along their length while Group colorization picks color groups into the color source and applies them to groups of strands. Both modes can be combined using multiple blend modes such as Multiply, Soft Light, Overlay, etc.

Both Group and Length colorizations can be applied to groups of hair: Parent, Child or Related Strands (braids). Using Group colorization, each parent and relatives (child, subdivisions) or each child strand and relatives (subdivisions) can use a distinct color group. For braids, each braid component may use a different color group.

Color source textures can have various organizations, below are some examples as well as the results they can produce.

Length colorization usually uses a color source organized by row, as each column represents the potential color variations of a hair strand along its length. Colors do not have to be uniform for each row though and below are examples of what can be achieved by modulating either the luminosity or the color along the texture width.

Regarding Group colorization, the illustration below shows how color groups are being made by the function of the color source organization. With no specific color organization, a group is defined by a circle the center of which is picked randomly and the radius is configurable. For color sources organized horizontally or vertically, the group is made using a random center and an extent in the direction where colors vary. Once a group is defined in the color source, random colors will be picked inside it and assigned to hair strands grouped by either Parent, Child or Related Strands.

It is also possible to get coloring without using the colorization features, here is an example where the depth map has been colored using SD nodes.

Hairstyles

Hair & Fur can generate multiple hairstyles: straight, wavy, curly, large curls, braids (and cordage) as well as round shapes suitable for hair card buns. It is, however, to be noted that currently it only generates textures, so designs such as braids or large curls may be used on hair cards generally for small features. For larger ones, a 3D positioning such as interlacing for braids is needed for more realism. Hair textures then do not need to be braids but straight hair as the 3D model would handle the braid design.

Braids may have different aspects determining the frequency and amplitude of modulations, the spacing between components, their overall shape, how much they are compressed when passing under other components, adding flyaway hair, etc.

1 of 2

Large curls can also be configured in various ways regarding their thickness, depth aspect and how organized they are.

Tiling

Within the context of hair cards, several hair clumps are generally assembled into larger textures in order to reduce the texture count for game engines. For this, tiling tools are provided:

  • Tiler (grayscale and color): tiles one type of texture, grayscale or color, for instance, creates a tiled Depth map or a tiled Flow map.
  • Global Tiler: tiles several types of textures at once, i.e. all the hair clump texture types of all the hair generator instances are tiled at once. An oriented background can also be generated for the tiled Flow maps.
  • AutoCrop: crop clump textures function of an aspect ratio. 

 

These tiling tools can operate on any texture, not only hair textures, but they may also be used to organize atlases, for instance.

Tiling tools enable the user to set up the aspect ratios of both the inputs (the clump textures to tile, generally square) and the outputs (here a rectangular 2:1 texture). Clump spacing/sizing/offsetting can be operated for all clumps at once so we obtain a tiled texture very quickly. We can then position/rotate/size independently clumps that would require further adjustments. Note that during these processings, the input clumps are never upscaled, the various scaling adjustments they receive are combined and then performed on the original clumps resulting in downscaling only, this way preserving texture quality.

Benefits of Procedural Texturing

Creating hair textures procedurally has several benefits:

  • Procedural generation enables the user to work independently on settings impacting certain aspects of the hair while combining them with other settings dealing with other hair properties non-destructively. We can shape the clump in a specific way and check how it would look with curly hair, or slight waves, or straight hair, all this without altering other clump settings.
  • Hair & Fur parameters are controlling fixed and random-based settings. The latter generate random values related to a user-configurable random seed. This means random values are going to be modified when the random seed changes. This way, by simply moving a slider, it is possible to generate thousands of variations based on a given design, browse through them and pick the ones we like. Moreover, restoring a given random seed value always restores the exact same design, which can be saved into a Preset and re-opened later.
  • We can create a library of Presets representing various designs and use them as bases for new designs.
  • We are not constrained by resolution, it is possible to make an 8K hair texture if we want to get very much into details, and use the same design at 1K or less if tiled with other textures into a common texture map.
  • Designs may be visualized in 3D which helps in particular for depth calibration.
  • There is no need to bake, what you design is exactly what you get in the final texture.
  • Possibilities are augmented by combining the features of Hair & Fur with those of the host software, Substance Designer, where textures can be pre-processed or post-processed in many ways.
  • If for some reason one could not make a specific design with a single instance of the hair generator, it is possible to combine the outputs of several instances using dedicated SD blend nodes. 

Integration into Substance Designer

Substance Designer is a procedural texture generator, so Hair & Fur is in a well-suited environment there. The benefits of operating into this context are mainly modularity, the non-destructive aspect, and the additional functionalities which, combined with those of the hair generator, lead to even more possibilities. Here is a non-exhaustive list of features provided by the interactions with SD:

  • Users can provide their own patterns to draw hair strands, which can be generated using SD nodes or an external app.
  • SD nodes can generate color gradients to provide color sources for the colorization features. These color sources can be tweaked by any SD node for brightness/contrast/saturation/color variation/deformation, etc. prior to passing to the hair generator.
  • Depth profiles can be processed with any SD node before passing on to Hair & Fur, many effects can be created this way beyond what the hair generator is capable of doing itself for depth profile configuration.
  • Due to the procedural nature of SD, nodes being placed ahead of Hair & Fur can be tweaked while visualizing the hair outputs. This way we can change coloring or depth settings on an SD node while looking at the result directly on hair in the 2D or 3D views.
  • Output maps can be further processed and blended together after their generation. For instance, blending the Depth and Color maps using a Multiply operation generally produces good results for 2D works.
  • 2D artists can overlay hair on top of a character image using a Blend node and modify the hair design in real-time while seeing how it fits on the character.
  • Derived maps such as Normal and Ambient Occlusion can be generated using SD properties with all the options these nodes provide (normal map algorithm selection, OpenGL or DirectX format, various AO properties, etc.).
  • Maps can be converted and packed into texture channels before exporting using appropriate SD nodes.
  • Hair textures can be visualized and rendered in 3D (using displacement) with OpenGL or iRay renderers. This helps to visualize and potentially fix the depth effects as this may not always be obvious in 2D (like for clipping).
1 of 2

Hair Textures Usage in Applications

The generated textures can be used in 3D applications and sometimes also in 2D ones with appropriate map blending as mentioned above. To that attention, samples are showing various examples of what can be made for 2D.

The output textures can generally be used as-is in 3D applications such as game engines, they do not need further adjustments. If such ones are wanted though, they may be made inside SD by further processing the outputs, or in an external application. The package is provided with a Toolbag template and the documentation contains usage instructions for the Unreal Engine 4 hair shader and Toolbag.

Finally, I tried to make things easy for users to get into the app providing a detailed manual covering all parameters, various concepts, and tips on making specific hairstyles, a free try-out app, and a 4-hours, 5-part set video tutorial:

  • Part 1: Substance Designer intro, basic Hair & Fur concepts and output maps
  • Part 2: Hair & Fur parameters continued
  • Part 3: Colorization
  • Part 4: Tiling and map usage in Toolbag and UE4
  • Part 5: Seven hair design examples

 

I can’t wait to see how artists are going to use these tools! Feedback is greatly appreciated (for instance using the Polycount thread), it will help to enhance and create additional features along with those currently being planned. For further updates, you may check my Twitter and Artstation accounts. Have fun making hair!

Olivier Lau, Developer / Technical artist at Eyosido Software

Interview conducted by Kirill Tokarev

 

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more