Jonathan Caro shows how you can use Unity to build some very interesting lighting with cool little details and beautiful materials.
Jonathan Caro shows how you can use Unity to build some very interesting lighting with cool little details and beautiful materials.
Introduction
Hey, 80 LEVEL readers, my name is Jonathan Caro and I’m a 3D supervisor working at a mid-sized media agency located in New Jersey, USA. Art, computers, and video games have been a part of my life for over two decades now, however, I’ve only just started diving heavily into real-time rendering and authoring 3D assets for game engines over the past 2 years.
My professional background stems from offline rendered commercial, broadcast, and digital signage projects of which primarily cater to major automotive manufacturers. I’ve been working with 3D Studio Max, Chaos Group’s Vray, and The Foundry’s Nuke for all offline rendered projects. Offline rendering meaning, primarily utilizing a CPU to fully ray-trace an image and output a single frame or a sequence of images which then goes through a compositing process. Two professional projects of note that have jump-started my interest with real-time rendering and authoring 3D content for game engines are: Mercedes-AMG Power Wall and Cadillac in Virtual Reality, both of which are powered by Unity. Cadillac in Virtual Reality inspired me to learn more about hard surface modeling, texturing, scene layout and lighting for game engines, as my team and I at the time were given a concrete deadline to develop a VR application; of which we have never done before in our careers.
Lighting in Unity
The “Light & Color Study” series I’ve recently been working on have been concentrated exercises in leveraging Unity’s built-in lighting systems, to achieve as close as a possible match to an image reference. The main elements of Unity’s lighting systems that I’ve utilized are:
- Environment ambient lighting controlled by a single color
- Real-time spot, point, area, and tube lights. (area and tube lights used from the Adam demo.)
- Baked emissive materials, associated with geometry
- Light probe network to light non-static objects (i.e dynamic objects) with approximated indirect lighting. I use this on smaller props, or non-hero objects to reduce the number of lightmaps needed at runtime, and keep bake times down.
- Real-time, baked and custom reflection probes
- Post-processing stack v2
- Enlighten’s lightmap scene baking parameters, and per object lightmap settings.
All of these elements work in conjunction with Enlighten’s global illumination in order to have responsive direct lighting, clean indirect bounce lighting, and efficient bake times and runtime lightmap file sizes.
The driving force behind achieving a film-like look within Unity, however, is the engine’s ability to work in linear color space and High Definition Range tone-mapping. These are the first two settings I enable when creating a new Unity project.
As I mentioned earlier, using light probe networks is a great way to speed up runtime performance and reduce amount of lightmaps required for baking.
A neat trick is to create empty game objects as a child of your individual props, which in return get utilized as light probe anchor points. Using these anchor points you can instruct the prop where it should be receiving the approximated indirect lighting controlled by the probe network.
Parameters
First and foremost developing a strategy for what objects will be marked as lightmap static will help guide your scene layout and dictate your baking time. Marking an object as lightmap static means the mesh will be present in all lighting calculations and contribute its UV’s to a lightmap. I almost always have a scene’s walls, ceiling, floor and stationary large hero objects marked as static. I use the term “large” very loosely here; particularly any object as large as a table will be marked as static. Additionally, I keep each mesh’s lightmap parameters at scene default or low resolution.
Keeping lightmap settings low when first creating lighting is critical to being able to work iteratively without being constrained by bake times. The progressive lightmapper’s default settings are a great starting point for light layout. Having the ability to prioritize baking of texels in the camera’s FOV is conducive to working at an efficient pace. When I have the light layout at a favorable spot, I switch to enlighten and begin increasing lightmap resolution and padding by powers of 2. This continues until I’ve reduced light leaks and GI splotching. Moreover, I may create specific lightmap parameters for static objects that are close to the camera, to help showcase cleaner GI when in direct camera view.
The latest scene
I choose to work at a 1:1 scale when blocking out geometry, so I always set 3D Studio Max’s units to 1 Unit = 1 Inch and display units to Feet w/Decimal Inches. This helps instruct scene scale and object believability when trying to match a single reference image.
When blocking larger objects such as the walls and stairs I try not to deviate too far from their primitive beginnings as I try to have texture and light define silhouette when working with the structural form. Additionally, introducing chamfers to sharp edges on meshes aids in reducing aliasing at runtime. These days polycount’s don’t need to be as strict as they once were for real-time rendering.
Another aspect of scene setup/blockout that often gets overlooked is object naming conventions. I adhere to a strict set of rules that guarantee objects will always have an associated material and texture maps in the engine, and any artist would be able to re-associate new maps/materials without having to go digging for assets. As an example:
- Object “int_wall_a_01” details an interior placed wall dictated as style “a”. The enumeration as a suffix is for object replication.Object “int_wall_a_01 will be given a material named exactly the same.
- Material “int_wall_a_01” will be supported by textures that follow the exact convention, however, the suffix appends the type of material, I.E “_normal, _albedo, _ao” etc.
- If a material for the aforementioned object needed a version with an orange tint in the albedo, I’d introduce a unique suffix after numeration labeled “_orange” That way material “int_wall_a_01_orange” still has the correlated texture maps and still gets applied to the same object. The only unique attribute is the orange color shift of the material.
When I’ve reached a decent composition in max in relation to the reference image and have authored UV’s for surfaces I’d like to mark static, I prep objects for export and bring into unity for initial light setup.
Composition
The one conscious decision I had for this scene was a deadline of 3 days max in order to keep myself focused and not get sidetracked. As I mentioned earlier I aimed to keep geometry not to complex and close to its primitive state, which may very well be contributing to the sharp look. The intention was to pass Unity lightweight geometry that would allow for the application overhead to be utilized for lighting.
That being said, I did deviate from the reference image in order to direct the audience’s eye along the left wall towards the signage. I achieved this by having the piping on the ceiling skewed to the left in unison with angled area lights to splash specular highlights along the left wall. This naturally guides the eye towards the center without introducing too many complications that would distract the viewer.
Bakes
All of the baked lightings are being driven by one emissive material associated with 6 cylinder meshes. The image above illustrates the advantage of baked material emission. The bake is able to fill the geometry with smooth light, that produces a wonderful luminance in conjunction with soft shadows and ambient occlusion.
The settings used in the image above are what produced the scenes lightmap bake. Total baking time for this scene is approx 2 minutes. Less is more with this approach, as you’ll see in the real-time lighting UV charts, the low indirect lightmap resolution and efficient UV’s for lightmap static meshes allows for this scene to run at 4K 100+ FPS on my i7 4770k, GTX 980Ti, and 16 gigs of DDR3 ram.
Real-time lighting
The real-time lighting is being driven by both direct and indirect lighting. The area and tube lights from the Unity Adam Demo are primarily contributing to the direct lighting and specular highlights found in the scene. The area lights produce great soft shadows, however, the tube lights do not have a proper Percentage-Closer Soft Shadows (PCSS) implementation. Each tube light can have two shadow planes. Shadow planes cut off the light’s influence and have a controllable feather.
The image above displays the indirect contribution via three default Unity point lights. I used these three point lights with varying range sizes and indirect multipliers in order to better separate foreground, midground, and background. The foreground in the subway exit scene has a lower luminosity in order to pronounce the mid-ground. The background has a hint of orange when Unity composites the baked emissive light, real-time indirect light, and textures.
I’d recommend reading the Unity documentation on the different GI scene draw modes and what each represents. Understanding how to utilize the different GI scene draw modes to adjust lightmap parameters will aid in keeping bake times nominal and light maps clean and efficient.
Color
Firstly, as mentioned earlier, when starting a new Unity project I set the color space to linear and mark the main camera as HDR. This provides correct support when the color grading mode is set to High Definition Range. Secondly, the Academy Color Encoding System (ACES) tonemapper is selected, as this will be driving the film-like look of the final grade. Another benefit outside of the film industry standard tone-mapping, is its ability to produce a consistent and predictable display of the game on a wide range of display devices.
Unity’s post-processing systems, specifically its color grading effect, feel right at home when compared to a grade node in The Foundry’s Nuke. Obviously, the number of options pale in comparison to a full-functioned post-production suite for offline rendering, however, the same logic and core tools apply, such as color wheels, lift, gamma, gain, channel mixing, curves, etc.
Specifically for the subway exit piece, I wanted the grade to do the heavy lifting on color and tone. With this in mind, I rarely deviated from the default albedo color of materials and stayed as close to true light color for the emissive material and lights as I could. During the grade, I started with a color filter [Hex #9B9E62 / R:0.6 G:0.6 B:0.38] upped the contrast and went from there as my baseline adjustments. Since green was the predominant color applied by the filter, I used a lift to introduce yellow/orange into the shadows. I also increased the gamma a tiny bit to introduce some blue/violet and increased the gain slightly to have my brightest sources of light slightly lean towards green/cyan. Have a look at my grade settings below.
Is this setup viable for a game?
I suppose my answer to this is quite subjective, as the scene was built with the end result being an executable that highlights the graphical capabilities and lighting leveraged in Unity 2017.3.1f1. Since this isn’t a game in itself, I’m able to use whatever bandwidth needed to output this at 4K 100+ FPS.
With that being said, a majority of my normal and metalness maps are being limited to 512×512 in unity. Many materials share the same detail normal map and the light fixtures and some pipe geometry is driven by a single default unity material with no texture maps. The ceiling and ground below the stairs are driven by custom lightmap parameters that are set to a resolution lower than the default “very low-resolution” setting. So, there are some efficiencies in this scene, however, I’d imagine I’d have to cut back on albedo texture resolution, amount of area and tube lights, scene lightmap settings, and other things if this was used for an actual production.
In closing, matching film-like lighting is a fun and challenging exercise and the current tools in Unity do a great job giving an artist the control he or she needs. I’d like to thank the Archillect Twitter feed for its constant bombardment of imagery that has made me want to jump into Unity and create! Moreover, Neill Blomkamp and his team at Oats Studios has jump-started my interest in leveraging Unity for high-quality VFX work. As someone who hails from a background in traditional offline render production pipelines, the speed of which I’m able to iterate and create using Unity is, in my opinion, poised to shake up VFX as we know it. I’m looking forward to what improvements Unity 2018 and the HD render pipeline brings to the table. Thanks for this opportunity 80 Level!