logo80lv
Articlesclick_arrow
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
profile_loginLogIn

Destiny 2: VFX Production Tips

Mike Stavrides, Senior VFX artist at Bungie, gave a talk on VFX for games and the way he worked on incredible effects for Destiny 2.

Introduction

80lvCould you introduce yourself to us? Where do you come from, what do you do, what projects have you worked on? How did you get into VFX in general and end up in Bungie?

My name is Mike Stavrides, and I am a Senior VFX artist at Bungie. I have been with Bungie for almost 6 years now, and have worked on every Destiny release.

I have wanted to work in games since I was about 13 years old, and it took me a long time to reach that goal. My desire to create games mainly stems from my love of these games: Meridian 59, Ever Quest, Final Fantasy Tactics, Command and Conquer Red Alert and Doom series.

My career actually started over on the East Coast, and I found my way into contract work for quite a few years. I worked on cell phone content, commercials, educational cartoons for children, and medical animations. For most of these jobs, I was a modeler or a generalist.

I would try to learn as much as I could to keep myself relevant and employed for as long as possible. These jobs were all contract work, and I wanted each one to last as long as it could. I would start on modeling for a project, then usually help with rigging, lighting, and rendering. This let me stay for the entire project length and also taught me many skill sets. It was when I was working on cartoons for children remotely for a company in Japan that my wife and I moved to the West Coast. We had the safety net of knowing some friends out near Seattle already, and I was able to take my job with me since I was working remotely from home. I was also eager to move out to Seattle because it is a video game hub. If I was going to be able to find my way into games, I had to be close to where the jobs are. I eventually found my way to a Microsoft contract, and this is where I started to get my first real time experience. I was hired to model, texture, and rig a character that would be used with Kinect for some demos that they brought to colleges. They kept me on for about a year. We made various projects, some real time, some pre-rendered, but this was my start in real time work.

Luckily for me, a friend I had met in college was working at Bungie already. This was now 2012. Bungie figured out what they wanted to make for Destiny, and they were starting to hire more people for it. My friend kept talking me up, and I was able to get an interview for a new team being created. The team was internally called Spec Ops, and the idea was this would be a small multi-disciplinary team that would handle the spectacle moments in the game. We needed to be as self-sufficient as possible so that we could take an idea from start to finish. I first started out doing technical set up, rigging and creating doors. Tasks quickly grew more and more complex, and, before I knew it, we were creating large story moments for the game. As time went on, I kept noticing that the bottleneck was always FX. Everything in Destiny has little glow bits on it or space magic of some kind, and there were just never enough people to create the amount of VFX content that was wanted. Seeing this, I started asking questions about shader VFX to anyone that would listen and help me learn. I would also look through files that VFX artists were creating and try to deconstruct it. I used that knowledge to start making very simple glow cards that I would use on my tasks. From there, I kept asking more and more questions and then taking those learnings to my next task and tried to push what I was able to a little more each time. Before I knew it, I was working on the original ghost shader VFX, and creating the crystal shader for The Vault of Glass. The great thing about VFX is that it is a conglomerate of many skill sets to make an effect sing. All of my learning from those previous jobs started clicking together. I was able to model and UV my shells for shader VFX, to add lights to support the FX, to rig and animate patterns, and I knew the engine inside and out to do what I wanted it to do. I think I officially became an  FX artist somewhere during Rise of Iron, but by then the line was so blurry because of all of the FX work I was already doing. I always liked working in CG, but it wasn’t until I started making VFX that I realized that this was truly where my passion lied.

One of the things that I love about VFX is that hardly anyone goes to school for it. It is like an old style apprenticeship where you find someone, and for some reason, they decided to hand down all of the learnings they have to you. Then you become a combination of that teacher and your own unique style. It’s beautiful.

VFX in Games

80lv: Could you share your general philosophy in terms of VFX? What are the main elements that you feel are most important for every VFX production?

VFX in games’ most important role is communication, especially in Destiny. We make many assets that players need to interact with. Many of them have different states and each of those need to be readable during the chaos of Destiny moment to moment combat. Players need to immediately understand if something is dangerous, if it has a timer, defining bounds for either a capture location or a damaged area of denial. At the end of the day gameplay and the mission’s design win out. We want it to look as beautiful as we can, but if it isn’t working for gameplay then we need to stop and ask why isn’t it working, and make corrections. I am a huge believer in playtesting the game and seeing how the assets are being used in the space and how players are able to understand them.

VFX is all about being deceptive. Especially in real time VFX. Most assets are just a series of simple particles, or mesh shaders all layered together to look very complex. The Destiny VFX tools are quite powerful, and we are empowered to go outside of the realm of what most companies have a VFX artist do. We currently don’t use flip books to make Destiny VFX, but we have a library of textures that we mix and match together to make all of the different visuals. I am always trying to mix and match in a way that someone can never catch what the true textures looked like. Obfuscation is a VFX artists friend. We often hide motion to be omnidirectional, and our shader networks sometimes get very complex.

Many of the assets I work on incorporate lighting techniques, mesh shader FX, particle FX, post process, camera shake, and rumble. It is the combination of all of these elements together that can make a great looking Visual Effect.  I often find a little camera shake goes a long way in selling an effect. If something is feeling flat but should feel larger ask yourself if a camera shake is appropriate.

Gambit Motes VFX

80lv: We’d really want to concentrate on some of the pieces you’ve done. You don’t have to go full in depth here, but would be awesome if you could at least give a general idea of how these are created, so people could try to do something similar in their personal projects. The Gambit Mote looks amazing! Could you explain how you’ve done this beautiful lighting effect where everything is sort of reflected and cracked and the shines in a different order? How this illusion of the glass reflecting inside endlessly created?

The Gambit Motes were one of the most fun, and difficult, assets I worked on for Destiny Forsaken.  It had to work in a wide variety of lighting conditions, draw the players attention, and give players information about where it is, and how much time is left before it vanishes.

Most of the Gambit Mote VFX are done with mesh shaders. I chose this route for a few reasons. Combat, in general, consumes a large amount of the particle budget for the engine. We have a priority system that helps handle the importance of VFX, what should be on screen and what can be omitted from an effect. Using mesh lets me get volume more easily, it also lets me use different shader techniques and render blend modes. The Motes are a combination of additive, alpha blend, and opaque shaders all working together. Also because it is mostly mesh based it gets around the run time limits that could cause priority systems to kick in and start removing other VFX.

The reflected dancing lighting effect is the shader that we are going to focus on for this write-up. For this particular part of the mote, I decided to use an opaque shader with emissive properties. I had a few early versions that didn’t have any opaque, and I was getting consistent playtest feedback that players were not seeing them, especially not on one of our more brightly lit maps. I created a geo shell and flipped the normals so that the faces are always rendering toward the center of the mote. For the shader itself, it’s fairly simple. I have a tile texture that has triangular shapes on it, each with a different grey scale value. Ideally, this texture would have better grey scale range but because it was for free from a normal, and in the library already, I just used it as is.

I then used a technique that is sometimes called Gradient Mapping, or Palette Lookup, as it’s called in the Destiny engine. I used game time to tell the Palette Lookup to constantly scroll through this varying grey scale triangle texture and as it hits a new value that entire triangle would illuminate. We also have some built-in PBR iridescence settings. I used that to also give it an extra glint that would change depending on the camera angle. The last element of that specific shader was a normal map of the triangle texture. This gave it some more depth and made it feel more like faceted glass then a flat geo surface. I used the normal map twice. One version of it had a parallax offset, and then I overlayed the same normals texture but at a different tile rate on top of it. All of these layers try to give the illusion of as much depth as possible without adding much cost. The same normal texture was also the same texture used for the grey scale mask. I just chose to use the green channel only. This let me save on some texture memory. So at the end of the day, the shader is 1 normal map used 3 times in different ways, and a very small gradient texture scrolling through all of it. Because the grey scale values of the triangles are scattered around it, it makes it look like the facets are dancing from light refracting around in the mote. Also because this was an opaque shader, I now had a consistent lit backdrop for other shader and particle elements to play off of, regardless of the lighting condition. This was just one of many elements of the mote shaders. The below image shows each of the mesh shells, the math behind the Palette Lookup, the textures used, and what the elements look like separated out.

Raid Orbs VFX

80lv: You’re working a lot on these amazing complex effects, which have a ton of different steps and elements like in Raid Orbs. Could you talk a little bit about the structure of these things? I believe that in essence, the approach to this is very similar to the effects we’ve had in games for a while, but with your craft and detailed approach you managed to elevate it to some incredible levels!

You are correct with the assumption that the approach isn’t anything new. It is all about fitting the type of technique to the type of asset being created, and then in that asset using the correct technique for each of the states it has.

The funny thing about this asset, that I didn’t really stop to think about until answering this question, is that it uses every type of VFX hook that we have available to us in the engine.

The warm-up part in the sky is all camera facing cards and some particle meshes for the cylinder black stripes that shoot down. I went with more traditional particles for this part because of how they allow for the animation to happen more easily. You can scale them up over time and add some random properties to them. That type of approach works well for an intro type effect. I should also mention that the warm-up VFX started with some particles that were made for the Taken Public event intro. I should also mention that I wasn’t the original creator of some of these VFX, but I did retune all items that started from something else. A good VFX team shares!

The impact is a mix of a dynamic light (helps ground the VFX), post process screen effect that is doing radial bloom blur (fills the screen in an interesting way), and some taken FX kit that I worked on that are used all over forsaken (Help your team out and think modular where you can!). You can see similar burst type FX on the gambit portal, and also the portal that takes players to the ascendant realm. I also added in a little bit of camera shake and controller rumble on the impact.

The idle column and bottom shell are all static 3ds Max generated geo. The column is a series of cross quads for the for the green glow and a cylinder for the black ink. My personal approach to VFX is using more mesh shader type visuals instead of all particle based. One aspect that I feel Bungie pushes more so than other game VFX is the use of mesh shader VFX. I feel approaches like that help with this type of dynamic asset that needs to idle for a long unknown amount of time. I am able to reduce overdraw that particles would have if they were doing something similar and crossfading out over and over with themselves. We also have a limit to how many particles can be on screen so I get around this by using meshes to ensure the VFX remain. Bungie also doesn’t use flip books for VFX. We have an approach that mixes multiple textures together and offsets them in ways so that you can’t tell the masks being used, but also allows for VFX to remain on screen for a long time and not see repetition. The Bottom part is also a series of 3ds Max shells for the green and black geyser looking part. The stringy taken goop that extends away from the base is a dynamic decal. This lets the designers more easily place the pattern in the world and not have to have it always be 100% flat surface. The trick was to have the edge of the opaque have some alpha test in it so that it would break up the edge and blend in with where the decal would begin. This approach for the base let me create an interesting taken shape on the ground, let it not have to be on a perfectly flat surface, and be able to idle for an unknown amount of time without repetition. Particles were also used for the very base secondary elements of a taken fire and a small little-taken mote that would swim around. Again particles were chosen for that part of the look because you can more easily spread them out in random ways and for the motes, they could use GPU vector animation to give them a more interesting animation.

The last part is the orb in the center. This was mainly particles ( additive, alpha blend, and distortion). It also used a dynamic light and a lens flare. Particles were chosen for this part because it was easier to layer all of those types of visual render types and give them a nice animation to break the silhouette.

So, in the end, this asset used camera facing particles ( additive, alpha blend, distortion), Mesh particles for the column going down, post process FX for the impact, dynamic lights for the impact and idle, mesh FX for the idle column and the taken base, a lens flare in the center of the orb, a dynamic decal for where the pattern intersects the world, and lastly, camera shake and controller rumble for the impact.

Optimization & Testing

80lv: How do you test your VFX and make sure that your effects are not excessively expensive? What’s the trick here? Could you discuss a little bit the main problems that VFX can give the devs and what are the general rules of thumb to make it all work under the existing restrictions?

VFX is one of the arts that can absolutely destroy frame rate when implemented poorly. This is actually an area where we could use some more tools than our current setup. Most of the time we work on an effect and just dream big and try to make it look and feel the way we want it to. We have some debug help with checking overall run-time memory for CPU and GPU particles, but currently, we don’t have an easy way to see the frames total cost from the effects we are creating until a performance team sweeps in and finds bad frame rate areas. They flag areas where frame rate drop happens and give a performance PIX capture that we use to help identify the offending VFX. From there we optimize where needed. The hope is to never hear from that team. For the most part, we rely on prior experience and trying to be sensible when making the VFX. We do have some debugging tools that show wireframes and sizes of light bounds and decal bounds and run-time memory reporting.

We have distance settings that will remove the particle or mesh from the screen and fade it out as you get closer to it. These are all user-defined so that you can set it for the specific asset being worked on. We try to never have pop on or off of VFX for any reason. For particles, we also have priority settings that determine the importance of the particle effect. Priority 1 is super important for the look, priority 2 is nice to have flavor elements and priority 3 are things like environment FX. When under a heavy tax, the system starts to remove the lower priority systems so that hopefully the important gameplay elements remain and the frame rate stays steady.

Another nice feature we have is a quarter resolution buffer that we add elements to. This only costs 1/16th the price of a full res VFX, so choosing when to use this buffer is very important. Things like full-screen environment dust are always in this buffer, as well as particles that act like a generic volume fill and any other VFX that can get away with a lower resolution.

Shaders

80lv: You are also doing some incredible work with shaders. What’s your approach to shaders in general? How do you suggest using them? What’s a good way to experiment here and how can they help you achieve a better and more interesting way to build this outstanding stuff?

As I mentioned earlier, I lean more to shader style work than traditional particles. For the most part, you can actually get away with very cheap math. I mainly use multiplication and addition math operations to mix low-resolution textures together. Most of our textures are 512 x 512. We have a library of textures that we try to rely on to help on memory concerns, and it’s all about using those textures over and over in ways that don’t let the player realize that it is the same few textures seen. Obfuscation of the masks is key. Try to never let the player see the actual masks being used in the VFX. A quick tip to help obfuscate is to simply cross pan two textures that are multiplied against themselves with different tile rates, and different UV offset speeds.

Almost every VFX I make for shaders reads from the depth buffer to either mask out at intersections or create a hot intersection line. Using depth buffer in a smart way also lets designers more easily use the asset in many placements. If your shader is smart about auto masking itself then it can work in a larger variety of conditions.

Another go to is always using Fresnel to blur glancing angles from the camera to help soften shapes and not let a hard-line form. Just like hiding what mask you are using it is always good to obscure what the underlying geometry looks like.

An often overlooked part to good shader creation is being able to make your own custom geometry shells and UV them. I try to keep my UVs in 0 – 1 and then lean on things like world space and the tile rate of the textures, instead of stretching my UVs out past 1.

Another system I lean on quite a bit is painting custom vertex colors on my mesh and reading that into the shader. You can use this to create custom masking without the use of a new texture. Yes, this does add extra cost per vertex, but it’s often cheaper than a new texture in memory. It’s also crucial to use where you need to match UVs from an environment created asset where they go out of 0 – 1 UVs, but you still need to have use of custom created masking while matching the underlying textures from the environment asset.

Recommendations for Learners

80lv: Overall, could you recommend some cool resources that can help people start working with VFX? What’s a good place to start, what’s a good person to follow, books, tutorials? These would be amazing recommendations.

It’s hard to find resources for VFX. It is a bit obscure and only recently has it been taught in schools. I often browse Artstation for art to look at but even there most of it isn’t VFX related. Another good website is realtimevfx.com. If you are interested in starting VFX, I would recommend cracking open unreal and also searching youtube for some basics. Most newer engines are going more and more node based math routes for creating VFX. I am a bit spoiled, for when I started my VFX journey I didn’t have to open note pad to define the shader math, it was all node based for me from day one. I was also lucky enough to be able to learn this skill on the job from people around me.

Mike Stavrides, Senior VFX Artist at Bungie

Interview conducted by Kirill Tokarev

Join discussion

Comments 1

  • krishna

    I like you write style. thanks for an awesome trick.

    VFX Course In Delhi

    0

    krishna

    ·5 years ago·

You might also like

We need your consent

We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more