logo80lv
Articlesclick_arrow
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
profile_loginLogIn

Battlesuit: Filmmaking in Unreal Engine 4

Hasraf ‘HaZ’ Dulull, an experienced filmmaker and VFX artist, talked about his career and the production of Battlesuit, an animated proof of concept for a pilot episode based on TPub Comics's graphic novel The Theory.

In case you missed it

You might find these articles interesting

Introduction

Hi, my name is Hasraf ‘HaZ’ Dulull, and I am a Filmmaker.

I came from a VFX background before making the transition to directing and producing.  

I had worked on films like The Dark Knight (doing postvis) and TV shows for History/Discovery such as America the Story of Us and BBC’s Planet Dinosaur, in both I was nominated for a VES award for my VFX Supervisor/VFX Producer role. My IMDb has a much bigger list, you can check it here.

Filmography

I broke out with my debut sci-fi feature film THE BEYOND back in early 2018, where it was number 2 in the Apple iTunes charts next to Bladerunner2049! And then it trended on Netflix for several weeks and was a commercial hit in the indie sci-fi market.  My second feature film 2036 ORIGIN UNKNOWN starred Katee Sackhoff (Battlestar Galactica, Another Life) and had a limited theatrical release in late 2018 before hitting Netflix where it was in the top 10 Netflix films in the UK.

That feature film landed me the opportunity to direct the pilot and several episodes for Disney Channel’s action-comedy mini-series FAST LAYNE (now available on Disney+).

Haz Film Company

With the success of my first feature film (I self-financed 80% of it) I was able to set up my own small production company HAZ FILM, where we have had several projects in development and also contributed to directing and producing the ending segment in the anthology sci-fi horror feature PORTALS (from the producers of horror anthology V/H/S). 

On all my projects I tend to keep working with the same VFX artists as the relationship building is core to having a team that knows your sensibilities as a director. At the same time, I also like to give a lot of freedom to artists on my projects because of that relationship built over the years. Yet, I love finding new talent to join the team as projects scale up.

Battlesuit Project: Team

With the Battlesuit project, a fully animated proof of concept for a pilot episode, I enlisted my usual CG collaborator Andrea Tedeschi, who has worked on all my projects from one of my earliest short films SYNC.  Last year, I was giving a keynote at The Future Film Summit in London speaking about the way I use Unreal Engine as a director, and after the session, I had several people come up to speak to me. But one guy came up to me and said: “We really need to work together, I am currently developing an animation pipeline for Unreal Engine which I think you will dig, let's meet for coffee”. That guy was Ronen Eytan, who is the Technical Unreal Engine Artist on Battlesuit and who was instrumental in the animation pipeline setup in Unreal Engine.

The next person I instantly enlisted onboard was Edward Patrick White, the music composer of the recently released Xbox game Gears of Wars Tactics.  We had been wanting to work on a project together for a while, so I took this opportunity to make that happen.

For casting, we were fortunate to work with great experienced actors such as Kosha Engler (who did voices for the games Battlefront 2 and Terminator Resistance) and Nigel Barber (who was in films like Mission Impossible – Rogue Nation and Spectre). We also worked with a new talent Wes Dalton who was just awesome as he came from a stand-up comedy background.

How the Project Started

Battlesuit came about late last year after I was approached by a comic book company TPub Comics via LinkedIn. The owner and comic book creator Neil Gibson wanted to meet me after seeing my previous work. What started as a general coffee (yeah I did all my meetings in coffee shops!) ended with him giving me a ton of graphic novels to read, and one of them being THE THEORY, which I read and loved.  Next week or so, we caught up and he asked if I was up for doing a proof of concept on one of the stories in THE THEORY, and I picked Battlesuit, mainly because I love big giant robots, but also because the story had a mix of things from planetary exploration to character drama to some dark terror tech moments and action.

Battlesuit had originally been set to be a live-action proof of concept, but with the very small budget allocated to the project and the tight turnaround, there was no way we would have been able to achieve that vision really well, we are talking big giant robots in warzones!

So, I looked at animation, specifically real-time animation to keep costs down and allow me to be as hands-on with every shot without the need to bring in huge teams.  

I had already been using Unreal Engine previously doing pitch-vis/previz for my next live-action sci-fi feature film LUNAR, which was in soft pre-production before the Covid-19 hit. I always thought to myself the quality of the previz I was doing looked really cool as there was lighting, shadows, and all the CG elements but in real-time. So, with a little more visual and poly count love + the use of ray-tracing, we could do an animated film like this.

Tpub was not convinced that animation would be the best approach and worried it wouldn’t look as good for the budget we had. So, I did a little quick test using free pre-existing assets (Paragon) from the Marketplace to not only show the quality and benchmark (to manage their expectations!) but also convince myself that a team of 2 other artists and I could do this.

This is the test clip I put together which gave TPub comics confidence we could do this:

Tpub comics were impressed with the test and off we went to make this proof of concept pilot episode entirely inside Unreal Engine. We called it a pilot because the series evolved into an animated series trailing with the rise of adult animation like Love, Death and Robots, Castlevania, etc.  I was also very inspired by those Netflix series, particularly LDR.

Goals

The goal of this project was to see if we can pull off an animated episode (the pilot) entirely inside Unreal Engine and create a pipeline model that would work for the entire series, with a small team and keeping it cost-effective (because in this business, it always boils down to creating content at an attractive price point, and once it’s a big hit, you then get to demand the bigger budgets).

Character Design

The character Linda, the main protagonist who we follow throughout the story was one of the first things we designed. We had lots to draw from the graphics novel, and we knew red was a key color in her costume, but we also knew she had to look cool as well as an explorer. In the graphic novel, she has long wavy hair, - we knew doing hair dynamics in Unreal Engine was too risky and we didn’t have the budget to do any dynamics grooming with hair, so I decided very early on that she would have short hair.  We licensed a space suit asset and then reworked the texture and shaders whilst also adding more geo detail to certain areas that reflected the look portrayed in the graphic novel. Her face started off as a generic female head model which Ronen then refined further with sculpting in ZBrush and creating various blendshapes in Maya before bringing it into Unreal Engine for final shaders.

1 of 2

Utilizing Pre-Made Materials

The only way this project was able to get done for the small budget was with the use of purchased/licensed assets to provide us with the groundwork to build on top of. The Unreal Marketplace (we also purchased from Turbosquid) was where a lot of the assets such as weapons, FX, environments, tech objects, etc. were purchased. To be honest, a big portion of the budget went on purchasing assets and building further from them… this saved us a lot of time.

Andrea, who was responsible for building assets and environments, used those assets as a starting point to then build the final version I wanted. We also used Kitbash 3D assets we purchased to create the warzone environment. The cool thing about the kitbash approach was that I was able to set dress scenes myself based on where I was placing my camera, without the need of sending stuff back to Andrea once he had created the environment. Most of the work we did was in the texturing and shading of the assets as well as rigging for animation. Things like explosions were taken from FX packs from Unreal Marketplace and then we manipulated the blueprints setups to make it work the way we wanted to.

We also made great use of Megascans and Quixel Bridge to add all those details into the environments within Unreal Engine. This was a gamechanger for us, as we never had to go back to 3ds Max or Maya to put detail. We kept refining and adding detail to the environments using Quixel Megascans and decals painted across the surface of objects and landscapes.

For example, I wanted the scene (spoiler alert!) with the soldier dying on the ground after the explosion to look a little gory, and Andrea was already mega busy setting up other assets, so I just went in and used the blood spatter decal from Quixel Bridget and projected that onto the soldier’s body and face and within minutes it was done.

General Pipeline

We had a unique/non-conventional pipeline to accommodate the way I work as a “hands-on” director (a nice way of saying I’m a control freak hah hah!). So, Andrea would create the environments and key assets, and once that level was setup, Ronen would rig the characters and place them. They would then send the complete level over to me, and I would create all the shots from that level.

So, it was like I was in the middle, on my left, there was Ronen sending in character stuff, and on my right, there was Andrea feeding me assets and environments.

Once I had animated the camera, action, lighting, etc. I would then render the final shots out of Unreal Engine (as final pixels) and bring into Davinci Resolve for editing and color correction.

Because we were all working remotely, we used Dropbox as our main server and Trello for tracking progress, notes, and ideas, etc.

Working on the Mech Robot

Andrea used 3ds Max for modeling and rigging. We kitbashed various assets we purchased to build the Mech robot and then did a lot of shading/texturing work to give it that look we wanted. Then, we used the Datasmith pipeline to move the Mech robot from 3dsMax to Unreal Engine where we continued shader work and added things like emissive lighting parts.  

1 of 2

Animation

The animation for the Mech robot was done inside 3ds Max as well and then Andrea exported various FBX files for me which I could then mix in Sequencer to create the various actions I needed per shots.

With the marines, we used tons of mocap data we purchased as well as animation from Mixamo, and then in Unreal Engine Sequencer, I could mix various actions, poses, speed, weight, etc. to get the desired result. I also did some additional keyframing to push things further that the mocap data didn’t have.

The facial animation was the most challenging thing to do, as we didn’t have enough budget for full performance capture.  So, Ronen came up with a solution using his iPad Pro’s depth camera piped into the LIVE LINK feature in Unreal Engine.

On the day of recording the voiceover, we also used that session to simultaneously capture the actress’s facial performance. This is where real-time animation came in very useful as I was able to see her performance on the character as she was recording the VO and performing.

Optimization

This was the first time any of us on the team had done optimization in Unreal Engine, so there was a learning curve, but a very fast one. For example, we used things like Light Mass Bounding box to ensure Unreal only calculated/built lighting data for the areas the camera is looking at because Unreal treats the level as a complete world. Other things we did included a good use of various LODs on objects depending on how close they were to the camera.

I also had to tweak my mindset to work with the scenes more like a game level but on a timeline of events triggered. Everything was controlled in Sequencer, so I would trigger key FX of explosions, bullet hits, tracer fire, smoke, etc. as keyframes triggered ON or OFF.  When certain objects went off-camera I would change their visibility to HIDDEN so that they wouldn’t get calculated.

With the action scenes, there were so many elements to control like soldiers, explosions, etc. - it was like herding cats!

1 of 3

One of the things that helped was the use of the Nvidia GPU, specifically the Quadro 5000 which was built into the laptop I was using. It wasn’t relying so much on the CPU, hence I was able to move around the scenes while actions were happening in real-time – it was in a way like a virtual chaotic film set.

Finalizing the Look

The look of the final piece was a combination of the lighting and ray-tracing settings in Unreal Engine and then the final color grade in Davinci Resolve.

I wanted a stylized look that would reflect my love for that punchy style in Anime films, so the lighting had vibrant colors (acting like gels in real-world lighting) with strong shadows.  

In the ray-tracing settings, it was a balancing act of samples in the Global Illumination and Ambient Occlusion. For the closeup shots of the main character I used high settings, but for the action scenes, I reduced the samples as there was lots of motion blur with the action, etc.

1 of 3

A big part of the look also came from the Post-Process settings inside of Unreal Engine, which meant that the renders coming out of Unreal Engine were Final pixels. Things like motion blur, depth of field, chromatic aberrations, bloom, exposure, fog, lens flares, etc. were all controlled inside the Post-Process settings. 

I rendered out all the shots as Prores4444 QT with the exception of the desert planet scenes with the main human character which was rendered out in Linear EXR.

Virtual Production with DragonFly

For the warzone scene, I wanted a more visceral action-style camera similar to films like Saving Private Ryan or Black Hawk Down, and I knew that doing that with keyframes (like I have with the other shots) would take quite a while. So, I used DragonFly, a virtual camera solution from Glassbox Technologies who approached me to try out their new tool after seeing some of the BTS images I was posting during the early production stages of Battlesuit.

Dragonfly utilizes the iPad as a virtual production camera linked to the camera inside Unreal Engine. I was able to operate it like a DOP, adjusting things like lens and focus on the fly as I moved around the warzone scene and captured the action which was playing as a loop in Sequencer.

We did a lot of the capturing in the Epic Games lab in London, who have been very supportive of indie filmmakers using virtual production.

1 of 2

The data would be recorded as a “Take” and the keyframes baked into the camera timeline, but because of the fast nature, I was able to grab as many takes as I wanted and then make the selection in Unreal Engine using the Dragonfly plugin, which also allowed me to smooth and clean up the frames of the camera capture.

Using Razer Laptop for Production

The Razer laptop was instrumental in the creation of Battlesuit because all the shots were created on it. The story behind the laptop is that I don’t often use desktop machines, mainly because I don’t like taking up space in my flat in London, but also because I am always on the go and I want to be able to travel without disrupting the project flow. Being a director in film and TV I have to be in various location recces, shoots, and production meetings, so I like to continue working on edits and previz whilst being on the move (bear in mind I used a MacBook Pro at that time). When Battlesuit started, I knew I had to use Nvidia GPU and a PC to make the best use of Unreal Engine, so I was looking around for the best laptop I could use to do this, and I was introduced to Razer by Glassbox technologies. Razer really liked the project and how I was doing this using game engine technology which they were familiar with as their core audience are gamers. They also had a lot of creatives using their laptops, so when they heard me talking about a mobile animation production rig, they instantly jumped in and tech sponsored the project with the Studio 15 laptop which came with the Nvidia Quadro RTX 5000, 4K screen, 1TB SSD.

At the time of early production in January and Feb 2020, I was also committed to various speaking and keynote events in Paris, Edinburgh, and London, so during the day, I was giving keynotes about the production of my previous films and my career, etc. and during the evening, I was in the hotel working away, reviewing asset builds inside Unreal Engine from my team, and building out scenes.

The laptop approach also came in very useful during the voice recording and performance capture sessions. As I had the scenes created, we were capturing the performance directing onto the characters in the scenes, and then I went back to my home studio and continued working. There was no process of conforming or moving data around.

What's Next?

This project actually opened doors for me as a director in the animation world now, allowing me to pitch my project ideas that were deemed too insane or bold to do a few years ago. Now studios and networks are open to those ideas knowing it can be done, and having a proof of concept like Battlesuit helps.

At the time of doing this interview, I can say that I signed on to direct an animated feature film based on a video game IP, and we (using the same team as on Battlesuit) are already in production. I can't disclose any details due to the NDA but what I can say is that it will be done entirely inside Unreal Engine and the project announcement will hit trade press later in the year.

In the near future, I am looking to make a series of masterclasses for filmmakers who want to create narrative content inside Unreal Engine. You can check out my recent masterclass series on sci-fi filmmaking here.

Follow HAZ on social media:

Hasraf ‘HaZ’ Dulull, Filmmaker & VFX Artist

Interview conducted by Arti Sergeev

Keep reading

You may find this article interesting

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more