READ THE LATEST ISSUE HERE

Rift: Animation continuum

Posted on Sep 9, 2022 by Samara Husbands

Bridging Unreal Engine and DaVinci Resolve, director Haz Dulull broke the mould with new feature film Rift. Its exciting workflow is more efficient, supremely collaborative and carries pure transmedia potential

Words. Lee Renwick / Images. Hazimation

How does a big-budget, live-action script become a multiplatform animated offering, created by a crew of just a dozen? Introduce one global pandemic, let the creative mind of a director loose in a powerful game engine, add one editing platform utilised from end-to-end and you’re on the right track. This is the origin of Rift.

While it’s a shame to miss out on the multiverse-hopping thriller as it was originally intended, the final result is undeniably exciting. Not least for its innovative means of creation. Helmed by Haz Dulull
– seasoned director and co-founder of Hazimation – the film’s reimagined workflow might be seen as a small-scale success story for larger things to come. Relying on cutting-edge tools and valuing creativity over cash, Rift is very much a sign of the times.

“When the pandemic hit and live action was put on hold, I asked myself how I could keep telling stories,” Dulull begins. “I called my producing partner, Paula Crickard, and said we should make an animated film. It takes years to develop a solid script, so we looked in our vault. Rift was always meant to be the kind of live-action movie you make when Hollywood throws a tower of money at you, but we realised it would make for an amazing animated feature, so brought on Stavros Pamballis to polish it into the screenplay we needed.

“I was using Unreal Engine for pre-vis, but started pushing it so far that it began looking like a full-on film. I stopped and asked myself, why aren’t more people working this way? Bigger studios have a fairly rigid pipeline, but I was working on a laptop, getting incredible lighting, animation and camera movement in a real-time game engine. It was a complete light-bulb moment.”

Before long, Dulull and a handful of collaborators were in pre-production, analysing sections of the script and visualising ideas. Unusually, none of these steps took place in a physical space.

“I was pulling headers and scene descriptions into Resolve, for a paper edit. It took on a very editorial approach, because I could bring in the rough renders from Unreal,” Dulull continues. “My timeline made it seem we were much later in production; it was like some kind of animated comic. We could see the pacing and get a sense of it all, and that really helped. It’s funny, I remember Stavros seeing dailies for a script he was still working on and having his mind blown. But that’s the beauty of the game engine. We could take rough assets, block stuff around, start experimenting and finding the film’s language, all between Unreal and Resolve.”

As production evolved, the team discovered all the required building blocks were in place like never before. In a truly iterative process, pre-vis shots created for script development could be updated with more and more detail.

“It’s a very agile way of working. We didn’t have to wait for one thing to finish before moving on to the next,
which is how traditional animated film works. You do a storyboard, animatic, layout and lighting – then go into animation. If you want to go back and change anything, it’s very expensive. In our case, the CG artist was constantly updating the characters. We achieved so much in those early stages. It wasn’t really pre-production at all – it felt like we were making the movie already.”

Rift_Frame_01_May_2022
Never too late: In what would be an extremely costly move within a traditional pipeline, Rift’s final aesthetic was still undergoing changes more than a year into production

Reimagined traditions
“Our whole Resolve and Unreal combination really came into its own during the edit,” Dulull explains. “A lot of people working in Unreal Engine are game developers or CG artists, who don’t have much need for editorial. But when you’re finely combing through a 90-minute feature film, you can’t really do that in a game engine. You still need to rely on the traditional way of editing, with a linear pipeline.”

Even here, a more interwoven collaboration was being fostered. There was no need for Dulull, who edited the film himself, to wait for finalised material before cutting. This shift away from old paradigms was one he relished.

“Every week, we were delivering dailies, which were getting better and better over time. Not just shots, but full sequences. Next time around, I’ll hire a dedicated editor and have them involved right from the start,” the director muses. “That means they help shape the film while it’s happening in real time. They can ask for another shot with an alternate angle, to get the coverage needed, and I can give it to them. In live action, that’s impossible. You can’t even get away with it in conventional animation – once the lighting pass comes in, for example. An editor might see a lit scene, absorb the mood and realise another shot is needed. Working in Unreal, you can go in, move the camera and hit render – all at a very minimal cost.

“Traditional animation movies render in what are called passes: there might be as many as 20. Those get composited, then rendered out as a single shot for editing. In our case, everything in an Unreal Engine frame is final pixels. The minute I hit render, what I see on that screen goes into a 4K EXR file, then into Resolve. And it’s updated automatically: I never needed to do any transcoding on Rift. We didn’t have countless versions of each EXR file, we’d just version them up. It’s possible to go back and look at details from the previous edits that way. It all requires a shift in mindset, but once you’re in it,
you can achieve so much.”

Beyond a much faster turnaround and what was undoubtedly a better film by virtue of a unified vision, one of the greatest draws of this new-found approach for Dulull was keeping the creative talents of his team focused on their respective areas of expertise.

“All these amazing artists were feeding work into a centralised hub – Unreal Engine. I could pull it all together and feed it back to them for even more refinement. In this workflow, if I wanted to experiment with small details, like the way the lens looked, I’d do that myself. That meant everyone else could focus on their artistry, not their director’s desire to move a tree three pixels to the left,” he laughs. “The crew was very small. We started with three, then went to five, then ramped up to 12. So the value of my team was high. If they’re an amazing character animator, or excellent lighting artists, I wanted them to focus on doing what they do best.”

Rift-Motion_Capture_remote_Gabriella_K_Xsens
Digital copy: To reap the greatest reward from the available budget, an Xsens inertia suit and gloves by Manus were used for all of Rift’s mocap work

Shaping a look
Perhaps the most impactful factor in animation is art style. For many viewers, it transcends voice performance, editing and narrative itself. As revolutionary as Rift’s creation was, some age-old factors never change. In fact, above and beyond all technical barriers, establishing the aesthetic proved Dulull’s greatest hurdle.

“When we started doing our first animated projects in Unreal, we found they looked very cool, but they felt like video game cinematics. That was the feedback when I showed early tests to friends within the industry, too,” he says. “Established looks are specific. There are newer productions with very lifelike CGI or classic 2D animations. When we spoke to distributors, they told us if we stylise and find a look that’s so unique it becomes its own thing, that’ll work.”

Examining Rift’s timescale is rather surprising – but so is everything else about it. Pre-production began in October 2020, official production kicked off in January 2021, with the picture being finalised for distribution this month. In another interesting circumnavigation of the traditional pipeline, Dulull used Resolve’s capabilities to experiment with early grading and adding unique visuals.

“We used the watercolour plug-in to see what kind of looks we could get, then simulated that elsewhere. We also did a lot of 2D effects on top of Unreal renders, once they’d been pulled into Resolve. Controlled details like bullet hits can be tricky in a real-time platform.”

To the envy of animators and directors alike, the ability to update completed content with ease provided the crew with genuine freedom of choice. Decisions that would typically have to be locked in well on schedule could be taken to the wire.

“We actually only nailed the art style 100% in April this year,” Dulull continues. “If this was a conventional movie, finalising the look 13 months into production would have been a big no-no. Obviously, we had to go back and re-render all the shots, but that was a very minimal time investment.

“In settling on something final, we looked to many reference points and employed a lot of trial and error. There were moments where I tried shaders, my head of CG tried some, then our other artists tried their own. Finally, it was a case of loving what we’d done with hair in one option, eyes in another and clothing in a third. We combined a lot of ideas – and finding that final style was a collaborative effort, for sure.”

RIFT_Game_04
Crossing over: A true transmedia offering, many assets were shared between feature film and video game

Player, ready?
Although not unheard of, completing more than one deliverable simultaneously is a rare feat. In addition to the upcoming feature, Rift is currently being developed as a third-person shooter. It seems only logical, considering Unreal Engine’s original application, but it was not a light creative decision for Dulull.

“We never intended to make a video game. The idea came when running film sequences as a game to capture elements we didn’t want to animate, like vehicle suspensions during a car chase,” the director explains. “We realised what was possible, but wanted to have a good reason for it. I went back to all the ideas thrown out during script development and realised there was so much potential in our multi-branch narrative that couldn’t fit within a 90-minute film. Essentially, all we did was take the assets, migrate them into our game project, then work some code. It became a pure transmedia project.”

Today, multiplatform media itself is not remotely uncommon. It’s utilising the arduous work of creative teams between these distinct offerings that’s much more infrequent – and what a loss it is. Of all the benefits that may be taken from Hazimation’s innovative approach, this could be the most promising. As we hurtle towards a metaverse world in which mediums look likely to merge, how fantastical might content be if created in wholly collaborative ways? Not just between filmmaking departments, as is the case with Rift, but experts from multiple industries.

Returning, for now, to the subject of film production, Dulull’s succinct summary speaks volumes.

“This changes the way movies are being made. But really, viewers don’t care how it’s done – they only want to see the best results.”

Originally published in the September 2022 issue of Definition.

Water world: Avatar

February 27th, 2023

Avatar: The Way of Water colourist Tashi Trieu explains how hue sets the mood...

Behind Her Eyes: Good, clean thrills

May 5th, 2021

Felix Wiedemann captured clear and detailed imagery for Behind Her Eyes. He talks naturalism,...

Going up to Eleven

July 19th, 2022

Stranger Things is back for a fourth season and, as DOP Caleb Heymann tells...

The saga continues: The Hunger Games

November 27th, 2023

As the latest instalment in The Hunger Games series hits screens, Trevor Hogg speaks...

Newsletter

Subscribe to the Definition newsletter to get the latest issue and more delivered to your inbox.

You may opt-out at any time. Privacy Policy.