Posted on Apr 12, 2022 by Alex Fice
We speak to the creatives who sparked the digital gunpowder in some of Hollywood’s most explosive blockbusters – from visualisation to VFX refinement
Words Lee Renwick / Images Various
It’s odd to imagine that some of the most moving moments in film history never even happened at all. We’re not discussing the movie magic and subsequent suspension of disbelief that goes into every live-action production here. We’re talking about Mark Hamill wielding a lacklustre metal baton on-set; Sam Neill turning Laura Dern’s head towards an empty field; and the entire indescribable landscape of James Cameron’s Pandora beginning as a green backdrop. But, by the time these sequences hit the screen, don’t they just win us over?
Today, visual effects have become so advanced, they may not even be noticed at all. The experience of stepping into a cinema and watching some astonishing action unfold in an unreal virtual world is entirely commonplace. But that doesn’t mean it is any less extraordinary to behold.
As far as recent blockbuster visual romps go, there haven’t been much bigger than Venom: Let There Be Carnage and Uncharted. So, what better case studies to explore the complex work of the creatives behind them?
Uncharted opens with a bang. Tom Holland’s Nathan Drake boards a plane, then disembarks in rather dramatic fashion. The heart-racing sequence sees him hang from a string of cargo crates, battle a guard in mid-air, then clamber his way back to safety – almost.
“It was an interesting challenge from everyone’s perspective,” says DNEG production VFX supervisor Chas Jarrett. “Working alongside the special effects team and second-unit director, Scott Rogers, we needed to figure out the best mechanism for shooting it, that would allow the visual effects component to be added seamlessly. I’m a firm believer that you can only make the shot as good as the raw material allows.
“The interesting solution we settled on was the use of enormous Kuka robotic arms. They’re built for production lines, so can manoeuvre large weights very accurately. We ended up building tiers of three arms, each higher than the last, up to 20ft, then attached lightweight versions of the crates from the daisy chain.”
For more articles like this, make sure to sign up to our Definition‘s newsletter at the bottom of the page.
With the boxes moving under reliable motion, Holland was tasked with performing his on-screen counterpart’s death-defying sequence of leaps – albeit rather more safely.
“The performers really had to struggle – and their eyeline was always moving. Little nuances make a big difference to the success of the ultimate shot,” continues Jarrett. To enhance this effect, his team placed the crate-and-arm set-up at a steep vertical angle, then shot it sideways, so as to appear horizontal, with a strong headwind in the final cut.
“To film, we had the camera operator on the arm, right in front of Tom at times. We also used cranes, as well as a wirecam for one fly-by shot. I think that added even more authenticity to the sequence, in the sense that the cameras weren’t doing impossible moves.”
Filmed in the open air against a blue screen, the next step of Uncharted’s VFX process required placing the live action in an equally convincing environment. A helicopter was sent to Thailand to capture reference plates, which were subsequently given to DNEG’s VFX supervisor Sebastian von Overheidt, and his team, to rebuild virtually.
“Despite filming plenty of kinetic energy and movement in the plates, we wanted even more. So, we ended up cutting some of the crates and adding rotation,” Jarrett explains. “To heighten the sense of speed, we also inserted a lot of environmental detail, like thin whisps of cloud. If we put those close to the action, we could have a mid-layer of atmospherics further back, then more distinct clouds even further off in the distance to create parallax.”
Lighting was also manipulated in post, to a subtle, yet effective end.
“We were filming outdoors, so hung an enormous silk on a crane to diffuse the natural light – although we removed it often. There were also 20K lamps on top of the blue screen, which could be pushed in for a harder effect.
“In post, we wanted to continue that sense of fluctuating light. The team animated the light up and down in the compositing software, Nuke, to deliver the effect of flying in and out of cloud cover. It’s almost imperceptible,” Jarrett continues, “but really helps to keep the scene active.”
DNEG looked to Maya for 3D application, in which models of the plane and the entire daisy chain of crates were digitally reproduced.
“Those crates were designed to consist of many different items, so we modelled each one in each crate, wrapped them all in a digital net, and ran hardbody and softbody simulations on them. In the sequence, you can see every item in the crates jostling, sliding and pulling against the cargo netting.
“We also created digital doubles of Tom. There are even shots that begin as live-action footage of him, but then as we pull away, become digital. That was primarily because of physical limitations and how wide we could shoot on-set.
“The great thing was that the sequence could evolve in post-production,” Jarrett explains. “It’s all carefully planned and storyboarded, and pre-vis elements are done, but when you pull it all together, sometimes you have new ideas or take things in directions you didn’t plan for. When you have fully digital recreations of the actors and the surrounding world, you can add new action to make the sequence effective.”
Jarrett was impressed by the detailed work of his colleagues.
“One of DNEG’s strengths is beautifully detailed environments, with natural light. That was crucial, because although some of the events in Uncharted are fantastical, it’s not a fantastical world,” he says. “I remember the first renders we saw of the digital environment. I knew they’d cracked it immediately. We did have the aerial plates, but I believe 100% of what you see in the final sequence is a virtual rebuild. It meant we could control the camera movement precisely. But to look at them, you’d never know they weren’t real.”
When it came to Venom: Let There Be Carnage, The Third Floor’s visualisation supervisor Martin Chamney was tasked with bringing the titular monster to life in a variety of ways.
With inspiration from director Andy Serkis, the wheels were in motion, and Chamney’s team swung straight into a gripping San Quentin State Prison sequence, virtually blocking shots ahead of plate capture and later VFX creation.
“We quickly got into discussions about how Carnage would move through the space,” Chamney begins. “Not just walking, but using tentacled appendages to fire up onto balconies and lift himself off the ground. He could change shape and move very quickly.
“It was crucial to inject that into the scene and get the pace up and running. We started by building the digital environment in Maya and conducting a virtual camera shoot, to look at the space within the scene.
“There was a plan to physically build around nine prison cells on each side of the corridor, which would be extended digitally into the distance. And this was based on the real San Quentin in San Francisco, so the art department had designed the set-up to a specific scale.
“During the VCam session, we quickly discovered that, with our blocking, the narrow corridor made Carnage’s movements much too easy. He could almost step from one side to the other. We were looking for something much more extravagant, so we widened the space between cells and got what we were after.
“Pathfinder was also incredibly beneficial here,” Chamney continues. “It’s a virtual scouting tool, with a VR headset, and allows you to actually explore the virtual set. Our production designer, Oliver Scholl, could put this on and look at all his designs – not only in a pre-vis sense, but also with a real physical feel for the space.”
The next piece of the pre-shoot puzzle was a technical visualisation, in which diagrammatic movies were created to illustrate key details.
“Our tech-vis detailed elements like camera track, camera height and field of view. All these factors are extremely useful, if you think about moving CGI. The crew is essentially shooting an empty set, with no reference for movement. Our visuals identified exact panning speed, for example. It’s very important that these details are planned accurately.”
“Post-visualisation comes when the crew has actually shot the plates – the photo cinematography,” explains Chamney. “It’s essentially temporary VFX. Rather than having teams look at hours of green screen plates, they can now see a proxy – in this case, with the prison set extension and the symbiote in the shot. It helps these teams further understand the story, because without those elements, it is pretty tricky. On this project, a lot of the action came from things that Carnage’s tentacles were doing, so they need to be seen.”
Like final VFX work, post-vis is highly technical in nature. Physical effects, such as explosions or tearing off prison cell doors, are present in the plates, and these details are based around pre-vis. What remains is a process of bringing the as yet disconnected parts together convincingly.
“When plates are sent to us, the first thing we do is integrate them into our systems. We use 3D Equaliser for camera tracking, which gives us advanced tools like LiDAR data. It captures all the information from set and transforms it into a 3D computer model, which is completely to scale and provides an accurate visualisation of CG elements. Actors and props can be scanned for digital doubles, too.
“There are actually already thousands of digital assets in place, which were built earlier in the process – environments and actors, vehicles, props, the symbiote aliens themselves in this case, and their tentacles – and those are brought into Maya to work with. The tentacles alone required a lot of complex character rigging,” Chamney tells us. “We had to work out how they would extrude and stretch, and what their tension would be. All these elements are put through our rigorous publishing system, to make sure they’re fit for use.”
Although major film editing is accomplished by a distinct team, as it would be with any non-VFX production, some must be done on full-CGI shots just the same. This responsibility also fell to The Third Floor.
“We used Avid to edit pre-vis and post-vis content, to convey the unfolding story in shots that were often entirely virtual. Many moments saw the symbiote creatures extending off Tom Hardy or Woody Harrelson’s bodies, but others had CG characters fighting in CG environments.”
In his closing thoughts, Chamney takes a moment to consider the evolution of roles such as his, which have come hand-in-hand with the evolution of visual effects technology.
“The interesting thing about visualisation is that nobody except the filmmakers will see it,” he says. Still, there’s no doubt that it’s crucial to the success of many blockbusters – with scenes that are so complex, they couldn’t possibly be completed without it.
“Initially, visualisation work was about solving VFX problems,” Chamney concludes. “Now, it’s there for virtually every department and, ultimately, the benefit of the story.”