READ THE LATEST ISSUE HERE

Living in a virtual world

Posted on Apr 27, 2022 by Samara Husbands

As cutting-edge VP continues to develop, how are traditional workflows being reshaped to fit within the volume? The experts shed some light

Words Lee Renwick  |  Images Various

The revolutionary creative potential of the LED volume is well established, expanding on-set immersion in ways rear projection only hinted at. It offers an armful of benefits that chroma key and post-production VFX may never replicate. And while virtual production cannot be lauded as the singular future of filmmaking – likely, no one technique ever will be – it’s an exciting new tool in the arsenal.

But, as with any new addition to the production workflow, surrounding elements are being forced to adapt. By no means a negative impact, fresh collaborations are developing – perhaps even leading to a better, or at least more unified, end result. 

With the volume walls’ reliance on input images, the process actually begins with visual effects. On that front, few are more ubiquitous than DNEG. Steve Griffith, executive producer, virtual production, offers up his expertise.

Giving life to volumes

“In the traditional, post-production VFX world, we’re still building the environment well before we apply it to a scene. It’s just that the build can happen while the broader crew is shooting, or after. The biggest difference with virtual production in that sense is that creatives have to lock in ideas before we step on the set,” Griffith explains.

“It can be an uncomfortable process for some filmmakers, because they aren’t used to making those decisions so early in production. It’s our responsibility to work very closely with everyone, from directors to production designers, ensuring that goes as smoothly as possible.”

The additional time and money poured into the pre-production phase of virtual shoots is, naturally, a cost saved in otherwise more invasive post-production work. That’s not to say additional VFX will not be added when volume content has been shot, but even still, time is saved.

“The other name we have for the LED volume technique is in-camera compositing, which follows a pretty traditional VFX approach,” Griffith explains. “The layers of an Unreal-built world need to render to give parallax. On the virtual production stage, you’re still composing through the view of a camera’s lens, which is exactly what you’d do in post. Now, you’re just compositing in the very moment you shoot, rather than afterwards.

“When shooting a green screen with a 50mm lens, I would take it into a visual effects platform, recreate the view of that lens, then create the backgrounds using that view. We still do that when working with a volume, just, again, ahead of time. 

“But in a traditional workflow, you’d spend a lot of time rendering the image. You’d have to emulate the camera and lens that were used; set up the digital environment; render it and give to a compositor; and have it put through a compositing package, then render that out as an image. It goes through several different processes to get you to a final image. Instead, when you’ve done all that ahead of time and rendering through Unreal, it just happens immediately. The image layering process is the same, but the time to product is immediate.”

Garden-VP-image

Higher resolution

To wrap a shoot day with what may well be an all-but-finalised image is of great benefit, all the way down the production chain. The marketing department has footage to feed into a trailer or other promotional materials. Editors can work with a clear idea of the shot’s full composition. Talent can perform with a real reference on-screen, rather than in an empty green space. And directors see their vision brought to life immediately.

The pre-production VFX work itself begins with pre-visualisation, which can serve as a beneficial step in traditional workflow. Here, however, it’s essentially mandatory.

“The director’s vision is essential, to know what we’re going to create for the set,” Griffith says. “Pre-visualisation, tech-visualisation and virtual location scouting all assist greatly. If you’re doing these stages in Unreal, the other advantage of this whole process is that we’re just taking those assets, making them higher and higher resolution. We start with a low-resolution image or asset, then add more detail, and by the time we finally get on-set, it’s just an upgraded version of tech-vis and pre-vis work. If those assets need to be used in post-production VFX, we upgrade a little more and pass on to the next department.”

However, virtual production isn’t all about Unreal 3D builds. This exciting opportunity is widely publicised, due to its groundbreaking potential. But more photorealistic and slightly less demanding approaches may well offer the finest results.

“We can apply a 2D environment to the volume, which is useful for any static camera shot where parallax isn’t needed,” Griffith states. “Then, there’s another process called 2.5D, in which we can take existing plate photography, or just non-moving still imagery and – through photogrammetry – project it onto rough geometry. As long as the camera move is not too big, that dimensionalised plate will have enough parallax on-set to make it look like it was a full 3D build, without actually doing the months of work Unreal demands.

“Resolution also needs to be considered. In some cases, we’ve had to send 24 4K outputs to an LED wall. Your normal visual effects shot is just one 4K or 8K scene, but we have to do that 24 times over and render immediately. The amount of computing power required is massive. It’s hundreds of times more than you’d normally need.”

As such, highly photorealistic VFX scenes we’d usually see in the post-production space are, for now, virtually off limits.

“Typical visual effects platforms can still render to a higher level of detail than what we can do in Unreal and virtual production. Video-game engines weren’t designed to do photorealistic 3D rendering. That’s the goal, but the hardware and software isn’t quite there yet. We’re close, and depth-of-field helps, but 2.5D is based off photographic images, which is why many find it the most successful use of LED volume.”

Omniverse3

Shaping the shoot

The evolution of traditional on-set roles can be noted by no one more accurately than Garden Studios virtual production supervisor, Mark Pilborough-Skinner. Among a host of broader production spaces, the studio became home to one of the UK’s first permanent LED volumes when the technology first rose to prominence.

Garden Studios’ volume itself is comprised of Roe Diamond panels, spread across a 12x4m back wall. A ceiling is built with Carbon 5 fixtures, offering a lower pixel pitch, but increased brightness of 6000 nits. That’s enough for around 80% of a typical shoot’s fill light.

“The physical and digital art departments have to be in close communication,” Pilborough-Skinner notes. “You need to make sure that your physical foreground props match the digital ones, otherwise there’ll be a big disconnect when you’re looking at the final pixel frame.

“It’s the same with flooring, although it’s possible now for us to perfectly blend physical floors into the background. When you’ve got different pools of light, it becomes harder, but there is this relatively new tool in Unreal Engine, called colour correct regions. We’re essentially colour grading, but instead of working globally, can place adjustments in a 3D space.

“By creating different depths with props on the stage, you also give the focus puller more planes of focus. Sometimes, if your talent is very sharp and the background is out of focus, your brain will wonder why there are only two different focal depths.”

The most obvious adjustment for the camera operator and focus puller is maintaining the volume immersion. Framing off the wall, or highlighting the panels in sharp focus, need to be avoided.

“There’s also a little bit of latency between the camera and what we call the camera frustum, which is the projected view of the camera onto our wall – it takes some getting used to. Our wall is quite good, with a delay of about six frames. But if you want to do quick whip pans or erratic camerawork, we need to overscan the camera frustum,” the virtual production supervisor explains.

“We normally render what the camera can see at the highest resolution. Whatever the camera can’t see, we render the lowest quality, so you’ll still get all the lighting effects. The only caveat is if you’ve got lots of reflective props.”

With much of a scene’s lighting provided by the volume itself, the gaffer’s role has undoubtedly changed – but not for the worse, according to Pilborough-Skinner.

“Gaffers may think a lot of their normal remit has been moved to the hands of the technical team behind the volume. But, actually, we say our LED ceiling is just another lighting source for them.

“We recently did a day and night time-lapse shot. Our ceiling panel was changing from warm to cold, but the gaffer and DOP didn’t think it was punchy enough. They asked us to create blue and red circles overhead, and animate them to move in time with the rear image. It created this soft spotlight effect. You can’t really get hard shadows off the top screen, but it was a punchy blob of colour that moved with the background. We walked away realising that we can actually augment the content with additional lines.”

But virtual production isn’t a solve-all solution, as Pilborough-Skinner reiterates. “It doesn’t suit every production – especially ones that film in a fluid way, with script changes happening during the shoot, or those with huge action sequences. For those it does suit, I think one of the best soft advantages is the absolute demand for collaboration.”

 

VP-Stage-Image-13

Staying connected

Even as social distancing appears to be drawing to a hopeful end, there are plenty of reasons for remote working within the virtual production sphere. Stages tend to be smaller, and creative experts may not be close at hand. Brands such as Nvidia are ahead of the curve with collaboration platform offerings, although adaptation is required.

“Real-time collaboration between applications that creative teams are most comfortable with provides the freedom to make complex changes – directly into the shared virtual world being presented on-set,” explains Nvidia’s media, entertainment & broadcast industry lead, Jamie Allan. “Nvidia Omniverse enables this by building on Universal Scene Description, a common description of 3D, and with a library of connectors to plug the most popular tools into the same scene.

“The general solution outline involves creating content in 3D applications, and powering real-time engines on the fastest Nvidia GPUs with our sync technologies, to drive large LED walls. Capturing content can then be done using Nvidia AI-powered camera and tracking systems, while distributing live content using our IP video and data networking acceleration. All of these elements existed already, but the VP sector has found ways of bringing them together like never before.”

While the importance of preparation and pre-production work couldn’t be clearer, in rare instances, unavoidable changes to the LED’s virtual content will have to be made before shooting can resume.

“If a creature or piece of set design needs tweaking once it’s been seen on-set, but the artist or studio is not physically there, changes can be made remotely through Omniverse, and then reviewed in real time on-set, Allan continues. “Creatives can stay in their native application, such as Maya or 3ds Max, while connecting to the Unreal Engine scene being displayed to the director on the stage.

“This means that any creative changes made during the shoot are saved back to the ‘source of truth’, or full-quality assets. The assets, along with the incorporated changes, can then be used during further post-production VFX, without having to replicate this work.”

Always innovative, those in the industry are harnessing existing platforms in new ways, to meet the extensive demands of virtual production. With time to develop, potential is perhaps boundless. For the time being, the creative connections garnered are, in themselves, a promising sign of things to come. As we watch LED volume scenes in more mainstream productions, the outstanding results are beginning to speak for themselves.

This feature originally appeared in the May 2022 edition of Definition Magazine.

ARRI opens new mixed reality studio

June 18th, 2021

ARRI has just announced the opening of its new, state-of-the-art mixed reality studio in...

The Burning Girls: The heat is on

May 26th, 2023

Cinematographer behind the eagerly anticipated series The Burning Girls talks VFX, screens and night...

Talk of the ton

February 4th, 2022

DOP Philipp Blaubach reveals everything you need to know about the cinematography on Netflix’s...

Shock and awe: Oppenheimer

September 12th, 2023

DNEG – sole VFX partner on the dazzling Oppenheimer – shares the inside story...

Newsletter

Subscribe to the Definition newsletter to get the latest issue and more delivered to your inbox.

You may opt-out at any time. Privacy Policy.