READ THE LATEST ISSUE HERE

Movie magic: virtual production

Posted on Sep 6, 2021 by Alex Fice

Democracy in virtual production is causing an evolution in the technology. We look at how these new tools work

Words Chelsea Fearnley

As VFX becomes a greater part of the industry, virtual production attempts to fix the growing divide between what filmmakers can see through the camera on-set, and what they have to imagine will be added digitally many months later. But the technology isn’t new. It’s something that’s been utilised for a long time, and can even be traced as far back as the 1999 production of The Lord of the Rings. However, because it’s expensive and still needs improvement, there’s been a monopoly on who gets to use it.

In the movie business, there’s only so much money for R&D, since there are so few people who need it. 3D is one such example of an emerging technology that could only reach so far without proper funding, because it was such a pain to use. But the rise in consumer technology is making it possible to fix these problems. When we think about how much time people spend on their phones and laptops, you start to realise they’re living two lives: the physical one we’ve always known, and the digital one inside our devices.

Now, all the world’s biggest companies are trying to marry those two worlds together – and it’s going to be enormous for VFX in the film industry. It means virtual production will finally become democratised.

What a wonderful world

In its simplest form, virtual production is the ability to mix live footage and computer graphics at once, to get real-time feedback and make on-set decisions about VFX and animation. It gets a bit more complex with VR, because the content is almost entirely computer generated. You can pick up a tree and move it, grab the sun and change the light, become a character and give a different performance.

The benefits of virtual production are unending. What it’s really good at – and helped pummel it into the mainstream – is how it allows for less travel and lodging for crew, because you can create any environment you want and have it in one place: an invaluable perk during a worldwide lockdown. It also helps put the tools of storytelling back into the hands of filmmakers, rather than an army of technicians. Using a combination of LED walls (and ceilings) with camera tracking systems and games engines to render content for playing not only in real-time, but in dynamic synchronicity with the camera’s viewpoint, filmmakers can stage scenes with greater realism. They can finally shake off that “we’ll fix it in post” mentality. But before we start heralding it as the future, let’s first look at what’s been done already – and what needs to improve.

Lighting complexities

For HBO comedy-thriller Run, the production built two cars, outfitted to resemble an Amtrak carriage, on a sound stage in Toronto. These rested on airbags that could simulate movement; instead of LEDs, a series of 81in 4K TV monitors were mounted on a truss outside each train window, displaying footage pre-shot by Stargate from cameras fixed to a train travelling across the US. It was a smaller-scale, less expensive version of Lucasfilm’s production of The Mandalorian – for Season 1, it used a virtual set that was 75ft in diameter, 21ft high, and also had a ceiling composed of LEDs – but the principal was still the same. “It brings the location to production, rather than moving an entire cast and crew to often hard-to-access locations,” says DOP Matthew Clark.

Any light that played on the actors’ faces, or on surfaces in the train, had to be synchronised to the illumination outside the windows, otherwise the effect wouldn’t have worked. Clark says, “It was important to line up the picture so, when you’re standing in the car, your perspective of the train tracks and power lines are realistic and continuous. If the angle of the TV screen is off by just a few degrees, suddenly the wires of a telegraph pole would be askew. When we needed to turn the car around to shoot from another angle, the grips could flip all the monitors around the exact angle.”

Although LED lighting has come on leaps and bounds in recent years, output image quality from the panels still has some way to go. “Currently, the industry is using live event technology. It needs to transition to fit-for-purpose, cinematic LED technology,” explains Michael Geissler, CEO of Mo-Sys. Without cinematic LED technology, it’s not possible to light the actors and sets using panels alone – at least not well.

Arri Rental’s Andrew Prior recalls a scene from Marvel’s Loki, where the eponymous lead, and another character, find themselves stuck on a hazardous, purple planet, with meteors crashing into it from all kinds of directions. He affirms: “You could tell it was a volume behind them, but it was done very well. The average viewer wouldn’t have known. What was interesting, however, was a wide shot of them sitting on a rock that looked all wrong. The purple light reflecting on their faces made their skin tones jaundiced – so even though the colour reflections were accurate [and not green from chroma key], it’s a clear example of lighting quality not being quite where it needs to.”

This is something manufacturers will have to figure out – and Mo-Sys’ Cinematic XR initiative is aimed at driving change in the image quality of LED panels. Purpose-built for cinematic and broadcast use, and designed by expert engineers, the solution improves final-pixel Unreal Engine image quality.

Geissler explains: “There are two extremes of Unreal graphics quality. In final-pixel LED volume shoots, you sacrifice Unreal image quality for immediacy. That is, you can’t turn the Unreal quality dials up without dropping below a real-time frame rate. At the other end of the scale, post-production compositing enables non-real-time rendering with all the Unreal quality dials at maximum, but at the expense of time and cost. Mo-Sys’ new NearTime rendering combines the immediacy of final pixels with graphics quality approaching offline compositing, stretching rendering time in a patented and automated workflow.”

Another thing manufacturers need to keep in mind is that bigger is no longer better. LED displays are measured in pixel pitches (the distance in millimetres from the centre of one pixel to the centre of the adjacent pixel), and they need to be narrow enough for the images to be photographed. In our industry, panels with a pixel pitch of less than 4mm are known as ‘fine-pitch displays’, and displays with a pitch of less than 2.5mm are called ‘super fine-pitch displays’. From a marketing perspective, fine-pitch displays are indisputably the focal point of many vendors’ products, like Video Screen Services’ Black Pearl display, and boast vibrant colours, clear images and sufficient brightness.

By contrast, On-Set Facilities, a rental company in the UK offering LED screens and monitors, advises that the bigger the pixel, the more light it outputs onto the actors. Therefore, very fine pixel pitches may not be optimum for filming – but until the image quality of these panels improves, a lower output of beauty lighting is probably for the best.

Camera tracking

Another essential component is the ability to have the LED volume tracked to the camera movement by a wireless sensor. This means, as the DOP frames the shot on the LED wall, it adjusts to the camera’s perspective – and that’s no small feat, as it requires minimal to zero latency in order to work. Most tracking systems rely on a series of cameras following infrared trackers, either inside-out (sensor mounted on the camera viewing trackers around the set) or outside-in (sensors mounted on the set viewing trackers on the camera). There are benefits to both, with varying impacts on how you would build a set.

Continue reading in our September issue of Definition magazine.

Industry Briefings: July 2024

June 22nd, 2024

New releases and developments from the world of Film and TV

Wildscreen at 40

November 24th, 2022

Sir David Attenborough, Pope Francis, a drag queen and an Amazonian chief mark the...

Mythological VFX from Freefolk on Netfli...

November 14th, 2024

Visual effects by Freefolk bring Olympus to life, with stunning CG creations of Zeus'...

IBC 2023 Preview

September 8th, 2023

As the industry prepares to descend on Amsterdam, Nicola Foley takes a look at...

Newsletter

Subscribe to the Definition newsletter to get the latest issue and more delivered to your inbox.

You may opt-out at any time. Privacy Policy.