READ THE LATEST ISSUE HERE

Mixing realities

Posted on Jan 14, 2022 by Alex Fice

Continuing our round table series where we discuss the latest buzz with industry specialists, this month we explore how virtual production is transforming the craft of filmmaking

Interview Chelsea Fearnley

Part 1

What gets you most excited about virtual production? 

Mark Pilborough-Skinner: The most exciting thing is the creative possibility it enables, allowing productions to go to locations that are either non-accessible, dangerous to film at, or simply don’t exist. VP also encourages early collaboration between departments that, in a traditional production process, would normally work in silos. Having your DOP, art director and VFX artists engaged in ongoing conversations prior to shooting means interesting solutions, more creativity, and collaboration can occur before even getting on-set.

Jonny Hunt: We’ve been putting LED screens in front cameras for 15+ years, so now having the opportunity to use that experience to solve brand-new challenges every day is incredibly exciting! It has also taken us from being a primarily technical department, to being right in the middle of the creative process, while working with some of the world’s leading DOPs and VFX supervisors – and gaining a real understanding of their vision. We feel very lucky to have been there right from the beginning.  

Christian Kaestner: The answer for me is actually a non-technical one. While it’s extremely exciting to witness the advancements in technology, real-time rendering and low-latency synchronisation, it’s really the creative aspect of virtual production that excites me the most. Virtual production enables us to become an even bigger artistic partner, and it dramatically expands
our involvement very early on in the filmmaking process. In-camera visual effects (ICVFX) require close collaboration between filmmaker, DOP, production designer and the visual effects department – as the shoot is being planned, scripts are being written and stories are being told. Becoming part of this process is super exciting, and requires a refreshing way of thinking about what we do. 

Jeremy Hochman: Our team has been working with LEDs on-camera for close to 20 years, so to see this become mainstream is incredibly exciting. In the past, we’ve relied on fragile systems with custom software and hand-crafted LED fixture arrays to do these things. To now have an entire industry embracing this type of workflow will benefit moviemakers and VFX companies, ultimately leading to more (and better) content for consumers.

David Levy: The ability to have such a detailed level of control over your environment, without losing creative freedom. In fact, the types of shots you can achieve in-camera are amazing, and would certainly be otherwise impossible on location.

Marina Prak: It’s the way it has shaken up film and (in its wake) the industry for commercials and broadcast. With virtual production, there is no limitation to creativity – anything you think of can be put on screen. Furthermore, travelling the earth to go to locations that fit the scenario is no longer needed. You can project everything here and now, and go from sunrise in Japan to sunset in Norway in a second.

Dan hamill: The seemingly endless creative possibilities it affords filmmakers. They are free from the normal limitations of shooting on location – such as time of day, weather, etc. There is no need to wait for the rain to stop, or until it’s dark to shoot a night scene; these natural restrictions can be ‘fixed’ extremely quickly. Another big plus is sustainability – sending large cast and crew units around the world on planes, emitting huge amounts of CO2, can be kept to a minimum, as long as more studios offering VP as a service keep being developed globally. 

What advances are being made to speed up the process, and provide greater synchronicity between camera tracking, rendering and LED playback? 

Pilborough-Skinner: One of the main driving forces of virtual production adoption is the use of real-time game engines as our render medium. Using video plates and other techniques works well depending on the shot, but Unreal Engine – which is free to learn and run – has accelerated VP and democratised the process. This means it can be deployed across a range of productions, including short-form content and adverts – it’s no longer just for high-budget TV and film. 

Hunt: It’s great that Epic (and others) have seen the value of focusing on training and developing features geared solely towards virtual production. Having their team available to guide you through the process and help solve any problems is invaluable.  

Kaestner: Getting images rendered in real time displayed within a camera frustum, that’s also being tracked in real time, is quite a technical challenge, and latency will always exist. If you combine that with the difficulty of capturing these images with a physical camera, you have plenty of areas that are constantly improving. Camera tracking systems are always getting more accurate, able to provide tracking data to the game engine faster. The game engine will benefit from more powerful render nodes with better GPUs and data handling. LED walls and their firmware are regularly upgraded to allow for new protocols, or to accommodate new camera software. This means refresh rates can continue to increase and keep camera shutter and display phase in sync across render nodes, on a display several meters tall and maybe more than 50 meters wide. Any tweak or technical advancements to these components in this delicate chain of command – at 24Hz or more – demands constant feedback loops and optimisation. In the world of VP, all components evolve all the time – exponentially.

Hochman: Wow, this is quite a short question with a lot of answers! We’ve synchronised LEDs forever, because that’s a requirement for making a display with thousands of tiles operate as a single entity. But, camera tracking and rendering have always been separate systems, operated by different departments and involving entirely alternate workflows. Now that all of these technologies are being brought together, we’re seeing an enormously fast-paced advancement, with all the interconnected equipment becoming an ecosystem. We’re at the infancy of this, and Epic Games/Unreal is pushing to be a big part of it, from the software/render side of things. Megapixel is working diligently, to make Helios the core central infrastructure needed for all of that rendered content to get distributed and displayed.

Levy: We have observed that all technology partners involved in defining workflows and providing hardware/software solutions (including Arri) are engaged in a very open dialogue with each other. This has allowed the industry to move much faster than before, and lets us develop in a more collaborative and efficient way.

Prak: Using LED volumes, latency is now being brought down to one frame. At the moment, nobody is able to drop below that figure.

Hamill: The current developments in Unreal Engine are helping to produce more authentic photorealistic environments, and will be increasingly intrinsic to the characteristics of camera sensors and lenses. 

How important is an LED panel’s pixel pitch to the camera’s optics?

Pilborough-Skinner: Choosing the correct pixel pitch for your LED volume is essential to negate visual artefacting and the moiré effect. Depending on the size of your volume and the required viewing distance, the tightness of the pixel pitch can vary. 

Hunt: There are so many variables to consider. The choice of pixel pitch depends on the shots you’re looking to achieve, depth-of-field and countless other factors. The sweet spot of mid-2mm pixel pitch works for a lot of scenarios, but there are plenty of times when having a finer, sub-2mm pitch gives advantages. Such as shooting subjects closer to the screen, which can also reduce screen size and therefore cost. We own large quantities of 1.5mm, 2.3mm, 2.8mm and 3.4mm, which all get good use for in-camera pixels – it really depends on the project’s requirements.

Kaestner: I would say that every virtual production project currently in development is a prototype of some form. The technology is evolving rapidly and fulfilling ever-higher visual demands. Each project is unique, and there is no ‘standard’ per se for a virtual production methodology. Therefore, every project will have to look very carefully at the combination of LED panels, camera chips, lenses and colour pipeline. The pixel pitch of an LED panel is one aspect that has a massive influence on the visibility of moiré or pixel patterns, when seen at certain distances, through specific lenses and camera chips. For the Netflix series 1899, DOP Nik Summerer had special anamorphic lenses crafted by Arri, to not only create a unique visual for the show, but also complement the virtual production. Using the Alexa Mini LF in the Dark Bay volume in Babelsberg, limited the amount of moiré and pixel patterns for the distances we needed to shoot our scenes.

Hochman: This is quite critical, and the pitch depends heavily on the type of stage that is set up, along with the DP’s choice of shooting. A large volume with a 30m diameter is fine with a 2mm, since the camera distance is quite far to the LEDs – and depth-of-field works in your favour. But, on the other hand, we are seeing certain VFX supervisors prefer as high resolution as 0.9mm, so the LED screen can be extremely close to the camera without moiré. And it’s not just pixel pitch, but also the quality of the pixel itself – that’s why we actually created our own LED chemistry to make Rec. 2020 colour with exceptional viewing angles and black levels. You’ll see more of this next year.

Levy: It is a crucial topic, and the choices are determined by the size of the LED walls relative to the talent performance area, camera position, sensor size and lens choice.

Prak: It’s important in relation to the type of content you’re shooting and the kind of result you require. Anything close-up requires a finer pixel pitch, whereas wide shots work perfectly with a larger one. At the moment, most LED volumes work with anything between 2.3 and 2.8mm. For closer in, a finer pixel pitch might be preferable – like a 1.5mm. It’s good to keep in mind that
this choice has two sides: the camera optics – which might be easier to change – and the LED panel used in the wall or volume – which is harder to change. It’s a matter of taste, budget, etc. But it is most important to test your products of choice together intensively, and synchronise your camera and LED panels to get the best results.

Hamill: The pixel pitch of LED panels is critical in creating a high-resolution, realistic virtual background that is suitable for in-camera VFX. Aside from the camera optics (focal length, sensor size, resolution), there are a series of other elements (size of LED volume and distance from camera to LED screen) which all need to be considered to mitigate issues like pixelation and moiré. In theory, the finer the pixel pitch, the higher the definition. For reference, the pixel pitch of the Roe Visual Diamond panels that form the LED volume at our studios is 2.6mm. Whereas, on The Mandalorian, they used 2.8mm – but in both cases, excellent cinematic visual outputs were delivered. 

Read the full article here.

Mo-Sys Announces New StarTrackers

July 19th, 2022

Mo-Sys Engineering has announced plans to introduce two new StarTracker products together with an...

Cloud forecast

November 2nd, 2022

From camera to cloud to remote post-processing, is the industry’s physical media on its...

Garden Studios now certified as a B Corp...

August 16th, 2023

Garden Studios is driving positive change in the film and TV studio industry

Atomos: Follow the cloud

March 28th, 2023

Collating, cutting, editing and packaging content at the Sundance Film Festival has been a...

Newsletter

Subscribe to the Definition newsletter to get the latest issue and more delivered to your inbox.

You may opt-out at any time. Privacy Policy.