Virtual takeover
Virtual Production wants to use your cinematography skills in its virtual world. but how do you get involved in its inevitable success?
Words / Sam Scott-Smith
Virtual assets are part of the fabric of filmmaking, and have been for a while. However, improvements to the time it takes to create those assets has grown virtual production into a real step change in production, which now encompasses using traditional filmmaking tools and, increasingly, their specific terms.
Head of Moving Picture Company LA (MPC), Kerry Shea, explains what she means by the term: “Virtual production right now is such a generalised, catch-all term, but what it truly entails is film and television production that allows creatives to interact directly with CGI content in real time. It’s not one single process or product; it’s several different methodologies and several different tools where you can work in a very small or large scope.”
Traditional filmmaking tools and techniques are being modelled into game engine
Virtual production origins
In fact, virtual production has been with us since the early nineties. Of course, back then, we weren’t talking real time, but filmmakers started integrating motion capture into their workflow in order to work with virtual characters. “We wanted to see more and more unique characters; to allow actors to perform those characters and have their performance directly translated. It was those directors who wanted to do animation, but didn’t come from an animation background who pursued this new type of production,” explains Shea.
Those directors wanted to direct actors, but also wanted to use what they were familiar with. That is, they wanted to use familiar camera outputs. “Virtual production back then just meant you wanted to see the performance as a CGI asset – now you can see them live on the set,” says Shea. “Originally when you were doing motion capture, you could just see a skeleton walking around and then you could see a better version of the skeleton, but you had to use a lot of imagination. Now, with the addition of game engines, everything is instantaneous. A director can look on set and see an actor and somebody in a motion-capture suit, then look into their monitor and, suddenly, they aren’t looking at an actor in a suit – they’re seeing an actor and a CGI character interacting with each other.
This immediate visualisation is a game changer for both the director and the performer – as well as everyone else on set. “For cinematographers, what has changed is that we are using traditional camera outputs like master wheels, pan and tilt heads – even the aesthetic of a handheld camera,” she explains. “We have those camera outputs, so we can walk around with the hardware, and yet what they see in the monitor is the CGI content. It’s not film, it’s not video; it’s the CGI content moving around.”
As a result of game engines, it is now possible to affect lighting and feature different lenses, which is perfect for cinematographers, according to Shea.
“They can then plan out the entire shoot and have control – or more control – over the camerawork, as opposed to the VFX companies simulating that in the back end. Now, instead, what you’re shooting is what you’re going to get in VFX,” she says.
VR is now used on set to ‘walk through’ a virtual set, so you know what works and what doesn’t before building the environment
VFX meets cinematography
You could see cinematography’s relationship with VFX as being a case of bringing the two together kicking and screaming. But Shea sees it as a great opportunity for both disciplines, with each needing the other in a reciprocal relationship. “How do these different parts of the film process work together?” she asks. “When everything was changing and motion capture was being introduced into the film industry, actors were concerned they were going to be replaced, but as it turned out, instead of replacing them, they now realise they can actually perform as a virtual character. They see what the value is, and so it has become more readily adopted.”
Among directors of photography, there has been a concern that more and more VFX would encroach on their territory – but Shea insists that’s exactly what she doesn’t want to happen. “We want to be able to benefit from the talents of a skilled and experienced DOP and camera operator,” she stresses, “so we can put the tools they’re used to working with in their hands. They can actually drive the camerawork, as opposed to visual effects trying to interpret what they’ve done on set. That’s the biggest change – we’re trying to go back to traditional film production almost, not looking to replace it.”
Capture to CGI
When Definition featured Welcome to Marwen, directed by Robert Zemeckis (February 2019), we also talked with DOP C. Kim Miles about using the Unreal Engine to light his actors. He admits to a kind of disconnect between his world and the video game engine world he was filming. He describes his struggle, saying: “I had a hard time articulating, saying for instance that I needed a hard light over here and a really diffused fill light over here. It took a bit of translating to get on the same page. So, what they graciously did was to talk with our electrical department and get photometric data from most of our lighting package, which they then input to their system.
“They then created an iPad app for me, so I could sit in the game engine room with them and turn dials in the app to move the position of the sun, lower the intensity and quality of it, including the colour – it was supremely helpful in communicating how we wanted it to look. In effect, they matched their library of lighting tools to our physical sources,” he recalls.
Shea admits that the lighting encoding side is getting better. “It’s not 100% yet, but it’s getting closer and, right now, it’s still an excellent tool for a DOP,” she says.
Live VFX
If you speak to the people in the VFX world who know about these things, they say that, in ten years, VFX for high-end movies and TV will be a live event on set. Let that prediction sink in for a second… We’re talking about a pre-vis that is more of a ‘vis’. Shea puts this speculation in more real-world terms: “The holy grail is to be able to take the final effects assets and use them at the very beginning of the pipeline in pre-vis. That’s not only a workflow shift, it’s also a psychological shift for the industry as a whole. It’ll work for some filmmaker, but may not be preferable for other filmmakers. What it essentially would mean is that, at the very beginning of the process, you build your characters, your sets and your props for all your CG work for final at the very beginning. It wouldn’t be a proxy.
“Right now, the way that filmmaking works is that it’s an evolution, where you start with a proxy and then look at it more evolved, more evolved and more evolved. That’s kind of the way people have been working now. But to have the ability to not build it as a rough, but to build it as a final effect at the beginning – that’s a huge shift. The challenge from a technical standpoint, which I believe will be resolved, is that in order for everything to run in engine in real time, it has to be lightweight.
“So that’s the challenge people are facing,” Shea continues, “by taking the final film resolution assets and putting them in a game engine, it slows the engine down. But that’s not insurmountable. I think everyone’s working on that right now, trying to figure out with graphics cards and the hardware in general how much geometry you can actually put in to an engine and run in real time.”
She adds: “The more you can put in, the faster you can work and the closer we are to this workflow revolution.”
So, although the CG characters seen in the monitors might be a lower resolution because they have already been realised in full resolution, you won’t have to rebuild them, which is the case now. But Shea is quick to manage expectations: “We’re not there yet. It hasn’t been fully achieved yet, but it is something that everyone within the industry is pursuing currently.”
MPC’s Genesis
Developed by Technicolor’s VFX studio, MPC, and now overseen by a dedicated virtual production team at Technicolor, the Genesis production platform is the culmination of a development programme to address the requirements of this new, real-time virtual production environment.
Genesis provides tools that give directors, production designers, lighting designers, directors of photography, VFX and post-production supervisors – among others – the ability to simultaneously integrate and manipulate live action and computer-generated assets.
“Genesis, for us, is a component that runs alongside a game engine,” explains Shea. “At the very beginning of the process, when you’re building lower resolution assets, they’re checked into Genesis. It’s a compounding effect, so every single thing you are then doing – including running cameras – is tracked in Genesis. If you’re moving around sets, that’s tracked. When you go on set and are now doing motion capture, that is being tracked into Genesis. It’s compounding. It’s tracking everything, which is huge because, from a data standpoint, as we’ve been making films, we’ve been getting data from all different places and trying to interpret it. What we’ve created is a centralised hub for all that information.”
Shea gives an example: “The way we now have been working is, for instance, take an animated scene where the director of photography is running cameras on the scene. All of that is tracked, and we’re moving around set dressing as well. We record a take and that goes into editorial, so editorial have QuickTimes. It also has layouts, which then go to VFX.
“For that initial animation stage in VFX, we’ve been allowing the director to change to a different angle if he wants to. That means taking that low-resolution animation back in to the engine and re-lensing it, which is really amazing. We’re able to do that, because we’re tracking everything through Genesis,” she says.
High-resolution screens are also part of the virtual production takeover
Minimising VP
Just when you thought virtual production was all about motion capture – with virtual characters and real actors mixing on big sets – there is now an alternative, as a side effect to all this tracking of data.
“You can minimise this operation,” explains Shea. “Imagine a couple of people in a conference room, for instance. They could bring the material back into the engine and re-lens a shot right there in the room. That’s a big deal, a big change. And for us to then be able to put multiple systems into a rack-mounted case and transport this technology to different locations is really cool as well.”
The danger for acquisition professionals is to retreat into what they know and have experience in, but people in the VFX industry want to partner with associations like the ASC and BSC – and probably others, too. They want to know what the challenges are for cinematographers, not interpret for them. They want collaboration.
“There are obviously different languages that describe different disciplines,” admits Shea. “You could have a coder who talks to someone whose entire career has been on set using traditional film terms. We need a scatteration. We’re at the point in time now where the best way to improve the communication is to use film terms across the board – whether we’re working in CGI or working in live action. It educates people in VFX who are working in final pixels, and also invites all the live action disciplines – the cinematographers, the ADs, production designers – to the table. I’m a huge fan of using the film terms,” she concludes.
Virtual production is a juggernaut of technology that cannot be uninvented or stopped. If you are (or going to be) working with VFX, it’s a massive opportunity to get involved at a ground-floor level.