Death of the green screen?
Posted on Jan 15, 2021
We examine the pros and cons of this bleeding-edge virtual production technology.
Words Chelsea Fearnley / pictures Eben Bolter
Virtual production has become a bit of a buzzword for various computer-aided production and visualisation filmmaking methods. But in its purest form, virtual production is an image that is created with real-time rendering engines and projected on to a live LED screen behind a physical set. Think of it like the Holodeck from Star Trek: within seconds, your actors and props can be transported into a whole new universe, just by using high-resolution LED screens instead of green or blue screens.
It’s a technique that has become synonymous with Disney’s The Mandalorian, which shot the majority – but not all – of its scenes in an immersive and massive 270° horseshoe LED wall and ceiling display. With a high-profile project such as this employing virtual production techniques, it’s no surprise to us that industry experts are heralding it as the death of the green screen. If we’ve remembered correctly, it was the Star Wars prequels, shot between 1997 and 2003, that leaned heavily on the process known as ‘chromakeying’, where actors are filmed against green screens and CGI backgrounds are added in afterwards. Now, green screens are everywhere – even in the unexpected, on drama productions like The Crown (page 4). So, we wonder, has the Star Wars franchise accelerated another VFX technique – and if so, is it at the stage where it can replace green screen?
Before we get into this, it’s important to first explain exactly how virtual production technology works. Everything is live tracked, which means the perspective and lighting projected on to the LED screen shifts in relation to the camera’s movement. But the camera is only responsible for what it sees through its lens, so the surrounding image remains dormant. Cinematographer Eben Bolter, who recently got hands-on with the technology, explains in the simplest terms. “Only the section of the screen that the camera is looking at is alive and in 3D, everything else is static, dead even.” He continues to explain that this is achieved by attaching witness cameras to the main camera. “When the camera moves, the witness cameras tell the computer which part of the screen the camera is looking at. The computer then displays the 3D imagery accurately in the camera frame. It really is impressive how quick it is, especially when you think about how complicated it is.”
Bolter, who is the DOP behind Avenue 5, used the technology to shoot Percival, produced by Rebellion Studios. Percival is about a knight of the Round Table, who’s battle-scarred and close to death in a moonlit forest. Suddenly, time speeds up and some unknown force aids his recovery, transporting him to the ruins of a church, where he receives a vision that inspires a new quest. Although it’s only five minutes in length, Rebellion is claiming Percival as the world’s first “all virtual” production, with all the action playing out entirely in front of a halo of large, flat-screen displays.
Bolter explains the film was “an experiment of sorts, designed to show what a virtual production could offer over the use of green screen”. The first benefit, which is also the least obvious, is how LED screens enable better lighting. Jason Kingsley, who played the knight, wanted to wear his own armour, which Bolter describes as being a red flag straight away. “An armour is like a mirror, so if Kingsley were to wear his in front of a green screen, with all the set lights on him, we would have been battling with green reflections, which are incredibly difficult to remove in post-production.”
There’s also more flexibility with lighting controls: Bolter discloses that on Avenue 5, he used 150 LED lights for one set, each with their own controls. This is great when you compare it to the blue-collar, old-fashioned film lights, but certainly bested when compared to an LED screen, for which you have just one control. Even so, this way of working might not be suited to everyone. “It requires learning a whole new language,” says Bolter. “Because, instead of communicating with your gaffer, you’re asking the post-production team to adjust or change the colour of the lighting, which I appreciate some people would find quite strange.”
Click the images to see a larger view
Another drawback with using green screen is that it dramatically reduces the gear options available to filmmakers. Bolter used vintage anamorphic lenses to shoot Percival, but he says, “VFX crews notoriously hate these lenses, because they have to artificially emulate all the out-of-focus artefacts that are unique to them.” This makes effects-heavy shots look less real as a consequence, so, “VFX crews tend to want you to shoot clean, using modern lenses,” says Bolter. “Of course, this isn’t an issue when the effects are captured in-camera.”
Bolter adds that if he wanted to shoot Percival on location, in a moonlit forest, it would have been a tiresome process. “We’d be restricted to only being there between 9pm and 7am, and if we didn’t get what we wanted in that time frame, we’d have to go there again another night. This is where virtual production comes into its own. Because, when you’re not reliant on time or travel, you’re able to capture scenes more efficiently,” he says.
Although Percival is a fantasy production, Bolter believes that this technology is best served to shooting projects that are real. “If I was doing an emotional sunset scene in a desert with five pages of dialogue between two actors, we’d have to fly everyone to the desert, rehearse in the desert and then start shooting just as the sun’s in the right spot. Then, twenty minutes later, the sun would set, and we’d have to go away and shoot the scene another day. Whereas, if I had a good virtual sunset environment, I’d be able to sit down with the actors and shoot for a 12-hour perfect sunset.” In doing so, the emphasis is put back on to the performance, with actors able to take their time.
But the biggest benefit for Bolter, who’s self-proclaimed to be “anti-green screen”, is the early collaboration with post-production. “When using green screen, we’ll set it up, and light and frame it in a certain way. Then, several months down the line, someone in a VFX studio somewhere is trying to figure out what that green screen should be – and usually, all the initial ideas we had on-set get lost to time and are gradually filtered down,” he says. “The beauty of virtual production is that all decisions, whether it’s the design of a dragon or castle, can be made together in the present.”
Know your limitations
Nonetheless, virtual production is still very much a bleeding-edge technology and it’s not at the point where you could dump every project in front of it and expect success. Bolter says that when he moved the camera, he noticed a half-second delay before the screen caught up. “It’s only slight, but if you wanted to do a quick pan, it basically wouldn’t work. It also means that if you hold the camera handheld, with all those little micro movements, left, right, left, right, just as you’re naturally breathing, the screen will go out of sync with you.” Furthermore, the screen itself, due to current processing power, has to be low resolution. This means that it constantly has to be kept somewhat out of focus, so as not to break the illusion. “It’s not 4K perfect,” he explains. “It’s actually more like 720p to the naked eye, which is impressive, but not nearly good enough.”
Bolter says the technology needs to mature before it can become a viable tool for any number of films and TV shows. “Give it a year or two,” he says. “The graphics cards will improve, the processing power will improve and the pixels on the screens will become smaller, and it’s all just going to get better and better and better.” On looking to the future, Bolter remarks that if you wanted to create an office block for a film, you could go crazy and have a thousand cubicles, “or an infinitely ridiculous, Charlie Kaufman-esque environment, with just one real-build cubicle, where edges of the physical set were blocked to hide the problems.”
These physical limitations, however, can’t be matured by advancing tech and are the biggest downside of using virtual production over real-life environments. Bolter says, “If we go back to the desert analogy, in reality, we’d have figure out how to seamlessly blend the studio floor with the screen. This can be done, because in The Mandalorian, the join was protected by sand and rocks, but still, there were certain angles that couldn’t be achieved. Like looking up someone’s nose, for example, you’d see the seam between the wall and ceiling screen, because they’re not joined together. The ceiling screen is really just for lighting.”
And actors also don’t have the same freedom of movement on a virtual set, “because if they keep walking, they’ll eventually hit the screen,” he laughs.
Ultimately, it’s important to understand these limitations in order to use the technology effectively. Treat it as a tool to help you out, but don’t adopt just because The Mandalorian used it (sorry). We anticipate it being revolutionary for our industry, but if you’re just wanting to use it to shoot a scene on a park bench, then go to a park bench. “Don’t let it become a gimmick,” concludes Bolter.
Virtual production specialist Lux Machina partnered with ILM and Epic to develop the technologies deployed for The Mandalorian’s in-camera VFX, including LED, camera tracking and rendering. Since then, The Mandalorian has become somewhat of a style request, like a Rachel Green haircut, with many producers requesting it because of its “cool backgrounds”, says Phil Galler, co-CEO at Lux Machina. However, style is not the reason why The Mandalorian producers used these virtual production technologies.
Had it been shot on green screen, the reflective nature of the eponymous Mandalorian’s armour would have been “nigh on impossible to fix in post without spending millions and millions of dollars”, explains Galler. Adding: “If you’re going to spend that money, you may as well do it in pre-production, because at least then you know what you’re going to get.”
Still, Galler firmly believes that virtual production technology should be used by filmmakers as a tool to help them, and not as something that they conform their whole workflow to. “Unless you’re going to be constantly using a giant disco ball in your production, then you don’t need a full wraparound and ceiling screen. This is not a cheap, nor an easy solution. It requires a lot of different production professionals – such as visual art, tech-vis and pre-vis departments – to do it accurately and at scale.”
Despite this advice, the demand for virtual production technology has grown tremendously this year. Galler explains: “Due to the coronavirus pandemic, there’s been a massive upswing for this technology, because it offers an avenue to producing content that didn’t exist before, but we hadn’t envisioned it becoming properly accessible until 2023.”
With this in mind, Galler notes that this technology is “still very much in its science experiment stage” and that there are apples and oranges comparisons happening everywhere, despite there being no standardisation. “We’re seeing a lot of virtual production providers popping up, some with no production experience, and all with different service offerings and technology, especially on the camera tracking side,” he says.
For Lux Machina at least, its biggest technological advancement in the last six months has been in the photo-real rendering quality of digital characters.
“I think we underestimate how much time is spent building character assets. Now, we’re able to advertise the use of an asset we build once in virtual art – and, with it being available at the beginning of production, it could become a highly profitable marketing tool.”
He concludes: “Everyone’s focused on virtual production as the death of the green screen, but actually I think it’s going to be driven by the hype around assets from marketing companies.”
This feature originally appeared in the January 2021 issue of Definition.