Thar she glows!
Posted on Jul 19, 2022 by Samara Husbands
In capturing 60K footage and manning a 165ft LED volume for high-seas comedy Our Flag Means Death, Stargate Studios founder Sam Nicholson may just have landed his virtual production white whale
Words. Lee Renwick | Images. Warner Media & Sam Nicholson
If you’ve sat down to watch an episode of Taika Waititi-produced, swashbuckling comedy-romp Our Flag Means Death, you may have been fooled into thinking Rhys Darby and co truly set sail during filming. In fact, what you’re seeing is yet another milestone for in-camera VFX virtual production. Because only one crew took to the seas during the creation of this series, and it was that of Sam Nicholson.
“I had collaborated with Mark Costa from HBO on virtual production before,” he explains. “A few years ago, we did a show called Hooligan Squad. It was fully green screen – very ambitious – and we pulled it off. The pilot didn’t go, but it looked fantastic and, for the time, was futuristic. Next, we did Run for HBO, set mostly on a 250ft-long train. We shot plates all across the US, then remapped them onto 40 4K monitors on a set in Toronto. And that worked, too. Both were scary conquests of virtual space.
“So I’d worked with HBO, and did some very successful tests with Taika for Akira. It was all on LED and, although that hasn’t been made yet, the footage looked great. As Our Flag Means Death was set to begin, Mark called me and asked, ‘Can we shoot an entire pirate series without leaving a sound stage?’ I have a great team at Stargate that I bring unusual challenges to – and we figure them out.”
As planning began, ever-developing virtual production was one leading option for tackling this unique set of circumstances. Building and launching a sailable pirate ship is near impossible within the timescale and budget of even the largest blockbusters. What’s more attainable is creating plate footage in a technology-driven way.
A vast array
“We only had six weeks of pre-production when the show finally got green-lit,” Nicholson continues. “Rendering water at the scale we were going for was out of the question. David Van Dyke, the visual effects supervisor, and I quickly decided live-action plates were the right decision.
“A stabilised array of five Blackmagic Ursa Mini Pro 12K cameras was chosen for three reasons: image quality, data handling and resolution. There’s a lot of risk involved in all production decisions. Hardware dependability and relationships with manufacturers are critical to success. We relied heavily on Blackmagic hardware, for instance – not only in image capture, but also the playback and distribution of assets.”
Puerto Rico was the primary location, so Nicholson and his small team of experts set out on a boat for an intensive few weeks of shooting. What began as a full, 360° array with eight cameras was reduced to five. This opened space to stabilise the rig, with plenty of resolution left over thanks to the 12K sensors at play.
“Imagine five cameras on top of a boat,” says the DOP, returning to one of his three deciding factors. “Access is difficult, you’re swinging around in the waves, weather is harsh and the sun is going down. You don’t want to have to change data. We found a great solution, putting 4TB Sandisk SSDs on each cam. You’ve got off-the-shelf storage to shoot on all day. Plus, it simplified the DIT lay-off when we got back to the hotel.”
But why go so large scale at all? Well, a big ship demands a sizable volume.
“Our visual effects team at Stargate stitched the individual recordings together. So, we had a 60K original background, squeezed down into a 20K anamorphic signal, then chopped up through Blackmagic hardware and redistributed to a large 165ft LED wall. Full resolution was required across the entire volume, so the crew could point multiple cameras anywhere and still have a usable shot. We didn’t film in the stage based on a frustum philosophy, where only one camera needs to see a single section of the wall at a time. We wanted a playground for the cinematographers.
“The water we shot had to be dead still, so it was steady on the wall,” Nicholson continues. ”Thankfully, the resolution and stabilised rig made sure this was possible. After that, we added motion with Unreal Engine. You need perfect material going in; only when it’s redistributed do you start to play with it.”
With so much capacity to handle, six synchronised DaVinci Resolve platforms were necessary to drive this particular volume – modified to let crew make full colour and focus adjustments on multiple 8K channels.
“Resolve gave us all the control we wanted,” Nicholson adds. “You’re essentially compositing onto a gigantic display. The creatives will ask, ‘Can you make the sky a little darker, vignette the sun some more, bring a moon in and put it over the left-hand corner of the wall?’ You’re editing on multiple screens at the same time. Once it’s done and looks great, it has to be played at 24 or 48fps. That was definitely a challenge.”
As for the missing pieces of the typical puzzle, Resolve was run on custom-built computers, and 14 Nvidia A6000 graphics cards were needed to drive the NEP Sweetwater wall itself. But yet another piece of hardware was relied upon for efficiency and reliability.
“We couldn’t have any downtime on set, but we had a backup, which was a series of 8K HyperDecks. Once we settled on a sequence everyone loved, the camera would repeat the same movements take after take. That meant we could switch to a linear workflow of the perfect rehearsal. We’d record on the multiple HyperDecks and – in sync – these distributed playback over the 20K wall. If a computer crashed, or the ship’s rigging blocked the tracking cameras, it wouldn’t affect the live wall.”
Full sail ahead
As with any aspect of filmmaking, we’re entirely at the mercy of the technology at hand. The cutting edge has its appeal, but balancing that razor-thin line with a budget takes skill.
“Production is interesting when you look at it as complex problem-solving,” Nicholson muses. “A lot of the mysterious technical arts, if you will, rely on engineering and hardware capability. If you stepped on-set and had rendered some beautiful water, but it only lasted ten seconds because that’s all your machines could put out, it’s no good. It doesn’t matter how appealing the solution, if it doesn’t fulfil the project’s needs.
“We approached this Taika Waititi comedy differently than a drama, for example. In a drama, you have shorter takes. Things are storyboarded, and there’s a lot less improv. When you choose LED volume, you have to take into account the type of content you are shooting. I always feel the tech should not drive the show; the show should drive the tech.”
While long hours at sea and complex shooting methods were a necessity in this case, it paid off. It certainly allowed Our Flag Means Death to be created at all – that it was done in spectacular fashion came down to the expertise of creatives like Nicholson. He concludes with a broader look at the future of LED volume.
“One day, we may do ICVFX exclusively with rendered images, or a much more viable hybrid fusion. But it isn’t about a camera test, purist philosophy or science experiment. It’s about real production.
“Right now, you can order 2000 panels to your doorstep at a day’s notice, but who’s going to drive it? What’s the content? That’s the stumbling block for a lot of shows right now. All that work has to be done in pre-production. What was a green screen post-production workflow is now flipped around. Our assets not only had to be captured, but stitched, stabilised, de-grained, distributed on a massive scale, then rehearsed. You have to make sure the director, DOP and other key parties have seen the material – it has to be worked into the creative pipeline. You wouldn’t build a physical set on the day principal photography begins. We’re building something here, too – it’s just much more complex.”
Series one of Our Flag Means Death is on HBO Max in the US, coming to UK screens soon. A second series is in the pipeline.
First published in the July 2022 issue of Definition.