Beyond the stars: Our Universe
Posted on May 5, 2023
The new Netflix series Our Universe explores the fascinating relationship between Earth and the cosmos
WORDS Robert Shepherd | IMAGES Netflix & Stephen Cooter
When Morgan Freeman lends his voice to something, it’s sure to be a hit. Having played the character Red, who narrates The Shawshank Redemption (1994), he went on to become the voice behind several documentaries and TV series, including Cosmic Voyage (1996), Slavery and the Making of America (2004), March of the Penguins (2005) and Breaking the Taboo (2011). Freeman also hosted and narrated Through the Wormhole from 2010-2017.
His latest project is Our Universe, a six-part nature documentary made by BBC Studios and Netflix. So, with one of the most well-known voices in the business secured, the team was charged with making it visually spectacular.
Paul Silcox, VFX director at Lux Aeterna and VFX supervisor for the series, explains how – in terms of technological challenges – processing huge amounts of space data into the VFX pipeline was incredibly significant. The team worked with Durham University’s Institute for Computational Cosmology, processing simulation data to show the formulation of Earth’s moon on screen.
“It took four weeks to process a massive 30TB of information into the various formats needed for the show,” he describes. “With 100 million points per frame, 1.4 trillion data points were processed for just one scene. We had to change quite a few things about the way we work to manage that pipeline, including turning to the cloud and using AWS to scale up our rendering output. However, the result of upscaling our efforts means we now have a robust and powerful pipeline to work on even larger projects in the future.”
To get the desired look, the team worked mostly in SideFX’s Houdini, alongside Maya and Nuke, for the 700 shots it produced for Our Universe. The VFX team also developed in-house tools to deal with project-specific challenges.
“ShotGrid and Deadline managed the render of all our VFX and have become a permanent part of our pipeline,” adds Rob Hifle, Lux Aeterna creative director. “It’s because of the power and flexibility of these tools, combined with the talent of our 25 artists, that we delivered the project on time and budget, keeping open lines of communication with the production companies to ensure everyone was satisfied with the final shots.”
Early in development, director and producer Stephen Cooter had extensive conversations with showrunner Mike Davis and Netflix to define the look.
“The show is pretty unique – combining natural history with VFX to tell animals’ stories in the context of both planet Earth and the wider universe itself,” Cooter explains. “We shot in a widescreen 2.39:1 ratio, and although we left the final decision to individual DOPs, our go-to package was Atlas Orion series anamorphic lenses with the Red Gemini to get the cinematic look that the epic nature of the storytelling required. The choice of anamorphic lenses tied the two strands of the show together – the intimate ‘character’ moments with the animals, and the huge cinematic space shots – to illustrate that everything you see in the show is connected. It is these cinematic aesthetics – taking references from the science-fiction films of Steven Spielberg and JJ Abrams – that demonstrate this wasn’t a standard space or wildlife documentary show.”
The team needed to meet Netflix’s delivery requirements: a minimum of true 4K delivery, and on 2x anamorphic squeeze. Director of photography on Our Universe Dale Bremner explains how, given both the environmental and physical constraints, he knew multiple set-ups of both camera and lens combinations would be required to successfully achieve this. The BBC’s original desire was to shoot Red Helium with Atlas Orion series lenses, though he opted for Panavision G Series combined with a DXL2 and Sony Venice package.
“Some key technical hurdles that present themselves when shooting anamorphically are: vertical and horizontal resolution loss, horizon barrel distortion, smaller aperture range, inconsistent weight and front element dimensions, undesirable lens breathing and a lack of minimal focus lengths,” Bremner says. “Considering all these factors, I needed a base set of anamorphics that held up in low light, with minimal distortion, a uniformed casing, a realistic minimal focus range and a balanced visual consistency. Personally, I find when shooting the ocean’s horizon line, any skew feels amplified, so any deep distortion needs to be minimised as well. It’s almost as if we’re subconsciously conditioned to see the ocean within a perfectly horizontal plane; anything else feels completely off.”
Bremner explains that Panavision’s G Series anamorphics ticked these boxes perfectly, so he combined them with the Sony Venice for its resolution, high-ISO capabilities and internal NDs in the majority of the on-land sequences.
“When the G Series were pushed beyond their low-light and minimum focus limits, I supplemented them with Panavision Primo sphericals to match the G Series aesthetically,” he adds. “A Panavision Super Macro 90mm was used to capture hero shots of our baby turtles emerging for the first time, as dioptres can distort and amplify resolution loss.”
This article appears in the May 2023 issue of Definition. Read the full story here.