READ THE LATEST ISSUE HERE

Nothing better than the real thing?

Posted on Sep 24, 2019

MPC’s animation aimed for the most realistic effect; but real filmmaking tools were needed to help this virtual production achieve greatness


Words Julian Mitchell, DISNEY / Pictures Disney

There was the movie Babe, with real animals talking with animated mouths. There was Avatar, with a few motion-captured aliens mixing it with real actors, then Disney’s Jungle Book turned the tables and reduced the human count to just one child in a photorealistic jungle world. Now, with The Lion King, we have everything as real as the computer technology can muster circa 2017-2019 and the animals are still talking (and singing). So realistically in fact, that one of the film’s VFX Supervisors mistook a render for a still from Kenya where the crew spent some time referencing the real world before any virtual production started. Where do we go from here? It’s only going to get more realistic. At least when all the real lions have gone, we’ll have a memory of them singing and having fun.

For us, the interest is the element that’s been added to the new production of The Lion King, the virtual side, taking keyframe animation and handing it over to the acquisition experts to ‘real’ it up. The term ‘virtual production’ is a catch-all phrase, but this movie is perhaps closest to the true meaning; when it strove for photoreality it knew that cinematography had to be a major part of the capture, and that people who shoot the real world for a living had to be involved.

ANIMATION

Once the camera shoot was completed and the voice performances recorded, the production shifted to the animation phase. For animation supervisor Andrew Jones, it was all about improving upon the past. “In terms of realism, I think this is a big step forward,” he says. “We achieved a certain level that I was quite happy with in The Jungle Book – but we wanted to push it even further in The Lion King. We wanted the animals more believable. We wanted to take a really beautiful story that everybody already loves and tell it in a new, unique way. It feels a bit more documentary style because you’re not anticipating everything the characters are going to do or possibly could do.”

Once character designs were approved, artists from MPC built each character within the computer, paying close attention to anatomy, proper proportions, fur or feathers – applying textures and colour, shading eyes and ensuring their movement was authentic to their real-life counterparts. New software tools were developed by MPC R&D teams of more than 200 software engineers to better simulate muscles, skin and fur.

March of the 600
Elliot Newman is a VFX Supervisor at MPC in London who had the job of animating The Lion King. After spending two years of his life working on the movie, he was in the mood for reflecting on this huge endeavour. “You do end up in a bubble and you can’t see it any more, but now is a good time to look back at how it was done. For instance, the virtual production aspect was something that was totally new to us but the scale of it is impressive. There were no plates, it was all CG so the closest thing to it was The Jungle Book – but it was still a different beast really. There were practices that we had refined since then; like how we broke down the sequences within the company, how review iterations worked and how we presented the work back to Disney.”

15 years ago MPC was just a couple of floors; for The Lion King, more than 600 people touched the movie for the company at some point. “There were plenty of people coming in and out while the movie was in production depending on what skills they had. The movie was also delivered in IMAX and in stereo so there were more dimensions to organise. It was creatively and technically challenging.”

The mainstream press has tried to encapsulate what the movie is, and photorealistic is a term that is commonplace. But Newman thinks that the striving for this level of reality opens new doors to the movie making industry. “Everything we do is about how you mimic reality, photography and real light. We decided to have a film crew bring their knowledge of reality to the fore, if you like. Caleb Deschanel was our DOP and even though he was on a virtual stage with VR goggles and a virtual camera with a monitor on it being tracked in real time, he could have easily done near impossible camera moves because he could, he had no constraints as he didn’t need to figure out how to make a crane big enough or a platform that was high enough so he could shoot where he wanted to.

“But he wouldn’t do it for the sake of it and always grounded himself in reality; as in, if you couldn’t have achieved that shot in real life, you shouldn’t do it in the virtual world. What we’re trying to achieve here is something that makes you think you’re looking at a real photograph. That’s often a pitfall with visual effects and computer animation; it’s too easy to do the impossible. Things become overworked very quickly, there’s a concept that ‘more is better’ but something that director Jon Favreau was often reminding us about is that actually when you look at real photography, sometimes it is boring, sometimes you haven’t got the perfect sky, sometimes the light isn’t ideal on the day but you just have to shoot. If you’re running out of light you have to increase your ASA perhaps as you have a window to capture something, especially when you’re on location.

“We would base the movie around the Arri Alexa 65 camera sensor so we knew we had to match it”

— ELLIOT NEWMAN, VFX SUPERVISOR MPC, LONDON

“In CG there’s this beautification pass that happens and immediately you get taken out of the experience and it potentially breaks your reality. I think what’s special about this movie is that yes, there’s a lot of design and compositional considerations on a shot-by-shot basis; we certainly made sure that when you’re looking at the images they look pleasing. We didn’t want people walking away thinking, ‘that was an ugly shot’. We certainly made sure that there was a craft to the shots, not just scientifically going with values that made sense.

“We kept everything as grounded in reality as possible. For instance, if Simba is walking into a cave, behind him in the savannah it would probably be bright sunlight and be very blown out. So you’re struggling to expose for the interior of the cave. We could have thrown lots of lights into the cave and added extra bounce and kicker lights with maybe rim lights around everything, and exposed it all that so it was compensating for the background so it didn’t blow out. But as soon as you do that you start entering the fantasy realm again and I think that’s part of why the movie looks the way it looks. Yes, it’s pretty but it’s not pushed, it doesn’t have that ‘overworked’ look.”

It’s to the VFX world’s great credit that this reach for photorealism saw them also reaching out to the cinematography world. Deschanel actually shot the movie with his hands on the camera and with all the nuances that his experience has given him, in fact Newman thinks that this marriage of CG and traditional acquisition might have created a new genre. “There is a new genre that potentially will come out of this which is quite exciting for us. This is brand new technology and since The Lion King the technology we’re using for the virtual shoots has evolved even more. We’re already on a completely new revision of those tools and more and more filmmakers are interested in using them and it’s definitely exciting. Even if you have a project that you want to shoot on a plate and add a visual effect character to it, these tools will help you visualise those things as opposed to visual effects just being considered a post process.”

Process too far?
Director Jon Favreau has said that The Lion King could have been completed without the input from a film crew, indeed it would have been cheaper to go that way. But what would be the difference between an all-VFX crew and a traditional filmmaking crew being involved? Luckily as the film sails past the half-a-billion-dollar mark in ticket sales it’s not a pressing question.

“The virtual production helps introduce very experienced filmmakers to the process rather than seeing all this in an animated movie,” Newman says. “You can bring a crew in that’s experienced at filmmaking and introduce them to a new set of tools and actually shoot the movie. It’s going to be pre-visualised in a real-time game engine and you will be able to use a camera and put some lights in as if you’re there on location, you’re just using different tools. What’s special about the virtual production side is that you’re not taking away the filmmaking process from the filmmakers, it’s just using different techniques. It really only takes a day or two to adapt to these new tools so you don’t have to know too much about VFX as you’re using familiar gear and terms.

VIRTUAL PRODUCTION GUIDE

The Virtual Production Field Guide has been published by Epic Games as a timely description of what virtual production is and what it could do for your production. Epic, of course, invented the Unreal Engine as a way of improving the gameplay of that particular title. Now, as we know, game engines are being repurposed for use in the movie and television world due to their huge efficiencies in manipulating polygons and triangles. But engines are just part of the virtual production world; the guide also describes how productions using new high-resolution LED screens are eschewing green screens and having live backgrounds to quicken up takes and give actors an idea of what they’re acting against. The guide also explains the collaboration aspect of VP, with high-quality imagery being used at a much earlier stage due to the same game engine efficiencies.

In the guide, you will learn about the new terms of visualisation like pitchvis – imagery created to help in-development projects earn a green light from a studio. Or virtual scouting, which presents a completely digital version of a location or a proposed set that crew members can interact with, while techvis is the combination of virtual elements with real-world equipment for the process of planning shots, as well as combining already captured footage with virtual assets.

The Virtual Production Field Guide is available at www.unrealengine.com/en-US/feed as a free pdf download.

“Real-time engines will get more capable and what you’re seeing through your view port will become more realistic,” he continues. “I know Deschanel adapted to it fairly quickly mainly as the tools are designed around what you need. You can bring in your own gear and we will track it, if you’ve got a certain focus pull that you like to use bring it in, if you’ve got a special fluid head bring it in and we’ll encode it so we can capture it. A lot of the filmmakers respond to that because we’re here to offer a service and to help people make their films, we don’t want to be people who take the work away from you, it’s a collaboration. To do VFX properly it has to be involved from the start, it’s not just about people being in dark rooms and adding stuff at the end.”

Modelled cameras
As in most things in the professional media world, audio got there first. Decades ago real instruments were modelled through samplers like Fairlight and now are commonplace in the industry and even in apps on your smartphone. Virtual production is now doing the same for the real instruments of our industry. Look on IMDb and it will say that The Lion King used the Arri Alexa 65 to shoot the movie; obviously it didn’t, but the crew did take the cameras to Kenya to shoot reference shots that they could emulate for the animation. “There was a location shoot in Kenya that Deschanel went on,” says Newman. “This gave us some Deschanel photography we could all look at so we could learn his sensibilities and style – how does he like shooting things, what lens choices does it have? But it was also a great Alexa 65 reference for us, very rich and detailed and meant for us to try and capture the essence of the plates we got from it.

“From that we said that we would just base the movie on that camera so the virtual camera that Deschanel was operating was based around the same Alexa 65 sensor with the same measurements, so we knew that we had to match the same field of view as the camera had achieved with a particular lens. In post-production we would also simulate the correct depth-of-field, it might be a T2.8 shot so that would be the depth-of-field on that virtual lens for instance.

“We also did a calibration shoot of the Alexa 65 as well so we mapped its noise characteristics, dynamic range, the quality of the lenses, bokeh and so on. Even though it was rendered we did our best to make it feel it was shot on a 65 which was our reference camera.”

The team also went to Disney’s Animal Kingdom in Florida and the instruction from director Favreau was not to put any animals into a scanning or a photo booth. “He never wanted that to be a part of the making of this movie; everything we captured from real life we did from a distance with long lenses. We were able to take a slightly more technical approach with the Animal Kingdom because it’s obviously a park where we had safe distances to set up multiple cameras with multiple vantage points to study the animals. It’s always better to work from real references and it’s always more successful as there are so many semi-nuances available in real life. Like weight distribution, general animal behaviour, how much they flick their ears, how alive they seem in terms of each animal’s connection to the others, that kind of thing.”

Production starts
Following the team’s extensive research trip, Favreau set up production of The Lion King inside an unmarked, purpose-built facility in Playa Vista, California, an area that has been recently nicknamed Silicon Beach for its gaming and high-tech industry. The facility was large enough to house everything under one roof, including a virtual-reality volume.

With two state-of-the-art screening rooms, dubbed the Simba and Nala theatres, the Los Angeles team was able to interact in real time with the MPC Film team in London to collaborate on animation review and visual effects. Says Favreau, “On The Jungle Book, I was bouncing around to different facilities, and it was difficult. So, we concentrated everything and used the technology as a foundation to allow us the freedom to more efficiently use our time and be in closer contact with people that we collaborated with in other locations. That is also where we had our blackbox theatre to record our performances in the same room we used as our volume, where we scouted and shot the film. We had different VR systems and a dozen different VR stations around the bullpen. We wanted to make it feel more like a tech company than a movie studio, so we created a campus environment. We had food trucks pull up for the crew out front, or I would be cooking upstairs.”

 Optitrack cameras were used to help track cinematographer’s moves Optitrack cameras were used to help track cinematographer’s moves

Producer Karen Gilchrist says that the production itself mirrored live-action filmmaking. “It very much felt like a traditional film,” she says. “We had a call sheet. We had an AD. We had a DOP who worked wheels. We had a dolly. We had a Steadicam. Even though the art and the production design were driven by a video-game engine, we had an art department and a script supervisor. We had video playback. Other than not having to wake up at five in the morning and drive to a new location or worry about the weather, it very much felt like a live-action set.”

Virtual Production
Everything that is being seen on screen was created in the computer, but it is anything but traditional animation. Favreau explains, “Where we departed from animation – beyond the photoreal look – was, at the point when you would normally operate the cameras in layout on a computer, we stopped the process and brought the entire film into VR and let our live-action crew actually set up real camera equipment.”

 The virtual camera system that replicates camera moves in virtual space The virtual camera system that replicates camera moves in virtual space

VFX Supervisor Rob Legato says the unique approach is groundbreaking. “People are studying animal reference and the animators breathing their life into these digital rigs. So, we’re taking an antiseptic digital medium and telling one of the most emotional stories that we have in our tradition using these tools. That dichotomy and underlying tension creates a lot of creative opportunities. This is as close to practical filmmaking as you get with an animated film.”

Filmmakers kicked off production with a pre-visualisation (pre-viz) phase commonly used in animated filmmaking. Animation supervisor Andrew Jones and the team of artists created simplified animated sequences so that it could run in real time in VR. These early versions of environments and characters became part of the Unity gaming system. Favreau says, “Instead of watching it play on the computer screen, we could go into the environment and stand next to an animated lion.”

According to the director, the virtual production employed in The Lion King is an extension of what they did on The Jungle Book. Favreau and his team were able to don VR headsets and walk around within the virtual set, setting up shots, choreographing movements, and adjusting lighting, characters and set pieces in real time before sending the version of each scene to editorial.

Favreau explains, “With The Lion King, we are literally putting filmmakers inside the monitor, using a set of proprietary tools interfaced with the HTC Vive virtual reality system and Unity game engine.”

Magnopus
Ben Grossman works with Magnopus, a company that helped bring technologies, hardware and software together to create a platform for the game-engine-based virtual reality filmmaking multi-player game. “Since the advent of digital effects, filmmakers have struggled to bring those visuals to the stage to see the complete image in context,” says Grossman. “Avatar brought a small window to the stage, allowing the filmmakers to peek inside the world they were creating. The Lion King turns that on its head by putting the filmmakers – and the gear they have used for decades – completely inside the world they are building for the film.”

A world spanning hundreds of miles was constructed in the game engine. “Physical devices are custom built, and traditional cinema gear modified to allow filmmakers to ‘touch’ their equipment – cameras, cranes, dollies – while in VR to let them use the skills they’ve built up for decades on live-action sets,” adds Grossman. “They don’t have to point at a computer monitor over an operator’s shoulder any more; the most sophisticated next-gen technology is approachable to
any filmmaker who’s ever been on a traditional set.”

According to Favreau, the idea behind incorporating live-action language into the film was to convince audiences that what they’re seeing is authentic. “My generation – people who grew up with video games – is very sensitive to photography and shots that look like they’re entirely digital,” he says. “You can sense the difference between a visual effect that was added to a real live-action plate and one that was built entirely in a computer. How do you make it look like it was filmed? The way shots are designed when they’re digital is much more efficiently done. The camera move is planned ahead of time. The cut points, the edit points, the performance, the camera moves – all that stuff is meticulous and perfect. But that perfection leads to a feeling that it’s artificial. Not every generation of filmmaker is sensitive to this. I find my peer group has the same standard where we want it to feel like something that was photographed, so instead of designing a camera move as you would in pre-viz on a computer, we lay dolly track down in the virtual environment.

“And so, even though the sensor is the size of a hockey puck, we built it onto a real dolly and a real dolly track,” continues Favreau. “And we have a real dolly grip pushing it that is then interacting with Caleb Deschanel, our cinematographer, who is working real wheels that encode that data and move the camera in virtual space. There are a lot of little idiosyncrasies that occur that you would never have the wherewithal to include in a digital shot. That goes for the crane work, and it also goes for flying shots.”

Virtual helicopter
Favreau was the designated virtual helicopter operator on the crew. “We also developed new rigs for something that emulates a Steadicam and something that emulates a handheld by having the proper weighting and balance on this equipment,” he says.

Legato says, “In real photography, the cinematographer can tell which cameraman operates a shot while you’re into the nuance of watching dailies. We want to inherit all  those happy accidents, all those human idiosyncrasies. How do you infuse emotion and humanity? It comes from the humanity of the people operating the equipment.”

Although Deschanel had never shot a film created totally within the computer, his live-action experience was exactly what the project required. “My experience in photography is capturing images of real things happening,” he says. “In a way, my job is to preserve the reality of what normally goes on in front of the camera – to understand what light does and how the camera behaves.

“When you’re filming wild animals, obviously you have no idea what they’re going to do,” he continues. “In order to preserve that reality for the animals that we created within the computer, we wanted to create that feeling that the camera operator is surprised at what they’re doing. The performance is different than what might have been expected, and that creates a wonderful jolt of excitement and understanding of the character.”

According to Deschanel, the trip to Africa both garnered footage that would later help artists create authentic characters, and helped guide camera movement that would mirror the real world, too. “There were times when I was following an animal and it would fool me. I’d make mistakes. Those elements later became part of the structure of how we made the movie.”

Favreau says, “Generally, with the higher tech films, they would use motion capture for the performances and then work the cameras with essentially digital tools because that gives you maximum freedom. But we didn’t capture the performances because it’s all animals and is key-framed. We captured the camera movement. We’re putting all of our work into capturing the camera data and showing that the virtual camera is being driven by humans while allowing the naturalism of the performances to come from the artistry of the animators.”

The data obtained during the virtual production was utilised by the animation team. Scenes and recordings were exported to editorial as video files, and to visual effects as data files that gave clear direction to the visual effects crews around the world who crafted the film’s photoreal aesthetic. Preserving the invisible hand of the filmmakers throughout maintained the film’s live-action style.

Newsletter

Subscribe to the Definition newsletter to get the latest issue and more delivered to your inbox.

You may opt-out at any time. Privacy Policy.