READ THE LATEST ISSUE HERE

Leaving the Valley

The Polar Express and A Christmas Carol paved the way for director Robert Zemeckis’ latest feature, in which life-like dolls bring the story to life.

Words: Julian Mitchell / Pictures: Universal Pictures

Robert Zemeckis’ track record speaks for itself: a whole host of films in which he has pushed technology to the limits. It’s safe to say he has become the doyen of movie motion capture with films such as The Polar Express, A Christmas Carol and Beowulf. And don’t forget classic and more culturally significant films like the Back to the Future series, Forrest Gump, Contact and Who Framed Roger Rabbit – all VFX-heavy movies, but not totally reliant on motion capture (mocap).

Mocap has taken us close to ‘re-skinning’ someone’s movements, applying the expression and personality of one person to someone or something else. But there’s a fly in the ointment – recreating the human face, something Robert was determined to nail for his new film Welcome To Marwen. The vacant, expressionless look on some motion-captured faces even has its own moniker: ‘uncanny valley’.

 “What if we do it the other way around? What if we make digital dolls and glue the actors faces on those? So, how do you do that…?”

My Hero

Into this world of revolutionary motion capture came DOP C. Kim Miles, whose IMDB entry is littered with episodics but not so many features. Luckily for Kim, Robert Zemeckis’ children were avid watchers of one of those episodics, The Flash, and Robert soon arranged a meeting to talk about a possible feature film.

“I was doing The Flash in 2017 when my agent called and said that Robert Zemeckis’ office had called and wondered if I was available and interested in doing a movie. Bob’s been a hero of mine my whole career, I’ve looked up to his work and his shot design, and had always tried to create shots that told stories like his.

“My agent encouraged me not to count on anything and that he would call back if he hadn’t heard anything more. About 90 minutes later he calls me back and says, ‘They want you to look at a script tonight and also take a phone call tonight’. As I was on-set, we arranged a phone call for the following morning when Steve Starkey, Bob’s longtime producer, called and basically pitched me the movie.

“I don’t think there was anything he could have pitched me that would have resulted in me saying ‘no’ – so I said ‘yes’, and Steve told me that Bob was anxious to meet me and show me some camera tests that had been done. I flew down to have lunch with Bob and talk about the movie.

“Before Steve left the conversation, I did ask how Bob found me and chose me for the movie. He said that Bob’s kids watched The Flash and he had noticed my work. From that he chased me down to see what I was up to.”

For Kim, this whole episode was a re-affirmation that the Hollywood dream is alive and well – all you have to do is keep working at it and someone will notice. “I nearly quit The Flash after the second season, but my agent advised me to keep going; ironically it was during the third season that Bob noticed my work.”

 Robert Zemeckis pushes technological boundaries, breathing life into his inanimate creations. Robert Zemeckis pushes technological boundaries, breathing life into his inanimate creations.

Towards the Valley

Kim was told upfront that the movie would use a large element of motion capture for the movie, but not the methodology. “We shot a load of camera tests to decide what that methodology would be, the first that the VFX supervisor Kevin Baillie and Bob tried was the idea of just shooting actors in costume and then turning them into dolls or cladding them as dolls in visual effects. What they found from that was you couldn’t convincingly create plasticity out of a real human because your body just doesn’t move the same way, with the same restrictions that a plastic doll has with the joints and
the sockets.

“Bob didn’t want to motion capture from the start as he had been down that road with films like The Polar Express and Beowulf, when he just about invented the art. What they had found was that there was a certain amount of the audience that had an emotional disconnect from the characters because they were fully computer generated, and the subtleties of human emotions created by the many muscles around your mouth and eyes are almost impossible to recreate. Bob wanted to avoid going down the uncanny valley yet again.

“Then someone said, ‘what if we do it the other way around? What if we make digital dolls and glue the actors faces on to those?’. So, how do you do that? Do you do motion capture and then facial arrays, which would mean the actors acting the scenes while multiple cameras shot all the different angles. But what if we did motion capture, and simultaneously lit and photographed their faces? That’s what we tested, and it seemed to work.”

“Bob and I would go into these virtual rooms and look at monitors while we moved cameras around the space.”

Framestore’s Marwen

Framestore delivered one of the film’s key sequences, with VFX supervisor Romain Arnoux overseeing close to 100 shots.

Kevin Baillie, the film’s overall VFX supervisor, engaged Framestore to create the film’s first scene, creating a dramatic plane crash and bringing Marwen’s inhabitants to life. “I was seduced, because I knew there was no pipeline that would do this. I had never seen a project like this before,” says Romain.

Framestore’s pipeline consisted of animation files going straight to lighting, where artists extrapolated from the setups used on the motion capture stage to create credible exterior lighting that matched the live action. Shots then went to compositing, where the team performed de-aging and completed the integration of the live-action and CG components. Framestore’s pipeline was tweaked, in that it allowed for back and forths between tracking and compositing departments to ensure high-quality tracking.

Framestore used advanced motion-capture technology to turn the film’s stars into realistic looking dolls. At the beginning of the film, Zemeckis wants the audience to be tricked – he doesn’t want them to think the dolls are actually being filmed, or that the story is taking place in a doll-like world. To create the perfect hybrid, Framestore’s artists projected 75% of the actors’ faces onto their respective dolls. As soon as the plane crashes, the audience is introduced to a fully doll-like world, where only Steve Carell’s mouth, eyes and part of the chin is preserved to achieve the desired plasticised look.

“Because we only had a texture from one camera point-of-view to project on the doll, we had to be perfectly aligned,” Arnoux noted. “There was always some slight variation between the doll’s face and the actor’s face. We used custom tools to calculate the disparities and realign the face, but it wasn’t easy – most of the time, the tracking team had to do it by eye.”

 IMAGES: The build of a shot from Framestore. IMAGES: The build of a shot from Framestore.

Lighting

They tested two ways of lighting, including flat lighting – so VFX could create whatever lighting direction they wanted – and they tried it with more cinematic lighting, as though the characters were in situ. “The tests proved that lighting them as though they were in situ was the much better way to go, it was much less artificial. So fast-forwarding to the days that we shot the motion capture, we physically shot their faces with real cameras and created shots with real cameras within the motion capture space, while the motion capture equipment was recording the movements.

“The way we got around the lighting situation on their faces was to very heavily prep the motion capture. Before we even started shooting the movie, Bob and I would spend weekends with the visual department to create all our doll environments in the Unreal video game engine. They had these 3D environments with 3D characters and 3D representations of the camera, which was mathematically matched to an ARRI Alexa 65 in terms of sensor size and focal lengths. Bob and I would go into these virtual rooms and look at monitors while we moved this virtual camera around the space and created the shots, at least created the general blockings and how we wanted to figure out each scene. What that did was give me enough information to say, if the scene was going to play out like this then we would want to put our sun on this side of the set and put our fill on this side of the set.

“That gave us a way to plan for our lighting, so I lit the Unreal virtual sets then used all of our lighting direction and lighting quality information from some of those pre-visualisations to go to our motion capture stage, and recreate them with physical lights. So instead of the traditional way of us lighting something and then VFX matching it, VFX lit it with my guidance and then it was us matching the lighting on-set. It seems to have worked; we stayed true to everything that we planned and the other information being so extensively prepped gave us extra parameters, like shadows, from door frames and the like. So we were able to create those elements within the motion capture stage and have the actors interact with their set, so to speak, even if there wasn’t any set.”

Game industry Disconnect

For the first couple of days of working like this Kim admits to a kind of disconnect between his world and the video game world with Profile studio (now Method studios). “I had a hard time articulating, saying for instance that I needed a hard light over here and a really diffused fill light over here. It took a bit of translating to get on the same page, so what they very graciously did was to talk with our electrical department and get photometric data from most of our lighting package, which they then input into their system. Ryan, the lighting designer (at Method) then created an iPad app for me so I could sit in the video game room with them and turned dials in the app to move the position of the sun, raise and lower it, change the intensity and quality of it, including the colour.

“So everything was at my fingertips in terms of designing the lighting in each of sets – it was supremely helpful in communicating how we wanted it to look. In effect, they matched their library of lighting tools to our physical sources. As in most things that Bob does, the whole process is kind of groundbreaking while being quite expensive at the same time.”

Kim talks about how symbiotic the relationship was between the physical set and the VFX department during the production of the film, which in his experience isn’t usual. “The disconnect between the two departments is entirely unnecessary as everyone is marching towards the same goal. But on this movie, Kevin Baillie, the VFX supervisor who is now a really good friend of mine, was such a great motivator and has such a great understanding of the physical filmmaking side of things that our conversations were much easier than they usually are between myself and VFX supervisors.”

Infrared block

Kim had never experienced motion capture, but this was no normal capture: “In a way it was like we invaded the sanctity of the motion capture space – they had a space that was roughly a 60 by 40 by 30 feet cuboid space, which was rigged with 30 or 40 motion capture cameras. We then surrounded that volume with bluescreen and blacks so we could use blue when we needed to and black when we needed to. The first thing we discovered we had to adapt to was that the motion capture cameras were infrared sensitive, which is how they get their information from the tracking marks. But anytime we turned on too much lighting trying to physically light the actors, our lights would emit so much infrared information that it would overcome the motion captions cameras.

“So, in many ways we had to adapt our lighting methodology to reduce the amount of infrared and still achieve the design we were hoping for. It took a little doing, but with a little bit of give and take we got it done. The other big difference was that we wanted to shoot the faces with our cameras how we normally would with a Technocrane, for instance, so the art department gave us some rudimentary wireframe set decoration shapes for doorframes and stuff like that for our actors to act with. So we were able to put our cameras into the set.

“The motion capture guys were able to equip our cameras with what we called little ‘sputnik’ devices, which were motion capture tracking marks that we put on our cameras top and bottom, so their motion capture cameras could keep track of where our physical cameras were going. They could then use that information to position their virtual cameras in relation with the virtual actors. We tried to give them as much information as possible, as early as possible.”

 Steve Carell and his miniature co-stars spend time in a world where fantasy and reality collide... Steve Carell and his miniature co-stars spend time in a world where fantasy and reality collide…

46

The number of minutes of doll VFX in the movie

60

Number of days the doll alter egos were in the mocap studios

 Robert Zemeckis directs with two ARRI Alexa 65 cameras for Welcome To Marwen. Robert Zemeckis directs with two ARRI Alexa 65 cameras for Welcome To Marwen.

Choosing the Alexa 65

It was clear from the outset that digital capture was going to be the only choice for principal photography, and Robert Zemeckis had previously used the Red Weapon cameras on several of his movies.

“On our first meeting, Bob asked me what my feeling was about camera choice. I had been used to using ARRI digital cameras, but at the time the new Panavision DXL system had just come out so we decided to test that, and we also decided to test the Alexa 65.

“The reason we went with the larger sensor Alexa 65 – apart from the pixel density, so an abundance of information for visual effects – was because we were creating this world in exactly the reverse way we shoot traditional miniatures, and by that I mean you’re doing everything you can to make the small object look bigger. We had the opposite problem: we had to shoot full-scale actors in full-scale scenarios in a way that looked like we were photographing miniatures. So the biggest thing for me was depth-of-field.

“The more depth-of-field you have, the less illusion you have that you’re photographing miniatures. So we picked the largest sensor we could find to at least give us a starting point for visual effects, to artificially shallow the depth-of-field – the 65 is such a superior sensor to the DXL, both in physical size and the quality of the picture, not to mention the simpler Codex workflow that it works with. We ended up shooting with the Prime 65 and Vintage 65 lenses from ARRI.

“The Alexa 65 is such a beautiful system and it’s why people shoot images in medium format; there’s a lustre to it that you don’t get with a smaller sensor. It’s also so forgiving on cast members’ features, and there’s just enough fall-off in the depth-of-field that the complexions even out.”

“We created a hybrid asset. The geometry is based on the one-sixth scale plane, but we used a lot of full-scale textures.”

‘Welcome to Marwen’ – VFX Breakdown from Method Studios on Vimeo.

Digital P-40 

A key moment in the movie was Hogie’s dramatic plane crash, which saw Framestore develop a digital P-40 warplane (based on photos of Creation Consultants’ miniature) and building the plane’s cockpit from scratch. “We created a hybrid asset,” says Arnoux. “The geometry is based on the one-sixth scale plane, but we used a lot of full-scale textures. The animators recreated the plane as if it were a full-size aircraft.”

Flying at 250mph, the plane covered a lot of ground during the film’s aerial opening sequence, where roughly 20 miles of northern European landscape was laid out in full CG. The effects team filled the sky with scattered debris emanating from the plane, which was also simulated at full scale. When the aircraft crashes into the ground, the crumpling foliage, churned-up mud and flames were all brought to life in what was an almost entirely digital shot.

“Our goal was for the audience to immerse themselves in a world that’s imaginary but rooted in reality. Being able to bring these two sides of Hogancamp’s life together in such a unique way was absolutely amazing.’’ – Romain Arnoux, VFX supervisor

The film is shortlisted for this year’s Academy Award for Best Visual Effects.

WELCOME TO MARWEN went on general release in the UK on 1 January 2019.

Newsletter

Subscribe to the Definition newsletter to get the latest issue and more delivered to your inbox.

You may opt-out at any time. Privacy Policy.