READ THE LATEST ISSUE HERE

Giant Steps

Posted on Apr 16, 2013 by Alex Fice

Jack The Giant Slayer was Thomas Sigel’s first 3D movie.

Having worked on movies and documentaries since the early eighties Newton Thomas Sigel has seen and done it all. But Jack The Giant Slayer was his first stab at another dimension. How did he prepare for it and what has he learnt?

Newton Thomas Sigel is a lauded DoP and has been lensing for the equally lauded director Bryan Singer since The Usual Suspects back in 1995. Jack The Giant Slayer was his first attempt at a 3D shoot (the second will be the new X-Men movie Days Of Future Past, being shot on the Arri ALEXA M cameras) – so how did he prepare for this first attempt and what cameras were in his short list?

“Jack was my first foray into 3D.  Even before we began prep, I tried to see all the 3D movies I could and was fortunate enough to attend an intensive workshop sponsored by Sony. I visited the 3D vendors and observed 3D shoots in process. When we began prep, I worked closely with our stereographer Chris Parks to make certain we were on the same page as to how to approach the 3D.

“The choice of cameras was driven by two priorities.  Firstly, to have the smallest, most nimble rigs possible. The second was the desire of VFX to have the highest pixel count possible. For that reason, the logical choice was the RED Epic cameras on the 3Ality TS-5s, which gave us a rig of about 55 pounds with Ultra Prime Lenses.

“The entire movie was done in Native 3D on the rigs I mentioned, with the exception of some underwater shots which I did in 2D on the Arri Alexa.  At the time I could not get a 3D underwater housing for the stereo rigs.  Those shots were post-dimensionalized, and cut in quite well.”

But how did Thomas and the Director, Bryan Singer, see the extent of the 3D, were there scenes where the effect would be maximised or was the 3D to be mostly present all through?

“In conjunction with Chris Parks, we created a 3D stereo depth chart that plotted our use of 3D throughout the story.  Because we begin in the human world and journey into the land of the giants, there was a wonderful opportunity to play with our depth budget as the story evolves. The stereo in the land of the giants increases proportionally as Jack ventures deeper and deeper into this over-sized world.  Another wonderful motif we employed was to miniaturise the shots which were done from the giants P.O.V.  Consider that a typical inter-ocular distance for human eyes is about 2.5in, then our giants eyes would be over 10in apart!  We created a rig which put our lenses almost a foot apart and created this fabulous effect of making the humans look tiny.”

How do you compensate for shooting heavy CGI movies in as much as the performance of the camera and maybe skin tones are concerned?

When you make a movie in 3D, with non-existent CGI characters, one has to accept that there will be a lot of other hands in creating the final image –  VFX, the stereo processing, the DI.  I think that makes it all the more imperative that you are delivering to these other collaborators an image that is as close to possible as what you want the final picture to look like.  When you are shooting digital, it is even more important, because the image is so easy to manipulate. In the digital world, I find choosing the camera is almost like when, in the days of film, we would choose what raw stock to use.  They all have different dynamic ranges, colorimetry and so forth.  Unfortunately with Jack, the priorities of rig size and VFX concerns took precedent.”

Thomas also used his usually box of tricks, some of which couldn’t be used with the 3D. Like the specialist lens, the Frazier.

“The Frazier is an amazing lens, although unusable in 3D. I employed it constantly in Three Kings, to great effect.  Not only the way you can get it into tight corners, but also the cool things it does to perspective when you tilt the camera.  It is like a miniature crane!”

With digital photography comes the digial pipeline which for  Jack was handled by UK company Fluent Image (see below). But are digitally pipelines and workflows beneficial to the DoP as far as seeing and then adjusting what you’ve shot?

“A lot of pipeline concerns are about studio security and archiving.  What matters for the DP is that the image he or she created stays unaltered as it moves through the pipeline. Unfortunately, dailies are never done at full res, so there is always some distortion there. Nonetheless, I think the DP should strive to put whatever colour LUTs on the image necessary to demonstrate the final look, and then make certain that stays with the material through post.

Do you mostly shoot digitally now, are the new cameras out performing film or is film still as far as latitude is concerned the number one?

“I shoot film or digital, whatever is right for the particular project. There is no denial the digital cameras are progressing by leaps and bounds.  As far as dynamic range and sensitivity goes, there are some digital cameras which can actually outperform film, but the look is not exactly the same.  The digital image is clean and does not have the same texture that you get from grain in film. Is one better than the other?  Depends what you are looking for.

What is next for Newton Thomas Sigel?

“The next installment in the X-Men franchise is called Days Of Future Past. We will be shooting 3D again, using the new Arri Alexa-M cameras with Leica Primes and Fujinon Zooms.  The story brings all the actors from the original X-Men movies into an adventure with the actors from X-Men First Class.  I’ll let you speculate on how that happens!

UK-based Fluent Image are credited with being the Lab for the camera files of the film. What was the extent of their involvement, how did they get involved and the details of their involvement?

Kate Morrison-Lyons from Fluent Image: “We managed the entire digital image for the movie, essentially starting at the camera capture point all the way through to final DI.

“How we got the job, well….The Fluent team have been involved in this kind of work for many years really starting with the involvement in the film Speed Racer, shot on Sony F23s recorded to Codex (yet only the HDCAM SR was used and the primary neg). Over the years we’ve worked on Sony and Warner films and then in 2009 we worked in great depth with Walden Media and Fox on Narnia: Voyage of the Dawn Treader. On that show we ran an entire file based post production workflow from LTO4 and helped them design an in-house DI suit; lots of firsts on that show.

“Then there’s the core Fluent team; Jon Ferguy, who is a very well established and respected industry engineer (he built Sohonet’s core infrastructure and was CEO/Director there for 11 years, previous to that he was at Framestore (geeking out) and myself who spent years working with cameras then in camera facilities supplying camera equipment to features film productions in the UK and Europe until the D20 came about. But essentially the Studios know us, the Producers like our bottom line and we have a reputation of delivering.

“As Fluent we like to be involved with every part of the digital image management from the camera hand off point. On Jack The Giant Slayer we took delivery of the camera RAW and all associated metadata from set, twice a day. We processed coloured rushes to Editorial, we delivered Studio dailies and we delivered two redundant copies of the days material to the production every day. That’s just the day-to-day shooting schedule. During this we designed a VFX image turnover package with the production and the key vendors, MPC and Digital Domain. We decided to take the Red RAW and design an entire stereo EXR pipeline for the VFX vendors giving them consistent high-resolution images every time they needed them. These turnovers began before the show moved from London to LA and then the whole VFX pipeline ramped up considerably for Hollywood.

“We built a 320TB storage solution for ‘JTGS’ that ran Fluent software. Our software is always tailored to each show and this time it was a stereo Red Epic show, with a stereo EXR VFX turnover and submission pipeline and a final DPX delivery to DI. The servers we built became the ‘lab’, it was built in the same location as the production and they were able access what they needed at any point. All the deliveries to vendors could be tracked and checked. All the VFX submissions were equally accessible to the production once they came in, and we managed the whole rendering process to DPX from stereo EXR in order for the DI house to start grading (or indeed reviewing).

Oh, and we dismantled it in the UK after shooting and moved it all to the US.”

Did you need to be on location with the movie and how did the files get to you, where did you set up your lab, how was the DoP involved with you and also the director?

“We are very much the near-set lab. Sets are usually heavily unionised and frankly we perform much more of a support role for the DIT and camera team, rather than impinging onto their territory.

“Files were dropped off twice a day and pulled onto the server (lab) using workstations that we put together. This was all set up in the same location as Editorial.

“Our job was to preserve all the work done on set, so whatever the DoP wanted to do with his images our role was to maintain that through to DI. The most important role we played was to feed information to set regarding any image problems we discovered. The DoP needed to be kept well informed of the results of his images and indeed we needed to make sure he was making the best choices on set. Both Tom and Bryan (Singer) were happy to review in Editorial from their monitors, something we were happy about because all the colour information from those images were created at the point we created the rushes, essentially information that we could pass along throughout  the pipeline.

How many different versions did you work on including the 3D? How did you work out the method of working through the files?

Interesting question, if you mean ‘versions’ as in 2D or 3D it wasn’t as sophisticated as Prometheus where we actually pulled a totally separate 2D film once the 3D film was locked. Meaning, the stereo department on Prometheus actually preferred some left eye shots to some right eye shots, put a pull list together and we pulled that instead of just dropping an eye off the 3D film.

We automate as much as we can. The only way to do this is to be on the ball from the start and slowly start to refine and gather an understanding of how the show is developing over its lifetime. We create huge databases and we have our own code that recognises certain traits in file formats and naming conventions (to name just a couple).

Technically what was the set up of your lab, what about archiving, what about files to editorial?

To confirm we created and delivered all the dailies to editorial’s storage and wrote two copies of the days events into LTO data tapes every day. The ‘labs’ we put together are incredibly fast, incredibly big and include LTO tape libraries for quick semi automated back ups. We design them, and put them together either in data centres or in broom cupboards, doesn’t make much of a difference to us.

How was it dealing with RED files as opposed to other RAW formats and what were the details of working in the 3D?

Red was actually great, but I think you’re asking about integrating with the RED SDK right? The RAW is like any other, in the sense that it’s a camera manufacturer’s interpretation of a Bayer image – we use their SDKs to develop our code to de-Bayer into the film format of choice. The conversion into an EXR pipeline was really interesting because EXR has such broad parameters.

The information that carries with the R3D files is great, you can get a lot out of that and indeed the RMDs – all in all a nice package that gave the show a lot of freedom.

How closely did you work with the DoP? 

They watched dailies pretty much every day, mostly AVID material. Hoyte Yeatman, the VFX Supervisor probably spent more time in the screening room because it was such a technical film.

Newsletter

Subscribe to the Definition newsletter to get the latest issue and more delivered to your inbox.

You may opt-out at any time. Privacy Policy.