The Synthetic Shutter
Posted on Dec 23, 2016 by Julian Mitchell
An accidental meeting between Tony Davis of Tessive and the VP of Technology at Sony introduced synthetic shutter technology to Ang Lee’s 120fps movie, Billy Lynn’s Long Halftime Walk, just when they were wondering how they were ever going to show the movie
Words Julian Mitchell
If you have read our previous articles on Ang Lee’s new movie, Billy Lynn’s Long Halftime Walk, you’ll know that it was shot at 120fps in 4K and 3D and was also at one point considered unplayable to the general public unless you used a double projection system. The studio’s problem was presenting the cinema chains with a version they could play – that meant taking 120fps and turning it in to 60fps and then 24fps. They were considering their options when an accidental meeting happened with Tessive’s Tony Davis. He had come up with the technology thinking that only student filmmakers would use it.
“After my meeting with the VP of Sony Technology, I was put in touch with the technical group who were actually doing the post system and at that time they really didn’t know how they were going to do the post,” says Tony. “Post usually charges by the frame so this was going to be very expensive. They gave me a fairly long list of feature requirements as they were really driving the thing and I’d sit and make notes in terms of aesthetics and then go back and make new looks.”
Tony can’t talk about what Sony are specifically doing but can talk about what they and other people are trying to accomplish. “A lot of filmmakers have a goal of moving to a higher frame rate as there’s been this kind of general discontent with 24 frames-per-second. At the same time, we love it and it’s beautiful, but we have these very mixed feelings, however the desire to go to something that can render reality more accurately has not gone well so far. There’s been a lot of test footage shot at 60fps, the consensus was that it looked more real but more ‘video’ and didn’t capture the feel of a movie. That’s when I showed up! My little company has been saying that what’s going on with movie cameras is that they haven’t been accurately representing the real world.
“On the camera side, the framing is a bit of mathematical problem and that opens up possibilities. The maths doesn’t tell us what will look cinematic, but it does tell us that there is a fairly large world that we can explore. I make a tool that helps reduce judder, which is a camera-side thing. There are two parts: judder and strobing. Judder is what the camera does, they are artifacts or problems that happen at the camera end in terms of motion; strobing are artifacts and problems that happen at the projector end. I don’t worry about projector issues and strobing is almost completely fixed by high frame rate. People thought it would fix judder too, they thought it would fix this jerkiness to the motion, the choppiness, but it just doesn’t. I raised my hand and said it’s because of maths, that’s the reason it doesn’t work.”
Tessive’s first solution was a liquid crystal shutter that would fade in and out per frame, but now they’re doing it all in software. The idea is pretty simple, you shoot high frame rate – very high, 120fps or higher – and then you have enough data to put together softer frames in a way you choose as you go down to 60fps or 48 or 24. That’s where the real magic comes in.
“People ask what do you mean by ‘the way you choose’. The answer is, the mathematics will give you families of possible options that go from crisper to softer to less judder or more judder – actually a very wide range of things, it’s not a single axis of control. There are lots of different adjustments you can make to how that happens.
“The simple idea is we’re taking some number of frames and we’re weighting each of them with some percentage and adding them all together to make each output frame. The simplest case is if I took every two frames and weighted them 50% each and added them together that would make a single output frame so that would take you from 120 to 60fps, which would work very nicely. But what if I took 11 input frames to make each output frame and had this long list of weighting values to do it. This list starts from very little weight when it’s very far away and as it gets closer, we weight more then we weight less. That could make a very wide, kind of smeary function and that might not look very good but it would be different.
“You can then imagine a camera with the ability to expose multiple frames at the same time so as one frame was fading out its exposure, the next frame was fading in. So the incoming image was being transferred smoothly from one frame to the next, the next and so on. So you had multiple frames exposing at the same time. We can do that digitally but can’t do it in a real camera. So the next question is, we can do all these things but should we? Does it look good? That’s obviously an aesthetic question and there’s no mathematical answer to an aesthetic question so that’s when I hand the tool over to the artist and they use it in different ways and ask test audiences what they think. We’ve being doing this with 24fps shuttering for years, this is not a new idea. But now we’ve got this whole new adjustment. I think the results are going to be fantastic.
“Temporal adjustments work on an emotional level, you don’t just point at it and say ‘I see that there’. We aren’t wired that way. So when these things happen at a temporal level it can affect the emotion of a scene and that is a tool that artists have to grab on to. I don’t think we yet know what that’s going to do to the moviegoing experience. When I saw footage from this movie at 120fps, I was prepared to sit there and watch all the technical aspects, but I missed all that because I was so wrapped up in what they were showing – I had never experienced an emotion like it. It was really amazing, the immersion was astonishing. I had not expected it and I do this for a living.”
Time Shaper Software
New builds of Tessive’s Time Shaper software and new shutters are landing at Ang Lee’s production office all the time. “It’s not real time, mainly because of how long it takes to process. We can do around five frames-per-second output. It is a single-pass process and it runs on OS X right now. The looks are available from new builds of the software and also there aren’t that many parameters on the software because I couldn’t think of many to present. Maybe in the future there will be more access to the toolbox but right now those tools are a little bit too crude.
“From a marketing standpoint I identity the shutters so there are different names for each. There are square wave ones, which are named in the traditional way, 180˚ or 360˚ or whatever and they mimic what the camera can already do, so you always have the fallback of the very traditional, conservative look that cameras have always done. Then I have what I call the synthesised or ‘soft’ shutters and those – for lack of anything better – I name after elements. There’s no single metric, there are so many different parameters. What I do is generally rank them by their hardness, by how crisp they look. I do try to order them to give some idea of what they’re going to do to the look, whether it’s going to be a softer look or a crisper look, that kind of thing.”