READ THE LATEST ISSUE HERE

Point, Shoot, Compute!

Posted on Jun 10, 2016 by Julian Mitchell

The recent blockbuster ‘Jungle Book’ pushed the envelope of virtual production. As Lytro launch their Cinema camera we look at other camera systems that are already working in 3D space.

Anything VFX heavy is perfect for these new virtual production camera systems. Film makers and studios are starting to get really interested in this, think of the biggest VFX movies coming up and a system like ncam will probably be on them. These movies are very much getting on board in terms of Virtual Production- as the new term is.

So you’ve got your real life location and you’ve got your real actors but you can’t have the creatures and the characters and whatever VFX, they are just not there to see. So you can start putting those in in real time with pre-canned animation of characters and creatures. You can put that back through the lens so you can understand eye lines bettter. You’d obviously record those takes and so the actors can then see if that performance is good and sign off on it there and then.

Nic Hatch from ncam explains the process on-set. “You have ‘action points’ so if you have animated objects say helicopters flying over or a certain action point where something explodes or something happens, the AD can start to call action on ‘virtual elements’ if you like which is really helpful for shooting otherwise you’re just not seeing any of that. Alot of it is to do with set extension as well so if you’ve got a green screen or a blue screen and it’s going to be VFX at some point it’s very important to have that in real time so everyone is watching that. For the cinematographer to be able to frame up, typically if you’re on a green screen you’ll have the actors but that’s it. Your framing will be completely different, you may choose a completely different lens, you maybe able to frame and compose a shot in a completely different manner to how you would if you were really there, if it really existed. That’s what we’re trying to achieve, trying to shoot that plate as if it really existed like we used to.”

LYTRO CINEMA CAMERA

Launched with much fanfare at this year’s NAB Convention in Las Vegas, this gargantuan camera has some pretty big claims too.

 Just part of the gargantuan Lytro Cinema Camera, the rest is out of shot. Just part of the gargantuan Lytro Cinema Camera, the rest is out of shot.

It uses Light Field technology, capturing all rays of light from a scene. In a post process everypixel in that scene is now able to be changed, every parameter like focus, colour, depth and placement in space can be altered. The promise is huge and seductive, getting rid of blue and green work is a good start but really exciting for visual effects guys are the tools that allow you to isolate the depths of objects. VFX Supervisor Adam Valdez does want Lytro to keep their feet on the ground though, “They need to take their big beautiful prototype and think about today’s world, today’s movies with a new version. Right now even moving to 4K, acquiring 4K is a really big load, working 4K in post is a really big load. There are very few projectors that project 4K, consumers can’t see the difference between 2K and 4K at home and economically it’s a big strain for us in post.” Lets hope Lytro Cinema 2 will be smaller.

lytro.com/cinema

Jungle Book VFX Supervisor Adam Valdez sees a huge advantage in just shooting stereo with Lytro. “A single camera that is able to give you two subtely different points of view to generate stereo, that’s really huge. Typically our problem with using two cameras and two lens packages is there’s a lot of manual clean-up work and alignment problems with those techniques.”

Adam also thinks that the Lytro Cinema Camera might be offering too much. “What I would like to see them do is make a smaller form factor actual movie dedicated version of it.  Smaller physically and lighter and a little more geared to 2K.  I know it probably can shoot 2K and it probably can shoot 24 fps, you don’t really need all the incredible high data resolutions that they are offering like 300 fps and this huge sensor and everything.”

zLENSE CAMERA SYSTEM

zLense launched a depth-mapping camera solution in 2014 that captures 3D data and scenery in real-time and adds a 3D layer. The technology processes space information, making new and real three-dimensional compositing methods possible, helping production teams to create 3D effects and use CGI in live TV or pre-recorded transmissions – with no special studio set up.

zLense says Directors can produce simulated and augmented-reality worlds, generating and combining virtual reality (VR) and augmented (AR) effects in live studio or outside broadcast transmissions. The depth-sensing technology allows for 360˚ freedom of camera movement. 

zLense uses a structured light system, a bit like the Hololens or the Connect. The issue with those types of systems is you can’t really use them outdoors to their full extent and they are limited in terms of resolution and therefore distance. 

zlense.com

Nic Hatch points out that ncam is not designed as a post process, they are all about real time. “We don’t use lightfield technology, our technology is based around real time stereo capture, real time stereo sensors that’s what our entire IP is based around. For NAB we were showing the firstversion of our real time depth reconstruction, that’s what Lytro is doing but we’re getting a lot more. Because we’re stereo and we have a certain amount of distance between our stereo we captured a bit more than most systems.

Nic promises that they are working on achieving keying without green screen as Lytro claim to do already. VFXguys like Adam are keen to get their hands on Lytro’s output so they can judge how good the keying is. “Blue screen isn’t just about isolating depth, it’s also about finely separating blurry and transparent things, I’m not totally clear about how the keyer works. The Lytro keyer I’m assuming is effectively giving you a multi-plane image in a way which is like saying I’m going to give you a photography but it’s effectively a composite.”

As a cinematographer or videographer you must expect to meet these new captures tools and be expected to work in their data ridden world. The processing will only get smarter as the kit that can get smaller, gets smaller. The world of computational cinematography is here and is only going to get more important.

ncam

ncam have had a big year. Their system started out as a pre-vis tool for movies, they still do that but now have moved on to working with video games makers and sports broadcasters. 

CEO Nic Hatch explains their strengths. “We’re all about the data collection and understanding 3-dimensional space. So that could be camera tracking, the lens data, distortions, focus, iris and zoom. The postion of the camera, it could be the lighting conditions as we’re starting to understand light now and the depth as well. If we’re recording all of that data or at least sensing it all in real time then we’re looking to send that data to any other device so if you are watching the Olympics through your VR goggles or headset you will receive that data for services like interactive television or just interactive graphics for immersive, personal digital advertising. You’ll be able to choose what 3D graphics or stats you want. This kind of data use has only just started.”

ncam-tech.com

HYPER-REALITY from Keiichi Matsuda on Vimeo.

Newsletter

Subscribe to the Definition newsletter to get the latest issue and more delivered to your inbox.

You may opt-out at any time. Privacy Policy.