Posted on Jul 18, 2018
by Julian Mitchell
He’s behind you! With Real Depth, the presenter can walk around the digital golf pro, including behind him.
In the second part of this new series we introduce bleeding edge technology from an award-winning British company who are blending the real world with the virtual one.
Words Julian Mitchell
Last month we looked at volumetric capture where multiple cameras capture the ‘volume’ of a space in a 360˚ environment. The advantages are hugely realistic capture and fantastic representation of clothing and faces, mostly avoiding the ‘uncanny valley’ effect that we have seen in other CGI capture. Having captured in 360° you can then change your views at will, moving the camera where you want it within the volume of space.
The downsides are that you are stuck with the lighting that you capture which is usually purposely flat. You have to capture in a dedicated studio which is expensive to build, maintain and commercialise. Also to convert your huge amount of capture data takes a large amount of processing power and time. But it’s early days. We’ll follow the technology and what early adopters like Intel and Microsoft do with it. Intel has recently volumetrically captured a full scene from a Western film with horses and buildings included – to do what with we don’t know.
A company which has been on the edge of new capture for a while is Ncam Technologies. Its only piece of hardware is a camera sensor bar that attaches to professional level broadcast cameras to effectively capture space data including depth and camera controls. Really a software company, Ncam has moved on from just using the depth tracking camera device to offer the highest quality camera tracking, pre-visualisation and augmented reality services. The real challenge is to bring the world of VFX in to a real-time environment and they’re further down that road than you might think.
It’ll light up your world – Real Light can capture real light and use it to light computer graphics.
Sensing depth was obviously something that Ncam offered from day one but one of its new products, first seen at NAB in 2016 as a prototype, is now mightily impressive. Interestingly the demo at the show also included volumetric capture as an asset, but the real party trick was to see a presenter immersed in and interacting with a virtual world. You may have seen this before on other virtual set demos but Ncam’s Real Depth allows the presenter to walk around digital assets – exacting depth information is working in real time.
Real Depth provides a unique automated technique for sensing depth. By extracting depth data in real time, subjects are able to interact seamlessly with their virtual surroundings for the most realistic and synergetic visual engagement.
It is this real-time aspect which pops up a lot in Ncam’s product description. CEO Nic Hatch explains Real Depth within the Reality camera tracking world: “Ncam Reality is our camera tracking product range and Real Depth is a new option within that. Typically, broadcast graphics would always be behind you or in front of you but usually you can’t interact in terms of behind and in front. But that ability is what we’ve come up with.”
For the NAB demo Ncam had a presenter on a virtual golf course – then they bring in the volumetrically captured golf pro who goes on to explain the shot with inserted moving graphics. Using Ncam’s Unreal Engine plug-in, they can bring in the golf pro asset and then add the graphics on top. The golf pro talks through the shot (audio was captured with the volume) as the graphics illustrate club speed, launch angle and other supplemental information. The presenter is walking around the course and the golf pro, behind and in front.
Nic Hatch, CEO of ncam.
Nic continues: “The cloth on the volume capture is really good. His feet are almost perfect, the cloth on the trousers is great, the camera can go around it and do whatever you want as it’s three-dimensional – with Real Depth you can go behind the virtual asset – usually broadcasters have to be careful not to overlap the virtual asset.”
An impressive enough demo would have had a presenter walk around a digital asset, but Ncam threw in a volumetrically captured asset to interact with a human presenter – again, it’s very impressive when you see it live.
“The graphic overlays are all in our Unreal Engine plug-in. The content is done by Nvizible, the VFX company in Soho. They do mainly visual effects works for films, like Eddie The Eagle, Kingsman, Bond. But they are moving more into real time, especially as they do a lot of pre-vis (through their Nvizage brand) for Fantastic Beasts and also as a rental partner for Ncam on films such as Solo: A Star Wars Story.”
Real and virtual light
Ncam’s Real Light is designed to solve the common challenge of making augmented graphics look like they are part of the real-world scene. Real Light captures real-world lighting, and renders it onto augmented graphics in real time, adapting to each and every lighting change.
That’s a simple explanation, but this technology magically blends real and virtual light in ways you can only imagine. Real Light is about capturing the real light and using it to light the computer graphics. That on its own is pretty mind-blowing. For the NAB demo we saw a virtual sphere lit only by its own virtual light. It didn’t change when other light was introduced on to its location, so not realistic; it’s not being picked up by any other light. It doesn’t feel like it’s part of the environment, it was very clear it was out of place. If you add Ncam Real Light into that situation you get a reflection from the real world. This was the jaw-dropping moment. You can also add different effects like shadow and real-time real-world reaction to coloured light for instance. The technology is ultimately moving towards the realisation of real-time VFX, meaning “we can start to do more realistic VFX in real time,” says Nic.
Extreme was Ncam’s third new product. This is a new option for its camera tracking products and provides enhanced camera tracking for severe lighting conditions, especially stage lighting including strobing effects. Ncam Extreme is for when you have a difficult lighting condition; Ncam Reality works in low light and bright light as well but when you’ve got something like a pulsating light or strobe light it can get a bit harder to track. At NAB, Ncam was showing that even with heavy strobing lights they can still manage to track because they are using infrared tracking. It’s infrared lights on infrared spectrum so they’re able to track and bypass any hazards they might have with the lights. You can see how very useful it would be for concerts and live events – also talent shows and game shows would definitely use it.
Ncam’s wide appeal
Ncam is marketing its products on three distinct fronts. Speak to some Hollywood directors and you will know that Ncam’s pre-visualisation enables unprecedented help to large productions which used massive blue or green screen sets, allowing the CGI sets to appear on screen while actors are performing. For broadcast and events Ncam’s AR technology is allowing sophisticated additional live graphics to augment the action or products launch. Then there’s the magic of Real Light.
Nic Hatch looks to the future: “Our scope’s quite wide because we are about pre-visualisation, we are about broadcast, which is real-time VFX, real-time graphics – it’s just the quality is lower, we need to get it higher. We think Real Light will make it more creative, make it easier so people start to adopt it. As this progresses it’s going to be quite interesting because we’ll get more and more accurate lighting.
“So all that data you begin to collate and use you can then reuse for really intriguing things in the future for live VFX in television and TV episodics, where your level of VFX isn’t too high. That’s going to be our sweet spot in the next few years and then you’ll see film doing some live VFX. We already are to some extent, for instance if you remove the green screen and put up an LED monitor wall with pre-shot plates, you can imagine an actor in front of a New York street scene on the video wall and then you get all the interaction of the lighting which can look really good. Nobody wants green screen so it’ll be interesting to see where it goes.
“Ultimately, we’ll be able to go out on a location, record all the light data from say a city street or a forest dynamically as lighting does change no matter what people think. We can bring it back in to the studio to drive any video walls and lights to light the talent perfectly. It’s just data capture and then copying that real-world data in to CG data and blending the two worlds. There are so many things you can drive with that data.”
For Ncam this is just the beginning of where this new virtual capture technology will take them. You can also tell by the people who visited the NAB stand that the commercial world is accelerating the demand.