READ THE LATEST ISSUE HERE

Arraiy for Hollywood

At NAB Show 2019, we interviewed Mark Tobin, new CEO of Arraiy, a Silicon Valley start-up company that is looking to revolutionise virtual capture for movies – and that’s just the start

Questions / Andy Brogden

“It’s a through-the-lens monocular tracking solution, 100% software based”

— Mark Tobin

Definition: Is your background based in development?

MARK TOBIN: I’ve worked in post-production creative services for 30 years. I started as an editor on Avid in New York in 1989. I then worked for companies like PBS and HBO, before moving to LA 20 years ago where I started producing with a little company called Method Studios, which is now owned by Deluxe. Then, I went to run a company called A52, which is part of the Rock Paper Scissors group of companies. I then opened up the MPC office in North America and ran Psyop, an animation company in commercial advertising.

 Arraiy’s DeepTrack in progress Arraiy’s DeepTrack in progress

Def: So your next step was Arraiy? What did you bring to the company?

MT: I was asking myself ‘what am I going to do next?’. I felt like real-time content was really a growing area and how’s that going to be enabled and is AI going to help? Is it going to stretch me intellectually and allow me to be in the space that I’ve been in with the same relationships? I looked around and got introduced to Arraiy through a mutual friend, but when I found them they were a hardware company that had a three camera rig attached to a hero camera. It was focussed on providing visual effects elements to the MPCs and Framestores of the world, rotoscoping and tracking and the like. I gently said to them that this probably was not going to work. You’re going to have to get that rig on to cameras and, ultimately, you’re going to have a DOP who will say that he or she doesn’t want it on their camera. How are you then going to manage all that data and get it to all those places? On some levels even if you can do it at a cheaper rate, it’s a solved problem. So I spent some time just consulting with them to narrow in to what they were doing and where the real value proposition was. The question was: how can we leverage their AI ideas for real-time content creation?

Def: So you changed your focus away from hardware development?

MT: Yes, we made a shift still using the same underlying technology, but then looked at how we could enable real time and really empower content creators. So first thing we did was to focus on tracking and now we have a tracking solution called DeepTrack – that’s what we’re releasing [during NAB Show]. It’s a through-the-lens monocular tracking solution, which is entirely 100% software based, just understanding the features and the textures of a scene. We calibrate the camera, which takes about 20 seconds, and we do some deep learning on the environment, whether that’s in the studio or in a sports environment outdoors. We can create a model over three to four hours, depending what the scene is, and then you have a known geometry of every scene. You can then enable any camera that’s in that scene to camera track and to object track. So if NBC wants to shoot a football game or an athletics event and wants to use a 100 cameras, we can enable all 100 cameras to do tracking, which enables them to do real-time graphics. That’s the differentiator; we don’t need hardware, we don’t need stickers on the ceiling, we create a neural network that we can leverage across any cameras or scenes.

 Arraiy’s development team showing keying without green screen Arraiy’s development team showing keying without green screen

Def: What does Arraiy offer the customer?

MT: We basically allow anyone to develop their own neural network for their own uses. At NAB Show 2019, we partnered with The Future Group to show our tracking solution that many people are interested in. They have a great rendering graphics engine, so you can use MoSys or any other system. But if a customer wanted a fully integrated software-based solution, they can buy Pixotope virtual reality system with the Arraiy-embedded tracking system. We’d then license our tracking system into Pixotope.

Def: What’s in development and how else can AI help in this field?

MT: The next thing we’re developing is a segmentation rotoscoping solution, which essentially allows you to do Ultimatte-like green screen, but we can do it without a green screen – any type of flat field or brick wall or anything with depth matting as well. We’re looking to release that by the end of the year. Next year will be soft-object tracking, so we can do motion capture potentially without all of the sensors and all of the hardware around that.

““So if NBC wants to shoot a football game with 100 cameras, we can enable all 100 cameras to do tracking””

— Mark Tobin

Def: Tell me a little about Arraiy? What kind of capital have you raised?

MT: The company is based in Mountain View, so it’s really a Silicon Valley company – it’s a venture-backed company. Last year, we announced a $10 million Series A round of funding led by Lux Capital and SoftBank Ventures, with participation from Dentsu Ventures and Cherry Tree Investments, and continued participation from IDG Capital and CRCM Ventures.

Def: Who has shown an initial interest in these products?

MT: We’ve been talking with MPC and The Mill who are both Technicolor companies and working on how we can integrate into their virtual production. We don’t want to reinvent the whole pipeline, they have pipelines and embedded solutions, so we just want to be able to offer them a licence for our software. We’ve also been having conversations with companies like Avid. All these companies have great products and services, and we just want to enable them. I say that we are really bringing the physical world to the digital world and there are some people who are doing this with hardware-based solutions and we’re doing it 100% in software.

Newsletter

Subscribe to the Definition newsletter to get the latest issue and more delivered to your inbox.

You may opt-out at any time. Privacy Policy.