Future Post Part 1
Posted on Nov 5, 2019
All the best technologists question the status quo, and in post-production, things have to evolve to manage the current torrent of content
Words Julian Mitchell / Caricatures Bruce Richardson
This future of post-production article is an exercise in crystal-ball gazing; everyone does it and the answers you come up with depend usually on who you ask. We hope this article will be the start of a series looking at the future of different areas of the business. For this article we’re very happy to ask questions of two experts in their field, Lee Danskin, who is CTO of Escape Technology and Zak Tucker CEO of US-based post group HARBOR.
Definition: What service or technology will have the most impact on post-production in the next five years and why?
Lee Danskin: It is the cloud, to a degree. From a technology standpoint cloud and AI are the two things that will have the most influence over the next five years. I don’t see post houses as being machine learners in the first instance, I think they’ll be consumers of the learning outcomes. Most of the manufacturers are working on some sort of AI solution; there’s AI in Flame now and there are a lot of things coming from the likes of Adobe with their research that are all super interesting for the future.
Obviously, that requires a lot of compute, it requires a lot of ‘tin’, something that everybody out there is trying to move away from for all the right reasons. The technology is changing at such a pace now that you don’t really want to be using hundreds of thousands of pounds’ worth of whatever that is under your desk to read your email on. The more VFX-like work, the more Maya, Houdini-type workflows you have, the more you can specialise the compute resources that you need and then just consume them as and when you want.
If it’s just a traditional editing, broadcast kind of place then their technology requirements are very different from a simulation VFX kind of house. Each one has its benefits and disadvantages but obviously within the next five years we’ll probably see the end or the death of the spinning disk. We’re beginning to see that now with, for instance, 15TB SSDs and bigger capacities coming in the solid state arena. That starts to offer up other potentials in terms of central storage; there are a lot of companies that are still working around the single GB/s networking to their machines and still use restricted file formats of the ProRes, DNxHD type, not fully uncompressed video formats. A lot of people have been skirting around the cost requirements but with Adobe Premiere now being able to play native file formats straight off the camera, the only thing stopping this growth is technology and hardware costs. It’s super expensive right now but that will come down immeasurably, the days of hugely expensive HDR monitors for instance are coming to an end. There’s still going to be a requirement for a grade 1 monitor without a doubt but soon you’ll be able to sit at a workstation and do an end-to-end production without having to dive in and out of other suites. So, the editors of the world will become more than just the editors which will affect the broadcast world a lot more with the 4K agenda being pushed by the likes of Netflix, Amazon, Disney and Apple with new subscription services.
The cloud will be one of the tools in the arsenal required to address the scale of demand for content production and distribution
You will also see the legitimising of mobile screen technology with new panels from companies like Apple and its HDR-compatible screen tech. The excuse of it just going on mobile is going to go away very quickly especially with 5G on its way, so delivery to mobile devices is going to change the way you view programming and there won’t be any scaling up to achieve broadcast-standard resolutions.
There are also more pipeline technologies like Universal Scene Description from Pixar which is the open source file format. That has been adopted by Apple with USDZ so you’ve got a crossover there where you can start to utilise film-quality assets being used in AR and VR type scenarios. You can then have multiple teams working not only on a pipeline within a building, but because of the way USDZ been structured, it can be used via the cloud as well.
Zak Tucker: The sheer scale of demand for premium content creation is mandating really intuitive automation for appropriate post-production processes; coupled with quite high-touch bespoke human QC, intervention and customisation. That’s why we don’t see it a particular service of a particular technology but more of an approach that couples automation where it’s appropriate with humans overseeing it. So you still have custom workflows so not everything is a cookie cutter result. AI will be part of the toolset but I’m also talking about all sorts of other coding and scripting that can take processes that we normally do by hand on a daily basis, and automating them. The demand for premium content creation has gone through the roof and there are literally not enough people and expertise to achieve it at that level if you don’t automate. The speed of turnaround will not be possible.
One example would be around VFX pulls, for instance, and retrieval automations. Especially on larger VFX features and episodics you can automate the pull process, which has been done to some extent already, but also automate the delivery back to the post house from the VFX vendors and the off-line suites, maintaining colour pipelines and metadata integrity so you can iterate and reiterate and deliver on time.
We’ve have already implemented parts of this with our own coders and R&D sections. We can already automate the pull process with our own process for delivery to the vendors, we’re really focussed now on retrieval from them. Also auto-conforming and auto-updating on offline suites along with exposing the pull system to our clients so that we can get out of the way as well. Our clients can then do 24/7 pulls and retrieval without human intervention except for the skilled human QC elements, troubleshooting and customisation.
The co-founder and CEO of HARBOR. For more than 20 years, Zak has been strategically disrupting to create one of the first end-to-end, cross-genre, independent production studios of its kind. Zak started as an editor for commercials and documentaries, but soon progressed to directing. He launched Swete Post in 2000 and in 2012 launched HARBOR. Zak built HARBOR from a staff of four and a footprint of 400 square feet, offering two services, into a staff of 100 and a campus of 70,000 square feet, offering the full range of production and post-production services.
DEF: How will the cloud’s growth reflect on post-production’s performance? Is the traditional model still relevant especially when post companies are increasingly active in pre-production?
Tucker: Going forward the cloud will be one of the tools in the arsenal required to address the scale of demand for premium content production and distribution; the post partners, archiving and retrieval. Some of the current barriers of full cloud adoption include egress costs, bandwidth costs, especially enabling real-time, high-resolution and low latency. It’s going to get more and more adoption and will allow for more and more remote collaboration and delivery of even higher-end processes.
Lee started his career at Alias as a senior applications analyst. He helped build and launch Maya 1.0 before moving to The Film Factory as VFX supervisor. He then joined Smoke & Mirrors as head of 3D before becoming deputy head of 3D commercials/broadcast for MPC. Lee moved to Escape Studios as training development director in 2006, where he redefined its training programme by making it more industry relevant. A year later he was made a Maya Master. Lee is now CTO of Escape.
DEF: Is the traditional post-production model still relevant especially when post companies are increasingly active in pre-production?
Danskin: We’ve visited quite a few post houses in the last six months to a year and if you look at what a post house once was, especially the heavy VFX-type post houses who once upon a time only did commercials or promos, now we’ve got people doing AR, VR, real-time game engines etc. The whole convergence of being able to stream games with services like Arcade from Apple is changing the traditional concentration on just video. That’s having a massive effect on the back-end pipe lines and workflows; we once knew what a post house did, maybe commercials or promos or features. Now they are seen as more of a collaborative creative resource. Advertising campaigns are now much more rounded and fully fledged and most of the post houses that we deal with have had to evolve to reflect that including diversifying to make up for the downturn in commercials work.
Post house are also getting closer to the camera with the new virtual capture technologies appearing on-set. Post houses now have a mixture of revenue avenues including the rise of the pre-vis companies that have been using viewports of certain 3D software and are now trying to use game engines to raise that bar even further. Then there is the whole on-set virtual production world which is coming the other way and they’re sort of meeting in the middle nicely. The post houses have upped their game to achieve more real-time workflows because of the way they’re working from a business standpoint of the need for diversification. It’s also allowed for them to do more on-set.
Post houses have upped their game to achieve more real-time workflows because of the way they’re working from a business standpoint
With the rise of films like The Lion King and The Jungle Book where 90% of it, or 100% for The Lion King, is virtual. That type of process isn’t going to go away, it’ll become more of the realm that we’ll be going down. The art of what is now possible has now shifted quite considerably.
The classic cloud scenario is allowing post companies to not have to be based in London but perhaps have showroom houses there. You could have just review and editorial in town and take advantage of cheaper premises in many other places.
There is a huge amount of content to be made and in many ways longform TV is outstripping some of the quality levels required for film. Film is still done in 2K in most instances and content for Netflix is required at 4K. Moving the data around Europe for instance is not feasible so you’ve got to move the pixels, whether that’s remote workflows through the cloud or dark fibre between sites. That’s going to allow us to get the talent pool big enough to actually feed this huge demand for content.
Tucker: The traditional model for post-production houses is still relevant; there’s too much riding on these workflows that have been very successful over time to throw them out. Post companies are increasingly moving upstream, including previs, near-set dailies, near-set editorial and even further upstream in pre-production – and even in script writing consultations and line producing budgeting consultations.
That said, for us the most intriguing innovations are around facilitating less linear post-production workflows, where filmmakers have non-linear access and interactions with all their post teams and divisions; dailies, editorial, VFX, colour, sound, you name it. They are then more easily able to iterate with a real-time feedback loop so creative options are maximised and easily riffed on and explored. If you think about it, the main thing we do in post is the enhancement of the vision of the filmmakers. We’re working relentlessly to put options at the fingertips of the filmmakers. So yes, the traditional model isn’t going to drop off a cliff, there’s too much at stake for everyone to throw out what we’ve already done. That said we’re looking at and wondering how to customise the combination of the traditional model with a non-linear model so that we can address all the filmmaker’s needs.
A good example would be rather than waiting until the end of the show or lock-to-picture to do final colour correct, we’re final correcting as soon as there’s a rough cut. So by the time we get to lock-to-picture, not only do we have essentially final colour but the editors and directors have been working with final colour updated in their AVID projects all along. So the guide track actually reflects what the final show is going to look like. Those for us are the really interesting places as it allows a director, for example, to really riff between sound, colour and visual effects editorial. Then you don’t get a situation where the director finishes one process and moves on to the next without being able to go back and iterate the edit if they see something in sound that they would like to iterate later on in the process. If you keep it all open and iterative throughout the process they can play off each other. Time and again we have seen creative possibilities happen that wouldn’t have happened in a truly siloed, linear traditional process.
DEF: As software and desktop workstations become more powerful, how will technology pricing change? Will this lead to a more service-oriented equipment supply industry?
The traditional model isn’t going to drop off a cliff; there’s too much at stake for everyone to throw out what we’ve already done
Danskin: If you’re planning on starting up with a post service, you may be looking at a place you go and sit and do the work with all the trappings of a place like Soho for instance. But if your idea of a post house is to use people all over the planet to achieve the work, then you won’t want to set up a company for a project as at the end of that project, whatever it is, you might want to give it all back.
If you wanted to work on a single project you might want to achieve it in the cloud with remote workstations with no kit but software that is activated with a log-in. You can just log in wherever you are and just use the software. You just pay for what you use. At that point buying a desktop workstation is kind of a crazy thing to do; if you’re only going to want it for six months you will only rent a machine or you’re going to use a cloud-based workstation, especially with the remote workflows available.
We’ve seen that already with people all over the country who are accessing central storage through Amazon. The days of use Google Drive or Dropbox for instance with their fair usage policy and the data sizes are over. You can’t use these services to share VFX content. So, you’ll have guys just sitting at home spinning stuff up, anyone working on a local desktop workstation in five years is a crazy idea. Having central storage and being able to dial up the power of the machine as and when you require it is where we’re going to be. For me, yes we’ll sell desktop workstations but it really depends on how the businesses move forward. Everyone will eventually move to an online licensing model for their software needs. How far out that is, who knows, but the data sets are going to be so large that you will need render farms and some sort of AI compute. You’re not going to put that one machine under the desk, it’s going be multiple machines that you’ll need. If you’ve got an AI that’s doing roto perfectly and following instructions like “just give me the mattes for all these people in this shot”, you’re not going to run that on a local workstation, you’re going to run it on something that has massive GPUs. Even a freelancer sitting at home is going to be using the cloud in a different way than we are today.