READ THE LATEST ISSUE HERE

Cloud Busting: What The Cloud Could Mean For Pro Video

Posted on Jul 18, 2012 by Alex Fice

The Cloud is upon us and can’t be ignored. But what does it mean for the professional video world and what services are available at the moment? JULIAN MITCHELL dips his toe in the cloud to find out

The ‘Cloud’ isn’t new, we use it every day on Facebook, Netflix, Google Drive and countless other applications, us Mac iOS users even have our own one with an ‘i’ in front of it. But although Cloud Computing is here to stay the searing speed of innovation means that there are revolutionary applications on the way that promise to change our computing lives irrevocably. Eric Schmidt from Google called cloud computing potentially a bigger change than the dawn of personal computing, dodge the agenda and the corporate ease and he might just be right.

Every business now needs a cloud strategy, consumers will embrace the cloud, many without knowing they’re participating in it. In a few years time even the term cloud computing won’t even be used because it’ll just be the normal way of doing things.

But what does it all mean for the professional video industry? Many productions are already knee deep in Cloud business and have been for a while, reviewing and approving rushes and clips online, backing up to servers somewhere in the world, this is common practice and has been for over ten years. Most post companies probably don’t think they need the cloud just yet and if they do they follow their original workflows but with the cloud  duplicating part of the process.

Apart from workflow mimicry the cloud promises to reduce your capital expenditure costs by halting another round of new storage purchases, new IT based companies are springing up who want to massage your budget requirements away from buying new racks of storage, that fact alone will keep your accountant interested!

Futurists love the cloud because they can pig-out in tech heaven and postulate what could happen without fear of ridicule, but that merely shows the depth of the potential. They describe the Cloud as the end of the pursuit of local processing power, the end of the personal computer no less. The thinking is that there will be so much processing power in the Cloud that any computer device, like the ‘Thin Client’ idea of the Eighties, will be able to harness it.

But surely it is that processing future that doesn’t require a crystal ball, as we know Moore’s Law is inescapable and the cloud is just another beneficiary of it. There is a more subtle change afoot that will initiate the herding of processing power into digital’s version of the power station.

Cloud’s latest power server: Tilera’s new S2Q 2U server’s I/O enables it to provide up to sixteen 10GbE interfaces and sixteen 1GbE interfaces without adding the power and the cost of additional chipsets and networking cards. Eight nodes each contain the 64-core TILEPro64 processor. So that’s 512 cores providing up to 1.3 trillion operations per second. When racked the system can offer up to 10,000 cores. Each server node draws the maximum of 35-50 watts.Intel introduced the single chip Cloud computer idea in 2009 which contained 48 processing cores on a postage stamp sized slab of silicon. This year a company called Tilera announced a 144 core processor in a 1U rack space. Devesh Garg, co-founder, president and chief executive officer of Tilera commented on their recent processor family product launch. “From 60 million packets per second to 40 channels of H.264 encoding on a Linux SMP system, this release further empowers developers with the benefits of many core processors.”

It’s also a sign of the times that Tilera quote ‘performance per watt’ stats quite high up on the feature list. Multi processors will inevitably appear on personal computers but the thinking is that eventually environmental pressures on the possession of multi processor cores that are not be used 24 hours a day will come to bear on where you do your processing. Personal computers almost never work at their full processing capacity whereas Cloud racks run flat out. The move to more cloud processing will drastically change the form factor of personal computers, have a look at the new Chromebox from Samsung. A stripped down personal computer for the cloud computing age.

It is futurist Christopher Barnatt who thinks that Cloud computing data centres will become the power plants of the information age, “While very few people will ever go near a very-many-core cloud processor, most of us will nevertheless access one many times a day”.

But the increase in data centres is a serious environmental problem, depending on who you talk to the guess is that nearly 2% of the CO2 produced in the world comes from these centres. So surely that will stop their growth stone dead in the water? But no, the cloud is cleverer than that and has turned from an energy pariah to a potential green champion leading the eco way for other industries. [http://www.greendatacenternews.org]

Only this year a British company Verne Global has decided to create the world’s first data centre powered entirely by geothermal and hydroelectric energy in Iceland. Traditionally global companies had sworn off the idea of the natural cooling abilities of Iceland for their server farms because of the lack of significant connectivity, now the Icelandic government has seen an opportunity to enhance their world standing (and make up for their disastrous banking collapse) and is busy laying sub-sea cables to attract more investment from cloud computer start-ups.

The Verne data centre sucks in the cool Iceland air to cool its servers which are powered by the natural geothermal and hydroelectric occurrences locally. Google are also considering parking their servers in the ocean to cool them and run them off wave power.

But these data centres are not infallible as witnessed by the customers of the Nat West Bank in the UK recently when a data centre in India failed and of Netflix, Instagram and Pinterest amongst others as a storm in Northern Virginia struck an Amazon Web Services data centre causing extended downtime for services. The data centre in Ashburn, Virginia that hosts the US-East-1 region lost power for about 30 minutes, but customers were affected for a longer period as Amazon worked to recover virtual machine instances. A multi vendor approach for your data centre seems to be the best way forward for safety’s sake.

If you believe the multi process data centre argument you can then lose yourself in imagining what the processing could do for you. High resolution video editing is a given, heavyweight rendering is an obvious idea, grading etc… Once you have the processing you just give it things to do, 24 hours a day. But think beyond just the processor churn and look to augmented reality which is another by-product of the ‘always on’ cloud. Visual recognition by cameras will produce a ton of information for the production. We already have lens data and GPS metadata, you can add to that real time data for the DoP who might be wearing contact lenses linked to the Cloud (Don’t laugh they’re already here). The data would be overlaid as he designs his shots and maybe needs references of other similar shots he has done before or instructions from the post house.

The augmented reality future is so intense that the phrase ‘diminished reality’ has already been coined to use the same processing to block out things you don’t want to see like advertising as you walk past shops and people’s Facebook and Twitter icons as you walk down the street.

The first instances of visual research is appearing already with software like Google Goggles for Android smartphones. Take a picture and the cloud will return information about the picture, either identifying it or giving you more information. Google is also experimented with physical goggles connected to the Internet and to the cloud, but they just look weird.

There is also less processing heavy Cloud adventures like crowdsourcing or ‘asking 100s of people what you should put in your camera’. Companies like Digital Bolex used a mixture of crowdsourcing and crowdfunding to launch their product. But is it a sensible approach so your customer knows exactly the product they are getting and what feature set they have paid for or is it just cheap R&D?

The Cloud as a Utility

If Christopher Barnatt is right when he says that cloud computing is the next type of power plant for the information age then like power, cloud computing should be seen more of an utility and treated accordingly as far as price and availability is concerned. Price structures for simple storage solutions like Amazon’s S3 servers are far too simple for an infrastructure that ramps up and down as you need processing. Cloud utility is priced for what services like Amazon call Instances. These are chunks or snapshots of server storage/processing/performance. For example a large and an extra large instance from Amazon’s elastic computing platform is described as:

Large Instance

7.5 GB memory

4 EC2 Compute Units (2 virtual cores with 2 EC2 Compute Units each)

850 GB instance storage

64-bit platform

I/O Performance: High

API name: m1.large

 

Cluster Compute Eight Extra Large Instance

60.5 GB of memory

88 EC2 Compute Units (2 x Intel Xeon E5-2670, eight-core “Sandy Bridge” architecture)

3370 GB of instance storage

64-bit platform

I/O Performance: Very High (10 Gigabit Ethernet)

API name: cc2.8xlarge

Each instance provides a predictable amount of dedicated compute capacity and is charged per instance-hour consumed.

In this way companies like Amazon and their elastic compute cloud (EC2) idea reflect the utility approach. It’s elastic because you only pay for what you use as is normal with electricity and gas, their pricing reflects this, “You can increase or decrease your compute capacity depending on the demands of your application and only pay the specified hourly rate for the instances you use”, says Amazon. In effect you can ramp up and ramp down your server ‘instances’ in minutes and they also provide you with a calculator to work out your spend (see over page). There are choices of CPUs for larger computational requirements which is an option at the moment for scientific research for things like Computer Aided Engineering, molecular modeling, genome analysis, and numerical modeling. But just as easy to see this cluster processing working for 2K video and beyond.

Samsung’s new Chromebox Series 3 personal computer holds the Chrome OS and integrates with Google’s cloud services for Apps. Storage is only a 16GB SSDAt NAB 2012 Microsoft announced new media services to enable content providers and customers to realise the power of cloud computing. Microsoft’s new Azure service has a similar price structure to the other cloud companies and again charges on what you use and also the power of the processing you’re after . All processing has been broken down into servers instances as the currency for the new cloud utility world. The Azure platform is quite late to the Cloud but has the advantage that programmers will find the development environment familiar and hopefully build their application without much conversion time. Their NAB announcement also included some media partners. These included high-speed transfers from Aspera Inc., content encoding from Digital Rapids Corp., ATEME and Dolby Laboratories Inc., content security from BuyDRM and Civolution, and video-on-demand streaming from Wowza Media Systems Inc.

Dedicated Services

The NAB Cloud Computing Conference this year had eight times as many attendees as last year, proof enough that the cloud is already impacting on the professional video industry. At the same show Microsoft announced their Azure service for developing and running Windows applications and storing data in the cloud. NAB also saw the launch of a dedicated cloud service for the industry by a company called Aframe. Aframe’s founder David Pleto explained the reason for the launch, “I was a producer and founded a post production company in London called Unit. I tried to build Aframe from that point of view by asking some basic questions. ‘What would I want to know before I trusted a company with my content? I would want to know that the people who built it understand the day to day in and outs of shooting video, I don’t want to be putting my content with just a technology company that doesn’t understand my needs. I would also want to know that their sole focus day in and day out was the security of my content. I would also want to know that they have the spread to enable me to work internationally.”

Aframe obviously tick all those boxes and are in the middle of setting up re-sellers and ‘upload partners’ in a race to be the first choice when productions see the cloud as an asset to their productions. Interestly Aframe don’t rent space from companies like Amazon’s EC2 platform and actually own their own servers which would give them some leeway and some elasticity of their own.

A basic workflow popular with some of Aframe’s clients is one where the high resolution footage is uploaded to their servers where proxy files are created and then used to offline. The AVID project file is then used to identify the high res files to download from Aframe.

If you ask an editor or production company if they need a service like Aframe’s they will probably mention that they already have access to ftp and have already rented GBs of storage for their back-ups. They will also mention that why would they want to upload their high resolution footage to Aframe just to download a lot of it when they have perfectly adequate local storage to use.

But it’s early days as David points out, “You don’t have to use that workflow, plenty of people are dipping their toes in and just using us for review and approval of either the raw assets or storing the high res files with us adding their timecode-based comments and everything else and then using us like a paper log sheet as usual but with the knowledge that all their media is backed up in the cloud.”

David explains this workflow a bit like they are doing an equivalent of a tape conform for you. They’re reading the EDL and giving you the high res bit with handles that you need to re-link in the AVID and then you finish locally. A full conform in the cloud wouldn’t be practical as you’d have to have all the assets, plug-ins, fonts and graphics available and that would be hugely complex. David sees a time when you would be using plug-ins in the cloud, in fact he has already had talks with some of the developers. But before that he sees a time when you would have your editors in the cloud – maybe an agreement with Adobe is beckoning.

“If you saw Adobe’s technical demo of their new Creative Cloud at NAB you would think that the thing was nuts! It’s a full Premiere Pro running in the cloud and you’re editing off high res files all the way through and then you are doing your conform in the cloud! I can see us working with Adobe like that handling all the media and transcoding in the background, that’s a really exciting future.”

Aframe has recently launched into the States and are currently concentrating on finding these ‘upload partners’. Signed up are companies like AbelCine, with locations in New York and Burbank, DigiNovations in Acton, MA, Postworks in New York, Hula Post  Production, in Burbank and Rule  Broadcast Systems in Boston. These centres will have GigaByte lines for uploads and will hopefully become beacons for those production companies who want to use cloud computing for their files.

Aframe have a great opportunity through these relationships with upload partners to become the de facto standard if you want to work in the cloud for professional video. They are also announcing soon a relationship with a camera manufacturer that would theoretically push the name out to a wider audience. David wasn’t saying what the agreement was or with who but presumably you would buy a camera and with the camera purchase you might get a few months of Aframe cloud computing. That idea would makes sense with the huge publicity drive that Aframe are currently undertaking.

Infrastructure First

Most heavyweight IT companies should by now have their Cloud Computing strategy sorted out and have started targeting the digital media and entertainment industry as a growth area, after all data is data. But for professional video you are probably better off looking at a solution that has sympathies with our industry. C4L (Connexions for London) has started a company called MiMiC who are looking to use their existing infrastructure as a private media cloud for digital media. They brought in consultant Ben Dair who used to work for AutoDesk and AVID to identify what the market needed and what they can offer it in return, Ben explained MiMiC’s offering. “We identified that there was high connectivity and high capacity storage for tier-2 and tier-3 applications. What we have been doing since we started is deriving services from that. Everything associated with MiMiC is effectively a dedicated private secure service designed for the media and entertainment industry.” One of their primary goals is to offer an alternative to pricey local storage expenditure. Instead of spending hundred of thousands of pounds on new storage you effectively rent connectivity and ‘mass scale out’ storage through them but again as part of a utility model. So no capital expenditure but an operating one.

MiMiC’s philosophy initially is to mimic people’s workflows, hence the name. Their research showed that one of the services that are not usually costed by post studios is archiving and they realised that with the huge increase in digital content coming through the door studios would soon be giving back archived material to the content owners or charge significant amounts to keep it archived. This is a definite potential market for them and one that seems easy to convert to.

MiMiC’s credentials are impressive and their offer equally so with connectivity up to 10 GBits/s which is effectively 350MB/s transfer rate in practice where a 1 GBits/s circuit would offer you something like 80 MB/s. So you can see for yourself how the system would work if you wanted to deal with digital rushes that you wanted to store or transcoded media or just wanted to move assets around. With those kind of speeds you could theoretically transfer up to 30 TB a day (7TB with the 1Gbit line). Looking at a real world situation this kind of transfer rate starts to make sense especially if you are archiving to LTO as many facilities do. The interesting fact is that LTO transfer although local will take longer than a transfer to MiMiC’s cloud. Plus you get all the added benefits like File System Metadata search and collaboration tools.

Top Heavy Post

The move to digital capture has turned what was a ‘final mastering’ heavy post process to one that is very front heavy now with all the multitude of digital rushes to sort including metadata handling and transcoding. Some say the handling of digital rushes is still in the stone age as in the practice of dealing with them and the way that process is costed. Here is where these IT based companies are looking to take the load off. But there is a difference between companies like AFrame and MiMiC. MiMiC are very much a low level infrastructure company who are looking to partner with more Software As A Service (SaaS) type applications. But they still offer access to the Metadata which will become one of the main challenges when you have online media and they are also able to leverage processing power through C4L.

Amazon’s EC2 online calculator helps you calculate your storage, processing and transit needs: http://aws.amazon.com/ec2/pricing/ http://aws.amazon.com/s3/pricing/ If the use of the cloud has whetted your appetite hold fast because MiMiC’s Ben Dair has some notes of caution as far as pricing is concerned. “Amazon are providing Elastic computing with their EC2 and S3 platforms and their whole charging model is very granular – you can get stung especially on storage. You’ve got the transit costs, storage costs and then you’ve got the return. So there’s about three different ways of being charged. We give you a 10Gbit link and tell you how much it costs a month and you can use it as much as you need to. Here’s the storage, say 100TB and you use it as much as you need to. We don’t charge for transit so keep the pricing model very simple. Traditionally a lot of telecom and IT providers make it really complex.”

MiMiC aren’t pretending to offer the whole package and are actively looking for partners who might want to run applications in the cloud but need the ‘grunt’ of a proven infrastructure that can scale with added virtual machines. Transcoding companies would be a good fit initially.

So should you join the cloud as part of your company’s IT strategy? Yes definitely, you can start by just using a service for review and approval by loading up rushes with a company like Aspera On Demand. You could then go further and transcode in the cloud using a company like Zencoder. “Video encoding belongs in the cloud. Our customers want to take advantage of the parallel encoding that we offer through Amazon Web Services,” said Jon Dahl, CEO of Zencoder. “Using Aspera, they can get their large media files into Amazon S3 fast, and the resulting transcoding workflows are 8-10x faster than the time it typically takes to prepare new titles for worldwide distribution.”

Our advice is to pick your supplier with care and treat the companies like you would a traditional utility supplier where tariffs are on purpose incredibly complex. But at the very least dip your toe in that cloud.

 

World’s Toughest Truckers ended up ingesting 26TB of rushes which were then available globallyHow Aframe’s Cloud Helped The Truckers

World’s Toughest Trucker pitted eight of the world’s most talented truck drivers against each other in some of the harshest places on Earth, but it was also one of Aframe’s toughest projects due to the sheer amount and variety of footage produced. Dragonfly’s production team were concerned about transcode, handling the massive amount of media and making the footage discoverable when they needed to organise the rushes.

Dragonfly chose Aframe to handle the job based on their ability to handle huge amounts of multi-format footage whilst making it searchable via their tagging and logging service.

Aframe ingested the raw rushes files then transcoded them to a common web proxy format so the whole team could access the media online, wherever they were located.

To ensure they got maximum value from Aframe’s tagging service, Dragonfly worked closely with the tagging team to create a ‘Truckers logging bible’ that detailed the precise way the team needed the footage to be logged. It listed naming conventions to be used when describing the main characters, what kind of logging detail was required for each camera type, time of day, whether the scene was interior or exterior and why the sequence was good.

For example, a typical log might be: Daytime. Australia. Red Truck. Interior. Cab. XF305. Discussion. Shane “We’re running really low on fuel”. Zola “I hope we can make it to the next gas station” GOOD TENSION.

Dragonfly’s Head of Production Janet Smyth: “The ability for the production team to start working on the footage right away was amazing because we could start organising the footage as it was ingested at Aframe’s ingest point in London. Once this footage has been combined with the logging, we could find everything we needed in an instant. The time and budget we saved during this pre-edit phase was massive.”

The production was a large multi-camera shoot with multiple format transcode. Ingest was via hard drives and cards with about 4-5TB of rushes per episode. Camera formats were Canon XF305, GoPro, Sony Minicam and Sony XDCam.

Each episode’s rushes were ingested at once. Eventual total storage was 30,000 clips, 26TB, 4,000 hours, mixed camera formats, full resolution and proxies.

Full transcription tagging was on all XDCam and XF305 footage, with lower density tagging on Minicam and GoPro footage.

The footage was stored in London and shared with production team members around the world.

Already In The Cloud

EDITING

Novacut: Novacut is a professional video editor with an open development structure. Their video editing software will be free of charge. Their business model is based on its distribution platform, not its editing software. Along with the video editor, they’re creating a central hub where artists can distribute their full length narrative features, web serials, animated shorts, documentaries, etc., while simultaneously crowd funding their work. Novacut will take a cut of the revenue artists make to pay developers, maintain their cloud, etc.

http://blog.novacut.com/

WeVideo: WeVideo is a basic browser only video editor for the YouTube generation or maybe as a journalist tool to through some clips together at a moments notice. The app is part of Google Drive and you use your Google storage to feed the editor. There are basic transitions and some effects with export out to YouTube and other sharing sites.There is a paid for version for higher resolution work.

http://www.wevideo.com/

Forscene: Provides professional editing tools in a SaaS (Software as a Service) package for all video industries including Broadcast post production, News and Corporate Professional environments.

http://www.forbidden.co.uk

QTube: Quantel’s QTube uses the cloud to enable editing in remote locations and actual finishing of programming in a collaborative manner. QTube has been designed primarily for Quantel’s broadcast customers and is based on their sQ server, the prominent sector of Quantel’s market. But this is only a start and soon Quantel will be talking about QTube for other sectors and other storage devices. 

http://www.quantel.com/qtube

CLOUD HOSTING AND PROCESSING

ADOBE Creative Cloud: Adobe’s big NAB announcement this year was the start of their cloud service. You pay on a subscription model and then have access to all of the CS6 desktop apps which you have to download to work with. You also get 20GB of storage for your CS files, (and any other files) and are able to synch your local files with the cloud. Adobe did show some processing with Premiere Pro CS6 at NAB but this was just a tech concept demo. It did however show the way forward for potential remote working for editors.

http://www.adobe.com/products/creativecloud.html

Aframe: Aframe launched in the US this year and is a fully fledged cloud partner to the professional video industry. Is helping many productions initially with their ingest and transcoding of digital rushes but will be able to offer more sophisticated processing as times goes on. Signing up ‘upload partners’ and re-sellers globally to try and steal a march on its competitors which are usually major IT companies like Amazon and Microsoft. Has a camera manufacturer tie up in the wings to further the cause.

http://www.aframe.com

MiMiC: A start-up infrastructure company offering private cloud services for the media and entertainment industry. MiMiC’s USP is its massive connectivity abilities and straightforward price structure where you don’t get hidden charges for transit costs. Looking to partner with SaaS companies who may have cloud apps but not the back office grunt. Able to move about 30TB of footage around per day with their top line 10Gbit/s lines.

http://www.thisismimic.com/

Newsletter

Subscribe to the Definition newsletter to get the latest issue and more delivered to your inbox.

You may opt-out at any time. Privacy Policy.