READ THE LATEST ISSUE HERE

5G, 6G and the future of media production

Posted on May 3, 2025 by Admin

Adrian Pennington explores what 5G and 6G might mean for the future of media production

A couple of years ago, a short film was made by shooting scenes in real time with actors who were 280 miles apart.

The experiment – a first for the UK – demonstrated the possibilities of virtual production over 5G, and those behind the project are now advancing to the next stage, which involves 5G and cloud-based remote collaboration.

“We’re interested in how cloud tech and telecoms can support functionality of studios and interoperability across a wider network,” explains Abertay University’s Professor Gregor White, lead of CoSTAR Realtime Lab.

CoSTAR is a £75.6 million government-funded programme of five interlinked UK labs, fusing the film and TV industry and the latest VP equipment with academic research into AI, video gaming, robotics and immersive sound. Four labs have already opened, with one in Pinewood launching in 2026.

“Our focus is on how real-time technologies can be built into production processes and pipelines,” reveals White. “There’s still quite a lot of soldering and sellotape involved in making volume stage work. We’re looking to plug that gap with software, so you don’t need an army of programmers to be able to implement your creative vision.”

The Realtime Lab based in Dundee was behind the original VP over 5G proof of concept. It draws on Scotland’s world-leading expertise in gaming (Minecraft mobile developer 4J Studios is based in Dundee), linked to Abertay University motion-capture studios and Edinburgh University studios, Water’s Edge Studio, Scottish Enterprise and more.

Now it’s working with Amazon Web Services (AWS) to build the Cloud Lab, “which will be a virtual doppelgänger of the physical studio,” White explains. “We could then extend across the CoSTAR network (of studios in Yorkshire, Belfast and London) and facilitate people from anywhere in the world to come in and collaborate virtually.”

The proof of concept in 2023 demonstrated that 5G could reduce latency between locations, so that actors could perform scenes ‘naturally’.

“That was the penny-drop moment for me, in terms of what CoSTAR Realtime Lab could do. We proved that we could send camera telemetry back and forth over the network, but it was also a very static scene. Now we’re asking how we can extend that functionality so that somebody in LA could be acting with someone in London for a virtual scene in a way that looks naturalistic in terms of performance, as well as coherent in terms of colour, lighting values and dynamic lighting effects.”

There are knock-on benefits in terms of energy use and carbon footprint, “particularly if you’re not flying people around the world for 20 seconds of work.”

The CoSTAR National Lab at Royal Holloway is a partner in the concept with AWS, and AIX Live which is based at Media City. “One of the challenges of managing distributed live events is that video moves slower than data. We are working with them to build a technology that allows us to synchronise video and data streams.”

BT is another partner; it’s putting in a 5G network at each CoSTAR site and advising on the next generation: 6G.  “Once you get into moving large volumes of data around at the resolution required for high-end production, it’s going to be a challenge for network technology,” White says. “We’re working to increase capacity and connectivity between the labs. The laws of physics won’t change with the next generation of telecoms (the speed of light will remain the same) so there are some immovable objects in the system, but the richness of data we’ll be able to transfer and quality of content created will be enhanced by the increased capacity and functionality of 6G.”

Several high-profile live events have already demonstrated the power of 5G production workflows, including the BBC and ITV for coverage of the crowning of King Charles in 2023 and following the Olympic torch relay around France. Routing signals over bonded cellular transmitters enables remote camera control, a reduction in on-site staff and lower overall production costs.

Performers dancing on a dark stage with a monitor converting the image in front of an audience
CoSTAR Realtime Lab’s sites are researching the potential of live feeds in film. Image Alex Holland/costar, University of York

5G for live and real-time collaboration today

Despite its potential, 5G adoption has been slow, largely due to a few key challenges. Public 5G networks are often optimised for downlink-heavy use cases (like video consumption), making them poorly suited to the high-uplink demands of live broadcasting.

Traditional private 5G solutions also tend to be complex and costly, requiring specialised knowledge and infrastructure like dark fibre connections, according to Jacky Voß, executive director strategy & innovation for Riedel Communications. “These barriers have made it difficult for many organisations to adopt 5G for professional media production.”

Riedel says its private 5G solution, Easy5G, will make ‘enterprise-grade 5G’ available to media productions and live events. Other features include network slicing, SIM-based secure access and expandability with additional base stations. “Unlike traditional carrier-grade systems, Easy5G allows general IT personnel to deploy and operate the network via a simple web interface. It runs on standard IT networks – no dark fibre required – and is cost-effective,” explains Voß.

For music and performance arts, high-bandwidth, low-latency wireless telecoms enable untethered audio and video transmission.

“It’s ideal for interactive or remote collaborations. Artists who are based in different locations can perform together in real time,” continues Voß. “In narrative and virtual productions, 5G enables mobile and immersive camera systems, AR integrations and live audience interaction. It also allows for new kinds of immersive audience experiences, such as fancam feeds and real-time mobile content streaming within venues.”

While 6G is still a few years away, it’s expected to bring transformative capabilities for content creation and distribution. Expect even faster data speeds, ultra-low latency and enhanced integration with AI and edge computing.

“For creators, this could unlock near-instant cloud rendering and fully immersive virtual experiences,” Voß says. “Global real-time collaboration with almost no delay becomes possible.”

Riedel’s Easy5G is built on standardised technology, and it’s been designed to evolve alongside future 3GPP releases, meaning that, as 6G becomes viable, Easy5G can adapt, ‘minimising disruption and maximising long-term value’. CoSTAR also has its eye on how next-gen tech like AI-powered 6G will change how content is consumed.

“Can these kinds of immersive environments be distributed the same way TV is broadcast now?” White poses. “Experiences like ABBA Voyage are currently dependent on physical location, but will you soon be able to experience something similar in your living room? In terms of narrative and performance, the next gen of production, distribution and display tech promises richer, more emotional and compelling experiences than anything we’ve achieved up until now.” In short, 5G and 6G support more immersive, mobile and collaborative storytelling formats: bring it on.

This article appears in the April 2025 issue of Definition

Vazen go anamorphic

May 17th, 2022

Vazen have completed their 3-lens set with the 135mm T2.8 1.8x Anamorphic Lens for...

Access livestreaming and video conferenc...

June 14th, 2022

The ZATO CONNECT marks the beginning of a new direction for ATOMOS, making livestreaming...

The Supremes get a new member

May 9th, 2022

With Diana Ross unfortunately not in attendance, NAB in Vegas still put on a...

Quasar Science releases lighting design ...

January 26th, 2023

The app enables remote control of Quasar’s Rainbow series

Newsletter

Subscribe to the Definition newsletter to get the latest issue and more delivered to your inbox.

You may opt-out at any time. Privacy Policy.