LED: Show your true colours
Posted on Sep 12, 2022 by Samara Husbands
Not all LEDs are equal, causing headaches for standardising output. We speak to the companies offering solutions to an industry-wide challenge
Words Phil Rhodes
Between low-energy lighting and virtual production, it’s fair to say that Isamu Akasaki, Hiroshi Amano and Shuji Nakamura’s late-eighties and early-nineties work on blue LEDs has brought a lot to film and TV. But as the proliferation of measurement standards suggests, it’s also provoked some complexity around how
light and colour actually works.
It’s tempting to assume that an LED video wall might be all the light a scene needs, given that the interactive play of light on real-world foreground objects is a large part of the appeal. That approach was used heavily for space scenes in Gravity, long before LED volumes for in-camera visual effects became popular. As Jeremy Hochman of Megapixel VR suggests, though, light cast by the screen is best used for its interactive properties only.
Megapixel has built displays for high-end clients all over the world – both for conventional LED wall applications and some big-name, big-screen clients – using approaches that would become almost an industry standard. “In the virtual production space,” Hochman begins, “First Man was the first to use an LED volume. It deployed a tile based around our team’s engineering specification, along with our processing.” In the context of LED video walls, processing is the combination of rack-mount hardware and electronics in each panel, which bridges the device producing the video image and the LEDs themselves. Megapixel’s system, Helios, is designed with a lot of flexibility to synchronise timing and keep colour accurate.
Still, no matter how good that processing is, colour performance is limited by the performance of the LEDs themselves, which are best not relied upon as a principal way to illuminate things – particularly people. “Colour quality in LED video walls works for dynamic lighting effects, but it’s not the best as a proper illumination device. Great, successful uses are for driving shots with bright neon, traffic lights and things like that – but RGB LEDs can’t properly mimic sunlight, or broad-spectrum illumination,” Hochman confirms.
Megapixel has some new ideas in development around the concept of adding white emitters to LED display pixels. Emitters which use a blue LED to illuminate phosphor to create white or coloured light, generally produce a broader, smoother spectrum than those which don’t. Although, this is at a cost in efficiency and sheer output. Putting that technology in a video wall panel might let the resulting display put the principal cast in a suitably flattering light, although until that sort of technology is widespread – and probably even after that – virtual production is likely to require more conventional lighting as well.
It almost seems uncharitable to refer to state-of-the-art modern LED lighting as ‘conventional’. Even common designs are often expanding beyond red, green, blue and white in their keenness to achieve high colour quality. Rosco DMG’s original light engine involves six different emitters, including a phosphor-converted red and, as the company’s Charlie Verne says, “Cameras are more pleased by the phosphor LEDs, because they tend to be out of gamut a lot less. Much of my job is speaking to gaffers and board ops… and colourists, because they’re directly influenced by what we make.
“The engine has been in development since 2017, and we came out with the SL1 Mix in 2018,” Verne continues. “When it was released, it was fairly limited in terms of colour temperature range, and in software we’ve improved it quite a bit. For the DMG Dash, we managed to cram everything into a five-inch design.” That kind of engineering decision often involves a straightforward compromise of colorimetry and power. “You can sacrifice quality for output; you can create output with TLCI, CRI or TM30 that’s much worse – the better the colour rendering, the lower the efficiency. The best answer is found in the middle.”
With all this technology, Rosco is keenly aware of its relevance to virtual production, and particularly the idea of lights with individually controllable subpixels for integration with the video display. “We’re working on profiles for virtual production,” Verne confirms. “We’d love to do some pixel lights down the road.” Meanwhile, the practicalities of selling lights to rental houses never gets any easier. “We focus a lot on colour, but also build quality. We’re looking at a design for a future product and the question was: should we choose an engine that’s better in this and that way, and only lasts 7000 hours, or this one that’s a compromise in a few ways, but lasts 20-30,000 hours? We went the route of choosing not to drive our LEDs too hard.”
Plugging the gaps
Manufacturer ETC, with a foothold in both live events and screen work, released its fos/4 and Source Four LED Series 3 just before the pandemic. The eight-primary design includes a deep red intended to, as ETC’s Declan Randall puts it, “get as close to being able to emulate the warmth and depth of tungsten as we have ever been with an LED source”.
“The problem with typical RGBW arrays is that the RGB portions are all narrow-band emitters, so you end up with gaps in the spectrum – colours simply not represented. Adding in a white helps to a certain extent, but not fully – there are still huge portions of the spectrum not represented,” Randall adds.
One difficulty is that many of the protocols used to measure and assess colorimetry were designed to represent the human eye, and while some cameras represent the eye reasonably well, better colour quality makes unexpected problems less likely. “Not all camera sensors are created equal,” Randall explains. “Each manufacturer has its unique sensor recipe, which is what makes DOPs select a particular camera for a job. Having full-spectrum lights means – irrespective of camera – you can capture good light.”
It’s just the sort of technology that might find its way into one of the sensory extravaganzas created by Disguise, a company whose demo reel suggests that its technologies could conjure something interesting out of more or less any piece of production technology that emits light. Disguise is, therefore, pretty much a natural fit for virtual production – but also for live and broadcast events, which increasingly leverage some of the same technology, if not in entirely the same way.
The company draws together a huge number of disciplines from the large and small screens, as well as concert and theatrical work. It’s one of few places which mention projected images; if it worked for Oblivion, it ought to work for other situations too. While many of Disguise’s most spectacular set-ups have brought together a variety of techniques, virtual production is no less complicated in terms of integrating camera tracking, real-time 3D rendering and vast displays. The company’s portfolio includes a recent appearance at Cannes, simulating car chases for Anka, and a commercial for Shutterstock in which actors wander a world made from history’s prettiest stock footage.
In the end, both virtual production displays and lights are trying to solve the same problems, as they cross over in capability. How good the results become depends not only on the technology, but also how good it needs to be. With the complexity of modern designs visibly increasing by the month, it’ll be interesting to see just how clever all this ends up becoming, before the famously fastidious film industry decides to relax and enjoy the show.
Pictures for video walls can come from a variety of sources, although most will use either carefully prepared live action material – often relying on at least some compositing work – or real-time 3D rendered pictures. The latter can involve racks full of high-specification computers running something like Unreal Engine with careful synchronisation. Although, in the end, the result is the same: one or more SDI or DVI connections which need connecting to a large array of LED video wall panels. The hardware that manages the division of the signal, colour processing, calibration and other key parts of the process exists in what’s usually a rack-mounted processor, that communicates with compatible hardware in each panel via fibre networking.
One of the world’s most prominent suppliers of both processors and the associated panel interface cards is Brompton Technology. Its Tessera series offers features that not only solve problems, but tell us a lot about the challenges involved in getting the best out of large, expensive arrays of LED video panels.
Calibration, for instance, is naturally crucial to avoid the inevitable manufacturing variations between panels being visible as rectangles in the display. Although it can sometimes have a lowest common denominator effect of dimming the whole display to a level achievable by the least-capable emitters, Brompton’s dynamic calibration maintains enough awareness of the panel’s capability to squeeze as much performance out of the panels as possible. Other proprietary tricks involve better handling of wide
colour gamuts and HDR.
With the bigger virtual production facilities creating displays tens of thousands of pixels across, forming a seamless, high-performance display from such a tornado of pixels is no small task.
Originally published in the September 2022 issue for Definition.