READ THE LATEST ISSUE HERE

Digital de-aging

Posted on Apr 17, 2025 by Admin

Forever young

We tackle the tech behind digital de-aging, a VFX process which freezes actors in time

Words Katie Kasperson

In Robert Zemeckis’ Here, we watch as teenage Tom Hanks and Robin Wright grow old. In James Mangold’s Indiana Jones and the Dial of Destiny, Harrison Ford appears as his younger self. This time-travelling technique isn’t unique to film; in Netflix’s Stranger Things, Eleven (Millie Bobby Brown) revisits her childhood, while in Apple TV+’s Lady in the Lake, Natalie Portman looks a lot like Padmé in the Star Wars prequel trilogy. It isn’t an illusion or exceptional prosthetics; it’s a process known as digital de-aging.

Often employed for flashback sequences, like in both Dial of Destiny and Stranger Things, digital de-aging uses proprietary technology to make actors appear younger – often much younger – on screen than they actually are. The technique debuted about two decades ago, with early uses in Click, The Departed, X-Men: The Last Stand and The Curious Case of Benjamin Button. In the last 20 years, it’s gained both momentum and scrutiny. The main counter-argument is that this eliminates jobs for additional actors who could otherwise play youthful versions of the leads, ‘by favouring more established but older names that can be de-aged to fit roles’, according to James Pollock, creative technologist at Lux Aeterna.

Above all, digital de-aging exists to serve stories. For instance, Disney released Indiana Jones and the Dial of Destiny, the franchise’s fifth and final instalment, in 2023, 42 years after Raiders of the Lost Ark first hit cinemas. Despite now being in his 80s, for roughly 25 minutes Harrison Ford appears as a young Indy – the same version as in the original eighties films. If it hadn’t been Ford, the scene would have lost some of its impact. Thanks to Industrial Light & Magic (ILM), this wasn’t the case.

To film Dial of Destiny, Ford acted out his flashback scenes wearing black dots on his face to capture positional data. During production, ILM placed two infrared cameras on either side of the main one to collect even more information about his movements. To achieve the final cut, ILM drew from previous footage and added the data captured during filming, creating a mask they could place over Ford’s face (and adjust according to the lighting).

It’s not always enough to de-age an actor’s face; sometimes their body needs to look younger too. In Stranger Things, Eleven relives memories from when she was a small child. In this case, the creators cast a body double (Martie Blair) as young Eleven, but the face we see is still a digitally de-aged Brown.

Released late last year, Here proved a novel challenge; Zemeckis wanted to capture Hanks, Wright and their co-stars as if they were aging organically. Metaphysic handled the film’s VFX, using generative AI to digitally age and de-age the actors in real time (with a negligible two-frame delay), tidying up the effects in post. The actors could even watch this live feed while filming to get a more accurate sense of what audiences would see, allowing them to refine their performances accordingly.

Going from de-aging primarily in post to doing so in real time is a natural step in the direction of more efficient filmmaking. As with virtual production, VFX is becoming involved at earlier and earlier stages, with lots more being captured in camera – a useful tool for DOPs and directors, who appreciate having this immediate feedback.

While ILM is internationally renowned for its artistry and attention to detail, Metaphysic is pushing the boundaries of what’s possible with regards to GenAI. The company ethically trained its model by feeding it reference footage from when Hanks and Wright were younger (after receiving their consent), to mimic not just their looks but also unique body language, expressions and behaviours.

Addressing the critique that AI equals job displacement, Metaphysic’s chief content officer and president of production Ed Ulbrich argues that this criticism is often misguided. He uses the example of CGI, suggesting that “it’s just learning new tools. As an industry, this created many new jobs that never existed previously,” and he believes the same is true of AI.

At Metaphysic, digital de-aging and digital aging often go hand in hand, and this requires an extra human touch. For instance, makeup artists aren’t being forced out: “we love working with great makeup artists,” Ulbrich says.

“Instead of putting a whole new face on an actor, we can work with makeup teams to do a prosthetic makeup treatment. You make them older, or into an alien or creature, and we do a data acquisition of them while wearing that makeup. Then they don’t need to wear it during production,” Ulbrich explains. “It’s the same as swapping their younger face onto them.”

When inputting a digitally enhanced face onto a real actor, films run the risk of entering the uncanny valley – where a person appears almost human, but not quite. The eyes have commonly been deemed the giveaway, particularly in the early days of computerised VFX. As technology evolves and digital de-aging becomes more convincing, this phenomenon diminishes; however, with the AI-generated, photoreal face, it never existed to begin with, Ulbrich argues. “We’ve eliminated the uncanny valley.”

While GenAI might seem like the next big thing, there are disadvantages too. “The main drawback of using this tech for de-aging is that you need a dataset of existing images,” Pollock explains. “If you don’t have access to that material, or it doesn’t exist, the AI-based approach might be impossible,” whereas with CGI, composites are created manually.

In the CGI versus AI debate, there’s no right or wrong; it mainly comes down to a filmmaker’s preference and budget. We’ve seen great examples of digital de-aging with vastly different techniques, and as technology evolves across the board we’ll likely be getting more of it.

This article appears in the March 2025 issue of Definition

A glittering family affair

July 27th, 2022

The trio of Premista lenses are making waves. Just ask cinematograper Philipp Blaubach BSC,...

A real fake story

November 23rd, 2022

The man behind a documentary about Thunderbirds creator Gerry Anderson explains why deepfake is...

Revealed by robots: The Invisible Man

May 31st, 2023

The Invisible Man has had lots of iterations over the years, but nothing quite...

Dynamic duos: J Blakeson & Philipp Blaub...

April 9th, 2025

Director J Blakeson and DOP Philipp Blaubach, BSC reflect on their working relationship, revealing...

Newsletter

Subscribe to the Definition newsletter to get the latest issue and more delivered to your inbox.

You may opt-out at any time. Privacy Policy.