How Metaphysic’s AGT Entry Will Impact Entertainment

On June 6, 2022, Chris Umé and Thomas Graham, the founders of Metaphysic, joined the America’s Got Talent stage with Daniel Emmet. Using the Metaphysic technology developed, Daniel Emmet performed “You’re the Inspiration” using AGT judge Simon Cowell’s likeness live on-stage.

This performance stunned the judges and captured the crowd, resulting in four yesses. The group is moving forward to the next round of AGT — but what does their use of deepfake tech mean for the future of entertainment?


What Happened on the AGT Stage?

The main contestants for the show were Chris and Thomas. When they introduced themselves, they said:

“We use artificial intelligence to create hyper-real content.”

They then brought Daniel Emmet on stage. When Simon asked how he got in with the duo, he said:

“I’m a fan of what they do online, and they’re fans of AGT. And when they asked me to be a part of this wonderful, unique, original thing they’re gonna do, I couldn’t say no. ”

So, with all the mystery surrounding the trio, Cowell wished them good luck, and they started the show.

As Emmet prepared to sing, a camera walked onto the stage to record him in profile, blocking the audience’s (and the judge’s view). When the music started, the camera’s feed went live on the stage’s screen, where we saw Simon’s hyper-real avatar superimposed onto David and singing live on stage.

The crowd went wild when they saw that, but Simon was initially confused. As the performance continued, Cowell went from confused to embarrassed to entertained. Ultimately, he (and the rest of the audience) gave the act a standing ovation, and the group received a “yes” from all four judges.

How Deepfakes Impact Entertainment

While deepfakes may seem like a new development for most people, this isn’t the case, especially in Hollywood. Granted that this is probably the first time deepfake tech has featured in a live setting, but movies have been using this technology for years now.

One popular example is the 2015 Furious 7 film, where one of the protagonists, Paul Walker, died in a car crash midway through filming. Instead of killing off his character in the movie, the producers decided to film him using his brothers as stand-ins and then using CGI to put Paul’s face on them.

This technique also appeared in Rogue One: A Star Wars Story in 2016, and was applied to Grand Moff Tarkin and Princess Leia. Unfortunately, the actor who played the character, Peter Cushing, passed away in 1994. And while Carrie Fisher was still alive during filming, her appearance had changed significantly since 1977.

For this film, the production used stand-ins and similar technology to deepfakes to recreate the original characters digitally.

And while you may think this is a recent development, you might be surprised to know that 2000’s Gladiator also used similar technology. When one of the primary supporting actors, Oliver Reed, suddenly died during filming, the production chose to recreate him digitally while using a double body for the actual shoot.

However, this technology goes beyond resurrecting and age-reversing celebrities. They even used it in Captain America: The First Avenger to interchange Chris Evan’s hulking body for Leander Deeny’s smaller frame. Although they also used various other techniques to make his tiny appearance before Howard Stark’s experiment, deepfake technology was one of their tools.

How Deepfakes Can Revolutionize the Entertainment Industry

One thing all the examples above have in common is that they had to spend months and millions of dollars to get the effect they wanted to achieve. However, on the AGT Stage, Metaphysic showed that they could recreate another celebrity in high quality in real-time.

While their output on the AGT stage isn’t 100% realistic, it’s close enough to the real thing if you don’t look closely. And one big thing is that they’re doing it in real-time. With the way technology is evolving, we will soon find a real-time deepfake application that could instantly produce a realistic video.

The development could reduce studios’ dependence on advanced tracking suits and extensive post-processing, further streamlining their post-production workflow. And while creating seamless deepfake videos isn’t as easy as it sounds, you no longer need to be a multi-million dollar production company to afford it.

The technology’s mainstream introduction would allow smaller studios and even indie filmmakers to afford and use it for their creations. In fact, anyone with a smartphone can try this technology through face swap apps, though they’re not as good as on the AGT stage.

The Future of Deepfakes in Movies and TV

As deepfake technology evolves, we might soon find a time studios can easily use the likeness of celebrities in the past to bring life to their newest films. These would help keep continuity, especially as productions (and moviegoers) prefer long-running franchises, where they face issues with aging or even dying celebrities.

However, this development also has another facet — it can be used to recreate a person’s likeness without their consent. Furthermore, that same video can be used to sow fear, tension, and confusion among people, especially if it’s from a political figure and the deepfake is not recognizable as such.

As much as we are happy with this technology in entertainment, we should also be wary that it’s used ethically and never to sow chaos.

Leave a Comment