Artificial intelligence is not a new invention. In fact, it’s been used in the entertainment industry—and the acting industry—for years. Still, with all the new innovations coming from the world of generative AI, it’s more important than ever to keep up with how it’s being used. Is it possible to use it in your auditions? Or even in your performances? We look at how generative AI is affecting actors and what to expect for the future.
“Indiana Jones and the Dial of Destiny” Courtesy Lucasfilm
As an actor, you can use generative AI as creatively as you want. But, generally speaking, there are a few concentrated areas where AI is being used in the acting space right now:
AI still largely functions as a supplemental tool in the casting and project analysis processes.
Content creation workflow AI Largo.ai, for example, launched in 2018. The model is trained on a dataset of movies, TV shows, and talent to create an all-in-one suite that analyzes commercial potential. Largo.ai can also directly connect producers and casting directors with actors based on its analysis of scripts, characters, and an actors’ appearance.
Actors are already seeing companies similar to Largo.ai reaching out to them. “I got an email that went to my spam folder. And that said, ‘Our AI thinks you’d be a good fit for our feature film we’re doing.’ And I’m like, is this real?” recalls actor Noelle Gizzi (“Irreverend,” “Newsworthy”). “And it seems legit enough. But you know, there’s a lot of unanswered questions at the moment.”
“We wanted to keep that movie magic to it. It’s only as good as your cinematographer and the shot you feed it, it's only as good as a performance you feed it, and it’s only as good as the CG character that some artists made.”
—Tye Sheridan, actor
There are still thorny questions to consider about using AI for the casting process itself. “Our policies around how to handle [AI] are still evolving,” says Luke Crowe, vice president of casting at Backstage. Currently, the platform is limiting access to AI-related projects out of “concern around the evolving contractual nature of how it’s going to work,” says Crowe. “If you have your image captured for a photo bank of this type are you signing away your own image forever?.... That’s kind of a frightening way to go—that eventually talent is no longer needed for a lot of these types of projects at all.”
“On the other hand, you can’t just ignore it and say, Hey, we can’t allow anything related to AI to ever cast,” Crowe adds. “This is a reality of where a lot of this market is going. And there is money to be made for the talent as well. I think it’s just a matter of both sides needing to figure out…what is crossing the line when it comes to manipulating the image of somebody.”
Like most use cases, AI is currently most useful for actors during the audition process as a tool, not a replacement for the process itself. You can, for example, use Slatable’s AI as a dynamic reader to make self-tapes more easily, or TryItOAI for quick headshots. Alternatively, you can try out tools like Descript to easily edit your demo reels and self-tapes, as well as add a backdrop to make them look more professional.
Here’s how it works: an actor bearing some resemblance to the person they’re trying to impersonate, in this case, Tom Cruise, has a digital mask applied to their face. Through AI, the mask moves like Cruise’s face would in real-time, creating the illusion. Remember that scene in “Blade Runner 2049” when Officer K’s (Ryan Gosling) holographic girlfriend, Joi (Ana de Armas), syncs up with a human? It’s pretty much like that, but even more accurate.
While there’s yet to be a fully AI-generated performance, studios and creators are already utilizing the technology to augment what you see onscreen. For “Indiana Jones and the Dial of Destiny,” for example, Lucasfilm utilized AI to comb through old Harrison Ford footage to de-age the actor in certain scenes.
What once took VFX artists hours of rendering and shot compositing can now be accomplished in a fraction of the time, and the possibilities are nearly endless when it comes to AI-powered VFX; from replacing the faces on stunt doubles with the actors they’re working for to recreating long-dead people to making actors look like robots or aliens.
However, the software is still in its relatively early days. “If you’re a filmmaker and you have a very specific vision in mind, I think it’s really, really, really hard to achieve that specific vision with generative AI right now,” says Tye Sheridan, an actor (“Ready Player One,” “X-Men: Apocalypse”) and co-founder/president of Wonder Dynamics, a tech start-up developing AI tools for the film industry.
“A lot of these tools don’t allow you to edit the process,” Sheridan says. “They’re trained on these models, and to get something different, you have to retrain the models entirely.”
Even though the facial alterations are convincing, they still need good physical actors to pilot them. “We’ll get certain actors that really understand [motion capture] acting,” says co-founder/CEO of Wonder Dynamics, Nikola Todorovic, “and they’ll do those exaggerated moves that look really good on certain characters. So that’s definitely an important skill to have.”
“We wanted to keep that movie magic to it. It’s only as good as your cinematographer and the shot you feed it, it's only as good as a performance you feed it, and it’s only as good as the CG character that some artists made,” he continues.
With apps like Flawless AI, filmmakers have been adjusting on-screen visual dialogue to match other languages. That means filmmakers can simply re-dub their movies in post without having to get additional footage from actors. Not only does this expand the possible distribution of the film and widen its potential audience, but it can also expand your fanbase and image to different parts of the world.
By this point, you may be worried about missing opportunities to digital duplicates of yourself or others. Here’s what Google’s AI chatbot, Bard, has to say:
Bard’s assessment seems on the money, for now: we’re not yet at the point where an AI could whip up a realistic-looking clone of an actor out of thin air.
“To make digital doubles of people really lifelike and believable is really, really hard,” says Hanno Basse, the chief technology officer at Digital Domain, a VFX company whose work can be seen in Marvel Cinematic Universe and “Terminator” films. “It’s the hardest thing there is. It’s harder to make than, for example, a digital dog or a digital tiger…. Especially if you have a very well-known actor with a range of emotions and a range of work that you’ve seen before—to really replicate that with a machine, I think, is really hard.
Leaders within the industry are already calling for regulations on the tech, looking to a future where it could advance to a level where human actors lose work to AI. As part of its recent strike, SAG-AFTRA—one of the largest unions for performers in the U.S.—made it clear that AI as it applies to consent and compensation is on the table during negotiations with the Alliance of Motion Picture and Television Producers (AMPTP). During the press conference announcing the union’s official strike on July 13, National Executive Director Duncan Crabtree-Ireland spoke on the AMPTP proposal SAG-AFTRA outright rejected: “They proposed that our background performers should be able to be scanned, get paid for one day’s pay, and their company should own that scan ... to be able to use it for the rest of eternity,” he said.
At this stage, AI is still contending with an “obvious quality metric,” says Crowe, but if the tech advances far enough without weigh-in from working actors, fully AI-generated actors aren’t out of the question. We might experience a world where real actors work alongside AI ones. However, it’s equally possible that AI performers will never garner the same popularity as real ones. After all, actors aren’t just valued for their performance skills; they’re also appreciated for their backgrounds, experiences, and personalities. “There’s a wider context to the entertainment we consume,” says Crowe. “Right now, there is a lot of novelty around [AI], and aspects of that will die down.”
According to Sheridan, the reality is that whether the impact AI has on actors will be small or large, you can’t bury your head in the sand.
“When [actors] get on the topic, they almost want to avoid it. I think that’s the wrong approach,” he says. “We have to acknowledge that it’s here and it’s gonna stay around, you know. We have to figure out how to adopt it into our process in a responsible way so it still protects us and protects the nature of what we do as performers and as artists.”
“The Irishman” Courtesy Netflix
While there are still many unknowns regarding how AI will change the acting industry (and entertainment in general), there are a few areas that will most likely be impacted the most in the near future.
Changes in post-production
According to Sheridan, the possibilities for generative AI use in post-production go beyond dubbing. It’ll also be invaluable for making tweaks to performances after principal photography wraps up.
“If you go in and you shoot a scene, and let’s say you like the performance capture, everything’s great, but you want to change something in the facial performance, the actor can then sit at their webcam and redo the facial performance,” he says. This practice could potentially save filmmakers a lot of money, so it stands to reason that you might be asked to alter their performance this way rather than through costly reshoots.
Sheridan argues that because AI would only be used to tweak performances and it would still be your own work, it wouldn’t affect the nature of the job. “You should look at AI as a real tool in that situation,” he says.
“We have to figure out how to adopt it into our process in a responsible way so it still protects us and protects the nature of what we do as performers and as artists.”
Crowe predicts that casting will become more seamless. Rather than set up complex filters manually to discover talent, new applications will automatically sort actors by their characteristics—what they look like, where they’re from, and their skill sets, for example. “A lot of that stuff’s going to be very transformative in the future,” he says.
Crowe also speculates that actors may reach a point where they have an AI that uses their physical and vocal likeness to audition for them. “It’s going to sound like you, it’s going to look like you, you may even be able to do some emotional adjustments or tweaks to it; you’re going to have some control over it,” he says. “The better AI gets, the better that’s going to be, and now your kind of AI avatar can start handling those initial [auditions].”
Despite the opportunities it presents, there are some potential pitfalls that can result from relying so heavily on AI. For one, algorithms often result in centralization, which might cause the same few actors to be recommended over and over again. Furthermore, if the industry allows gen AI to determine every casting decision, there could be missed opportunities. After all, not every actor who is right for a part in theory is the best fit in practice.
Gizzi recalls a time when she was casting a part in a mini-series and was faced with an unexpected audition: “I was like, who is this guy? He doesn’t even look like this character. But I said eh, what’s one more audition? And then I loved him and he was, like, everything I wanted.”
“Sometimes the most unexpected choice is the most interesting in terms of casting,” she adds.
Renting your likeness
According to Crowe, actors may be able to “lend out” their image or their voice for a project while still retaining their rights to their likeness as well as their ability to get work.
“This could just be another avenue of revenue gain,” he says. “In some cases, you may even have continual gains from that if you’re able to own, say, royalty rights to your image. In the future, maybe it could just be like, ‘Hey, we’re gonna pay you, we’ve already got enough imagery here, we can just generate it,’ ” he continues.
If you’re a working actor, this might sound like a great option to get some passive income. However, it may come with its own troubles.
“I am a little leery of using generative AI in this context,” says Gizzi. “The thing that’s worrisome is, we’re in this industry because we love it, and if it replicates us, if it exceeds our abilities, then then we’re not going to be able to do that.”
“I’m open to ideas, but at the end, I want to be the one in control,” she concludes.
“We’re making technology as a means to an end, as tools to tell stories,” says Todorovic. “The lower the price point of this content [becomes], we’re gonna see a lot more indie films that are made with CG and visual effects.”
Crowe speculates, “This makes opportunities for real actors that really come in and are able to participate in higher quality work, who might not have been able to in the past.”
Although the acting world is not through the earliest days of AI by any means, Crowe also predicts that the dust will eventually settle.
“We’re gonna have some really cheesy AI art, AI films and stuff come out of this, and then people are going to get over the novelty aspect of it,” he says. “[We’ll] get back to where it makes sense to have real human involvement. At the same time, we’re gonna become more specialized in it and… AI will just become another tool in the toolbox. It’s going to be more seamless. And you won’t even realize it’s happening there, rather than right now where it’s so in your face.”