Is AI Going to Take Your Acting or Writing Gigs?

Article Image
Photo Source: Margaret Ruling/Ian Robinson/Solikin/Shutterstock

Generative AI technology is already making waves in the entertainment and communications industries, and reactions have been decidedly mixed. We take a look at the impact AI could have on the creative career path—both now and in the future.

JUMP TO

Will AI take away artists’ and content creators’ jobs?

ChatGPT

Vitor Miranda/Shutterstock

Although the applications of generative AI are seemingly endless, they’re still limited by the constraints of machine learning and a lack of human intuition and creativity. 

“You have to train models on real data,” says Chris LeDoux, the co-founder and senior visual effects supervisor of Crafty Apes, of his visual effects work. “And if we still have a hard time figuring out how to interpret what the client is after, how the hell is the model going to? We have to read between the lines a lot. Whatever you put into it is what you’re gonna get out of it. Eventually, maybe it learns to think.... But for right now, it’s a reflection of us.” 

There are infinite nuances in the way humans communicate with each other; some are obvious, while others are subtle cues that take intuition to comprehend. As it stands, AI can’t read between the lines—it can only respond to direct requests. 

“Right now, there are fairly limited ways through which, at least a really popular model, can be engaged with,” says Dr. Maya Ackerman, the CEO and cofounder of WaveAI, which specializes in creating AI-generated music.. “I think that technology is incredible, but…I need to figure out how on earth to talk to the machine so that I can give it something of myself.” 

“The optimistic view is that AI joins the long list of fabulous tools that creators use. The pessimist would say that this is going to be disruptive to livelihoods. The money stands to benefit. And who is most vulnerable? The talent.”
Jason Squire, Professor Emeritus at the USC School of Cinematic Arts

Joe Vanderhelm, the associate director of education for the improv hub Rapid Fire Theatre based in Edmonton, Alberta, agrees. “That’s always where the humans will come in, for the…deeply poetic things that touch your heart,” he explains. “I don’t know if the bots will ever be able to get to that place.” 

“If a person wants to write a script about their childhood, then ‘click of a button’ doesn’t make any sense, because it’s about humans expressing themselves, right?” says Ackerman. “It’s about the person telling their story; but maybe they could still use a lot of help.”  

That being said, AI has still disrupted the lives of creators who make a living from their art. “It’s just like any other powerful thing—it can do good things or bad things. It’s very clear now that it can do bad things,” says Ackerman. “It’s already hurt the livelihoods of a whole bunch of visual artists, which is very devastating.” 

Regardless of whether generative AI content is aesthetically nuanced or ethically sound, it doesn’t change the fact that some companies see it as an opportunity to cut costs by consolidating roles and commodifying employees.

“The consensus is… that there is a balance between potential good and evil. The optimistic view is that AI joins the long list of fabulous tools that creators use,” says Jason Squire, Professor Emeritus at the USC School of Cinematic Arts and host of The Movie Business Podcast. “The pessimist would say that this is going to be disruptive to livelihoods. The money stands to benefit. And who is most vulnerable? The talent.”

Right now, experts predict that white-collar jobs will be the first to experience a noticeable level of AI replacement. “[AI is] increasingly going into office-based work and customer service and sales,” Anu Madgavkar, the labor market research lead at the McKinsey Global Institute, told the Guardian. “They are the job categories that will have the highest rate of automation adoption and the biggest displacement. These workers will have to work with it or move into different skills.”

AI is finding a foothold in Hollywood, even if it’s a small and uncertain one. A report from Above the Line suggests that studios have, at the very least, looked into the possibility of using AI to generate scripts based on existing IP, with writers polishing those scripts—a practice that the WGA strongly opposes. 

While AI is still unable to entirely replicate the job of a human artist, there is cause for concern over volatility in the job market down the line. “You can easily see the job becoming polishing AI scripts. It fits neatly into what companies have been doing—turning everything they can into gig work,” writer Vinnie Wilhelm (“The Terror,” “The Right Stuff”) told the Hollywood Reporter

As Squire notes, the rise of AI in Hollywood is happening “right in the middle of renegotiations of the union contracts in entertainment.” He’s referring to the Writers Guild of America’s (WGA), the Directors Guild of America (DGA), and SAG-AFTRA’s separate, but related efforts to renew collective bargaining agreements with the Alliance of Motion Picture and Television Producers (AMPTP). 

The WGA and SAG-AFTRA intiatiated strikes against the studios and the DGA ratified a new contract in June. But for all three unions, AI is a sticking point:

  • Included in the DGA’s settlement is a guarantee that “AI is not a person and that generative AI cannot replace the duties performed by members.” 
  • In SAG-AFTRA’s initial call for a strike authorization, the union noted that “unregulated use of artificial intelligence threatens the very voices and likenesses that form the basis of professional acting careers.” 
  • Notably, the AMPTP rejected the WGA’s proposals for AI regulation, countering instead with “annual meetings to discuss advancements in technology.” 

“[Writers and actors] have the opportunity to negotiate AI into their three-year agreements with management, just as the DGA has done,” says Squire.

“The longer I’m in the space, the more I empathize with the artists and the musicians,” says Ackerman. “At first, everybody comes in kind of naive, like, ‘Oh, this is good; I see how to make good out of it.’ But just because I see how to do something good with it doesn’t mean that the whole industry is doing good things. [There are] lots of people with different incentives and different ways of understanding even what’s good and what’s bad. It’s complicated.”

Do AIs steal artists’ work?

AI

Ascannio/Shutterstock

Technically, generative AIs aren’t “stealing” the content itself, but examining it for style and structure. This process, known as “data scraping,” happens without the consent of those who own the content. Advocates of AI have said that this is basically the same as a human creator being trained and influenced by others. In the case of both visual art and, let’s say, a script written “in the style” of “Seinfeld,” “To create AI-generated art, artists use AI as a creative tool and work with algorithms to set up specific rules through which machines analyze thousands of [data sets] to comprehend a particular creation process, like a specific style or aesthetic,” explains Artland Magazine

But there is an argument that this practice is still tantamount to plagiarism. In February, stock image hub Getty Images filed a lawsuit against Stability AI, the company behind Stable Diffusion. Getty is accusing the company of copying more than 12 million of its stock images to train the AI to create art—some of which, the lawsuit claims, contain a version of the Getty logo.

How can you tell whether something was made by an AI?

AI apps

Tada Images/Shutterstock

There were noticeable flaws in the output of the previous generation of artificial intelligence, such as stiff, robotic writing, but updates to the technology have created significant improvements, making it harder to distinguish between human- and AI-generated art.

There are tools that can detect whether something was made with AI and prevent original content from being harvested for data sets. None of these options is perfect. But as of now, they’re the best we have. Here are a few examples: 

  • Glaze: This tool from the University of Chicago allows artists to add small, undetectable alterations to art—a process known as “cloaking”—that prevent AI from absorbing their work and replicating their style.

  • Originality.ai: Looking to check whether a piece of writing was penned by a bot? This tool will scan blocks of text to see how much of it is AI-generated. The software’s creators claim that it has a 96% accuracy rate.

When trying to decide on your own whether something is AI-created or not, keep your eyes peeled for anything uncanny and not quite human. Watch this AI-generated trailer, for example, and you’ll notice that the faces—most noticeably, the mouths—don’t move in lifelike ways: 

When it comes to writing, a good rule of thumb is to look out for an abundance of simple words like “the.” This is because generative AI learns from language models that predict the next word in a sentence, so it overuses common short words like “the,” “it,” and “is.” When in doubt, remember that AI-generated writing is, ironically, often too “perfect” to be human-created. 

“Language models very, very rarely make typos. They’re much better at generating perfect texts,” Daphne Ippolito, a senior research scientist at Google Brain, told the MIT Technology Review. “A typo in the text is actually a really good indicator that it was human-written.”

A recent ruling from the U.S. Copyright Office stated that artists who use AI must disclose that fact if they wish to copyright their work, creating a strong incentive for transparency.

More From Backstage Special Report

Recommended

More From Actors + Performers

More From Creators

Now Trending