Unpacking SAG-AFTRA’s New AI Regulations: What Actors Should Know

Article Image
Photo Source: Walter Cicchetti/Shutterstock

SAG-AFTRA ratified its latest TV/Theatrical Agreement with the Alliance of Motion Picture and Television Producers in November 2023, following almost four months of striking and intense negotiations. Chief among the pressing issues the union addressed was the impact that the rapid rise of generative artificial intelligence could have on actors’ livelihoods. The technology affects performers across the industry, from those just starting out to A-listers like Tom Hanks and Viola Davis; so it’s important to pay attention no matter what stage you’re at in your career. 

JUMP TO

Terms you need to know

One of the much-needed amendments in the TV/Theatrical Agreement codifies the use of generative AI. Here’s some key jargon to learn:

Digital alteration: This refers to the practice of changing an actor’s work “in photography or [a] soundtrack previously recorded by the performer.” Your explicit consent is required to make these alterations, unless they’re “substantially as scripted, performed, and/or recorded.” 

Your consent isn’t required if your lips, body, or voice are adjusted for redubbing or postproduction edits explicitly made “for purposes of cosmetics, wardrobe, noise reduction, timing or speed, continuity, pitch or tone, addition of visual/sound effects or filters, standards and practices, ratings, an adjustment in dialogue or narration, or other similar purposes.”

Employment-based digital replicas: This applies to likenesses of actors who are already under contract for a series or film; they’re generated “for the purpose of portraying the performer in photography or [a] soundtrack in which the performer did not actually perform.” Essentially, this means that the replica is an extension of the actor’s work it’s copying. 

Entertainment lawyer Tara Aaron-Stelluto cites Brandon Trost’s “An American Pickle” (2020) as an example of how employment-based digital replication works in practice. In the film, “Seth Rogen plays both the grandfather…and the grandson in modern times,” she explains. “So if one of those characters was a digital replica, but the actual actor was performing alongside or in the same scene, there’s no additional compensation there.” 

While you won’t make any additional money if an AI supplements your performance, you also won’t make any less. As far as compensation is concerned, if your digital likeness works, so do you. The SAG agreement ensures that whether it’s you or your digital double onscreen, you’ll get paid. 

“Whether the producers use an AI-generated version of Nicole Kidman or Nicole Kidman [herself], they still have to pay Nicole Kidman,” says Aaron-Stelluto. “In some ways, she’s better off letting them use an AI version and getting paid for it so she can go do something else [instead of] actually having to show up and sit in the trailer for 17 hours a day.”

Independently created digital replicas: With your permission, producers can completely simulate your likeness through AI, which means you won’t have to show up to set at all. In fact, you don’t even have to be alive! An actor’s contract may include a stipulation giving studios permission to use their AI replica after their death. Even if they don’t consent themselves, the performer’s estate (or the union, if they have no living representative) can give producers permission to use their likeness after they pass.

Synthetic performers: These are entirely AI-generated “actors” who are presented as real people; according to the agreement, they can’t be “recognizable as an identifiable natural performer.” Producers can derive certain body parts (e.g., eyes or mouths) from real actors to create a kind of Frankenstein’s monster, but that requires explicit consent from the people in question. 

The contract states that both producers and the union “acknowledge the importance of human performance in motion pictures and the potential impact on employment.” It also stipulates that the union be notified and given “an opportunity to bargain in good faith” if producers are considering casting an AI performer. 

What that “good faith” will look like remains to be seen. Aaron-Stelluto acknowledges that “there’s a legal issue here, and then there’s an ethical and moral issue.”

The question of consent

Requiring a performer’s permission at each step of the generative AI process is a cornerstone of the agreement. Producers must clearly stipulate how they plan on using AI in a contract, and they must obtain informed and explicit consent from an actor before using their digital replica. It’s up to you whether you want to allow a project to supplement your performance with AI.

Aaron-Stelluto says that the summary’s phrasing prevents studios from collecting a roster of digital replicas and using them in future projects without consent from the performers they’re based on. However, the summary includes a lot of vague language. The phrase “reasonably specific description of the intended use,” which pops up five times in the digital replication and digital alteration sections, particularly stood out to Aaron-Stelluto. 

“Is this opening the studios up to be allowed to completely change the use based on editorial discretion and completely change the purpose of using that AI character later on?” she posits. “There may be additional details. But that would have been something that probably would have alarmed me if I was counsel for SAG-AFTRA negotiations.” She also points out that the document only provides a general summary, and every individual contract will be different.

That’s all well and good if you’re in the union. But are you protected from AI replication if you’re a nonunion actor? It all depends on your agreement with the producers of a project. 

Aaron-Stelluto advises performers to read their contracts thoroughly and keep an eye out for AI-related provisions, particularly phrases like “[technology] now known or hereafter devised.” She adds that production companies are “pretty good at making sure that their rights are protected…regardless of what happens with the technology.” 

If you’re still early on in your acting career, you may feel pressured to sign a big offer regardless. But you should always advocate for protections that safeguard your most valuable asset as a performer: your likeness.

Do SAG-AFTRA’s AI provisions go far enough?

That’s the billion-dollar question. Are the terms of the union’s agreement enough to protect performers from being taken advantage of by AI, or is it already too late? 

SAG member Justine Bateman takes a hardline approach when it comes to artificial intelligence. In an interview with MSNBC prior to the vote, she noted that the use of AI will hurt the union because SAG can’t represent a synthetic actor. 

“You might as well be negotiating for nonunion actors, because these synthetic performers, these AI objects, aren’t going to pay dues, they’re not going to pay pension and health, and they’re going to take a job from a human actor,” she said. 

She added that actors should only endorse the deal if “they don’t want to work anymore” and expressed concerns over how the issue will affect other areas of filmmaking. Synthetic performances could lead to “fewer [or] no sets, fewer [or] no crew, [and] fewer [or] no drivers. The list goes on.” 

SAG member Satu Runa echoed Bateman’s sentiment. “It’s very easy to replace us with this technology,” she told Prism. “We shouldn’t normalize it.” She also pointed out that, outside of big-name actors with bargaining power, the summary’s consent stipulations fall short of protecting performers. “Young actors, or desperate actors who want to work…they’re gonna feel the pressure to just check the box.”

According to Aaron-Stelluto, the agreement’s AI provisions aren’t perfect, but they do track “surprisingly closely to traditional right-of-publicity laws…. I don’t think there's anything super surprising in there. It follows what I would have expected based on rights and publicity. I’m not sure what more [SAG-AFTRA] could have asked for, other than to avoid using [AI] altogether. But…everyone understood that that train had kind of left the station.”

Jason Squire, a professor emeritus at the USC School of Cinematic Arts and host of “The Movie Business Podcast,” feels that the agreement is worth celebrating. “[Actors’] loyalty, their bravery, really won out in the end,” he says. He adds that the provisions “achieved initial constructive definitions of AI…. It has to come alive with complaints and interpretation and implementation…. I think it’s going to be constructive. I like to think it’s going to be helpful and move AI down the road.”

Though the agreement is a solid step forward for SAG members, the narrative around AI in entertainment is still being written. “This is an example of creators and managers coming together to try to harness [AI] and maintain good relations,” Squire explains. “Otherwise, you will have more unrest between the two parties—and the industry can’t afford that.”