Cuomo Passes Law Protecting New York Actors From Unauthorized Image Use

Article Image
Photo Source:

Screen actors will now be protected in New York from their likeness being used without their permission. New York Governor Andrew Cuomo and the New York State legislature have approved and signed into law a bill that protects actors, living and deceased, from unwanted and unauthorized commercial exploitation of their likeness. The bill also bans “deepfake” pornographic videos, which superimpose the heads of actors into sexually explicit videos without their consent.

This move was commended by SAG-AFTRA, who championed the bill. “We are thrilled that Governor Cuomo signed this important and hard-fought bill that protects not only our members, but society as a whole,” said SAG-AFTRA President Gabrielle Carteris. The bill doesn’t just consider actors, it also includes dancers, singers, and musicians. 

According to SAG-AFTRA, the bill (Assembly Bill A5605C, Senate Bill S5959D) ensures that “New York’s protection against the use of a living person’s image and voice, including their ‘digital avatar and digital voice,’ in advertising and trade, remains firmly intact, and will continue the trend of protecting against uses in expressive works unless the use is clearly permitted by the First Amendment. The bill, for the first time in 36 years, also prohibits the use of a deceased individual’s voice and image in advertising and for purposes of trade.” 

Under the bill, using a performer’s likeness for commercial purposes without their consent will entitle that performer (or their families if they’re deceased) to damages. Victims of “deepfake” face-swapping technology often used in pornographic videos, are also covered by the bill and will be entitled to injunctive relief, punitive damages, compensatory damages, and reasonable court costs and attorney fees. 

Last year, California governor Gavin Newsom signed a similar ban on “deepfake” pornography, instituting penalties for individuals who create those videos and recourse for victims, including statutory damages and preliminary injunctive relief.

At the time, Carteris characterized those videos as “nonconsensual pornography,” saying in a statement, “Deepfake technology can be weaponized against any person. Every person deserves the basic human right to live free from image-based sexual abuse.”