SAG-AFTRA: Deepfakes “Pose a Potential Threat to Performers’ Livelihoods”

  • Oops!
    Something went wrong.
    Please try again later.
  • Oops!
    Something went wrong.
    Please try again later.

As the capabilities of AI-generated content, including what is commonly referred to as deepfakes, continue to evolve, it becomes increasingly difficult to distinguish the real from the synthetic.

This can offer creative opportunities for actors and filmmakers — allowing Mark Hamill to regain his youthful appearance as Luke Skywalker on The Book of Boba Fett, for example. But there also are nefarious uses, as Jordan Peele proved in a 2018 deepfake of President Barack Obama describing how the technology can facilitate disinformation.

More from The Hollywood Reporter

SAG-AFTRA wants to help members maintain control of their likenesses in content, whether that be in a movie, TV series or video game — including after their death.

“Protection of a performer’s digital self is a critical issue for SAG-AFTRA and our members,” says national executive director Duncan Crabtree-Ireland. “These new technologies offer exciting opportunities but can also pose potential threats to performers’ livelihoods. It is crucial that performers control the exploitation of their digital self, that any use is only with fully informed consent, and that performers are fairly compensated for such use.”

The first step is for actors to know their rights, explains Danielle Van Lier, SAG-AFTRA’s senior assistant general counsel, contracts and compliance. “On top of that, one of the things we push for is making sure [the data] is protected from unauthorized access and use.”

Some of the guild’s collective bargaining agreements already prohibit the use of digitally generated simulations. A sample from the guild’s indie contracts: “Producer may not create digital reproductions of any Performer in connection with the Project without the union’s consent. Producer may not use any digital reproduction of any individual, living or deceased, as a character or in place of Performers in the Project without the union’s consent. The foregoing restriction includes any voice reproductions.”

It’s become common practice for VFX teams to scan actors in order to create digital stunt doubles. The guild maintains that the creation of digital doubles also is subject to collective bargaining and that contracts must clearly state that a scan created for a project can only be used for that project and the agreed-upon scenes, and nothing else.

SAG-AFTRA’s work in combating unlawful application of AI has had ramifications beyond Hollywood, leading to reforms in the pornography industry, where some of the first deepfakes originated. “We were instrumental in drafting legislation that prohibits unauthorized creation of sexually explicit deepfakes and championing it through the legislature in California and New York,” notes Van Lier. The resulting California law took effect in January 2020, and a New York law followed in 2021. The guild has supported similar legislation in other jurisdictions.

“This area in its infancy,” Van Lier says. “In a lot of ways your contract is your only protection.” Some non-SAG-AFTRA contracts from companies that create digital humans for entertainment, advertising and other media are “oppressive and egregious,” she says. “You are giving up your rights. And somebody early in their career could be precluding themselves from future work. That goes back to knowing what rights you are granting.

“It’s not to say that it’s all bad,” she continues. “There are a lot of exciting new forms of work, but it all comes down to making sure you maintain control and are fairly compensated.”

This story first appeared in the July 20 issue of The Hollywood Reporter magazine. Click here to subscribe.

Click here to read the full article.