A bipartisan group of senators has introduced the NO FAKES Act to address the rising threat of AI-generated voice and video clones. This draft legislation aims to offer legal protection not only to celebrities but also to ordinary people whose voices and images were AI-replicated and used without permission.
The NO FAKES Act
The draft policy is known as the Nurture Originals, Foster Art, and Keep Entertainment Safe Act, or NO FAKES for short. It allows individuals who’ve had their likeness AI-replicated without permission to pursue legal action against the creators. And it’s not just the makers but also any platforms that knowingly host, publish, or distribute the AI-generated material. This move is seen as an essential step in protecting the rights of individuals in an era where deep fakes and AI-generated content have become more prevalent.
“Creators around the nation are calling on Congress to lay out clear policies regulating the use and impact of generative AI,” said Sen. Chris Coons (D-Del.), one of the co-sponsors of the draft. “And Congress must strike the right balance to defend individual rights, abide by the First Amendment, and foster AI innovation and creativity.”
Still, note that the NO FAKES Act also includes exceptions related to the First Amendment, allowing the use of digital clones for news reporting, sports broadcasts, documentaries, and forms of commentary, criticism, scholarship, satire, or parody.
Some notable examples of AI celebrity clones
There have been some notable cases involving deepfakes of celebrities. The first one I remember is the deepfake Tom Cruise who was all over the internet about a year or two ago. A more recent example includes Tom Hanks whose AI clone was used in a dental ad.
I recently saw a very “spicy” image of young actress Millie Bobby Brown on Facebook. It was one of those posts Meta imposes on you even though you don’t follow the page. I wasn’t lazy – I found the original image in just a few clicks. There was a watermark, so I searched for the image creator on DeviantArt and saw that all of their artwork is tagged with “celebrityfake” and “celebfake.” But looks like nearly a thousand people in the comments didn’t bother to fact-check. And that’s just one image in one post on one social media platform! Sigh…
Ai-generated content also hit music performers. An AI-generated song Heart on My Sleeve that mimics Drake and the Weeknd was recently submitted for Grammy consideration. According to Billboard, the controversial artist Ghostwriter also released a new AI-generated track that sounds like Travis Scott and 21 Savage.
The connection with the SAG-AFTRA strike
The NO FAKES’ introduction comes at a time when concerns about deepfake technology and unauthorized AI clones have escalated. After all, this is what prompted labor groups like SAG-AFTRA to advocate for regulations in the entertainment industry.
SAG-AFTRA, a labor union representing actors and performers, has been on strike for a few months. It’s partly due to concerns about film and television studios using AI clones instead of real actors. Negotiations between the union and studios have been ongoing, with regulations surrounding AI-generated content at the forefront of discussions.
The NO FAKES Act has garnered support from both performers and industry groups. SAG-AFTRA sees it as a valuable tool for performers, providing recourse and protection against harmful AI-generated material. The Motion Picture Association also anticipates working with Congress to balance creative freedom and AI protections.
The draft legislation addresses the need to protect individuals’ names, images, and likenesses in an era where AI technology continues to advance and pose new challenges. Sen. Marsha Blackburn (R-Tenn.) praised the legislation as a positive safeguard for the creative community’s rights.
The draft policy serves as a framework for future legislation, with lawmakers planning to introduce a bill based on the draft language in the coming months.