Bipartisan bill protects artists’ and individuals’ likenesses from AI duplication

The popular recording artist The Weeknd, shown here performing in Mexico City in September 2023, was cited in a new bill that would protect artists' likenesses and work from unauthorized AI duplication.

The popular recording artist The Weeknd, shown here performing in Mexico City in September 2023, was cited in a new bill that would protect artists' likenesses and work from unauthorized AI duplication. Ismael Rosas/ Eyepix Group/Future Publishing via Getty Images

The No AI FRAUD Act penalizes cyber actors who leverage generative artificial intelligence and machine learning to duplicate images and voices without a person’s consent.

Lawmakers are continuing to try to fill the legal gaps accompanying rapid innovations in artificial intelligence technologies with a new bill introduced Wednesday that targets fake content generated by machine learning models.

The No AI Fake Replicas and Unauthorized Duplications Act of 2024 — shortened to the No AI FRAUD Act — was introduced by Reps. Maria Salazar, R-Fla., and Madeleine Dean, D-Pa., and looks to protect U.S. citizens from predatory deep fakes and falsified images and content. .

The bill’s text cites examples of generative AI technology defrauding individuals by mimicking or emulating their voices or likeness, including music artists Drake and The Weeknd, as well as a group of female high school students in Westfield, New Jersey.

“It is time that the bad actors that utilize AI face the situation,” Salazar said in a statement. “This bill fills a space in the law and grants artists and American citizens the power to protect their rights, their creative work and their fundamental individuality online.”

The No AI FRAUD Act specifically grants individuals sole ownership rights of their likeness and voices. Transference of these rights is stipulated as exclusive to the individual, and this copyright does not expire upon the death of the individual. 

Unauthorized duplication, replication or falsification of someone’s likeness can result in a fine of up to $50,000. The bill also notes that a First Amendment defense against this restriction can be invoked, but it will be weighed against the intellectual property rights provisions. 

In addition to the bipartisan duo introducing the legislation, Reps. Nathaniel Moran, R-Texas, Joe Morelle, D-N.Y., and Rob Wittman, R-Va., have signed on in support.

Following the bill’s introduction, advocacy groups endorsed its intent, with the Human Artistry Campaign calling it a “landmark” bill.

“Timely action is critical as irresponsible AI platforms are being used to launch deepfake and voice impersonation models depicting individuals doing and saying things they never have or would,” said Moiya McTier, the Human Artistry Campaign’s senior advisor. “This not only has the potential to harm these artists, their livelihoods and reputations, but also degrades societal trust. There has never been a more important time for our leaders to demand responsible and ethical AI that works for people — not against them.”

The Recording Industry Association of America also threw its support behind the bill. CEO Mitch Glazier said that “as decades of innovation have shown, when Congress establishes strong IP rights that foster market-led solutions, it results in both driving innovation and supporting human expression and partnerships that create American culture.”