The Senate and House members reintroduced legislation Wednesday that gives individuals the right to control the use of digital likeness as part of an effort to limit the use of AI deepfakes and audio clones.
The No Fakes Act was introduced last year, but failed to move forward in Congress despite bipartisan support and changes that alleviate concerns about the First Amendment.
This law gives individuals the right to allow their use of voice and likeness in digital replication. The rights to digital replication do not expire upon the death of a person and can be transferred and licensed by heirs, executors, etc. However, subsequent rights will end within 70 years of the individual’s death.
The law is sponsored by Senators Marsha Blackburn (R-TN), Senators Chris Coons (D-DE), Senators Tom Tillis (R-NC) and Senators Amy Klobuchar (D-MN). In the House, sponsors are Rep. Maria Elvira Salazar (R-FL), Madeleine Dean (D-PA), Nathaniel Moran (R-TX), and Rep. Becca Ballint (D-VT).
Mary Davis and Randy Travis will speak on stage during the Grammy on the Hill Awards Dinner in Washington, D.C. on Tuesday. (Photo of Paul Morigi/Getty Images at the Recording Academy)
Many sponsors appeared at the event today at Capitol Hill, along with singer Randy Travis and his wife Mary, as well as Robert Kinkle, CEO of Warner Music Group. Others who spoke include MPA Chairman Charles Livkin, Academy CEO Harvey Mason Jr., Sag Afra President Fran Drescher and YouTube’s Vivien Lewit.
“This AI that lives in Wild West is over,” Drescher said. “Now is the time to define what is right and what is wrong.”
Mason said the bill “will have an impact beyond just creative communities. It has a lot of real-world use that affects the average people who live their daily lives, such as social media people, people running for local offices, national politicians and more.”
The bill is supported by high-tech industries such as Openai, Google, Amazon, Adobe and IBM, according to the American Recording Industry Association. The terms allow the platform to avoid liability if it promptly removes unauthorized deepfakes.
At a hearing on the proposed law last year, the MPA warned that the bill would violate the First Amendment. That wide range was because filmmakers required approval before they placed historical figures on films like Forrest Gump. However, the studio ultimately used modern languages to cast support behind the law, with the exception of projects such as documentaries and biographies and comments, criticism and parodies.
“It wasn’t quick or easy to get to this point, but taking the time to get it right was worth it,” Libkin said.
Travis, who suffered a stroke in 2013, released “Where That That That Come” last year, using artificial intelligence to create new music in her own voice.
As he stood by her, his wife read a message from him. “In the past year, AI has enabled vocal productions to release new music with their own voices for the first time in over a decade. I am now able to record music again. And these recordings produced as an extension of my artistry are very different from stealing my voice and producing music that no one has participated in.
The White House is not yet putting any emphasis on the law. Blackburn said he spoke to Michael Krazios, director of White House Science and Technology Policy. A hearing before the Senate Judiciary Subcommittee is planned.
Klobuchar said first lady Melania Trump supports another bill, the Take It Down Act. Klobuchar and Senator Ted Cruz (R-TX) are the main sponsors of the Senate. Klobuchar added that in his first term, Trump signed an industry collateral bill to support live event venues during the pandemic.
“It’s all so intense like it is now and you try to see these signs that you might bring people around,” Klobuchar said. “I don’t think we should give up anyone who supports this bill because it’s so important, with the help of Marsha over there.”
According to the bill, tech platforms are not responsible for hosting fraudulent digital likenesses if notified “as soon as they become technically feasible.” You also need to establish a policy to terminate the accounts of violators repeatedly.
Those who post deepfakes face at least $5,000 in damages per job, or losses of actual damages and profits from misuse. The plaintiff may also seek punitive damages if proven malicious, fraud, knowledge or intentional evasion.
Coons said the key to gaining support from the No Fakes Act across the industry is to “ensure that First Amendment protections and carve-outs are stronger and that protections on the caps of liability are clearer.” The law has a $75,000 liability cap per work for platforms where employees do not make “good effort” to work. The bill also makes it clear that the platform is not obligated to monitor third-party content due to potential deepfakes.
More coming.