The House passed the bill Monday evening. This created deepfake porn using a person’s portrait to commit a crime and disagree with.
The Take It Down Act sailed through Chamber in a 402-2 vote, marking one of the first major laws passed by Congress to address the deepfakes that AI was generated.
The bill would make it a federal crime to publish unconsensual images of others generated by the real thing and AI, requiring businesses to delete images hosted or shared on the platform within 48 hours of receiving the notification. The Federal Trade Commission also empowers the Federal Trade Commission to investigate and implement compliance.
The Take It Down Act version has already passed the Senate and is a signature of the president, now away from becoming law.
D-Minn. The former Senate bill, co-hosted by Ted Cruz of R-Texas, took up an extraordinary mix of bipartisan support in Washington today, with all House Democrats voted in favor of a measure approved by both First Lady Melania Trump and former Biden Eradian Formation official Nina Jankawick.
The generation of unconsensual deep-fark porn using the faces and bodies of real people is a tragedy where it is likely to sacrifice famous celebrities as well as anonymous high school students. Many studies have shown that the vast majority of deep depths on the internet are nude or porn that disagree, many of which exchange faces and portraits of real people.
Last year, singer Taylor Swift cited the issue when Donald Trump and X owner Elon Musk used it to create a deepfake, supporting President Democrat Kamala Harris.
Klobuchar, who chaired Trump’s second inauguration committee, said last month she hopes to use the opportunity to pick up her support by taking up lunch for a takedown act while Melania Trump was in attendance.
“Use the moments you can,” joked Klobuchar.
Melania Trump praised the bill for passing and said she was honored that she had lended her support.
“The bipartisan passage of the Take It Down Act today is a powerful statement that we are united in protecting the dignity, privacy and safety of our children,” she said in the statement. “We are grateful for the members of Congress who voted to protect the well-being of our youth — both in the House and Senate.”
But despite the rapid passage of the bill through Congress, critics and some technical experts have said the Take-I-Down Act is another example of a good-intentional law, leading to unintended consequences.
Becca Branham, assistant director of the Center for Democracy Technology’s Free Expression Project, said the bill retains “dangerous effects” on online privacy and speech.
She expressed concern that the 48-hour Takedown provision would overburden small-tech platforms, and that it was an existential threat to end-to-end encryption applications that could not technically access user content or messages.
Additionally, Branham highlighted the role that the FTC, which has only three Republicans after President Trump attempted to fire two Democrats, investigated the crime and called the administration’s FTC “a recipe for weaponized enforcement.”
“The Take It Down Act was written without proper safeguards to prevent the mandated deletion of content that is well-intentioned, but not an intimate image of unconsensuality, and has become vulnerable to constitutional challenges and abusive takedown requests,” Branum said. Additionally, you can read that vague text and create impossible requirements for end-to-end encrypted platforms to remove inaccessible content. ”
Rep. Thomas Massie, one of the two House members who voted against the bill, explained his decision, stating that the bill’s provisions are vulnerable to misuse by bad actors.
“I’m voting no because I feel this is a slippery slope and has ripe and unintended consequences of abuse,” Massy wrote to X before voting.
Concerns about potential partisan abuse under the Trump administration are something even legislative supporters acknowledge. Now head of the American Nikko Project, Jankovic, a nonprofit dedicated to the fight against disinformation, has been baffling the law for years to protect young women and men from unconsensual deepfakes, but advocates say that government and the private sector must be accountable along the way.
“We hope that the implementation of this bill will protect the underprivileged people, not those who are trying to misuse the new standards,” Jankowitz said Tuesday on LinkedIn. “I hope the platform remembers who this law was trying to protect and doesn’t bend to someone who takes the glow for their own political purposes.”
Zephyr Teachout, a law professor at Fordham University and advocate for the bill, said he hopes his adversaries will present legal challenges that argued that the criminalization of deepfake law is a speech protected under First Amendment Freedom of Expression. However, unlike previous content moderation cases in which Tik Tok and NetChoice had similar free speech debates, they will face a much higher bar, claiming that activities prohibited by law deserve similar protections.
“The core actions here don’t deserve First Amendment protection. I think it’s really important and we’ll see the challenges, but I think this is much easier than the past,” she said Monday.