Congress has approved overwhelmingly bipartisan laws to enact more severe penalties for the distribution of unconsensual intimate images, sometimes called “venge porn.” The bill, known as the Take it Down Act, is headed to President Donald Trump’s desk for his signature.
The measure was introduced by Sen. Ted Cruz, a Republican from Texas and Sen. Amy Klobuchar, a Democrat from Minnesota, and later supported by Mrs. Melania Trump. Critics of the bill that addresses both real intelligence and artificial intelligence-generated images say the language is too broad and could lead to problems with censorship and correction.
What is Take It Down Act?
The bill makes it illegal to “intentionally public” or threaten to “deliberately disclose” or threaten without human consent, including “deepfakes” created by AI. Additionally, websites and social media companies must delete such material within 48 hours of notification from the victim. The platform must take steps to remove duplicate content. While many states have already banned the spread of sexually explicit deepfakes and revenge porn, the Take it Down Act is a rare example of federal regulators imposing on internet companies.
Who will support that?
The Take It Down Act was defended by Melania Trump, who lobbyed at Capitol Hill in March, with strong bipartisan support. President Trump is expected to sign it into law.
Cruz said the measure was inspired by Elliston Berry and her mother. Her mother visited his office after Snapchat refused to remove the “deepfakes” that were generated by a 14-year-old AI for nearly a year.
Meta, which owns and operates Facebook and Instagram, supports the law.
“Having an intimate image that is realistic or AI-generated can be shared without consent, and it supports a lot of effort to develop and prevent it,” Meta spokesman Andy Stone said last month.
The Information Technology and Innovation Foundation, a think tank supported by the technology industry, said in a statement Monday that the passage of the bill is “an important step forward that will help people pursue justice if they are victims of nonconsensual intimate images, including deepfake images generated using AI.”
“We must provide online abuse victims with the legal protection they need when intimate images are shared without consent, especially now that deepfakes are creating horrifying new opportunities for abuse,” Klobuchar said in a statement after the text late Monday. “Though these images can ruin your life and reputation, as our bipartisan law is becoming law, victims can remove this material from social media platforms, allowing law enforcement to hold perpetrators accountable.”
What are your censorship concerns?
Free speech advocates and digital rights groups say the bill is too broad and could lead to censorship of legal images, such as legal pornography, LGBTQ content, and government critics.
“The bill aims to address serious issues, but good will alone is not enough to create good policies,” said the nonprofit Electronic Frontier Foundation, a digital advocacy group. “Deputies need to strengthen and strengthen existing legal protections for victims, rather than inventing a new, abuse-rich takedown government.”
The bill’s takedown provisions apply to “a much broader category of content – potentially, potential images containing intimate or sexual content” are more likely than narrower definitions of nonconsensual intimate images found elsewhere in the text.
“Takedown’s provisions also do not have any significant protection against frivolous or dishonest takedown requests. Services rely on automated filters, an infamous, dull tool,” Eff says. “They frequently flag legal content, from fair use commentary to news reports. In the strict legal time frame, apps and websites must remove speeches within 48 hours.
As a result, the group said online companies, especially small businesses that lack the resources to overcome a lot of content, would “choose awkward legal risks by simply excluding the speech, rather than trying to validate it.”
The measure also puts pressure on platforms that “actively monitor speeches that contain currently encrypted audio” to address the threat of liability.
The Cyber Civil Rights Initiative, a nonprofit organization that supports victims of online crime and abuse, said there are “serious reservations” on the bill. It called its takedown preparation and unconstitutional, ambiguous, unconstitutional, scattered, and lacking appropriate protection against misuse. ”
For example, the group said the platform may be required to remove photos of journalists from topless protests on public roads, as well as photos of metro flashers distributed by law enforcement.