By 2024, AI tools have rapidly penetrated people’s lives, but AI legislation in the United States has moved much more slowly. Dozens of AI-related bills have been introduced this Congress to fund research or reduce its harms, but most have been stuck in partisan gridlock or overshadowed by other priorities. I was buried. In California, a bill aimed at making AI companies liable for damages easily passed the state legislature, but Gov. Gavin Newsom vetoed it.
This inaction has some AI skeptics increasingly concerned. “We’re seeing a repeat of what we’ve seen with privacy and social media: not putting guardrails in place from the beginning to protect people and drive real innovation,” Consumers Federation said. said Ben Winters, Director of AI and Data Privacy. America’s Time magazine speaks.
Meanwhile, industry advocates have succeeded in persuading many policymakers that overregulation will harm the industry. So instead of trying to pass a comprehensive AI framework, as the EU did with its AI Act in 2023, the US might find agreement on individual areas of concern one by one.
As the calendar turns, here are the top AI issues Congress may want to tackle in 2025.
Prohibition of specific hazards
The first AI-related harm that Congress may focus on is the proliferation of non-consensual deepfake pornography. This year, new AI tools have made it possible to sexualize and humiliate young women with the click of a button. These images spread rapidly on the internet and in some cases were misused as extortion tools.
Altering these images seemed easy for almost everyone. Leaders of both parties, parent activists, and civil society organizations all pushed for the legislation. However, the bill stalled at various stages of the legislative process. Last week, the Take It Down Act, sponsored by Texas Republican Ted Cruz and Minnesota Democrat Amy Klobuchar, passed through the House budget after extensive media and lobbying efforts by the media and both senators. incorporated into the bill. The measure would criminalize the creation of deepfake pornography and require social media platforms to remove the images within 48 hours of being notified.
However, the funding bill failed after strong opposition from some Trump supporters, including Elon Musk. Still, Take It Down’s inclusion in the funding bill means it has approval from all major leaders in both chambers of Congress, said Sunny, vice president of political affairs at AI-focused advocacy group Encode.・Mr. Gandhi says: He added that a similar bill, the Defiance Act, which would allow victims to bring civil suits against deepfake creators, could also be a priority next year.
Read more: Time 100 AI: Francesca Mani
Activists will seek legislative action related to other AI harms, such as the vulnerability of consumer data and the risk that companion chatbots can cause self-harm. In February, a 14-year-old boy committed suicide after forming a relationship with a chatbot that urged him to “go home.” But the difficulty of passing uncontroversial legislation like the fight against deepfake porn portends a difficult path to passing other measures.
Increased funding for AI research
At the same time, many lawmakers intend to prioritize promoting the growth of AI. Industry advocates have characterized AI development as an arms race, and the United States risks falling behind other countries unless it invests more in the field. On December 17, the bipartisan House AI Task Force released a 253-page report on AI, emphasizing the need to promote “responsible innovation.” “From optimizing manufacturing to developing treatments for serious diseases, AI can significantly improve productivity and help you reach your goals faster and more cost-effectively,” said Task Force Co-Chair. Jay Obanorte and Ted Lieu write:
In this vein, Congress is likely to seek increased funding for AI research and infrastructure. One bill that garnered interest but failed to make it past the finish line was the Create AI Act, which aims to establish a national AI research resource for academics, researchers, and startups. “This is about democratizing who is part of this community and this innovation,” Sen. Martin Heinrich, a New Mexico Democrat and the bill’s lead author, told TIME in July. Ta. “I don’t think we can afford to do all this development in just a few areas of the country.”
More controversially, Congress may also seek to provide funding to integrate AI tools into US war and defense systems. Trump’s allies, including David Sachs, the Silicon Valley venture capitalist whom Trump has dubbed the “White House AI and Crypto Czar,” have expressed interest in weaponizing AI. Defense contractors recently told Reuters they expect Elon Musk’s Department of Government Efficiency to explore more joint projects between contractors and AI technology companies. And in December, OpenAI announced a partnership with defense technology company Anduril to leverage AI to defend against drone attacks.
This summer, Congress helped allocate $983 million to the Defense Innovation Unit, which aims to bring new technology to the Department of Defense. (This was a significant increase over past years.) The next Congress could allocate an even larger funding package for similar efforts. “There have always been barriers to new entry into the Department of Defense, but for the first time we are starting to see these smaller defense companies compete and win contracts,” said law firm DLA’s AI Policy Director Tony Sump said. Piper. “There is a desire in Congress right now to be disruptive and to move things faster.”
Featured Senator Thun
One of the key figures shaping the 2025 AI bill will be Republican Sen. John Thune of South Dakota, who has expressed a keen interest in the issue and will become Senate Majority Leader in January. In 2023, Thune worked with Klobuchar to introduce legislation aimed at promoting transparency in AI systems. While Thune condemned Europe’s “heavy-handed” approach to AI, he was also very clear about the need for gradual regulation to address AI applications in high-risk areas. There is.
“I’m hopeful that there will be some positive results from the fact that the Senate Majority Leader is one of the top Senate Republicans who is passionate about technology policy in general,” Winters said. “That could lead to further action on things like children’s privacy and data privacy.”
Influence of President Trump
Of course, the Senate will have to take some cues from President Trump on AI next year. It’s unclear how exactly President Trump feels about this technology, and there will likely be a number of Silicon Valley advisers with different AI ideologies vying for Trump’s ear. (For example, Marc Andreessen wants AI to be developed as soon as possible, while Musk warns of the technology’s existential risks.)
While some expect Trump to approach AI only from a deregulation perspective, Alexandra Givens, CEO of the Center for Democracy and Technology, said Trump will focus on the impact AI will have on society in 2020. He pointed out that he was the first president to issue an executive order on AI. Protect people’s civil rights and privacy. “We hope he stays within that framework and that this doesn’t become a partisan issue that breaks down along party lines,” she says.
Read more: What Donald Trump’s victory means for AI
States can move faster than Congress
As always, getting anything passed in Congress is difficult. Therefore, state legislatures may take the lead in enacting their own AI legislation. Left-leaning states in particular may seek to address areas of AI risk that Republican-controlled Congresses are less likely to touch on, such as racial and gender bias in AI systems and environmental impacts.
Colorado, for example, passed a law this year regulating the use of AI in high-risk scenarios such as reviewing job, loan and housing applications. “We approached these high-risk uses while keeping a light touch,” Givens says. “It’s a very attractive model.” In Texas, state lawmakers recently introduced their own bill modeled after that bill, which will be considered in the Legislature next year. Meanwhile, New York state will consider legislation that would limit the construction of new data centers and require energy consumption reporting.