WASHINGTON, DC – November 18: President Trump meets with Saudi leaders at the White House as the administration considers an executive order that could override state AI laws and reorganize the entities that set the rules for the technology. (Photo by Win McNamee/Getty Images)
Getty Images
A leaked draft executive order reveals that the Trump administration is preparing to challenge state AI laws as unconstitutional, make federal funding conditional on state compliance, and deploy the Justice Department to litigate noncompliant states.
The industry agrees on the need for federal policy to pre-empt a patchwork of state laws. But not everyone speaks with the same voice.
This draft bill is a bold measure that would halt national action without putting a national framework in place. Supporters say a unified federal approach is needed to protect America’s competitiveness, prevent compliance costs from ballooning across 50 jurisdictions, and keep pace with global rivals accelerating their own AI strategies.
Earlier this summer, the AI ​​legislative moratorium proposal was not included in the Republican budget reconciliation package, the One Big Beautiful Bill Act. The Senate was unable to unite support with prominent MAGA Republicans like Sen. Josh Hawley. He recently said, “As a Republican who believes in federalism, I think it’s a strange argument to have some Republicans in this building suddenly arguing that we should say to the states, ‘No, we shouldn’t actually do anything.'”
In July, the AI ​​Action Plan set the tone for an accelerationist vision: the belief that AI development should proceed as quickly as possible while minimizing regulatory friction. It framed advances in AI as a geopolitical competition for economic and national security advantage, reflecting the preferences of Silicon Valley’s venture community while at odds with civil society calls for a more balanced approach.
Contents of the draft
The draft order aims to block state-level AI laws and create minimally intrusive national standards. The administration argues that state laws such as California’s Frontier Artificial Intelligence Transparency Act impose harmful compliance burdens and harm America’s competitiveness. This revives some elements of the failed suspension proposal.
The order directs the creation of the Department of Justice AI Litigation Task Force to challenge state AI laws that regulate interstate commerce as unconstitutional. Its reach will extend far beyond California. Colorado, Illinois, and other states have already enacted AI laws covering employment decisions, consumer disclosures, and algorithmic accountability. If a Justice Department task force moves forward, these laws could face federal legal challenges.
The administration will commission a review by the Commerce Department to identify state AI laws that are inconsistent with the administration’s AI policies and refer them to a Justice Department task force. The goal is to identify laws that require a model to change its output based on what the governing conditions “cause” or force disclosures that violate First Amendment protections.
The order directs the Commerce Department to outline funding restrictions for noncompliant states. Federal agencies should consider discretionary grants and consider placing conditions on states that refrain from enacting or enacting conflicting laws during the year they receive the grant.
The FCC would create national disclosure standards that preempt state laws. The FTC will apply its unfair and deceptive practices authority to explain when state laws that require output changes are prohibited.
Finally, the order directs Special Counsel for AI and Cryptocurrency David Sachs and Director of Legislative Affairs James Braid to draft legislation that would establish a federal regulatory framework. Critics see the administration as bowing to the Silicon Valley lobby.
What the industry wants
Not the entire industry thinks the same way as Valley accelerationists.
Traditional enterprise technology companies strike a more careful balance. Companies like Microsoft and IBM support innovation, but stress that trust and accountability are essential for widespread adoption. Microsoft emphasized strong standards support in its AI Action Plan input. IBM emphasized clear roles for developers and implementers and called for controls for high-risk use. These companies argue that federal standards can reduce fragmentation while building trust, which is essential for long-term reliability.
Anthropic is one of the most vocal supporters of safety and national security regulations. The company supports uniform federal AI regulation, opposes a blanket suspension of state laws, and supports state safety measures while federal action is delayed. Speaking at X earlier this year, Sachs accused Anthropic of running a “sophisticated regulatory acquisition strategy based on fear-mongering.”
Future direction of AI policy
The administration and its allies are rapidly stepping into the policy vacuum left by Congress. Their approach offers real benefits: reducing fragmentation, clarifying industry incentives, and strengthening the culture of innovation that has fueled American entrepreneurship.
But sustainable growth requires more than that. Confidence is needed in the market to avoid the boom-and-bust cycles that characterized past technology waves. Alongside federal action, modern governance alternatives such as dynamic governance models and independent verifiers can help build that trust. Dynamic monitoring and independent validation allows you to quickly adapt your rules with checks from a trusted third party.
The administration’s willingness to override state AI laws should remind Congress that only laws grounded in democratic debate can provide a durable and stable framework. If signed, the order would mark the most aggressive federal intervention in state technology regulation in decades and set off an inevitable legal battle over the limits of executive power. Meanwhile, Congress has shown little appetite for action. The question is whether things change before states and the White House come to blows.

