Only one week left in President Biden’s term. signed Executive order setting aside federal land for construction of Artificial intelligence (AI) data center. It is paid entirely by the developer of the AI foundation (or frontier) model, such as OpenAI’s GPT-4o. AI model developers must also: Please make sure that there is Intense AI workloads are notoriously energy-intensive, making it a clean energy source for these data centers.
The latest order follows Biden’s October 2023 order. presidential order Set guardrails for a strong frontier or foundation AI model. This includes ensuring that governments can evaluate AI systems before deploying them in areas such as cybersecurity and other national security risks.
Biden too pledged Develop labeling and content provenance mechanisms to help consumers identify which content is generated by AI. President Trump issued his first executive order regarding AI in 2020, calling for the use of AI in the federal government. Various states (California, Texas) and other people) also have their own AI rules.
UK and EU regulations
AI regulations are different in the US than in the UK and Europe. EU AI law is more comprehensive law Assess AI applications based on three risk levels: unacceptable risk (similar to government scoring of individuals based on social status), high risk (resuming scanning AI tools rank job seeker) and what is not Prohibited or considered high risk.
British Prime Minister Keir Starmer announced On Monday (13 January), an action plan to make the UK a leader in AI was announced, including expanding data center capacity for AI workloads. Mr Starmer said formal AI regulation was coming. his predecessor Rishi Sunak announced AI regulatory framework for existing regulators to follow.
AI chip export regulations
Biden too expanded Powered by AI chips in 2022 and 2023 export regulations I was going to Preventing China and other adversaries from getting their hands on AI hardware. This week’s new regulations will divide the world into groups of haves and have-nots: 18 allies and partners will not be subject to any restrictions At allMeanwhile, buyers who order smaller chips with up to 1,700 advanced GPUs in computing power will also get the green light. These are usually universities and research institutes.
However, more 120 other countries It is reportedly facing new rules in setting up AI computing facilities. Trustworthy organizations also include those based in close U.S. allies and not headquartered in “Countries of Concern.” Businesses not located in allied countries can also purchase up to 50,000 advanced GPUs per country. Biden also secret AI model weights from untrusted entities, and other security controls.
Nvidia lifts chip limits
The rule is expected to impact Nvidia, whose GPU chips are the silicon of choice for training and inferring AI models. The company has market share Fixed at 80% or higher.
Nvidia positioned itself for the AI revolution in 2006. Its GPU was originally developed to handle games and other graphics-intensive applications. Despite the slowdown in AI progress, co-founder and CEO Jensen Huang has staked his company’s future on pivoting to AI. in the past This is the so-called “AI winter.”
Ned Finkle, Nvidia’s vice president of government affairs, criticized Biden’s new rules. He wrote: blog He said that advances in AI are “now at risk” worldwide. He said Biden’s “misguided” policies “threaten to stifle innovation and economic growth around the world.”
Finkle criticizes Biden’s expansion of export control rules as “Over 200 pages of regulatory morass, drafted in secret and without proper legislative review.” Such regulatory measures result in “ waste “It’s America’s hard-won technological advantage,” he added.
Finkle praised the first Trump administration for fostering an environment for AI innovation and said he was “looking forward” to returning to the Trump administration’s policies as the former president prepares to take the oath of office. .
The Semiconductor Industry Association expressed a similar opinion. own statement. “We are extremely disappointed that a policy change of this magnitude and impact was announced so suddenly, just days before the presidential transition.” and Without meaningful input from industry. ”
OpenAI’s plans for the US
When OpenAI CEO Sam Altman signaled plans to attend President Trump’s inauguration, his AI startup pre-empted a blueprint to keep America at the forefront of AI development. Expanded.
“We believe we need to act now to maximize the United States.” A.I. Increase potential while minimizing harm. AI is too powerful a technology to be led and shaped by dictators, but that is the growing risk we facemeanwhile The economic opportunities presented by AI are too attractive to abandon,” OpenAI said.AI in America” economic planning.
OpenAI’s vision is based on the following beliefs:
chips, data, energy and Talent is the key to winning the AI race. The US is certainly an attractive market for AI investments, with $175 billion in global capital waiting to be deployed. Otherwise, these will be sent to Chinese-backed projects.
Free markets that promote free and fair competition to foster innovation. This includes the freedom for developers and users to use AI tools while following clear, common-sense standards to keep AI safe for everyone. Governments must not use these tools to consolidate power and control their citizens.
To ensure the security of the frontier model, the federal government must develop best practices for prevention. against Exploitation; restricting the export of frontier models to enemy countries. develop An alternative to a patchwork of national and international regulations, such as the United Nations led by the United States.
TThe federal government should share its expertise in protecting AI developers’ intellectual property from threats and help companies gain access to secure infrastructure, such as sensitive computing clusters. evaluate Develop model risks and safeguards, and companies that develop large-scale language models work with governments to define model assessments, test models, and exchange information to support enterprise safeguards. Create a self-directed pathway for.
While the federal government should take the lead on AI issues related to national security, states can act to maximize the benefits of AI to their own AI ecosystems. Power your AI experimentation by supporting startups and small AI companies to search How to solve everyday problems.