The wave of agents flooding the AI industry shows little sign of letting up, with some predicting $1 trillion in capital investment over the next few years.
650 Group co-founder and analyst Alan Weckel said Agent AI’s predictions are “absolutely insane numbers,” and his bullish outlook on the neocloud industry’s financial outlook interrupted a recent streaming roundtable discussion about the forces behind the emergence of a new class of vendors selling GPUs as a service. CoreWeave, Nebius, Lambda.
Weckel’s outlook assumes similar impressive growth in the network of AI data centers. neo cloud provider And major generative AI vendors are developing at a breakneck pace around the world, and in the process. AI bubble.
“What this means is that there are hundreds of boxes coming into the data center every minute, which means that there are many racks being deployed every minute, and the inter-building connections required to connect them are in the terabits range,” Weckel continued.
data center link
New AI data centers need to be connected or networked. Because as the size of AI workloads accelerates, enterprises can no longer train huge underlying models measured on trillions of parameters in a single facility.
“We have to find additional power, so we end up distributing the GPUs across multiple facilities,” Weckel says.
And AI vendors are realizing that they need to build dedicated, independent plants. Training and inference. “AI doesn’t live in a bubble. Once you start doing the inference side of things, it’s really important to bring in those datasets,” he continued. “And the data needed for inference isn’t always local.”
After many things, AI data centers are networked Proponents of the AI boom see an ever-increasing demand for more computing power to handle large-scale production and agent AI workloads, and Weckel sees a new wave of investment in response. it is physics AIthe world of AI-powered robots and autonomous transportation.
“The key message is that we are going to see trillions of dollars in spending towards the end of this decade that will propel us into the next decade,” Weckel continued.
Vendor’s view
The online event was hosted by Ciena, optical network vendor. Ciena uses the term neoscaler to identify a new group of AI cloud vendors that scale in a more decentralized manner, similar to traditional hyperscaler cloud giants. Their neoscalers’ strategy is to network together sometimes distant clusters of GPUs that are made available on demand to run AI workloads for enterprise clients, generative AI vendors, and even the hyperscalers themselves.
“As AI workloads emerge and the demand for AI traffic increases, these companies need to scale to make their business work,” Marc Bieberich, Ciena’s vice president of portfolio marketing, said at a Dec. 4 event. “We need to scale quickly, not just in terms of computing infrastructure, but also the network infrastructure that supports it.”
“Their business is really operating in a ‘pay as you grow’ mode, or on-demand mode…as I call it, pay-as-you-go GPU computing,” he added. “These companies are very well-capitalized and are building network infrastructure.”
Siena focuses on Optical networking with spectral efficiency systemsBieberich said the company is involved with a group of about 24 neoscalers that have “already decided that they need to build their own optical network infrastructure to support their businesses.”
Bieberich added that Siena sees “very sustained” demand for its services over the next five to 10 years. “There is really huge growth potential in the market,” he said.
Meanwhile, another sector driving significant growth in spending on AI data center construction is Sovereign AIsays Weckel.
“The national corporate part is very important from a data sovereignty perspective. It’s going to be very difficult for AI to be located outside its country of origin for regulatory reasons and national security reasons,” he said.
operate the center
To power AI data centers, their builders are exploring different ways to generate the large amounts of electricity needed. This places a huge demand on the existing power grid. “What’s happening now is that operators are looking to other municipalities, other sources of power,” Weckel said.
Weckel cited meta as an example of the increasing demand for power from major AI vendors and users. The social media giant maintains a gigawatt data center in Ohio. now Meta is planning its next data center, a 5-gigawatt plant in Louisiana. At the same time, Microsoft is building a multi-gigawatt center in Wisconsin, which the tech giant plans to connect to another center in Atlanta.
“So we’re going after very large power sources, mainly nuclear and hydropower,” Weckel said. “You start putting more and more pins on the map. You’re going to need a lot of new connections that didn’t exist before.”
Data center operators also use solar and wind power, but only as backup in hybrid configurations, Weckel said. In places like the Middle East, where water is scarce but capital is abundant, data centers search for water and spend millions on desalination plants to produce fresh water to cool the large numbers of GPUs stacked inside the plant. In water-rich regions such as Canada and Northern Europe, hydropower is becoming a major source of electricity.

