Ant Group has entered the multi-trillion parameter AI modeling space with Ling-1T, a newly open-sourced language model that the Chinese fintech giant positions as a breakthrough in balancing computational efficiency with advanced inference capabilities.
The October 9 announcement is a significant milestone for Alipay operators, which are rapidly building out their artificial intelligence infrastructure across multiple model architectures.
The trillion-parameter AI model demonstrated competitive performance on complex mathematical reasoning tasks, achieving 70.42% accuracy on the 2025 American Invitational Mathematics Examination (AIME) benchmark, a standard used to evaluate the problem-solving ability of AI systems.
According to Ant Group’s technical specifications, Ling-1T maintains this performance level while consuming an average of more than 4,000 output tokens per problem, putting it on par with what the company calls “best-in-class AI models” in terms of quality of results.
A two-pronged approach to AI advancements
The release of the trillion-parameter AI model coincides with the release of Ant Group’s dInfer, a specialized inference framework designed for diffuse language models. This parallel release strategy reflects the company’s bet on multiple technological approaches rather than a single architectural paradigm.
Pervasive language models represent a departure from the autoregressive systems that underpin widely used chatbots like ChatGPT. Unlike sequential text generation, diffusion models generate output in parallel. This approach is already popular in image and video generation tools, but less common in language processing.
Ant Group’s dInfer performance metrics suggest significant efficiency gains. Testing with the company’s LLaDA-MoE diffusion model generated 1,011 tokens per second on the HumanEval coding benchmark. Meanwhile, Nvidia’s Fast-dLLM framework delivered 91 tokens per second, and Alibaba’s Qwen-2.5-3B model running on vLLM infrastructure delivered 294 tokens per second.
“We believe that dInfer provides both a practical toolkit and a standardized platform to accelerate research and development in the rapidly growing field of dLLM,” Ant Group researchers said in an accompanying technical document.
Expanding the ecosystem beyond language models
The Ling-1T trillion parameter AI model is part of a broader family of AI systems that Ant Group has built over the past few months.

The company’s portfolio currently spans three major series: Ling non-thinking models for standard language tasks, Ring thinking models designed for complex reasoning (including the previously released Ring-1T preview), and Ming multimodal models that can process images, text, audio, and video.
This diverse approach has been extended to an experimental model called LLaDA-MoE, which employs a Mixture-of-Experts (MoE) architecture. This is a technique that activates only the relevant parts of a large model for a specific task, theoretically increasing efficiency.
He Zhengyu, Chief Technology Officer of Ant Group, clearly stated the company’s position regarding these releases. “At Ant Group, we believe that artificial general intelligence (AGI) should be a public good, a common milestone toward humanity’s intellectual future,” he said, adding that the open source releases of both the trillion-parameter AI model and the Ring-1T preview represent a step toward “open and collaborative progress.”
Competitive dynamics in a constrained environment
The timing and nature of Ant Group’s release reveals China’s strategic calculations in the AI space. With export controls restricting access to cutting-edge semiconductor technology, Chinese technology companies are increasingly focusing on algorithmic innovation and software optimization as competitive differentiators.
TikTok’s parent company ByteDance similarly introduced a diffusion language model called Seed Diffusion Preview in July, which it claims is 5x faster than comparable autoregressive architectures. These parallel efforts suggest industry-wide interest in alternative model paradigms that may offer efficiency benefits.
However, the trajectory of actual adoption of pervasive language models remains uncertain. Autoregressive systems continue to dominate commercial deployments due to their proven performance in natural language understanding and generation, a core requirement for customer-facing applications.
Open source strategy as market positioning
By making its trillion-parameter AI model publicly available with the dInfer framework, Ant Group is pursuing a collaborative development model that contrasts with the closed approach of some competitors.
This strategy has the potential to accelerate innovation while positioning Ant’s technology as foundational infrastructure for the broader AI community.
The company is also developing AWorld, a framework aimed at supporting continuous learning for autonomous AI agents. It is a system designed to independently complete tasks on your behalf.
Whether these efforts combine to establish Ant Group as a significant force in global AI development will depend partly on whether performance claims are actually verified and partly on adoption rates among developers seeking alternatives to established platforms.
The open-source nature of the trillion-parameter AI model could facilitate this validation process while building a community of users invested in the technology’s success.
For now, these releases indicate that China’s leading technology companies view the current AI landscape as fluid enough to accommodate new entrants willing to innovate on multiple dimensions simultaneously.
SEE ALSO: Ant Group uses domestic chips to train AI models and reduce costs

Want to learn more about AI and big data from industry leaders? Check out the AI & Big Data Expos in Amsterdam, California, and London. This comprehensive event is part of TechEx and co-located with other major technology events such as Cyber Security Expo. Click here for more information.
AI News is brought to you by TechForge Media. Learn about other upcoming enterprise technology events and webinars.

