Advertisement
Artificial intelligence
TechBig Tech

Baidu unveils AI computing platform powered by Chinese chips to push a domestic tech stack

The Baige 5.0 infrastructure platform aims to raise the efficiency of DeepSeek’s open-source AI models

Reading Time:2 minutes
Why you can trust SCMP
1
Baidu’s Baige 5.0 AI infrastructure platform delivers enhanced model training and inferencing capabilities. Photo: Shutterstock
Ben Jiangin Beijing
Baidu, one of China’s artificial intelligence champions, on Thursday unveiled its Baige 5.0 AI infrastructure platform – powered by a mix of semiconductors, including those designed by its Kunlunxin unit – to raise the efficiency of DeepSeek’s open-source models.
The Beijing-based AI and internet search giant’s upgraded platform delivers a speedier network connection, increased computing power, and enhanced model training and inferencing capabilities, according to executive vice-president Shen Dou, who also serves as president of Baidu AI Cloud Group, speaking at a corporate event held in China’s capital on Thursday.
Baige’s inferencing system – backed by adaptive and smart resource allocation technologies, which speed up data throughput and lessen communications latency – improved the inferencing efficiency of DeepSeek’s R1 reasoning model by around 50 per cent, according to Shen. Inferencing is the process that a trained AI model uses to draw conclusions in response to human queries.
Advertisement

“That means, with the same time and cost … we could have the model ‘think’ 50 per cent more [or] work 50 per cent more,” Shen said.

The launch of Baige 5.0 reflects the growing efforts across the mainland’s AI and semiconductor sectors to push forward the development of a domestic AI technology stack, reducing the impact of US trade restrictions on China.
Baidu executive vice-president Shen Dou serves as the president of the company’s AI Cloud Group. Photo: Handout
Baidu executive vice-president Shen Dou serves as the president of the company’s AI Cloud Group. Photo: Handout

Shen said the Kunlunxin Super Node, which supports hundreds of interconnected AI chips, had gone live on the Baige 5.0 platform, making it capable of deploying and running a trillion-parameter AI system within minutes. The number of parameters in an AI model dictates the size and complexity of its use.

Advertisement
Select Voice
Choose your listening speed
Get through articles 2x faster
1.25x
250 WPM
Slow
Average
Fast
1.25x