Advertisement
Artificial intelligence
TechBig Tech

Supersized and scaling: China pushes 10,000-card computing clusters in AI race

Cities and tech giants are racing to build massive AI clusters, betting scale will drive faster model training, lower costs and wider adoption

3-MIN READ3-MIN
Listen
A 10,000-card cluster functions as a supercomputer, integrating high-performance GPUs and advanced storage into a single system. Photo: Getty Images
Ann Caoin Shanghai

In China, computing facilities have emerged as a new form of infrastructure over the past two years, sparking an arms race among cities and technology companies to build 10,000-card computing clusters.

These clusters – which link 10,000 or more artificial intelligence accelerator chips – enable faster iteration of AI capabilities and significantly reduce model training times.

Domestic champions, from tech giants such as Huawei Technologies and Alibaba Group Holding to graphics processing unit (GPU) specialists like Moore Threads, are competing to position their chips at the centre of these systems.

Advertisement

Here’s how these clusters work and where the technology may be headed.

Who is building them?

A 10,000-card cluster functions as a supercomputer, integrating high-performance GPUs and advanced storage into a single system.

Advertisement
Advertisement
Select Voice
Select Speed
1.00x