World’s top AI brains debate if DeepSeek’s model is a game changer
Experts weigh in on how quickly China is catching up on the US in AI, as DeepSeek takes an open-source approach
![DeepSeek has been making waves in the global AI community. Photo: Reuters](https://cdn.i-scmp.com/sites/default/files/styles/1020x680/public/d8/images/canvas/2025/02/03/6aaeff14-a4f9-402b-a90a-e6bc40b06614_7eb51446.jpg?itok=RkKuLi2V&v=1738576723)
Industry heavyweights from OpenAI CEO Sam Altman to former Baidu and Google scientist Andrew Ng have praised the open-source approach of DeepSeek, following its release of two advanced AI models.
Based in Hangzhou, capital of eastern Zhejiang province, DeepSeek stunned the global AI industry with its open-source reasoning model, R1. Released on January 20, the model showed capabilities comparable to closed-source models from ChatGPT creator OpenAI, but was said to be developed at significantly lower training costs.
DeepSeek said its foundation large language model, V3, released a few weeks earlier, cost only US$5.5 million to train. That statement stoked concerns that tech companies had been overspending on graphics processing units for AI training, leading to a major sell-off of AI chip supplier Nvidia’s shares last week.
![OpenAI CEO Sam Altman. Photo: AFP OpenAI CEO Sam Altman. Photo: AFP](https://img.i-scmp.com/cdn-cgi/image/fit=contain,width=1024,format=auto/sites/default/files/d8/images/canvas/2025/02/03/e4afd981-4546-43a0-bd32-ae9492361924_0425409f.jpg)
OpenAI “has been on the wrong side of history here and needs to figure out a different open-source strategy”, Altman said last week in an “Ask Me Anything” session on internet forum Reddit. The US start-up has been taking a closed-source approach, keeping information such as the specific training methods and energy costs of its models tightly guarded.
![loading](https://assets-v2.i-scmp.com/production/_next/static/media/wheel-on-gray.af4a55f9.gif)