DeepSeek boosts AI model with 10-fold token addition as Zhipu AI unveils GLM-5
The upgrade will allow DeepSeek’s AI model to remember and process more information in a single conversation or task

Chinese artificial intelligence start-up DeepSeek has updated its flagship AI model, adding support for a large context window with more up-to-date knowledge and fuelling further anticipation over its next major release.
When asked, DeepSeek’s namesake chatbot confirmed in multiple responses that from Wednesday it had expanded its context window from 128,000 tokens to over 1 million – a nearly tenfold increase expected to improve AI systems’ handling of human queries.
A larger context window means an AI model can “remember” and process more information in a single conversation or task, allowing it to do more complex reasoning or work better with data and code.
The upgrade comes as another Chinese AI heavyweight, Zhipu AI, unveiled its next flagship model on the same day, a move that is set to further ignite competition in the AI race.
Zhipu AI’s GLM-5 comes with improved coding and agentic capabilities thanks to a twofold increase in its parameters and the adoption of the DeepSeek Sparse Attention, a technique DeepSeek invented to strike a balance between model performance and efficiency.
