btw, watching structures built to collapse, and in serious numbers
scmp.com
China’s Moonshot claims to build models with fewer high-end AI chips than US rivals use
In a Reddit forum, a Moonshot AI executive says start-up is ‘outnumbered’ by US rivals in terms of ‘high-end GPUs’ used for training models

Vincent Chow
Published: 8:00pm, 11 Nov 2025Updated: 8:07pm, 11 Nov 2025
Chinese artificial intelligence firm Moonshot AI continues to develop AI models with fewer high-end graphics processing units (GPUs) than what its US rivals use, according to the Beijing-based start-up’s executives.
In a three-hour-long “ask me anything” session on Reddit on Monday evening, a Moonshot AI representative with the handle “ppwwyyxx” – the same moniker used by co-founder Wu Yuxin on X – said the company was “outnumbered” by rival US firms in terms of “high-end GPUs” used for AI model development.
He also confirmed that Kimi K2 Thinking, a new reasoning variant of its open-source Kimi K2 model, was trained on Nvidia’s older H800 GPUs, which were banned for export to China in late 2023.
That reflected how Chinese AI companies have been making the most of available resources on the mainland to create cutting-edge models, despite stringent US tech export restrictions.
With Kimi K2 Thinking’s release last week, Moonshot AI – a unicorn valued at US$3.3 billion and backed by Chinese tech giants like Alibaba Group Holding and Tencent Holdings – ignited fresh debate about another “ DeepSeek moment” in the global AI industry, while raising questions about recent efforts by OpenAI and its CEO Sam Altman to secure more than US$1.4 trillion in infrastructure deals with the likes of Nvidia, Broadcom and Oracle.
 Moonshot AI founder Yang Zhilin. Photo: Future Publishing via Getty Images
In the Reddit discussion, another Moonshot AI representative identified as “ComfortableAsk4494” – the online handle of founder Yang Zhilin – made a direct reference to OpenAI’s massive data centre buildout when asked when the Chinese firm planned to release its next-generation foundational model, the K3. Yang replied: “Before Sam’s trillion-dollar data centre is built.”
Asked why OpenAI was “burning so much money”, Moonshot AI representative “zxytim” – the same handle used by co-founder Zhou Xinyu on X – said: “Only Sam knows, we’ve got our own way and our own pace.”
Those comments cast light again on the cost-efficient strategy behind the development of Kimi K2 Thinking, which was trained at a mere cost of US$4.6 million, according to a CNBC report.
Moonshot AI’s Yang, however, dismissed the reported figure. “This is not an official number,” he wrote on the Reddit forum. “It is hard to quantify the training cost because a major part was research and experiments.”
Even without factoring in its costs, the latest model has impressed the AI community. Thomas Wolf, co-founder of open-source developer platform Hugging Face, posted on X that Kimi K2 Thinking was another case of an open-source model achieving industry-leading performance.
Last month, Li Zixuan – Zhipu AI’s head of global operations – questioned Facebook owner Meta Platforms’ AI talent poaching spree that reportedly involved US$100 million in signing bonuses.
“We don’t believe it should cost you a hundred million dollars to hire these people,” Li said on the Manifold podcast.
Both OpenAI and Meta have committed to huge investments in data centres and advanced GPUs, which they said were needed to train powerful AI systems that could far surpass human intelligence. Meanwhile, Chinese AI start-ups continued to be restricted from accessing those chips under US export controls. |