Advertisement
X

China’s AI Race Heats Up: Alibaba to Spend $53 Billion on AI Infrastructure

Alibaba's $53 billion announcement for AI sets a record for the largest investment ever by Chinese private enterprises in the field of cloud and artificial intelligence hardware infrastructure construction

Chinese tech giant Alibaba on Monday announced that it will invest more than $53 billion on AI (artificial intelligence) infrastructure such as data centers over the next three years, according to a Bloomberg report. The e-commerce company cofounded by Jack Ma plans to spend more on its AI and cloud computing network.

Advertisement

The company, in its official blog, said that Alibaba envisions becoming a key partner to companies developing and applying AI to the real world as models evolve and need increasing amounts of computing power.

Citigroup analyst Alicia Yap said that the $53 billion announcement for AI sets a record for the largest investment ever by Chinese private enterprises in the field of cloud and artificial intelligence hardware infrastructure construction.  

It is pertinent to note that Alibaba has been witnessing a rapid increase in demand for AI-powered solutions. Alibaba Group CEO Eddie Wu revealed that nearly 70% of new demand has been for interfere workloads --- an important component of AI model deployment since the Chinese New Year. Its stock price also rose more than 68% this year.

The announcement came in the backdrop of tech giants like Meta and Amazon pledging billions towards the data centers need to train, develop and host AI services.

Advertisement

China’s Cheapest AI Model

China’s AI start-up DeepSeek created buzz in the tech world as it launched the large language models DeepSeek-V3 and DeepSeek-R1. It has overtaken ChatGPT as the most downloaded app on Apple’s US store.

The most interesting part about DeepSeek that it has built the low cost AI model as compared to the cost needed to build Meta and other models. The start-up claims that it created its low-cost model in just two months with less than $6 million. This is in huge contrast to the $100 million that OpenAI spent to train its GPT-4 model.

It has used older, cheaper Nvidia H800 GPUs. On the other hand, AI companies in the USA generally use powerful, expensive Nvidia H100 GPUs. One reason behind the same was the fact that due to U.S. export restrictions, Chinese companies like DeepSeek couldn't access the best AI chips, like Nvidia's. As a result, DeepSeek's engineers had to create smarter, more energy-efficient algorithms to make up for the lower computing power they had.

Advertisement

However, the start-up has also sparked data security and privacy concerns, with some countries completely or partially banning its use.

Show comments