New AI battle adopts old price war strategy as Chinese tech giants keep start-ups at bay behind the Great Firewall
“It’s like the bad money is driving out good money,” said You Yang, a computer science professor at the National University of Singapore (NUS).
The price war indicates a lack of competitiveness based on the merits of the models themselves, according to You, making them incapable of attracting customers at previous market prices.
Under this pricing, 1 yuan could buy 1.25 million input tokens. By comparison, it would cost around US$37.50, or 272 yuan, to buy 1.25 million GPT-4 tokens.
In AI, a token is a fundamental unit of data that is processed by algorithms. For Chinese LLMs, a token typically equates to between 1 and 1.8 Chinese characters.
Other Chinese tech heavyweights were quick to respond.
Several companies including Baidu, Tencent and iFlytek, an AI specialist known for its audio recognition technology, followed with even more drastic price cuts, some offering access to their less powerful LLMs for free.
Wang Sheng, an investor with Beijing-based InnoAngel Fund, said this kind of “vicious” price competition is hurting local AI start-ups.
“When it comes to developing LLMs, Big Tech companies are not necessarily better than start-ups,” Wang said. “But their practice of subsidisation to grab a bigger market share is detrimental to these firms.”
Alain Le Couedic, senior partner at AI investment house Artificial Intelligence Quartermaster (AIQ), suggested that the price competition will pay off for some over time.
“The race for dominance in the market is a sign that many players see attractive opportunities down the road, even if this requires some pain in the short- to mid-term,” he said.
LLMs are energy intensive, making them expensive to run, so the marginal cost of adding a new user is likely higher than for other online services. This makes blitzscaling more complicated for AI services. A race to make LLMs more efficient could eventually change this math, though.
“People are pursuing these improvements in efficiency … because it pays to do so,” Bill MacCartney, chief technology officer at the venture capital firm SignalFire and a computer science professor at Stanford University, told the Post at the UBS Asian Investment Conference this week. “It costs a growing amount of money to operate these models, and people have a powerful economic incentive to find ways to make it cheaper.”
MacCartney noted that resources are pouring into finding efficiency gains across multiple levels: “There’s improvements at the silicon level, there’s improvements in model architecture, there’s improvements in that kind of software that we layer above the models.”
“It’s not far-fetched to imagine that we’ll see over a three-year horizon … maybe a 10x improvement in inference efficiency broadly across AI,” he said.
Robin Li Yanhong, founder and CEO of Baidu, said in April that the efficiency of training its flagship Ernie LLM improved 5.1 times within a year. The model’s inferencing performance increased 105 times, reducing inferencing costs by 99 per cent.
ByteDance also said it cut prices because it is confident in being able to reduce its costs through technical improvements.
Wherever profits stand, tech companies have been quick to attribute rising revenues to the AI boom.
Alibaba Cloud said its 3 per cent growth in the March quarter was buttressed in part by AI-related income that accelerated growth. Baidu Cloud reported 12 per cent revenue growth in the same quarter, with generative AI and foundation model services accounting for 6.9 per cent of total AI cloud revenue.
US tech giants Google and Microsoft have likewise reported robust demand for their cloud-based LLM services. In its third-quarter results, revenue at Microsoft’s intelligent cloud unit grew 21 per cent from a year earlier. Google, meanwhile, saw first-quarter revenue expand 28 per cent year on year.
“We have witnessed Microsoft’s market capitalisation soar, and everyone is eager to capitalise on this opportunity,” said Ivan Lam, an analyst at market consultancy Counterpoint Research. “The Chinese market is particularly keen to establish quick links between [AI] applications and business [uses] in order to foster further advancements in its LLMs.”
Alibaba declined to provide data on the growth in usage of its LLMs after its recent price cut, but the company said that the number of calls to its Qwen application programming interface (API) from a top Chinese recruiting firm spiked 100-fold within a week.
At least for users of these LLMs like that recruiting firm, the current economics of AI services appear to be paying off.
Zhao Chong, founder and CEO of AI-powered graphic design service iSheji, was upbeat about cheaper LLMs in an interview with the Chinese news portal Sohu.com published on May 24.
“For start-ups like us building applications, [the price war] is a good thing,” Zhao told Sohu. “LLM costs used to account for between 5 and 10 per cent of our total costs, now it could be 1 per cent, boosting our profit margins.”
These consumer-facing service providers have in turn been reducing the prices of their own offerings, with many available for free.
Meanwhile, start-ups that are in a position to do so are trying to sit out the price war. Beijing-based Baichuan and 01.AI – a company established by Lee Kai-fu, a Taiwanese computer scientist who previously headed Google China – have dismissed the idea of cutting prices.
You, the NUS computer scientist, noted the benefits of low prices for app developers, but he cautioned that these applications might not perform well if built on subpar foundation models.
“[Reaching] the upper limit of a technology is less certain and requires more exploration,” Yan said in a fireside chat published by Chinese technology news site Geekpark on May 23. “Whereas there is always a way to cut price.”
Counterpoint’s Lam said that price competition may be inevitable for companies looking to maintain dominance in AI services as the market becomes increasingly crowded.
Some of China’s tech giants would appear to have an edge when it comes to compute resources and money to burn. AIQ’s Le Couedic said it is too early to predict a potential winner of an AI price war, as the industry is still not mature. Business models and technical edge will both be key factors in determining the dominant players, he added.
“At the end of the day … companies with the best services and best technologies will win,” Le Couedic said.
Additional reporting by Matt Haldane.