Huge Artificial Intelligence (AI) News for 3 Tech Companies This Week — None of Which Are Named Nvidia
The AI wars are only getting more intense.
When people think of artificial intelligence (AI), they gravitate to graphics processing unit (GPU) chip leader Nvidia (NVDA -1.99%)— and rightly so. Nvidia currently has a huge lead making the chips needed for today’s most advanced AI applications.
But AI benefits will offer growth opportunities for lots of tech companies that haven’t seen their stocks run as far and fast as Nvidia.
On that note, there was big AI news this week for three tech giants, including two which aim to compete with Nvidia head-on.
Arm says it will begin making its own AI chips
Arm Holdings (ARM -3.43%) went public last September in a long-awaited initial public offering (IPO) that brought one of the two main semiconductor-architecture licensors to the market, with the other being the x86 architecture.
Today, Arm is thriving, as use cases for its low-power chip architecture are proliferating. Revenue surged 47% last quarter on the back of strong licensing and royalty-revenue growth. That strong growth is not a surprise. Most cloud giants are gravitating toward the Arm architecture as they begin to design their own low-cost central processing unit (CPU) chips for their proprietary cloud offerings. Nvidia (NVDA -1.99%) uses Arm-based CPUs for its Grace CPUs that attach to their Grace-Hopper and Grace-Blackwell superchips, and Alphabet (GOOG 1.06%) (GOOGL 1.08%) just introduced its own data-center CPU chip called Axion.
However, Arm isn’t getting to the meat of the AI revolution, as it doesn’t license its architecture for the GPU accelerators that are at the heart of AI processing.
But management is looking to change that. According to a recent report from Nikkei Asia, Arm is now looking to design its own AI accelerators, which it will introduce in 2025. According to the report, Arm will invest heavily in initial development costs, which it should be able to bear as Arm has about $3 billion in cash on the balance sheet as well as a highly cash-generative core licensing business. The report says it will outsource production to a third-party foundry, perhaps Taiwan Semiconductor Manufacturing.
So, Nvidia will have some increased competition, which may be why Nvidia attempted to buy Arm years ago.
Still, this may be a case of Arm being a bit late to the game in a crowded field. Virtually every cloud giant is now designing its own accelerators. And while Nvidia has a strong lead in “neutral” AI GPUs, Advanced Micro Devices and Intel are each producing better and better “neutral” competitors.
Arm is 90% owned by Softbank (SFTB.Y 0.70%), which is owned by the ambitious tech executive Masayoshi Son. Softbank had extended itself at the tail end of the tech-software bubble in 2021, so Arm investors must hope this isn’t another example of “me too” thinking with a too-late entry into a red-hot tech sector. They’d be right to have mixed feelings about this week’s announcement.
Google introduces new Trillium chips
Speaking of Google parent Alphabet, the search and cloud giant held its input/output (I/O) developer conference earlier this week. Unsurprisingly, there were numerous announcements related to AI, including the introduction of the aforementioned Axion CPU. But perhaps most notable on the AI chipmaking front was the announcement of Google’s newest version of its tensor processing unit (TPU) accelerator.
Google has been making its TPUs since 2013 but only offers them through its Google Cloud platform. Of course, Google also has a deep stock of Nvidia chips that customers can use as well, but every cloud giant is vying for having the best low-cost alternative that’s proprietary to their platform.
That’s why Google’s announcement of Trillium, the sixth-generation TPU accelerator, was important this week. The new chip boasts a 4.7 times performance over the prior fifth-generation chip, while being 67% more energy efficient. The new chip is also adorned with a new third-generation SparseCore accelerator, with capacity to process double the high-bandwidth memory with double the interchip-connect bandwidth. This will allow the new TPUs to process larger models. Alphabet also announced the ability to string 256 Trillium GPUs together to make a Google-powered supercomputer as well.
The new hardware will be behind a slew of new AI services Google introduced at the conference, including an update of its Gemini 1.5 pro model, the large language model (LLM) that competes with OpenAI’s ChatGPT and others. But Google also introduced new specialized models for specific-use cases, including Veo, for high-definition video, Imagen 3 for text-to-image capabilities, and AI Sandbox, a music and sound-generation tool.
Finally, Google Search is also getting an AI-powered upgrade launching this coming Monday. Complex Google queries will now display AI-generated summaries and multistep plans for relevant searches. The new search summaries are good efforts by Google to avoid getting disrupted by aspiring search alternatives such as ChatGPT-powered Bing or start-ups like Perplexity.
And recent results imply Google’s AI capabilities may be gaining traction, with its cloud platform reaccelerating to a 28.4% revenue growth last quarter and Search posting impressive 14.4% in its own right even up against aspiring disruptors.
Oracle lands a big fish for its cloud
When it comes to cloud infrastructure, many think of the big-three cloud platforms. But investors may not want to ignore database giant Oracle (ORCL 1.10%), which has been making efforts to fashion itself as a fourth cloud alternative. Oracle’s infrastructure revenue only amounted to $1.8 billion last quarter, which was still far behind third-place Google at $9.6 billion. However, Oracle’s infrastructure as a service (IaaS) revenue also grew at 49%, the highest rate among cloud giants.
While it’s easier to grow off a small base, those aspirations got a big boost earlier this week. According to The Information, Elon Musk’s AI start-up xAI has been discussing a deal with Oracle cloud to rent its servers for AI processing in an amount up to $10 billion over “a number of years.” Given that Oracle’s cloud is currently on a $7.2 billion annualized run rate, that would be a big deal, making xAI one of Oracle’s largest cloud customers.
One may wonder how a mere start-up can afford such largesse, but Reuters also reports that Musk’s cache has led xAI to fundraising talks of up to $3 billion at an $18 billion valuation. xAI, whose chatbot Grok aims to take on Google’s Gemini and OpenAI’s ChatGPT, will have a unique angle with access to data gleaned from X, formerly known as Twitter, as well as visual data from Tesla vehicles.
Elon Musk has built successful businesses like Tesla and SpaceX from scratch before, although his takeover of Twitter has been much more dubious. Still, if xAI turns into a big-time LLM, it appears Oracle will reap a lot of the benefits.