Nvidia Seems Poised to Enter the Multibillion-Dollar Custom Artificial Intelligence (AI) Chip Market
Nvidia’s new custom chip unit will reportedly focus on custom chips for data center AI and other applications.
Nvidia (NVDA -0.90%) stock is on fire. Shares of the artificial intelligence (AI) chip leader have gained 81.5% so far in 2024, and are up 531% over the last three years through May 10. The S&P 500 index has returned 10% and 31.9%, respectively, over these periods.
The stock’s gains are mostly being driven by powerful demand for the company’s AI-enabling graphics processing units (GPUs) and related offerings for data centers. Nvidia has an estimated 80% share of the fast-growing market for data center AI chips.
But can Nvidia stock’s great performance continue? Some investors are concerned the company could be hurt by the increasing trend among large technology companies, nearly all of which are Nvidia customers, of developing their own custom AI chips. They are doing so to optimize the chips to their specific AI workloads, which can have monetary and power advantages over using general processors, such as GPUs.
There’s some good news for Nvidia investors on this front.
Nvidia is reportedly forming a custom chip business unit
Nvidia is forming a new business unit focused on designing custom chips for cloud computing companies and others to use for data center AI and other applications, according to a Reuters’ exclusive article published in February. The article named Dina McKinney, a former Advanced Micro Devices (AMD) and Marvell Technology executive, as head of Nvidia’s new custom chip business. (Custom chips are also called application-specific integrated circuit, or ASIC, chips.)
The Reuters article cited “nine sources familiar with the company’s plans.” Moreover, it indicated that McKinney’s title on her LinkedIn profile, which reportedly had indicated she was heading Nvidia’s custom chip business, was changed after Reuters contacted Nvidia for comment. The latter alone gives much credence to the reporting, in my view.
That said, I wanted to dig a bit before covering this topic. My perusal of Nvidia’s job openings supports the theory that it’s building a custom chip unit. Moreover, I found a LinkedIn post from McKinney, indicating she was excited to join Nvidia in her new role heading the custom silicon team.
What big tech companies have developed or are developing custom chips?
The three big tech companies that are leaders in the cloud computing service market — Alphabet, Amazon, and Microsoft — now have or will soon have custom AI chips. The companies use (or plan to use) these chips for internal reasons and they are (or will soon be) available on their cloud services.
Alphabet was the first mover in this space. In 2015, it began using its tensor processing units (TPUs) for internal purposes and several years later made them available for customers of Google Cloud. In 2018, Amazon released its first version of its custom AI chip Graviton, which is available on Amazon Web Services (AWS). And last November, Microsoft unveiled two custom-designed chips, Maia and Cobalt.
As you might expect, Facebook parent Meta Platforms has also designed its own custom AI chip, the Meta training and inference accelerator (MTIA). Apple and Tesla also have custom chips. Apple uses custom silicon for its various consumer electronic products, while Tesla started using a custom-designed chip several years ago in its self-driving feature.
What semiconductor companies are involved in the custom chip market?
Broadcom (AVGO 0.89%) is the biggest name in the custom AI chip business. There’s much confidentiality in this space, but it’s known that Broadcom’s custom AI chip customers include Alphabet and Meta Platforms. Moreover, at an investor event in March, Broadcom disclosed that it had gained a third big custom AI chip customer, but it did not name the company.
How big is the custom chip market?
Research firm 650 Group’s Alan Weckel estimated the data center custom chip market will be worth about $10 billion this year, and surge to $25 billion in 2025, according to the Reuter’s exclusive. That article also cited a Needham analyst who covers the semiconductor space as pegging the broader custom chip market at approximately $30 billion in 2023.
How large is the custom chip market relative to Nvidia’s size?
On May 22, Nvidia is due to report its results for the fiscal first quarter, which ended in late April. Management has guided for quarterly revenue of $24 million. If we use this number as an estimate and multiply it by four, Nvidia’s annual revenue run rate comes out to about $96 million.
If we assume the data center AI chip market will be worth about $25 million by 2025 and the broader custom chip market was worth $30 million last year, as estimated by 650 Group, it’s clear that the custom chip market is substantial even for a company as massive as Nvidia.
It makes good sense for Nvidia to leverage its expertise
It seems extremely likely that Nvidia will enter the custom chip market for AI and other applications. The company has expertise across GPUs, central processing units (CPUs), data processing units (DPUs), and high-performance networking technologies. It makes good sense for it to leverage its extensive intellectual property (IP) to capture some of the already large and fast-growing custom chip market.
John Mackey, former CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fool’s board of directors. Randi Zuckerberg, a former director of market development and spokeswoman for Facebook and sister to Meta Platforms CEO Mark Zuckerberg, is a member of The Motley Fool’s board of directors. Beth McKenna has positions in Nvidia. The Motley Fool has positions in and recommends Advanced Micro Devices, Amazon, Apple, Meta Platforms, Microsoft, Nvidia, and Tesla. The Motley Fool recommends Broadcom and Marvell Technology and recommends the following options: long January 2026 $395 calls on Microsoft and short January 2026 $405 calls on Microsoft. The Motley Fool has a disclosure policy.