Nvidia Set to Dominate AI Chip Market, Consuming 77% of Wafers by 2025
From petaflops of computing power to gigawatts of energy consumption, there are currently many approaches to measuring the AI industry. Wafers are a new addition. Morgan Stanley reports (via @Jukanlosreve) that Nvidia will consume 77 percent of the world’s wafers for AI processors in 2025, up from 51 percent in 2024, maintaining its footing at the forefront of the AI chip race.
Nvidia’s Unmatched Scale
Nvidia is simply a production-scale monster and has continued to innovate ahead of time. The company will consume 535000 300-mm wafers on demand in 2025 for its latest GPUs, including B200, H100, H200, and B300, all built on TSMC’s advanced 4nm-class process. These chips have large compute die sizes from 814 mm2 to 850 mm2, which explains the large wafer needs. The B200 GPU alone will consume 220, 000 wafers and generate $5.84 billion in revenue.
The success of Nvidia in capturing a large share of TSMC logic and CoWoS (chip on wafer on substrate) capacity is critical for its continued dominance. The strategy safeguards Nvidia against future rising AI processor-application, while all its rivals can only fight over its remaining wafer supply.
Competitors Struggle to Keep Up
While Nvidia is doing well, the rest of the AI chip contenders are struggling. According to AMD, its AI wafer share is projected to decline from 9 percent in 2024 to 3 percent in 2025. Even with the launch of MI300, MI325, and MI355 GPUs, AMD has very modest allocations of between 5, 000 and 25, 000 wafers for overall wafer production. Intel’s Gaudi 3 processors (formerly known as Habana) will not be much to boast about with a paltry estimated market share of 1 percent.
Google and AWS are more exposed. These companies have internal TPUs and Trainium chips for AWS, which are growing, but not comparable to the amount for Nvidia’s GPUs. Hence, in the end, Google will go from 19% to 10%, and AWS will come down from 10% to 7%. Tesla and Microsoft have their minor roles in AI computing with the help of Dojo, FSD, and Maia processors.
The Broader AI Market Landscape
By 2025, the entire AI wafer industry will be valued at $14. 57 billion over a total of 688, 000 wafers. It could be too low. For example, a TSMC, the largest semiconductor foundry in the world, grossed $64. 93 billion for 2024, where about 51% or around $32 billion is sourced from high-performance computing (HPC) segment. HPC is taking care of many products, but the main sources of revenue generation are AI GPUs and data center CPUs.
Nvidia is the primary wafer consumer because it has the technology advantage and very fast growing AI markets. As more sectors across the globe converge toward AI solutions, the demand for servers would be much higher, which will keep Nvidia on the stronghold.
What This Means for the Future
Nvidia is a monopoly: all the wafer consumption puts Nvidia on a pedestal of importance for AI’s future. But what does that say for market competition and supply chain dynamics? Even if a foundry possesses the capability to manufacture chips, then AI chip production may nonetheless present challenges for competition if wafer supplies are constrained with Nvidia.
However, upcoming AI technologies may bring changing power dynamics, with mounting implications and implications for new players, applications, and chip designs challenging Nvidia’s throne. Considering all this, Nvidia would still be the king in the AI chip market, as evidenced by its wafer consumption.