AI

By Decade’s End, AI Will Drive More Than Half Of All Chip Sales

Published

As the year came to an end, we tore apart IDC’s assessments for server spending, including the huge jump in accelerated supercomputers for running GenAI and more traditional machine learning workloads and as this year got started, we did forensic analysis and modeling based on the company’s reckoning of Ethernet switching and routing revenues.

Today, we are going to dice and slice and augment Gartner’s annual worldwide semiconductor sales breakdown, paying particular attention to AI XPU, HBM memory, and networking revenues as they embiggen at an embiggening rate and comprise more and more of the chippery sold or commissioned to be manufactured every year.

The most important thing, perhaps, in the Gartner report on worldwide chip sales is the statement by Rajeev Rajput, senior principal analyst at the market researcher, that XPU and GPU processors, HBM stacked memory, and networking chips sold or created to be put into AI systems accounted for nearly a third of total chip sales in 2025. The overall market accounted for $793.4 billion in sales, up 21 percent year on year, but if you take a stab at where “nearly a third” is – we picked 31.5 percent just to put a number on it to start making projections – then this works out to about $250 billion for those three kinds of chips (XPUs, HBM, switch ASICs).

Playing its usual game of cat and mouse with the data, Gartner said further in its report, which you can see here, that HBM revenues represented more than $30 billion in revenues (and accounting for 23 percent of overall DRAM sales worldwide in 2025), and that AI “processors” exceeded $200 billion in sales in 2025 and AI semis in general would represent over 50 percent of overall chip revenues by 2029.

Let’s chew on all of that data and how we can model that up as we look at the chip revenues by vendor over the past baker’s dozen of years. We took the top ten vendors data supplied by Gartner and then did our best to reckon what STMicroelectronics and Texas Instruments did. Both of those companies were knocked out of the rankings by the rocketship rise of Nvidia and the growth of Apple (which designs and has manufactured a lot of its own chippery for the devices it sells) and MediaTek.

Here is the monster revenue table for chip sales from 2013 through 2025 based on Gartner data:

As usual, numbers in bold red italics are our estimates to fill in gaps in the data.

AI system juggernaut Nvidia did not even enter the top ten rankings of chip sellers until 2019, and in five short years it blasted past a collapsing Intel and the other biggies in the market – Micron Technology, Broadcom, Qualcomm, SK Hynix, and Samsung – to be the unquestioned revenue leader in chip sales in 2024. And in 2025, the GenAI boom helped Nvidia grow its revenues by an amazing 63.9 percent to $125.7 billion, representing a 15.8 percent share of worldwide chip sales. This is around the peak revenue share that Intel had back in the mid-2010s, when AI was a much smaller opportunity, when CPUs still mattered a lot, and when AMD had not really gotten Intel backed into a corner and Intel’s foundries were only a little behind Taiwan Semiconductor Manufacturing Co.

Here is the same data graphically:

You can see the ups and downs of the memory and flash markets in the revenue streams for Samsung, SK Hynix, and Micron and the steady growth of Broadcom and AMD, which are obviously big players in the datacenter as well as in other kinds of devices.

The funny thing about that chart above is that it shows that Nvidia’s rise is so much more dramatic than Intel’s fall. And the latter company still does not have an AI play in the datacenter, while the former is dabbling in personal AI devices on several fronts. Broadcom and AMD are on slow, steady rises, benefitting from AI in their own ways. Being HBM memory suppliers for Nvidia, AMD, and the homegrown XPUs coming out of the hyperscalers and cloud builders has helped stabilize the big three memory and flash makers. We would love to see how Kioxia and Western Digital are doing. (Solidigm is part of the SK Hynix numbers) to get a gauge on flash storage.

As we have pointed out in the past, Gartner does not provide granularity below the top ten chip makers, but clearly with all of the major hyperscalers and cloud builders creating their own CPUs and XPUs and paying for them to be made, there is a “revenue” stream that we cannot see from Google, Microsoft, Meta Platforms, Baidu, Alibaba, Tencent, and others that already represents close to half of CPU “revenues” and could do the same for AI XPUs by the end of the decade.

We presume these chips are accounted for in the Others segment of the Gartner report.

Two more things. First, it may feel like Nvidia is bending the curve for chip sales all by itself here in the GenAI era, but that is not true, as you can see from the chart below:

If you take Nvidia out of the worldwide chip revenues, the market is still growing smartly. To be fair, sales of HBM memory, regular DRAM, high-end CPUs, and chips for interconnect fabrics and scale-out networks are bending that curve up for sure. Taking Nvidia out is not the point; taking AI out so you can look at it separately and also seeing the rest of the chip market distinct from that is.

Based on the hints that Gartner gave that we talked about at the top of this story, we have sense of AI versus non-AI chip sales.

We think AI chip sales are reaching the limit of large numbers and that the growth rate of sales of overall chippery is going to slow in the coming years – from 21 percent in 2025 to 8 percent in 2029, to be precise. If you plot that out, you get $1.2 trillion in chip sales in 2029, which is nearly double what we had worldwide in 2024. We know that AI chips accounted for less than a third of overall chip revenues in 2025 and will account for more than half in 2029, so we have two endpoints. If you do a more or less straight line growth between those two end points, you get a reverse Zorro graph for AI versus non-AI chip sales that looks like this:

Remember, these are estimates by The Next Platform and this is most definitely not a forecast by Gartner. We think it is absolutely reasonable that everything outside of AI goes into a kind of maintenance mode between now and the end of the decade.

Take these for what you will.

Now, let’s pick apart the AI portion of the pile of chips. We think that AI accelerators will still dominate the revenue streams, but that HBM memory and networking chips will representing a slightly increasing part of the AI pie over time. Bandwidth and memory capacity drive performance more than raw compute in these distributed AI systems, so this stands to reason. But we also known that HPC and now AI shops do not want to pay a lot for storage and networking – a fact that has not changed in six decades.

In our model, AI XPU sales (including GPUs) grows by a factor of 2.3X between 2025 and 2029 (inclusive) to $465 billion, but HBM memory sales for AI systems grows by 4.1X to $124 billion. Networking grows by only 1.8X to $31 billion. We might be a little light on networking spend and a little heavy on HBM spend, but we are trying to compensate for what we think is a massive capacity and bandwidth imbalance between XPUs and their HBM stacks. The question becomes whether HBM memory makers can get better yield fast enough so their stacked memory can be generously added to XPUs. If they can, the cost of inference and training should come down.

We are not saying this is inevitable. As usual, the only way to predict the future is to live it, but having predicted it you can see if you are on the roadmap or not. We are not afraid to stick our neck out there to make a financial roadmap and see how well reality will fit to the curves.