
The world’s largest and one might argue most important chip foundry is telling Wall Street that its AI-related sales are running ahead of schedule. But the company is not talking specifically about by how much and if the long-range picture out through 2029 has changed all that much.
As Europe and the United States were fast asleep last night, the chief brass at Taiwan Semiconductor Manufacturing Co were revealing how the top and bottom line ended up for the third quarter ended in September. And as expected, the numbers were good, thanks in large part of the GenAI boom, which is driving up the transistor count in XPU compute engines as well as switch and now router ASICs as AI clusters bust out of datacenter halls to entire datacenter regions as model builders chase superintelligence (creating models that are better than humans at everything cognitive) and enterprises are trying to get their applications extended with less impressive large language models whose inference workloads in the aggregate will nonetheless represent the bulk of AI computing at some point in the not too distant future.
Well, that’s the thinking from both people and AI models, all of which – we hesitate to say all of whom universally there because models are not yet people – are just taking their best guesses. The only way to really predict the future is to live it, and we sure are doing that, eh?
Anyway, the second largest profit pool in the world so far in the GenAI boom is at TSMC, which is etching Nvidia GPUs and all kinds of other compute and network engines that are used in AI clusters. Nvidia has the biggest profit pool in the GenAI market, and it has roughly 6X the revenues and profits of TSMC from AI training and inference based on our rough math on the back of an envelope. (That envelope, which is really an Excel spreadsheet on our desktop called Calc.xlsx, is crammed with all kinds of estimates. . . . )
But don’t feel bad for TSMC – it is doing just fine and it has the financial wherewithal to stay ahead of rival Intel indefinitely, barring a takeover by the US Departments of Defense and Treasury or a coalition of the willing (big banks and equity firms) that might be compelled by their patriotism and a call from President Trump some day hence to do some investing in indigenous chip manufacturing.
In the quarter ended in September, TSMC raked in $33.1 billion of revenues, up 40.8 percent year on year and up 10.1 percent sequentially. Net income grew faster than sales, which all companies love to see because that means their businesses are scaling well, and hit $15.1 billion, up 50.2 percent quarter on quarter and up 18 percent sequentially from the second quarter.
The TSMC business is growing in two ways. First, the content of the wafers is getting richer (more features on smaller transistors) and therefore TSMC can charge more money for an etched wafer.
And second, TSMC is finally kicking out more silicon platters after wafers starts collapsed in the fall of 2022 as the smartphone, PC, traditional server markets all went into recession in the wake of excess spending on these items during the coronavirus pandemic. (We all updated everything.) In 12-inch wafer equivalents, TSMC broke through 4 million wafers for the first time in its history – 4,085,000 good wafers, to be precise. Revenue per wafer was $8,102, which is 59.2 percent higher than it was back in September 2022.
Many chips that TSMC etches are based on older processes, which are mature and therefore have higher yields and higher profits per transistor (but not necessarily per wafer, don’t conflate the two), And chips made with 10 nanometer and larger transistor geometries accounted for $8.61 billion in the third quarter, and revenues for this legacy manufacturing rose by 18.1 percent year on year.
Sales of 7 nanometer chips actually grew slower, accounting for $4.63 billion in sales (up 16 percent). Revenues for 5 nanometer chips rose by 62.8 percent to $12.25 billion and represented 37 percent of all chip revenues. Newer 3 nanometer processes rose by 61.9 percent to $7.61 billion and comprised 23 percent of sales.
TSMC only hints about how much AI inference and training except in June 2023, when it gave a hard figure, saying that AI inference and training chips drove 6 percent of revenues. (Thanks for that.) The company does have an “HPC” category, which means fat chips used in PCs, servers, and other datacenter gear and in contrast to smartphone chips, which TSMC counts separately.
In the third quarter of 2025, this HPC category – and again, this is not high performance computing as we talk about it here at The Next Platform – drove $18.87 billion in sales, up 57.4 percent for the year and up 4.6 percent sequentially. Based on our model, AI inference and training chips accounted for 53.9 percent of HPC segment sales for TSMC in Q3 2025 and rose by a factor of 2.7X year on year to $10.16 billion.
CC Wei, TSMC’s chief executive officer, did not update the forecast that the company unveiled back in the June. As we previously reported and as Wei reminded Wall Street today, TSMC forecast that revenues driven by etching and packaging AI accelerators would grow at a compound annual growth rate in the mid-40 percent range between 2024 and 2029 inclusive.
“The AI demand actually continues to be very strong,” Wei explained on the call. “It’s stronger than we thought three months ago, okay? In today’s situation, we have talked to customers and then we talk to the customers’ customers. So the CAGR we previously announced is about mid-40s. It’s a little bit better than that. We will update you probably in beginning of next year so we have a more clear picture.”
Back in July, when the June quarter numbers came out, we estimated that the company’s AI-related chip sales were $9.1 billion in 2024, and at a CAGR of 45 percent, that puts them at $85.5 billion in 2029 and that is a factor of 9.4X growth. Assuming that TSMC’s overall revenues grow 25 percent in 2025 and then a modest 20 percent after that, you get 2.6X growth over those six years to $233.5 billion, and AI chips at around 36.6 percent of the total.
A lot of things can happen between now and 2029, which is why Wei & Co are not changing their forecast. But we think if TSMC can pull off another 15 percent sequential growth in Q4 2025, then it will have sold $36.63 billion in AI gear, which is significantly higher than the $30.3 billion we were projecting only three months ago.
It would be nice to get some hard figures and to get that new forecast in January as TSMC closes out the books for 2025. Our TSMC model shows AI chip revenue growth rates slowing over time, but Nvidia co-founder and chief executive officer thinks Nvidia can hold its datacenter growth at 50 percent between now and 2030. The Nvidia and TSMC forecasts do not mesh, which is why we think eventually Nvidia will get into other parts of the datacenter stack and maybe even build datacenters. Instead of buying Arm, maybe it will buy Supermicro amd Vast Data.
In the meantime, TSMC will stick to its etching. And that means focusing on advancing the state of the art in chip making and praying that China does not invade and this AI bubble holds up. And if the AI bubble bursts, guess what? People will still need to buy chips, and TSMC will still make a profit making them.
Be the first to comment