Site icon The Next Platform

Dell Says It Can Finally Make Some Big Money On GenAI

It is one thing for the market researchers of the world to make prognostications about hardware, software, and services spending relating to the GenAI boom. It is quite another thing when public companies start putting stakes in the ground because they face consequences if their auguries turn out to be wrong.

Dell Technologies, which we still call Dell just like we call what is technically Alphabet “Google” because vernacular matters, was a bit tepid about making predictions about the AI spending by its customers but was reasonably fair about telling Wall Street how much dough it has been making selling AI iron, put some big stakes in the ground at its securities analyst meeting in New York City today. But, as you might expect, Wall Street is still concerned about what kind of profits Dell can extract from AI business from the likes of xAI and CoreWeave.

Before we get into where Dell is going over the long haul with AI infrastructure, let’s talk about where it has been. Arthur Lewis, who came to Dell in July 2007 through its Alienware gaming PC acquisition and who did some product management and sales jobs before being put in charge of the Infrastructure Solutions Group in October 2019, walked Wall Street through some numbers.

First off, you will note the AI server sales in dark blue, which grew by 6.1X from fiscal 2024’s $1.6 billion to fiscal 2025’s $9.8 billion, and mostly because Nvidia had enough GPU accelerators to start giving all of the major OEMs some allocations instead of just the hyperscalers and cloud builders who manage their own supply chains and who contract the manufacturing of their iron out to ODMs. And in fiscal 2026, which ends in late January 2026, Dell is projecting that it will more than double AI server sales to $20 billion.

So ISG is back to growing after a pretty anemic and bumpy time excepting fiscal 2023. That server recession in F2020, F2021, and F2024 was due to the coronavirus pandemic and X86 server CPU product cycles. (You can cram a lot more cores into a server in 2025 than you could in 2018, and it has affected the revenue streams of all sellers of servers into the enterprise.) But you have to peel these numbers apart. If you do the math, sales of traditional, non-AI servers only grew by 4.6 percent to $33.8 billion in fiscal 2025 (from $32.8 billion in fiscal 2024), and based on the projections Lewis made, will only grow by 4.4 percent to $35.3 percent in fiscal 2026. This is roughly twice as much as global gross domestic product growth, and as you know from reading The Next Platform, we think 2X GDP is the normal growth rate for mature IT gear serving a mature subset of the market. In other words, 2X GDP is about as much growth as you can expect, and it ain’t much but that’s how it is. AI server sales are clearly booming.

The second thing you will notice – and this has to be intentional – is that Dell calculated the operating income rate for all of the years in the chart except for its fiscal 2026 projection, even though it gave the operating income amount and the revenue so we can calculate it. The answer, which we have shown in yellow, is 11.2 percent. Which tells you that operating profits are under pressure in fiscal 2026 and we think that is because some of the big deals that Dell has done with xAI and CoreWeave, among others, are dilutive to profits even as they do add profit dollars to the company.

This is the nature of the HPC business, as Dell and Hewlett Packard Enterprise in its many components (HP, Convex, Compaq/DEC, SGI, and Cray) know full well. You either want to build machines, or you don’t. And if you do, you try to make it up in software and services.

Here is the chart that caught Wall Street’s attention at the analyst meeting, which was presented by Jeff Clarke, the company’s chief operating officer and vice chairman who has also taken over running its PC business:

 

The expectations for AI were absolutely enormous two years ago, as you can see, but now on some vectors they have doubled or tripled as model complexity has risen faster than mainstream adoption. The way we see it, this complexity and the compute, networking, and storage it drives is giving enterprises more time to get their AI acts together, which they will do in the long run and which will be a more profitable business for Dell because lower volumes and more hand-holding is how the OEMs have always made the numbers work.

The other bit of new data that company founder Michael Dell shared in his presentation, which fits into this thesis that we have always had about the mainstreaming of AI, is that enterprise customers are not going to be doing AI on the cloud once they can get their hands on their own iron because of data sovereignty, model sovereignty, security, and latency issues that come from doing this in the cloud. Take a look at this:

The first and last data points come from Dell’s own Enterprise AI 2025 survey, and they may be shocking to a lot of people, but not to us because we know enterprises that are not hyperscalers and cloud builders and model builders. We know their conservative natures, and they will return to them in full force when it comes to agentic AI, which is just a layer wrapped around the ERP, SCM, and CRM estates running in the back office, a complete rewriting of those functions based on agents, or a mix of the two.

We think the latter, with new functions being done purely in agents and old functions getting wrapped up in agentic overlays much as transaction processing and other systems were wrapped in Web interfaces and augmented with new Web databases and application servers in the Dot Com era. There are still 6,000 mainframe shops in the world and they still drive a $15 billion business for IBM. Power Systems Unix servers still drive probably half that, all in. In the long run, old apps will gradually be replaced by agentic apps, but that will take at least a decade and maybe more.

Underpinning all of that ERP, SCM, CRM, and other back office functions, and a good percentage of data-driven front office functions for that matter, is the traditional server, and the outlook here is good because the server consolidation opportunities are huge:

Dell launched its 15th generation PowerEdge servers in October 2021, and over 70 percent of the iron in its PowerEdge installed base is on 14th generation or earlier iron. With the latest 17th gen machines, which rolled out using Intel and AMD processors last year, the servers have somewhere between 4X and 5X more cores per box and somewhere between 175 percent and 235 percent better power efficiency per box.

What this means is that for the same power budget, you can have 3X the servers (more or less) and around 4.5X the cores. Or, you can take your old crap out of your datacenter, and replace it by one third the number of servers (provided applications are virtualized) and still have 1.5X more cores and free up space, power, and budget to actually install AI servers to run GenAI workloads.

Clearly, we think the latter is going to happen. And so does Dell, we reckon.

Looking ahead, the number of customers buying AI gear from Dell is growing, and the pipeline is also growing. Dell put some numbers on it, saying that it has sold more than 3,000 customers AI servers with GPUs or, in rare cases, other kinds of XPUs since it started pushing this iron in fiscal 2024. Importantly, looking out five quarters ahead, Dell has a line of sight on another 6,700 unique companies who are potentially going to be customers of its PowerEdge accelerated systems.

A lot of customers are still in the proof of concept phase, Lewis admitted, but AI inference in full production will be a much larger – and a much more profitable – business. (The jury is still out on that latter point, but that is the idea.)

The inference market could easily consume 4X, or 5X, or maybe even 10X the compute as the AI training market, but possibly at lower price points and lower margins for Nvidia and AMD, but maybe not for Dell, HPE, Lenovo, Cisco Systems, Supermicro, and others who peddle boxes for a living.

The real reason why Dell is doing deals with xAI and CoreWeave now is to get the experience it will need in building rackscale systems based on Nvidia and AMD compute engines so it can perfect them and roll them out to thousands of new customers every year – customers that operate at a smaller scale that is inherently more profitable and that also need more services and possibly financing as well, which adds to profits that much more.

This is the game that Dell is playing, as are all of the OEMs. And Dell – both the company and the man – has the advantage of being the biggest OEM and the favorite of none other than Jensen Huang, the co-founder and chief executive officer of Nvidia. (It’s a billionaire thing.)

And so, over the long haul, Dell thinks it can grow ISG revenues by between 11 percent and 14 percent per year and its PC business, Client Solutions Group, can grow by 2 percent to 3 percent per year. That works out to 7 percent to 9 percent growth for all of Dell per year, on average. And because of this growth, and improving profit trends, Dell will be able to boost its dividend by at least 10 percent each year. The annual dividend is expected to be $2.10 in fiscal 2026, and at 10 percent per year will be at $3.38 by 2030.

Here is what that looks like when you take the midpoints of the forecasts:

Items marked in bold red italics are our estimates, as usual.

By the end of the forecast period, Dell will be making more revenue from AI servers than from traditional servers if the current consolidation wave and normal acquisition of compute capacity drive traditional server sales at about 4.5 percent growth per year. (That is our model, not Dell’s.) If traditional servers grow slower or go flat, Dell will have to do more deals to reach its overall growth targets for ISG. What we can see is that at the midpoints, ISG will be growing faster than the baseline growth for dividends.

Exit mobile version