If you want to know the state of the art in GenAI model development, you watch what the Super 8 hyperscalers and cloud builders are doing and you also keep an eye on the major model builders outside of these companies – mainly, OpenAI, Anthropic, and xAI as well as a few players in China like DeepSeek.
But if you want to understand how enterprises are adopting GenAI for real work, you might better look to how Big Blue is doing peddling its hard, soft, and people wares to the Global 10,000. This is where GenAI has to take off if it is to become a sustainable, new wave in data processing.
Thus far, enterprises have been enthusiastic about the potential of GenAI, but they have not lost their minds like many companies did in fear during the Dot Com Boom. There are no blank checks, but there are no blank stares, either.
Arvind Krisha, IBM’s chief executive officer, has a nuanced view of how AI – and particularly AI inference – might play out in the real world, not in the rarified air of the hyperscalers, cloud builders, and model builders. This is not a new theory, but a strategy that IBM has had four nearly a decade, when it started to work on tensor math engines for its Power10 and z16 compute engines, which are used in the most mission-critical back office systems of record on the planet. The Power11 and z17 have improved tensor cores as well as the vector units that have been part of IBM’s compute engines for decades now, and sales of these processors have been ramping over the past year.
“I do think that there is going to be a lot of concern around the nature of what are the models learning from answering these questions, and do we really want to share that with everybody else or not,” Krishna explained on a call with Wall Street analysts going over the company’s financials for the fourth quarter of 2025. “There are going to be issues around sovereignty – on the uses of these models – and there is going to be questions around just basic privacy. If I look out three to five years, 50 percent of the enterprise usage of AI is going to be in either a private cloud or in their own datacenters, and the other 50 percent is going to be usage of public models. Now there’s also an efficiency question. So if what’s being used on premise is smaller models, then actually it could be that 80 percent to 90 percent of all the inferencing is really in private/on premise, and 10 percent of the inferencing is on a public cloud, but that 10 percent could be at 5X to 10X the price and hence the dollars sort of even out.”
IBM is very much focused on this, which is why it has created its own models as well as packaged up open source and closed source ones in its WatsonX tools, and it is also why it has created several different code assistants to help customers with Power or System z servers to modernize their code using GenAI. It is not clear how many enterprise customers are using these code assistant tools, with Project Bob (which uses IBM models as well as those from Anthropic) being the latest iteration, but there is a skills shortage for these vintage IBM platforms and code assistants are going to fill in the gaps.
It is not clear how much business at IBM in the final quarter of 2025 was driven by AI, but we do know one thing for sure: It is going to be harder to figure it out going forward.
Since the third quarter of 2023, Big Blue has been providing cumulative bookings for software and consulting relating to GenAI. And a few quarters ago, it only gave out the combined number for bookings. And, according to IBM chief financial officer Jim Kavanaugh, this will be the last time Big Blue provides and stats on AI bookings because, as he put it, these numbers do not accurately reflect the totality of AI revenues from IBM’s enterprise customers.
Take a last look:
Krishna said that cumulative AI consulting bookings since Q3 2023 were more than $10.5 billion and cumulative AI software bookings were more than $2 billion, for a combined more than $12.5 billion. We filled in some of the gaps in what IBM said to give you a better model in the table above. (Estimates are in bold red italics, as usual.)
To put these numbers in a little perspective, over that same time from Q3 2023 to Q4 2025, IBM had $100.8 billion in revenues, and the backlog of AI stuff might stretch out over several years into the future. These bookings, therefore, represent a fairly small share of revenues to date. We also think IBM does not want people to see how little bookings – and therefore revenues – it is getting selling its own GenAI models and tools in a world where OpenAI did around $20 billion in business in 2025 and is projecting $30 billion in 2026, and Anthropic did maybe $6 billion to $7 billion in 2025 with an annualized run rate of $9 billion as it exited the year and will do maybe $18 billion in 2026.
Here is the big table showing how the IBM groups and divisions did over the past two years:
So, IBM awaits its GenAI fortunes as it learns how to deploy it internally to cut costs and drive revenues so it can sell that knowledge and products, Big Blue’s core systems business is doing well – and not the least of which because its server platforms are AI-ready and, in the case of the System z17 mainframes and their “Telum-II” processors, are having a bit of a boom.
In the quarter, IBM’s Infrastructure group, which sells servers, storage, switches, and systems software, had sales of $5.13 billion, up 20.6 percent year on year and had a pre-tax income of $1.6 billion. Sales of hardware and systems software were up by 29 percent to $3.85 billion in our model, and tech support for infrastructure was up 1 percent to $1.29 billion.
IBM does not report server revenues uniquely, but did say that System z sales were up 67 percent while sales of Power Systems and storage products together – what it calls “distributed infrastructure” when most of it is for enterprise-class back office iron – was up 3 percent.
IBM’s Software group had sales of just a tad over $9 billion, up 14 percent, with pre-tax income of $3.4 billion. Red Hat accounted for about $2.27 billion of that in our model, up 10 percent, and transaction processing systems for mainframes accounted for $2.59 billion, up 8 percent. The rest of the software group was development tools, AI tools, databases, and software-defined storage, what IBM used to call Hybrid Platforms & Solutions. (Our charts keep the old names for now.)
The Consulting group drove $5.35 billion in sales, up 3.4 percent, with Strategy and Technology up 2 percent to $2.9 billion and Intelligent Operations (which is really application hosting) up 5 percent to $2.4 billion. Krishna said, by the way, that GenAI represented 25 percent of its current $32 billion revenue backlog and around 15 percent of consulting revenue. The annualized run rate for GenAI consulting was $3.6 billion as Q4 2025 came to an end.
Red Hat’s OpenShift Kubernetes platform, which has AI variations with models and frameworks built in, is driving north of $2 billion in revenues a year and growing at 30 percent a year.
And finally, in the quarter, IBM’s “real” systems business – the hardware, software, services, and financing of the basic systems in the System z and Power Systems families but not including development tools, databases, security, and application software – had $9.42 billion in sales in our model, with a pre-tax income of $5.18 billion, or 55 percent of revenues.
This is one of the largest system and most profitable businesses in the world, and it is important to remember that as we look at the next wave of AI. No, IBM is no Nvidia. But then again, Nvidia is no IBM, either.