Just How Big Are Nvidia’s Server And Networking Businesses?

In the absence of hard numbers, we have never been shy about making estimates because educated guesses is all you sometimes have to make a decision or to derive some kind of insight. So when we got a call last week from someone asking us how big are Nvidia’s server and networking businesses in a finer-grained detail than just the broad “Compute & Networking” and “Datacenter” categories that Nvidia talks about, we didn’t hesitate to load up our spreadsheet for Big Green and take a stab at it.

We figured you would find this conjecture interesting, and hence we are sharing. And given that this is a group effort, we are happy for any input you might have on the subject.

Let’s start with what we know. In the trailing twelve months through January 2023, which is the closest thing to a proxy for 2022 annual sales that we have for Nvidia, the company posted $15.07 billion in revenues for its Compute & Networking group, up 36.4 percent year on year and including some products that are not included in the datacenter. And the Datacenter division, which includes some visualization products that are not technically compute, had just a tad over $15 billion in revenues, up 41.4 percent year on year. For the sake of argument, we are going to say that the systems business at Nvidia was $15.005 billion, which is the numbers for the Datacenter business.

Now, just as Mellanox was being acquired by Nvidia, in the trailing twelve months ended in Q1 2021 the entire Mellanox business was running at $1.45 billion and grew at 27.2 percent. We had three quarters in Nvidia’s fiscal 2021 ended in January 2021 where the Mellanox business was broken out at, and that amounted to $1.64 billion in sales. The odds are that it was around $2 billion for the full fiscal 2021 year for Nvidia, and that averaged around 30 percent of Nvidia’s Datacenter division revenue. Given how integral InfiniBand is for AI training clusters and how Nvidia is getting sales of its Spectrum Ethernet, we have no reason to believe the ratio between Mellanox sales and overall Datacenter sales was any different for fiscal 2022 and early fiscal 2023.

So that would put sales of switches, NICs, cables, and network operating systems at $4.5 billion for the trailing twelve months, leaving just a bit more than $10.5 billion for all datacenter compute.

Now we have to start making some more assumptions to drill down into this deeper:

After chatting with some people, we came to a consensus number of around 40 percent of the Datacenter compute revenues was for PCI-Express GPU cards of various kinds, dominated by the T4, A40, L4, and L40 cards, which are not available with the NVLink-capable SXM modules, and the PCI-Express versions of the A100 and H100 GPU accelerators, which are available in SXM4 and SXM5 modules and their NVLink ports and NVSwitch interconnects. So we have $4.2 billion for these PCI-Express accelerators, with perhaps an average selling price around $5,000 a pop and representing what we think is more than 840,000 units. That leaves the remaining $6.3 billion for whole DGX servers and HGX components.

So, how much of the HGX system boards end up in DGX systems that Nvidia and its resellers sell, and how HGX system boards many end up in systems created by ODMs and OEMs? Our guess is the DGX machines represent 20 percent of the revenues, and we think if it was any higher than that, OEMs and ODMs would be screaming. And if you work the number backwards off of the $1.26 billion we think was sold by Nvidia and its partners in the trailing twelve months ended in January 2023, that is somewhere around 7,800 machines and somewhere north of 62,000 GPUs – and given that the H100s were only now shipping in volume, most of those machines – but not all – were based on the A100 GPU accelerators.

That leaves just north of $5 billion in HGX boards that go out to ODMs and OEMs, comprising around 36,600 machines and around 293,600 GPUs, and if you further assume these HGX cards represent around 75 percent of the value of the systems and that Nvidia is not discounting very much for any customers given that AI compute demand is far outstripping supply, then the implied value of those third party analogs of the DGX systems probably drove somewhere around $6.7 billion in revenues. Add it all up, and the average selling price of an Nvidia GPU accelerated system, no matter where it came from, was just under $180,000, the average server SXM-style, NVLink-capable GPU sold for just over $19,000 (assuming the GPUs represented around 85 percent of the cost of the machine). And if you do the math further, the average datacenter GPU of any time sold by Nvidia sold for just shy of $9,200.

Based on what Nvidia has said in the past, we believe the hyperscalers might have accounted for 40 percent of Datacenter division revenues, but we would not be surprised if it is more than half, either.

Discuss.

Sign up to our Newsletter

Featuring highlights, analysis, and stories from the week directly from us to your inbox with nothing in between.
Subscribe now

1 Comment

  1. I have Nvidia DC dGPU on 2022 revenue $11.95 B / $2389 revenue per card from $4779 average design install price through consulting systems integration albeit Nvidia can be that too, subsequently 5 M units a year. On TPM backing out < 60% for other than PCI dGPU cards = 2 M to 2.5 M units on 40% or x2 my price respectively that is 10% of so said 'server' known understated on Intel supply data. PCI dGPU population in an HPC / ML sled or tray = 2, 4, 8? Noted your thoughts and the curiosity continues.

    Specific "our guess is DGX machines represent 20 percent of the revenues, and we think if it was any higher than that, OEMs and ODMs would be screaming". Nvidia is definitively an Intel first tier OEM and possibly the leading business of compute provider beyond Intel itself "walking that thin blue line".

    Mike Bruzzone, Camp Marketing

    Mike Bruzzone, Camp Marketing

Leave a Reply

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.