Connect

With Celestial AI Buy, Marvell Scales Up The Datacenter And Itself

Published

It was only a matter of time before Marvell was going to make another silicon photonics acquisition, and the $2.5 billion sale of its automotive Ethernet business to Infineon for $2.5 billion has given the company this past summer netted out to about half of the $3.25 billion that the company is shelling out to get its hands on Celestial AI, one of the several upstarts that hopes to hook compute engines, memory, and switches together using on-chip optical engines and light pipes.

The Celestial AI deal has been rumored for a few weeks, and happened as Marvell was getting ready to report its financial results for its third quarter ended on November 1 for fiscal 2026. The company has hit its numbers and, even without any effect from its now-pending acquisition, raised its revenue guidance for the coming two fiscal years. Celestial’s expected additional revenue stream to Marvell will be icing on and within the multi-layer cake.

We strongly suspect that this icing will be thicker than Marvell’s top brass is low-balling right now – provided that the AI bubble doesn’t pop in the coming years, of course. Hence the hedge. And even if it does pop, the same limits on electrical signaling nonetheless apply – copper wires get halved in length as bandwidth doubles because the noise gets loud, you cannot change the laws of physics. And therefore the costs for the transition to silicon photonics will still have to be borne, albeit across a possibly diminished revenue and profit stream for the datacenter sector.

In this sense, the AI boom is the killer app for silicon photonics. It is helpful to remember that Intel was showing off SiPho interconnects running at 1.6 Tb/sec for rackscale CPU systems way back in September 2013 – more than a dozen years ago. The idea is not new. But the need is truly pressing this time, and eventually – somewhere around 2028 or 2029 it looks like – even copper backplanes are going to hit a wall if Celestial AI and its several competitors like Lightmatter, Ayar Labs, Avicena, Astera Labs, and several others are right. (Taiwan Semiconductor Manufacturing Co are working on COUPE to integrate optical engines on 3D stacked chip complexes, and you can bet Nvidia will be ready when COUPE is ready for primetime, with AMD hot on its heels.)

With the US penny being canceled and the days of copper cables numbered, now is not a good time to invest in copper futures. . . .

Wall Street didn’t “get” the Celestial AI deal after the market closed on Tuesday, then figured it out a bit, and then on the following day there were reports that Microsoft was having trouble charging for copilots for $30 per month per user in its Office365 suite and a little bit of air was let out of the tech bubble, including Marvell, which obviously has a huge AI play with more than three-quarters of the company’s datacenter revenues coming from AI-related products at this point.

In the November quarter, Marvell had $2.08 billion in sales, up 36.8 percent year on year. The company had an operating profit of $358 million, which is a whole lot better than the $703 million loss the company had in Q2 F2025. Thanks to a $1.8 billion gain from the sale of the automotive Ethernet biz and actual net income from operations, Marvell posted a net income of just a tad over $1.9 billion – almost as large as revenues for the quarter.

But don’t get used to that cash, or the $4.47 billion hoard it has in the bank after the quarter finished. At least $3.25 billion is going to the shareholders of Celestial AI, and if what will become the photonics interconnect division of the company sells $2 billion in cumulative revenues by the end of fiscal 2029, then then they get an additional $2.25 billion for their shares. (To be precise, the initial phase of the deal is $1 billion in cash and $2.25 billion in Marvell stock.) If the Celestial AI division only reaches $500 million in cumulative sales by the end of fiscal 2029, Celestial AI shareholders only get a $750 million bump.

Matt Murphy, Marvell’s chief executive officer, told Wall Street Celestial AI’s Photonic Fabric (PF) optical engines and related packaging could hit an annualized run rate of $500 million as fiscal 2028 comes to a close and have an ARR of $1 billion as fiscal 2029 ends. If you play around with some numbers, that is about $1.3 billion in actual revenue. So sell an extra $700 million in stuff, and get an extra $2.25 billion in cash and Marvell shares.

In the quarter, the Datacenter group at Marvell generated $1.52 billion in sales, up 37.9 percent year on year but only up 1.8 percent sequentially. Enterprise networking, which includes storage and DPUs, among other things, had $237 million in sales, up 57.2 percent. Add it up, and Datacenter and Enterprise Networking comprise 84.6 percent of overall revenues for Marvell.

Drilling down into the Datacenter group, AI revenues were up 1.3X to $1.24 billion in Q3, and within this, $418 million of that was for custom XPU shepherding through the TSMC foundry and packaging partners as well as helping with designs (up 83.4 percent) and $819 million of that was for electro-optics used in various parts of the scale out networks for AI systems (up 2.6X). Other sales of other datacenter products fell by 39.2 percent to $402 million in Q3 F2026.

Looking ahead to Q4 F2026, which ends in early February, Marvell expects for $2.2 billion in sales, with about $1.64 billion of that coming from datacenter stuff. (That’s 85.9 percent of sales.)

And based on what Murphy said on the call with Wall Street, datacenter sales should hit around $7.67 billion in fiscal 2027 and $10.74 billion in fiscal 2028 – and that is without any revenues added from Celestial AI, which as we said, will add about $370 million in fiscal 2027 and around $760 million in fiscal 2028 to hit those target run rates. Remember, in fiscal 2020, Marvell’s datacenter business was under $1 billion in sales. . . .

If you tear apart the numbers a bit for fiscal 2027, you get $1.67 billion in AI XPU sales, up 20 percent, and AI electro-optics sales of $3.95 billion, up 35.5 percent, for overall AI sales of $5.62 billion, up 32.1 percent. Other datacenter products should drive above $2 billion in sales for fiscal 2027, up 15 percent.

And that brings us to the rationale behind the Celestial AI acquisition and Marvell’s enthusiasm for the Photonics Platform that the company has created. Murphy explained that the 3D approach to photonics integration frees up chip beachfront around the perimeter of the XPU package and making it more thermally viable. But perhaps more importantly, the two companies appear to share a common hyperscaler (and we think cloud builder) customer.

“Celestial AI is deeply engaged with multiple hyperscalers and ecosystem partners who recognize the disruptive potential of this technology,” Murphy explained on the call. “Notably, Celestial AI has already secured a major design win with one of the world’s largest hyperscalers who plans to use Celestial AI’s PF chiplets in its next-generation scale-up architecture. These PF chiplets will be co-packaged into both the hyperscalers custom XPUs and the scale of switches providing connectivity. This is expected to be the industry’s first large-scale commercial deployment of optical interconnects for scale-up connectivity.”

So maybe we need to add that to the Trainium4 projection we just made. This is how you might take four Tranium4 chiplets with six NeuronCore-v5 cores and glue them together into what looks like a single chip to the software as well as hook out to pooled disaggregated memory:

These would be very interesting additions, indeed, to the Trainium4 XPU from AWS. Or, for that matter, any XPU or CPU.