Site icon The Next Platform

Did AMD Use ChatGPT To Come Up With Its OpenAI Partnership Deal?

In early 2024, we all had been wondering how OpenAI was going to pay for its $100 billion Stargate datacenter infrastructure project it was rumored to be planning with Microsoft. This was the biggest single AI infrastructure project to date, and comprising about 1 million AI accelerator engines.

And then we raised our eyebrows a little higher in January 2025 when Stargate’s budget was expanded to $500 billion with much fanfare at the White House and just a few weeks ago with a $100 billion commitment from Nvidia, for which the world’s largest IT supplier gets OpenAI stock as well as its share of $350 billion in hardware sales related to the Stargate project, if the rumors are right, that are tied to this particular deal.

That’s NumberWang!

And by that, we mean that Nvidia is getting OpenAI stock as it delivers 10 gigawatts of capacity, plus the value of its OpenAI stock holdings will also grow, giving Nvidia a presumably large portion of its $100 billion bet back as the AI model maker’s stock continues to swell. With each gigawatt of capacity costing around $50 billion, Stargate is $500 billion and Nvidia will get a big chunk of that. It looks like 70 percent, roughly speaking, of the value will come from Nvidia iron, and depending on how you want to cut it and the assumptions you make about compute and network choices, about 80 percent of that 70 percent will go to Nvidia. Call it $280 billion in Nvidia revenues over five years. That is equivalent to two 2024 years of datacenter revenues.

Two weeks ago, OpenAI was talking about the $850 billion in datacenter buildouts, encompassing 17 gigawatts of capacity for powering and cooling of AI clusters, and at a dinner hosted a few weeks before that chief executive officer Sam Altman talked about “trillions” in investments “in the not very distant future.”

After the past several weeks, our left eyebrow had a cramp, and we can’t raise the right one except in total surprise (two eyebrows up) rather than Spock-ish dubitousness that the one eyebrow always signifies.

This morning, after a few days off from the news cycle, the left eyebrow worked again, which is good because AMD has done a kind of inverse partnership with OpenAI – That’s WangerNum! – that shows a clever way to arc OpenAI into AMD’s share price, which will rise as it does OpenAI deals for CPUs, GPUs, and DPUs for its future systems and which in turn will give OpenAI at least some of the capital it needs over the next five years to build out its massive global AI hardware footprint as it sells those AMD shares to other Wall Street investors.

Here’s the deal between AMD and OpenAI.

OpenAI has committed to buying 6 gigawatts of datacenter compute capacity over the next five years, starting in the second half of 2026 with the “Atair+” Instinct MI450 GPUs and the “Helios” rackscale systems that will debut with the MI400 series. The deal runs through October 2030, and OpenAI has until that time to buy and presumably install the gear based on AMD components. This is important because vendors can only book revenue upon delivery and acceptance, but if a third party like Oracle is the prime contractor, AMD will be able to be paid once it delivers chips and other components to Oracle and they accept them.

In exchange for that commitment, AMD will give OpenAI a warrant for an aggregate of 160 million shares, with an excise strike price of $.01 – yes, a mere penny – and that is up to a nearly 10 percent stake in AMD. (AMD had 1.61 billion basic shares and 1.62 billion diluted shares outstanding at the end of the June quarter. The difference is largely stock options that have not yet been exercised.) This deal looks like it will dilute current AMD shares, but it really doesn’t matter because as we write this, AMD shares are up 28.8 percent to $212 a pop, raising the company’s market capitalization to $267.2 billion, a nearly $60 billion increase. That 28.8 percent pop is three times the potential dilution over the next five to six years – depending on when OpenAI converts them.

AMD is not just giving OpenAI shares. It is making it work for them. The first tranche of the AMD stock will be given after that first gigawatt of MI450 racks are installed in the second half of next year, and the faster that OpenAI deploys iron, the faster it gets all of the tranches. But there is another caveat baked into the deal. AMD stock has to hit $600 a pop – a nearly 3X multiple – as OpenAI completes the installation of its 6 gigawatts of capacity, according to AMD’s 8-K filing on the deal, and OpenAI also has to meet unspecified technical and commercial conditions as each tranche comes up for delivery by AMD. These warrants are not transferable, but once they have been converted to stock, OpenAI can do what it wants with it. This is something that AMD chief executive officer danced around when Wall Street asked if the stock was going to be sold to fund OpenAI’s growing hardware habit.

“I wouldn’t call it that,” Su explained to the Wall Street cloud on a call before the market opened this morning. “This is really a way for us to align incentives and think about it as a win, win, win. You know, it’s a win for AMD shareholders. It’s a large deployment for us, very significant – tens of billions of dollars of revenue over the next number of years. The deal is structured so that the OpenAI warrants vest as OpenAI deploys at scale with AMD. It’s highly accretive to our shareholders. I think it’s also an opportunity for OpenAI to share some of that upside if we’re both as successful as we plan to be. And I think it’s up to them what they do. But my view of this is: This is a very nice structure for us to be incredibly aligned between our strategic objectives OpenAI strategic objectives, and frankly, it’s a big win for our shareholders.”

Clearly, this is all true, and clearly, OpenAI will be selling its shares as they vest so they can buy some iron.

It will be funny if, because of timing, some of the $100 billion that OpenAI is getting from Nvidia is used to pay for AMD iron and some of the money that OpenAI gets from selling AMD stock is used to buy Nvidia iron. But don’t get the wrong impression. This stock deal might not drive as much revenue as you think. Let’s do some math.

If OpenAI got some stock yesterday just for signing the deal and had a 10 percent stake already, it would have made $6 billion this morning. Which would have come in handy. But that is not how AMD did the deal.

If we assume linearity for the sake of argument, we can divide the 160 million shares into five and sell 32 million shares in each year from 2026 through 2030, inclusive. If we assume linear growth in the stock with the tiniest of exponent as we go out in time from $212 this afternoon until $600 in the second half of 2030, then the stock would rise to $281 in 2H 2026, $350 in 2H 2027, $425 in 2H 2028, $513 in 2H 2029, and $600 in 2030. AMD would have 1.77 billion basic shares outstanding and a market capitalization of $1.06 trillion if that all comes to pass. Assuming those two lines of salable shares and share prices, that would give OpenAI $9 billion in AMD shares in 2025, $11.2 billion in 2027, $13.6 billion in 2028, $16.4 billion in 2029, and $19.2 billion in 2030. That’s $69 billion in total. If you assume that OpenAI might hold onto some of the shares, call it $50 billion that it will sell, which covers the cost of construction of 1 gigawatt of datacenter capacity and which gives it cash flow every year from the Magical Pachinko machine that is Wall Street. (Hey, we all have 401(k) retirement to think about, right?)

Like we said: That’s WangerNum!

But the pesky question is how OpenAI is going to come up with the cash to build the rest of the infrastructure, because even Nvidia is not doling out its investments in OpenAI until it actually buys the gigawatts of capacity.

Speaking on Bloomberg this morning after the announcement, Greg Brockman, president of OpenAI, joined Su on an interview and had this to say about revenue: “The way I would look at this is that AI revenue is growing faster than I think almost any product in history. And that ultimately, at the end of the day, the reason this compute power is so important, is so worthwhile for everyone to build, is because the revenue ultimately will be there.”

We are back to Field of GPUs. . . . which we already talked about.

Brockman went on to say that OpenAI is looking at all kinds of ways of financing this massive AI buildout, including equity sale, taking on debt, and he also intimated that this deal with AMD would be mostly about AI inference, not AI training, where a lot of the work has already been done on Nvidia GPUs. Basically, Brockman said that OpenAI has a diversity of workloads, and it will require a diversity of chips as well as a plethora of chips to satisfy the company’s compute needs.

For AMD, Su says this deal with OpenAI represents “tens of billions of dollars” in product sales in 2027 and that it has the capability to drive “well over $100 billion in revenues over the next few years.” When asked if this including networking, Su danced a bit, talking about DPUs and the open ecosystem of partners and that made us believe that OpenAI is not pressuring AMD to be a vendor of UALink or Broadcom SUE scale-up interconnects for lashing CPUs and GPUs together by their main memories. Su added that each gigawatt of capacity would generate “double digit billions of revenue” for AMD. Which is a lot better than the $5 billion in did last year in GPU sales and the maybe $6.2 billion in might do in GPU sales here in 2025.

With the deal, Nvidia is behind 10 gigawatts of capacity, and AMD is behind 6 gigawatts of capacity, which is a 62.5 percent to 37.5 percent split in share across the two. If there is another 1 gigawatt of other kinds of iron, for a total of 17 gigawatts as Altman was talking about recently, then Nvidia iron has a 58.8 percent share of capacity, AMD has a 35.3 percent share, and others (possibly including a homegrown OpenAI “Titan” XPU) have a 5.9 percent share. Or maybe the whole pie is a lot bigger and the AMD and Nvidia shares of OpenAI infrastructure will ultimately be smaller.

It remains to be seen just how profitable this will be for AMD, but as we all know the dance by now, this deal will be accretive to profits in a dollar sense for sure, but not necessarily accretive to profits in a percentage of revenue sense. Which is what we all expect from anything that looks and feels like HPC, as the GenAI boom certainly does.

Nvidia has almost all of the profit in the GenAI boom, and as long as Nvidia controls its entire stack and is the preferred vendor of an AI stack, it will keep extracting massive profits even if revenue growth and profit growth slows. Competition will only remove the profits from the market, not spread them around, because you can only take business away from Nvidia by over-teching and under-pricing – and you need to do both because of that massive CUDA-X software moat. This is what happened with the open systems revolution with Unix systems in the datacenter that knocked out most (but not all) proprietary systems and then the X86 revolution in the wake of the Dot Com boom that knocked out most (but not all) Unix systems.

Given all of this, getting AMD’s share price to $600 could be a challenge, but not as big a one as OpenAI coming up with even $500 billion to buy stuff without relying on the assistance of national governments and sovereign wealth funds in the Middle East that have their own national security issues from the point of view of the United States.

What we really want to know is if this new twist on round tripping money between a hardware vendor and OpenAI – Microsoft did it, Nvidia did it, and now AMD is doing it – was the brainchild of ChatGPT. Wouldn’t that be funny? For the sake of humanity, we hope not. We’d like to think people are still clever.

Exit mobile version