
People – and when we say “people” we mean “Wall Street” as well as individual investors – sometimes have unreasonable expectations. It is happening right now with what we still call Google because Alphabet is a holding company that, for the most part, is just holding Google.
Google reported its financial results for the fourth quarter of 2024 after the market closed last night, alongside AMD, and Wall Street is just not in a mood for perfectly reasonable growth and profitability for both companies. (You can see our coverage of AMD’s numbers and our projections for the growth in its Instinct GPU business for 2025 here.)
Not all financial curves look like those that Nvidia has enjoyed in the past year and a half, even if companies are strong partners of Nvidia, as Google certainly is. (With its own TPU accelerators and its own optical switches and its own CPUs, Google is also a competitor to Nvidia.) As a chronicler of datacenter platforms, we could care less about the YouTube or search engine businesses at Google except inasmuch as these businesses use Google infrastructure and also can help prop up Google Cloud and let the Chocolate Factory create its rendition of hyperscale cloud infrastructure while also keeping some heat on Amazon Web Services and Microsoft Azure. We also care about the AI models that Google creates to keep the heat on Anthropic and OpenAI, who are respectively tightly coupled to those two companies, and the financial heft it takes to create these models.
We would argue – really, it is just an obvious observation – that only those companies that have a big business on top of IT infrastructure, either renting capacity or peddling a service that runs on it, can afford to build the massive GPU accelerated systems that are in turn used to train the largest AI models, which are asymptotically approaching artificial general intelligence. Not a single government on Earth can afford to make the kinds of investments that the hyperscalers and cloud builders have been doing for the past several years. AI is a very expensive research project, and it is not at all clear how the global economy will absorb it, use it, and pay for it.
But for now, GenAI hype is running high and demand for GPU compute is off the charts. And therefore Google Cloud is running fast and furious to build datacenters stuffed with GPUs and TPUs as fast as it can.
“We do see and have been seeing very strong demand for AI products in the fourth quarter in 2024,” explained Anat Ashkenazi, chief financial officer at Google, on the call with Wall Street going over the Q4 2024 numbers. “We exited the year with more demand than we had available capacity. So we are in a tight supply demand situation, working very hard to bring more capacity online.”
In the quarter, the entirety of Google posted sales of $96.47 billion in sales, up 11.8 percent, with operating income of $30.97 billion, up 30.7 percent, and net income of $26.54 billion, up 28.3 percent. Whenever you have net income growing much faster than revenues, you are winning. Google ended the quarter with $95.67 billion in cash, down 13.8 percent because of enormous investments in datacenters and IT gear inside of them, and a revenue backlog (mostly associated with its cloud contracts) of $93.2 billion, up 25.8 percent.
The Google Cloud business is much stronger than the overall Google business in terms of revenue growth and increasing profitability, but Wall Street was expecting for more.
In the December quarter, Google Cloud had $11.96 billion in sales, up 30.1 percent, and its operating income rose by 2.4X to $2.09 billion. This operating income represented 17.5 percent of Google Cloud revenues, which is similar to the profit margin in Q3 2024 and way better than the anemic profitability Google Cloud had in 2023 and the operating losses it had each quarter for years before that. Say what you will, but Google has had to learn how to make money selling a cloud, which was not an easy thing for a company with a search engine advertising monopoly to learn.
For the full year, Google Cloud had $43.23 billion in sales, up 30.6 percent; operating income was $6.11 billion, up by a factor of 3.6X and representing 14.1 percent of sales. We agree that Google can and should build a bigger – and more profitable – cloud business, but this has been a pretty good start.
Part of the problem is capacity, and Google is going to work on that, says Sundar Pichai, Google chief executive officer.
“In 2024, we broke ground on eleven new cloud regions and datacenter campuses in places like South Carolina, Indiana, Missouri, and around the world,” Pichai explained. “We also announced plans for seven new subsea cable projects, strengthening global connectivity. Our leading infrastructure is also among the world’s most efficient. Google datacenters deliver nearly 4X more computing power per unit of electricity compared to just five years ago. These efficiencies, coupled with the scalability, cost, and performance we offer, are why organizations increasingly choose Google cloud’s platform. In fact, today, cloud customers consume more than 8X the compute capacity for training and inferencing compared to 18 months ago.”
Last year, Google spent $52.5 billion in capital expenses, which was mostly for datacenters and IT gear, and most of that IT gear was for GPU and TPU systems. This was a 1.63X increase over the $32.3 billion in capital expenses spent in 2023, the first full year of the GenAI boom, and above the baseline of $24.6 billion in capex shelled out by Google in 2021, which had some AI components to be sure but which was before the GenAI boom that exploded on the scene in late 2022.
For 2025, Google is expecting to spend $16 billion to $18 billion in the first quarter alone on datacenters and gear, and will dole out around $75 billion for capital expenses in the full 2025 year. That is a 42.9 percent increase in the capital budget against what probably will not be the same growth in cloud revenues – at least not immediately. But you have to remember that these investments in infrastructure are long term. You spend $1 building the datacenter and $1 building the AI systems, and you get something on the order of $10 back over the course of four years and $22 over the course of a decade. That is an 11X return on investment, and that is why so many GPU-powered “neoclouds” are popping up like mushrooms in the forest in the spring, fueled by private equity money. (See The Gigabucks Going Into Datacenter Gigawatts for more on the enormous amounts of money flowing here.)
Google is using AI to automate parts of its own business, and we would love to learn more about this. It is debatable as to how well it is commercializing the Gemini foundation models in the software/services that it offers. We are skeptical about many of these tools and prefer to do our own writing and analysis, thank you very little.
But we do think that it is important not to confuse skepticism about AI applications with exuberance over capacity to run AI training and AI inference. The demand is still off the charts for the latter despite the tepid results of the former. If that gap persists, Google will just keep iron in the field longer to serve the AI compute demands of Google Cloud customers, milking iron and perhaps doing so more profitably than updating its massive GPU and TPU fleet with new capacity every year.
We think Wall Street is over-reacting about the cloud business at Google, which is growing faster than the cloud at large, and is not skeptical enough about AI in general. It remains to be seen how useful AI will really be, but with so many of the brightest minds on Earth trying to make this work, it is hard to bet against it and bet on the sassy crazy apes that are actually inventing usable and effective AI.
Be the first to comment