Profiting From The GenAI Boom Is Tough Even If It Democratizes
Wall Street might have unreasonable expectations about how OEMs and ODMs can profit from the GenAI boom through selling GPU laden systems. …
Wall Street might have unreasonable expectations about how OEMs and ODMs can profit from the GenAI boom through selling GPU laden systems. …
Rated horsepower for a compute engine is an interesting intellectual exercise, but it is where the rubber hits the road that really matters. …
Hardware is always the star of Nvidia’s GPU Technology Conference, and this year we got previews of “Blackwell” datacenter GPUs, the cornerstone of a 2025 platform that includes “Grace” CPUs, the NVLink Switch 5 chip, the Bluefield-3 DPU, and other components, all of which Nvidia is talking about again this week at the Hot Chips 2024 conference. …
Like many other suppliers of hardware and systems software, Cisco Systems is trying to figure out how to make money on the AI revolution. …
Intel’s second quarter is pretty much a carbon copy of the first three months of 2024 when it comes to revenues across its newly constituted groups, and with an operating loss that is twice as big. …
With all of the hyperscalers and major cloud builders designing their own CPUs and AI accelerators, the heat is on those who sell compute engines to these companies. …
For Mark Zuckerberg, the decision by Meta Platforms – and way back when it was still known as Facebook – to open much of its technology – including server and storage designs, datacenter designs, and most recently its Llama AI large language models – came about because the company often found itself trailing competitors when it came to deploying advanced technologies. …
A scant three months ago, when Meta Platforms released the Llama 3 AI model in 8B and 70B versions, which correspond to the billions of parameters they can span, we asked the question we ask of every open source tool or platform since the dawn of Linux: Who’s going to profit from it and how are they going to do it? …
Everybody knows that companies, particularly hyperscalers and cloud builders but now increasingly enterprises hoping to leverage generative AI, are spending giant round bales of money on AI accelerators and related chips to create AI training and inference clusters. …
When you have to compile the revenue streams of thousands of major IT hardware, software, and services suppliers into the datacenter, it takes a bit of time to get that data, get it right, and then aggregate it for a datacenter market analysis. …
All Content Copyright The Next Platform