
Picking Apart AMD’s AI Accelerator Forecasts For Fun And Budgets
Given two endpoints and a compound annual growth rate between those two points over a specific amount of time is not as useful as it seems. …
Given two endpoints and a compound annual growth rate between those two points over a specific amount of time is not as useful as it seems. …
There is a bit of AI spending one-upmanship going on among the hyperscalers and cloud builders – and now the foundation model builders who are partnering with their new sugar daddies to be able to afford to build vast AI accelerator estates to push the state of the art in model capabilities and intelligence. …
It is hard to bet against the GenAI boom, and thus far it is also hard for anyone other than Nvidia to profit from it. …
Every time Lisa Su, chief executive officer at AMD, announces a new Instinct GPU accelerator, the addressable market for AI acceleration in the datacenter seems to expand. …
With all of the hyperscalers and major cloud builders designing their own CPUs and AI accelerators, the heat is on those who sell compute engines to these companies. …
If high bandwidth memory was widely available and we had cheap and reliable fusion power, there never would have been a move to use GPU and other compute engines as vector and matrix math offload engines. …
Everyone is in a big hurry to get the latest and greatest GPU accelerators to build generative AI platforms. …
The world has gone nuts for generative AI, and it is going to get a whole lot crazier. …
For very sound technical and economic reasons, processors of all kinds have been overprovisioned on compute and underprovisioned on memory bandwidth – and sometimes memory capacity depending on the device and depending on the workload – for decades. …
It would be hard to find something that is growing faster than the Nvidia datacenter business, but there is one contender: OpenAI. …
All Content Copyright The Next Platform