
Getting To Zettascale Without Needing Multiple Nuclear Power Plants
There’s no resting on your laurels in the HPC world, no time to sit back and bask in a hard-won accomplishment that was years in the making. …
There’s no resting on your laurels in the HPC world, no time to sit back and bask in a hard-won accomplishment that was years in the making. …
Large language models, also known as AI foundation models and part of a broader category of AI transformer models, have been growing at an exponential pace in terms of the number of parameters they can process and the amount of compute and memory bandwidth capacity they require. …
DataStax, the driving force behind the ongoing development of and commercialization of the open source NoSQL Apache Cassandra database, had been in business for nine years in 2019 when it made a hard shift to the cloud. …
In many industries, embracing AI in the application software stack it is not just a matter of training some large language models or recommender systems against general and then specific datasets and plugging it in. …
The top hyperscalers and clouds are rich enough to build out infrastructure on a global scale and create just about any kind of platform they feel like. …
Predicting the future is hard, even with supercomputers. And maybe specifically when you are talking about predicting the future of supercomputers. …
AI is arguably the most important kind of HPC in the world right now in terms of providing immediate results for immediate problems, and particularly for enterprises with lots of data and a desire to make money in a new economy that does not fit models and forecasts before the coronavirus pandemic. …
How expensive and difficult does hyperscale-class AI training have to be for a maker of self-driving electric cars to take a side excursion to spend how many hundreds of millions of dollars to go off and create its own AI supercomputer from scratch? …
After its acquisitions of ATI in 2006 and the maturation of its discrete GPUs with the Instinct line from the past few years and the acquisitions of Xilinx and Pensando here in 2022, AMD is not just a second source of X86 processors. …
For the last few years, Graphcore has primarily been focused on slinging its IPU chips for training and inference systems of varying sizes, but that is changing now as the six-year-old British chip designer is joining the conversation about the convergence of AI and high-performance computing. …
All Content Copyright The Next Platform