AI

Ampere Computing Buys An AI Inference Performance Leap

Machine learning inference models have been running on X86 server processors from the very beginning of the latest – and by far the most successful – AI revolution, and the techies that know both hardware and software down to the minutest detail at the hyperscalers, cloud builders, and semiconductor manufacturers have been able to tune the software, jack the hardware, and retune for more than a decade.

Store

Object Storage Makes A Push Into HPC

Four years ago, Cloudian was a six-year-old startup in an object storage space that, while the technology had been around for more than a decade, was seeing a surge of interest from cloud providers desperate for a storage architecture that only could scale to meet the demands of their rapidly growing datacenters, the massive amounts of data that was being generated and the need to be able to more easily move it between core on-premises datacenters and multiple cloud environments – and in the coming years the edge.