
Nvidia Weaves Silicon Photonics Into InfiniBand And Ethernet
When it comes to networking, the rules around here at The Next Platform are simple. …
When it comes to networking, the rules around here at The Next Platform are simple. …
With the months-long blip in manufacturing that delayed the “Blackwell” B100 and B200 generations of GPUs in the rear view mirror and nerves more calm about the potential threat that the techniques used in the AI models of Chinese startup DeepSeek better understood, Nvidia’s final quarter of its fiscal 2025 and its projections for continuing sequential growth in fiscal 2026 will bring joy to Wall Street. …
UPDATED Networking giant Cisco Systems and AI platform provider Nvidia have hammered out a deal to mix and match each other’s technologies to create a broader set of AI networking options for their respective and – importantly, prospective – customers. …
Server makers Dell, Hewlett Packard Enterprise, and Lenovo, who are the three largest original manufacturers of systems in the world, ranked in that order, are adding to the spectrum of interconnects they offer to their enterprise customers. …
It was a fortuitous coincidence that Nvidia was already working on massively parallel GPU compute engines for doing calculations in HPC simulations and models when the machine learning tipping point happened, and similarly, it was fortunate for InfiniBand that it had the advantage of high bandwidth, low latency, and remote direct memory access across GPUs at that same moment. …
All Content Copyright The Next Platform