Dell Wants To Help You Build Your AI Factory

No surprises here: Reviewing first quarter earnings calls of S&P 500 companies, London-based analytics firm GlobalData found that generative AI was a key point of discussion among a growing number of the public companies.

Business fundamentals analyst Misa Singh saying that “companies are looking at GenAI tools for better productivity, increased sales, brand awareness, and an enhanced customer experience. They are investing, collaborating, and leveraging to make use of this new and emerging opportunity.”

And IT companies are driving hard to the hoop to grab what they can of that opportunity. Vendors have made sharp turns with their infrastructure, software, and services offerings to address an explosive AI market fueled by accelerating innovation and adoption since late November 2022, when OpenAI introduced its ChatGPT generative AI chatbot. Tech vendors and cloud providers like Nvidia, Intel, Microsoft, and Arm have built their portfolios and their user and developer conferences around generative AI, and Dell is no exception.

At this week’s Dell Technologies World 2024 show in Las Vegas, the IT giant is unveiling a broad array of AI-focused products, services, and partnerships touching on everything from servers, storage, networking, and AI PCs that stretch from the datacenter and out to the cloud – including Amazon Web Services and Microsoft Azure – to the edge. Like such competitors as Hewlett Packard Enterprise and Cisco Systems, Dell is aiming to give enterprises every incentive to embrace its infrastructure and related offerings as their AI IT platforms.

At last year’s Dell show, the company partnered with Nvidia on Project Helix, an initiative wrapping Nvidia GPU-powered Dell systems and software together to create a program that enterprises can use to create complete on-premises AI environments. Dell AI Factory with Nvidia was announced two months ago at Nvidia’s GTC 2024 conference and Dell this week is expanding the offerings to include new hardware.

“There’s a lot of enterprise interest and curiosity about AI,” Dell founder and chief executive officer Michael Dell said during a Q&A session with journalists and analysts. “The reason for that is pretty simply that they’ve all seem that there’s just an enormous opportunity there to make a difference in the productivity and efficiency and reimagine what they can do, so we came up with this concept of the AI factory. The idea here is that you start with data, because if you don’t have any data you don’t have any AI . And data is something we know something about because data is essentially storage. We do all that. Then you have compute. There are incredible innovations going on in GPUs and NPUs, and all these fantastic partners. Then you’ve got compute, storage, memory, high-bandwidth memory, networking. Put it all together. Then you got some services.”

Central to this is the new PowerEdge XE9680L server, a version of the XE9680, a 6U system that can hold up to eight Nvidia Tensor Core H100 GPUs and which Dell has called its flagship accelerated box for training generative AI models, customizing models, and large-scale AI workloads. The new server comes in a 4U form factor and can house up to eight of Nvidia’s “Blackwell” B100 and B200 GPUs – powerful AI chips rolled out at GTC in March – bring a 33 percent increase in the GPU-per-node density. The density is helped by the direct liquid cooling (DLC) for the system’s CPUs and GPUs.

There will be variants of the XE9680L, including air-cooled designs that can support 64 GPUs in a single rack, or the liquid-cooled version that can include up to 72 Blackwell GPUs per rack. The new server will be available in the second half of the year.

The vendor also launched its own Dell AI Factory that, liked the AI factory with Nvidia, is aimed at helping enterprises more easily build on-premises infrastructure and ease the adoption of AI technologies through repeatable processes. “While everybody’s data is unique, the problems they’re trying to solve and most of the use cases tend to be pretty common across certain industries,” Michael Dell said. “We’re seeing a lot of repeatable things.”

That repeatable nature of the AI factory idea will be important as more enterprises not only adopt generative AI in their businesses, but likely will need more than one such factory in their datacenter environments, according to Jeff Clarke, Dell vice chairman and chief operating officer. There is not set SKU for Dell AI Factories. It can be adapted to the particular needs of an enterprise.

“You can imagine a world where there’s a big factory in the corporate datacenter [and] there are AI factories in factories, in hospitals, running specific workloads, ultimately taking AI to where the data is created,” Clarke said. “Ultimately, AI is going to where the data is being created so we can triage and take advantage of it there to drive real-time insights to help customers extract that value from it.”

Dell is stretching its AI factories across datacenters, clouds, and the edge and included the PowerScale F910 all-flash file storage, a dense and high-performing storage system that includes DDR5 memory, PCI-Express 5.0 interconnect, and 24 NMV-Express SSDs in a 2U rack that comes with up to 1.47 PB in a single node. There also is Project Lightning, which will bring a parallel file system for unstructured data to the PowerScale portfolio that will help organizations accelerate model training for the largest AI workloads.

The PowerSwitch Z9864F-ON is based on Broadcom’s Tomahawk 5 chipset to support 400 Gb/sec and 800 Gb/sec switching, while the PowerEdge XE9680 supports Broadcom’s PCI-Express Gen 5.0 400 Gb/sec Ethernet adapters.

In another nod to the effort to push AI capabilities out to where the data is generated, Dell is rolling out five PCs and workstations in its Inspiron, XPS, and Latitude lines that will include Qualcomm’s latest Snapdragon X-Series Plus NPUs (neural processing units) to complement the CPUs and GPUs they already run. With the latest silicon, some of the systems will be able to run 13 billion parameter models at 45 TOPS, according to Sam Grocott, senior vice president of product marketing at Dell.

The systems also are part of Microsoft’s new Copilot+ PC designs, AI-powered systems that the enterprise software maker introduced this week. Dell is among a number of PC makers – along with HP, Lenovo, Samsung, Acer, and Asus – rolling out these PCs.

Dell’s AI PC strategy includes an expanding partner ecosystem, including Hugging Face, the top repository of open large language models and AI code that now includes Dell Enterprise Hub, as well as Meta with its Llama 3 model and an AI project with Microsoft Azure to accelerate such AI service as speech transcription and translation.

Sign up to our Newsletter

Featuring highlights, analysis, and stories from the week directly from us to your inbox with nothing in between.
Subscribe now

Be the first to comment

Leave a Reply

Your email address will not be published.


This site uses Akismet to reduce spam. Learn how your comment data is processed.