Dell’s Advice To Enterprises: Buy AI, Don’t Try To Build It

Unsurprisingly, the main topic of conversation at the recent Dell Technologies World 2025 event in Las Vegas was AI, and a central theme that wove through many of the messages we heard there was that adopting the emerging technology is much easier now than it was even a year ago.

During the conference, Dell introduced new capabilities – such as Red Hat’s OpenShift and Mistral AI integrations – to its year-old Dell AI Factory, an all-in-one infrastructure-plus-software-plus-security-plus-services offering that with Nvidia, Intel, and AMD flavors is designed to let enterprises quickly and easily move into the AI era. There were new servers and storage systems with the compute power and power efficiency necessary to run AI workloads, the Dell Data Platform, and a new Dell Private Cloud made for AI, with a disaggregated datacenter and the flexibility for organizations to run whatever software – like Broadcom’s VMware, Nutanix, and Red Hat – suits them.

The underlying pieces are being put in place so that enterprises that last year struggled to pull together their own on-premises AI environments can now embrace offerings that are ready to go and bring the flexibility to let them choose what software and data to put into them.

That’s a message that is crucial for companies to understand and take to heart, according to John Roese, Dell’s global chief technology officer and chief AI officer.

“It’s not totally turnkey,” Roese told The Next Platform. “Every AI project eventually is hyper-personalized. But the level of effort to do it, from last year to this year, has just dropped dramatically. These things are almost off-the-shelf, which means that if you were sitting back going, ‘I don’t have enough resources, I don’t have smart people, I don’t know if I can do this,’ that was true a year ago, because you would have had a big engineering team. Today, many of these tools that you can just buy and implement. We already know how to buy and implement them; maybe can just consume them from us. The result is, your ability to actually get into production, technically, is significantly easier than it was a year ago in a material way.”

Paving The Path To AI

To be sure, Dell isn’t the only one moving quickly in this direction. Hewlett Packard Enterprise announced enhancements to its own HPE Private Cloud AI – co-developed with Nvidia – and expanded its integrations and support with Nvidia’s Enterprise AI portfolio and Enterprise AI validated design.

Enterprise tech services player Kyndryl in April unveiled a collection of AI private cloud consulting services to help enterprises do everything from identify industry use cases to pursue, design and build AI prototypes, and move organizations into production environments.

The list can go on, but the trend is tilting heavily toward vendors and service providers increasingly taking on the burden of putting the tools in place to get organizations up and running with AI more quickly, which they need to do, Roese said, because their competitors are and lagging behind isn’t a good option in what he calls year three of the generative AI era.

Year Three Of Generative AI

The first year came with OpenAI’s introduction of ChatGPT in November 2022, which didn’t do much for enterprises but did stimulate them to start thinking about what generative AI could do for them, according to Roese, who was given the chief AI officer position last year.

Year two, which was last year, was the do-it-yourself year,” he said. “If you wanted to do something, there weren’t enough tools and systems. It wasn’t easy to do, so you had to invest a tremendous amount of personal technical effort into building your own stuff. We did that at Dell, and it was a very significant investment. It was not enjoyable. Because of that, what you saw in year two is very few enterprises actually got into production. They were still doing proofs of concept and kicking the tires. It was very hard to get into production, because even if you knew what to do, you didn’t really have the systems to make it easy. You had tremendous technical boundaries.”

Key to year three is that necessary tools is that the tools enterprises need to use AI are increasingly off-the-shelf, as illustrated by Dell’s AI Factory, which helps the infrastructure challenges that come with building out an AI environment to fade away.

“You don’t have to figure out direct liquid cooling and your network topology and your storage,” he said. “We’ve done that and we have lots of choice. It shifted that burden to us and if you want to stand up an AI cluster, it is actually extremely easy to do with us now because we just know how to do it and it’s been standardized to a point where It’s no longer a do-it-yourself project.”

Coding Assistants And RAG-Based Chatbots

Similarly, the tools that organizations may want to run on the AI Factory also are becoming more readily available, Roese said, using coding assistants for developers as an example. A year ago, enterprises likely had to build it themselves. That’s changed, with an abundance of off-the-shelf coding assistants available, both for on-premises and in the cloud, he said.

Enterprises should use them. Developers are gravitating quickly to such AI tools, though to varying degrees. In a report in March, HackerRank, a developer skill company,  found that 97 percent of developers — deep adopters and casual users alike — are using the technology. In addition, market research firm Statista said code creation is the top use by AI by developers, with 82 percent saying they’re doing it now and 9.2 percent saying they’re interested.

“If you have software developers and you don’t have a coding assistant deployed, that is not a hard problem to solve, and you should go do that, because it is almost guaranteed improvement in your productivity,” Roese said. “We see 20, 30, 40 percent improvement on our engineering productivity. The ramp-up times are faster. It is just now a no-brainer and it’s no longer a do-it-yourself project.”

Retrieval-augmented generation (RAG)-based chatbots, which help pull proprietary data into AI workflows, are another kind of tool that enterprises can easily adopt, which is important given that the use of such corporate and sensitive date – along with the cost of moving data to the cloud and the demand to bring data to the AI systems rather than the other way around – is driving the push for on-prem AI.

“You have a bunch of proprietary services information [and] today it sits in a bunch of traditional tools,” Roese said. “Nobody knows how to find anything. You take all that information, you vectorize it, you push it through a chatbot, suddenly you have one source of truth and everything changes. Suddenly you can find the answers. You can express your services capability as a chatbot instead of a person. Again, two years ago, you would have had to build it. Today, you can just buy it.”

Following The Leaders

The idea of buy-versus-build was a key change this year that should grease the skids for enterprises wanting to get into production with AI. Another is understanding that other enterprises already are in production. Onstage with chief executive officer Michael Dell during his keynote were executives from JPMorgan Chase and Lowes, both of whom outlined their companies’ use of AI and Dell tools. Michael Dell and other executives also talked of the vendor’s own approach for adopting the technology, from mapping out why they wanted to use AI (to grow profits, be more productive, and reduce risk), where to use it (sales, services, and engineering organizations and with the supply chain), and processes to focus on (giving salespeople more time with customers was one).

Such companies stood as proof points that enterprises are using AI now, which hopefully will convince the myriad CIOs who are reluctant to be on the bleeding edge of technology shifts.

“A year ago, two years ago, you might have been the first person to do this,” Roese said. “Today, that is absolutely not true. There are a ton of companies that are standing up and telling their story [so] you’re not at the bleeding edge. … We’ve got all the battle scars, we figured out the problems, you can shamelessly take everything we learned and so not only is it easier to do, but it’s also lower risk because you’re not the first one to do it.”

The Rest Of 2025

All this leads to what Roese expects will happen through the rest of the year. One is that enterprises – armed with these new tools and the knowledge that others are already on their way with AI – will accelerate their own use to improve their operations in large part because it’s just easier to do now.

The next seven months also will see a surge in agentic AI, with AI agents able to solve problems with high levels of autonomy and little if any human intervention. They promise to significantly change how businesses operate, though enterprises need to take the first steps into AI before thinking about AI agents.

“They’re real, they’re very valuable, you will eventually deal with them,” Roese said. “But if you haven’t even put the foundation in place to get your chatbots up and running, get your data organized, do that. There’s a lot of value there [and] it’s way easier to do than it was a year ago. Agents are new, but agents are moving incredibly fast. Don’t worry about agents for a while.”

Sign up to our Newsletter

Featuring highlights, analysis, and stories from the week directly from us to your inbox with nothing in between.
Subscribe now

Be the first to comment

Leave a Reply

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.