For years, Oracle has found itself solidly in the second tier of cloud providers, well behind the top three of Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform, which combined account for more than 60 percent of global cloud infrastructure services market. That said, it’s a market that continues to grow by as much as 25 percent a year, and in the second quarter of this alone pulled in more than $99 billion in revenue.
Being in the other 40 percent has not been bad for Oracle, which in its most recent quarter pulled in $7.2 billion in revenue from its Oracle Cloud Infrastructure business, a 28 percent year-over-year jump. And as we at The Next Platform have noted, the vendor may be set up well for the rapidly expanding AI era, what with not only its cloud infrastructure but also its huge enterprise software business and an aggressive effort to pour AI into all parts of its operations.
It is a message that Oracle co-founder and chief technology officer Larry Ellison has been delivering for well over a year, and it’s an argument that he made this week during his 90-plus minute keynote at this week’s Oracle AI World 2025 conference in Las Vegas – a show that used to be Oracle CloudWorld from 2022 through 2024, and Oracle OpenWorld between 1997 and 2021, and used to be an independent Oracle user group before that.
OCI not only provides the infrastructure for AI operations, it also the applications that allow organizations to run their AI and agentic AI workloads, securely bring proprietary data to a broad array of AI models, and link entire ecosystems through AI.
“Oracle Cloud is very unusual. In the simplest sense, Oracle does infrastructure and applications,” Ellison said. “We do scaled enterprise applications and we do scaled AI infrastructure. We’re the only cloud that does that. The other big clouds – Microsoft, Amazon, and Google – really do not do healthcare applications, enterprise applications, big financial applications. They don’t do that. In other words, they may or may not develop AI technology – Google does, the other two don’t – but they’re not building large-scale applications, where they’re trying to automate industries or automate ecosystems using this technology. Our goals are different than those other clouds. We’re a participant in creating AI technology, and we’re also a participant in using that technology to solve problems in different ecosystems, in different industries.”
The dubious claims against AWS and Microsoft aside — AWS and Microsoft obviously make their own AI XPU compute engines as well as custom CPUs, as well as their own AI models and frameworks — Ellison argued OCI’s case at each of these points. He pointed to the massive AI cluster that the company is building for OpenAI in Abilene, Texas, which when completed will include more than 450,000 Nvidia GB200 GPUs spread across eight buildings covering more than 1,000 acres, with the infrastructure in the buildings interconnected so they can support a single workload if needed.

The site will include liquid cooling technologies (like, super-obviously), with electricity coming in from not only from the public power grid but also from natural gas turbines built on the site. The cluster will add to what OCI already does now, Ellison said.
“We’re training the very first version of Grok, we’re training a number of other of these multimodal AI models,” he said, referring to for Elon Musk’s xAI business. “Almost all of these AI models are in the Oracle Cloud, We are certainly involved in training more multimodal AI models than any other company. These are enormous engineering projects, each and every one of them. What we’re trying to build are these multimodal neural networks trained on all types of data – textual data, images data, audio, video, every publicly available piece of data, plus synthetic.”
Ellison put a focus on the need to feed the AI beast with data, particularly private and proprietary data. AI models already are trained on vast amounts of publicly available data from the internet, but their real value comes when private and corporate data are added, with Ellison noting that most of the world’s private data already resides in Oracle databases. Oracle’s new AI Data Platform allows organizations to not only run whatever model they want on OCI, with xAI’s Grok, OpenAI’s ChatGPT, Google’s Gemini, and Meta Platforms’ Llama among.
“The AI Data Platform that lets you add private data to the models,” he said. “You basically take a bunch of data that the model has not been trained on, you put that information in a database that the model can access and you put your private data in an Oracle database. The new Oracle database is called an AI database, not just because AI is fashionable. It has the ability to take any of the data in the Oracle database and make it accessible to the AI model by vectorizing it. Since a lot of your data is in an Oracle database already, you simply have to ask the Oracle database to put that data in a format the model will understand.”
Ellison talked about using AI to create application suites to tie together entire ecosystems within industries, pointing to the healthcare sector as an example. Not only can AI be used to automate processes and workflows in hospitals, but also everything else having to do with healthcare, from banking and financing to pharmacies and government agencies, and automating the communications between them.
As part of this effort, over a three-year period Oracle used AI to rewrite the entire codebase for its Cerner software, which Oracle inherited in 2022 when it bought the healthcare technology company for more than $28 billion.
He also pointed to Oracle’s new APEX tool, which uses AI to automatically generate code and fold in new trends like vibe coding. It now includes a declarative AI generation language, though it will continue to work for those still using English when creating code. It feeds into what the company already does.
“A lot of the code that Oracle is writing, Oracle isn’t writing,” Ellison said. “Our AI models are writing it. We just tell the model what we want the program to do and then the AI comes up with a step-by-step process to actually do it. We don’t write the procedure. We declare our intent, but the model writes a step-by-step procedure.”
Ellison sees a not-so-distant future where Oracle can leverage its history in databases and big enterprise applications, its growing strength in the cloud, and its broad embrace of AI both within its business as well as in its cloud offerings to challenge the big three hyperscalers in a cloud space that is on an AI-fed growth spurt, and sees what the company is rolling out at Oracle AI World as big steps in that journey.






Unless there is no hidden cost as AWS adds..oracle can’t just alone pull it..it needs a strong research and innovation and lite cost model of what it says on oci …time will tell…when I was with ofss.. oracle financial, once merger with oracle..they literally killed it’s freedom and made it a subsidiary cutting its wings…but now times have changed.