While there are plenty of companies that have built a core competency in IT infrastructure acquisition and management, and think they can provide themselves a competitive edge based on those skills, it looks like there is a growing and potentially much larger cadre of companies who simply do not want to be messing around with basic infrastructure anymore.
This is not the first wave of IT outsourcing we are experiencing, but it could very well end up being the last. Datacenter consolidation and outsourcing of systems or applications was all the rage for a certain class of IT shops in the 1980s and 1990s, and during the dot-com boom lots of companies started out in co-location facilities and never left. With the public cloud, the utility-style pricing and usage models are much less coarse grained than an outsourcing contract from days gone by, and consumption of capacity is instantaneous and so is turning it off. So it is no surprise that the public cloud has become the dominant driver of spending on servers, storage, and switching in the IT sector, and private clouds that mimic them are also a big component of spending now, too.
“The pace of adoption of cloud-based platforms will not abate for quite some time, resulting in cloud IT infrastructure expansion continuing to outpace the growth of the overall IT infrastructure market for the foreseeable future,” explained Kuba Stolarski, research manager for server, virtualization, and workload research at IDC. “As the market evolves into deploying 3rd Platform solutions and developing next-gen software, organizations of all types and sizes will discover that traditional approaches to IT management will increasingly fall short of the simplicity, flexibility, and extensibility requirements that form the core of cloud solutions.”
The growth rates for public and private cloud spending across the core IT infrastructure stand in marked contrast to other spending in the datacenter, but it is important not to get the wrong impression. Non-cloudy infrastructure still dominates IT spending, at least for the moment. But within the next five years, it will tip in the other direction and cloudy infrastructure will start to dominate. This is nothing radical, actually, but just yet another example of a long-term transition in data processing like we have seen in several long cycles of change in the past fifty years. Cloudy infrastructure with various kinds of provisioning and orchestration for bare metal and virtualized servers, networks, and storage will simply be the norm by the end of the decade in the enterprise, excepting silos of legacy computing that take decades to get rid of.
Here’s a table we built from the publicly available IDC data that shows the transition:
We don’t have the overall datacenter IT spending number for 2013 as we go to press, so we cannot see how much faster cloud spending was growing last year compared to the market overall. But IDC did give out its forecast for 2015 and 2019 from that you can see that the non-cloudy datacenter spending is gradually shrinking over time while cloud IT spending is growing at a pretty healthy clip. Over the five-year span of the forecast period from 2015 through 2019, IDC reckons that public and private cloud spending will grow at a compound annual growth rate of 14 percent. By the end of the forecast period, cloudy IT will represent 45 percent of total datacenter spending.
In an interesting aside, spending on cloudy IT is forecast to accelerate in Western Europe, with 32 percent growth this year, followed by Latin America with 23 percent growth, Japan with 22 percent growth, and the United States with 21 percent growth. Companies in the United States have been on the front end of the cloud transformation, so it is no surprise that growth is higher where virtualized and orchestrated infrastructure has been less prevalent.
The wonder is why there is so much infrastructure being sold that is not somehow pinned to clouds. For one thing, HPC centers do not run virtualized infrastructure, and plenty of data analytics workloads and parallel applications run on bare metal. Moreover, even if mainframes and big Unix machines are virtualized, they do not meet the strict definition of being cloudy infrastructure. But we could make the case either way if we wanted to. Over time, it is hard to imagine that all workloads won’t be running in some kind of container or virtual machine – and very likely a mixture of the two for lots of reasons. Moore’s Law won’t run out of gas between now and 2020 – after that is a bit of a problem, perhaps – so systems will have extra capacity to run layers of virtualization as the system architects dictate without adversely impacting application performance too much. If anything, IDC’s forecast might be too optimistic about growth rates for non-cloudy IT spending in the datacenter. We’ll know by 2020 for sure.