Dealing With Density In The Datacenter And Beyond

It wasn’t so long ago that only supercomputing centers had to resort to fancy cooling technology to keep their systems running smoothly and at peak performance. But over the last decade, computational densities have soared, with 300 watt GPUs and 200 watt CPUs working their way into HPC, cloud, and enterprise datacenters. The advent of machine learning, with its reliance on GPU-laden computing gear, has accelerated this process.

There is no shortage of solutions out there, from rack door chilling, to direct liquid cooling of server chips and other components on the motherboard, to the more exotic liquid immersion method to try to keep all this gear from overheating. Cabinet-based cooling is another option, but not one you hear very much about. Encasing a server rack in a cabinet to cool it more efficiently seems sort of counter-intuitive since the extra layer of metal would tend to trap heat. But it also enables the creation of a climate-controlled space, along with a number of other advantages, which we will get to in a moment.

One company specializing in such solutions is ScaleMatrix, which has developed something called Dynamic Density Control (DDC), a technology that uses a hybrid air-water cooling system for controlling the temperatures of servers housed in a 45U cabinet. Chris Orlando, co-founder and chief executive officer of ScaleMatrix, says it was designed to “let customers deploy any hardware, any manufacturer, at any density, and put it just about anywhere.”

The product came to market in an unusual way. When ScaleMatrix launched 2011, it was strictly a colocation and cloud management business, providing hosting and other kinds of cloud services for its customers. The company developed the DDC technology for its own colo datacenters, which were aimed at customers with high density server racks but without the facilities to host them. According to Orlando, the technology exceeded expectations, and in 2018 ScaleMatrix decided to productize it and sell it to customers for on-premise use. That business now resides in DDC Cabinet Technology, a subsidiary ScaleMatrix has spun off.

Essentially, DDC is an enclosed air-cooled system that circulates a small amount of water to draw away heat from the surrounding air. The water temperature can be anywhere from low 40 degrees to mid-60 degrees Celsius. Some customers are running it off of warm return water from other systems, but it can also operate in closed loop without an external supply. The air is blown through the front of the cabinet in a pressurized manner, which provides even operating temperatures throughout the rack, usually within a 2 degree Celsius delta.

The system is dynamic because it’s continuously monitoring the environment in the cabinet and can vary the cooling capacity in real time based on the heat load from the hardware, the temperature of the circulating water, and the customer’s settings (that is, thermal set point and air flow rate), Orlando says that whether you are running 2 kilowatts or 50 kilowatts per cabinet, you get the same energy efficiency, which according to him is about 30 percent better than a typical air cooled setup. The whole thing is automated, so no manual fiddling is required by mere mortals.

Some of the energy efficiency can be attributed to the fact that they have to deal with the small amount of space inside the cabinet, rather than an entire machine room. In essence, they’ve shrink-wrapped the datacenter. “We only worry about cooling the front edge of the cabinet – a few cubic feet of air – but we do it extremely precisely,” Orlando explained.

The ability to adapt to different heat loads also means DDC is able to operate across a range of application environments. Orlando told us that it has enterprise clients at 15 kilowatts to 25 kilowatts per rack, customers running GPU-heavy AI workloads at 15 kilowatts and sometimes 20 kilowatts or 25 kilowatts, and more traditional HPC users at 25 kilowatts to 50 kilowatts. The technology is currently able to handle anything up to 52 kilowatts in a standard rack, which probably covers more than 95 percent of all installations out there. The Summit supercomputer, for comparison, can burn 59 kilowatts per rack, but that’s exceptional; in that system, each node is outfitted with two IBM Power9 processors and six Nvidia Tesla V100 GPU accelerators.

Besides the efficient cooling, the cabinet also protects the rack equipment from dust, which can build up inside servers over time and degrade their operation. Orlando says they’ve opened up cabinets after five years of seen no built-up at all. Some of ScaleMatrix’s colo customers have told them that they’re experiencing fewer hardware failures, which is probably attributed to both the even temperature control and the clean-room environment. Orlando thinks DDC is probably adding another year or so to the lifetime of the equipment.

Another advantage to the enclosed design is that it becomes much easier to limit access to the hardware for either security reasons or to enforce data compliance regulations. That makes it particularly attractive to customers, such as in the defense and healthcare industries, who have strict controls on who can and cannot see their data.

Business has been brisk. In its first year of production, ScaleMatrix racked up $2 million in DDC revenue. And according to Orlando, it now have five to eight times that amount in its sales pipeline. At that rate, it may soon reach parity with their datacenter hosting and cloud service business, which is delivering about $20 million per year in revenue. That side of the company is growing as well said Orlando, thanks primarily to the increase in deployments of AI infrastructure. “We see density in the datacenter growing fairly significantly now,” he noted. “It was slow for a number of years.”

Last year ScaleMatrix had only two datacenters, with an aggregate capacity of 2 megawatts of power. It now has five (San Diego, Seattle, Dallas, Jacksonville, and Charlotte), totaling 57 megawatts. The new Charlotte, North Carolina facility was initially anchored by a very large AI company that approached ScaleMatrix because it had an immediate need for 1 megawatt of computing that had to be up and running in 30 days. Orlando says all the large cloud computing companies competed for the business but couldn’t supply the latest GPU hardware in time. So the customer built the racks itself and co-located with ScaleMatrix.

“We had 30 cabinets, running above 30 kilowatts apiece, deployed in just under two weeks,” Orlando said. “That was one of the fastest deployments we’ve ever done. We just threw a lot of bodies at it.”

The company chalked up another significant win with the HPE’s Center of Excellence, which will deliver HPC as a service using the DDC cabinets to house Apollo server racks, some of which are drawing more than 50 kilowatts of power.

ScaleMatrix has identified edge computing as another growth area for selling containerized products. To move into that space, the company has purchased Instant Data Centers (IDC), a provider of ruggedized and mobile cabinets and micro-cabinets for edge environments. Although this market is still nascent, CB Insights has it pegged as a $6.7 billion opportunity by 2022.

Unlike the DDC platform, IDC’s cabinets can be set up outdoors and other challenging environments. Current deployments range from a South American mine 9,000 feet underground to an outdoor installation in Colorado that is exposed to snow and rain. Other locales include airport terminals, military command centers, autonomous vehicle test centers, and hospital wards. There are a handful of different form factors (ranging from 12U to 42U) and cooling systems depending on the circumstances, including one that is on a motorized cart.

The company currently operates only in the United States, but is planning to take its hosting service and cabinet product businesses global, using both direct and partner channels.

Sign up to our Newsletter

Featuring highlights, analysis, and stories from the week directly from us to your inbox with nothing in between.
Subscribe now

Be the first to comment

Leave a Reply

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.