Paid Feature Datacenters are notorious consumers of power but there is a revolution brewing.
The most progressive centers, along with systems and facilities partners, are pushing past carbon neutrality to become carbon negative—a state where more CO2 is removed than is released, which allows reinvestment into even more innovation on green datacenter initiatives.
Even though that sounds far-fetched given the wattages involved, there are some surprisingly simple steps to start down the path to carbon neutrality first, giving way for those carbon and consumption savings to roll into broader green computing efforts.
Nearly every datacenter today uses at least part of its energy from fossil fuel-derived energy sources. Considering most datacenters consume more than 50X the power per square foot than a typical office and that power draw is growing 12% each year, now is the moment to start rethinking operations—everything from how facilities are built and cooled to how individual servers contribute to carbon dioxide (CO2) load.
What this means is that the vast majority of datacenters are carbon positive with reliance on fossil fuels and adherence to less efficient power and cooling technologies. Walk into almost any datacenter facility and aside from the noise from inefficient fans and facility-scale air cooling units, the heat is pervasive. In most facilities, every watt ends up circulating in a room, with far more wattage spent trying to reject or dispose of it.
But there is a better way—a better way to concentrate that heat and turn it into a zero-emission energy source that can be recycled for other uses while putting the energy vampires (air handling units, massive fans, etc.) out of work.
If you have not yet set a baseline for energy consumption in your facility consider the following based on a typical 1U server. With all the standard elements (DIMMs, CPUs, etc.) the draw is around 700 watts with a fair amount of overhead in this single server’s fans. While these fans keep components from overheating, these represent 10% of the power draw.
Take all of this to rack scale and the consumption problems mount. With a 25 kilowatt rack, 2.2kW are just for the fans. If we zoom out even more to the facility level the impacts of inefficient IT operations are even more profound. The datacenter as a while has to address that heat by expelling or cooling it.
The point is this: The time is now to rethink everything from power source inputs to datacenter designs. That process begins with establishing a baseline of the full scope of power consumption and resulting C02 emissions. That foundation paves the way to build an efficiency improvement plan where every watt of power saved from IT operations can have a direct link to CO2 reduction.
“Fewer watts consumed, means less electricity generated, which means less CO2. If we can drive significant efficiency gains, the next step is to take some of those power cost savings and re-invest them into green power or carbon offset credits to further reduce the remaining energy consumption,” says Scott Tease, Vice President and General Manager of Lenovo’s HPC and AI business. “These double benefits accelerate the value of each watt of power we save.”
One of the reasons CO2 reductions have been difficult for companies to tackle is because change is required but also, because the concept of carbon emissions does not have (at least on the outside) an obvious, immediate business benefit or delivery on rapid ROI. However, with that baseline of consumption established, the monetary value of these shifts becomes immediately clear. Working toward carbon neutrality or better has clear financial benefit.
Take that baseline consumption number and multiply it by the number of hours per year a system runs and put that in context of what the power company charges. As an example above, we will use 10 cents per kw/hour. A CO2 number can also be arrived at by determining the power source (coal highest, natural gas a bit lower, hydro or solar almost zero). Using power combined from all sources, this example emits .92 pounds of carbon emissions per kw/h.
Around one-third of that spend is just for cooling and fans. “The monetary side is easy to understand but what a ton of CO2 actually means is a bit more nuanced,” Tease explains. “These molecules have mass. That amount just for the operation of fans is the same weight as two elephants and at the datacenter cooling level, another eight elephants. That is ten elephants worth of CO2 required to run these systems. And over all, this is the carbon emissions equivalent of ten cars driving 110,000 miles per year. It is significant.
We have spelled out the problem and some of the benefits, but what does a truly efficient, carbon negative datacenter look like in practice? Look no further than the Massachusetts Green HPC Center. A joint venture between five universities with efficient supercomputing and sustainability as core to its mission.
Executive director of the center, John Goodhue and team created a 15MW datacenter that leverages inexpensive hydroelectric power and worked with the municipal electric company to integrate other renewables. They established an early baseline of power consumption, looked for the areas where the most reduction would be needed (in their case summer cooling), but were able to find free cooling 70% of the year.
The liquid cooling piece is worth mentioning in particular. Aside from a focus on helping customers build truly efficient, sustainable systems, Lenovo provides technology platforms like the Neptune™ liquid cooling innovations for the system foundation.
“To go from carbon neutral to carbon negative we need two things,” Tease says. “First, to start with green power in. After all, no emissions electrical sources mean no emissions. Through efficiency gains and moving to CO2 offsets, the can be possible for many. Second, we need to rethink heat coming out. Today, it’s treated like waste but imagine what is possible when we can harness and concentrate it to be recycled. The key to this is to deploying a platform like Neptune.”
Lenovo Neptune™ is a warm water cooling technology that lets heat from the datacenter to be recycled. The unchilled water goes through to cool system, passes through the server, making the water hotter, which can have immense value when integrated into building infrastructure or even something creative, like a swimming pool.
Using carbon neutral energy sources and green recycled energy can push closer to carbon negative supercomputing. The benefits of these savings can be rolled back into a host of other green technology innovations or even into research to explore next-generation materials and approaches for carbon emissions reductions in other areas.
Lenovo’s insights on the future of sustainable supercomputing might sound lofty but the company practices what it preaches with 922% carbon emission reductions since 2010 with goals to cut an additional 50% by 2030. Lenovo has been recognized by the EPA in rankings of the Top 30 for green energy innovation.
Sponsored by Lenovo