Site icon The Next Platform

Climate Simulation Screams On The Frontier Exascale Supercomputer

We will always complain about the weather. It is part of the human condition. But someday, perhaps in five years but hopefully in well under ten years, thanks to parallel advancements in HPC simulation and AI training and inference, we will have no cause to complain about the weather forecast because it will be continuous, hyperlocal, and absolutely accurate.

That day, when it comes, will be the culmination of one of the most important endeavors in the related fields of weather forecasting and climate modeling that human beings have ever engaged in. And without a doubt, the world will be a better place when we can more accurately predict what is going to happen with our immediate weather and our long-term climate, both at the scale of individuals all the way out to the scale of the entire planet and even to the edges of the space through which it moves.

Whether or not we can change what is already happening, much less reverse it, is a subject for intense debate, but it is clear to everyone that we need to know what is going to happen with the weather and the climate on short and long timescales so we can manage our lives and our world.

We are at a unique time in the history of weather forecasting and climate modeling in that new technologies and techniques are coming together at just the moment when we can make the exascale-class supercomputers that are necessary to employ them. And we can get those exascale-class supercomputers at a reasonable price – reasonable given the immense performance they contain, that is – and at the exact moment that we need them given the accelerating pace of climate change and the adverse effects this is having on all of the biological, economical, and political systems on Earth.

Doing so requires breaking through the 1 km grid resolution barrier on weather and climate models, which is a substantial shrink in resolution compared to what is used in the field on a global scale today. For instance, the Global Forecasting System (GFS) at the US National Oceanic and Atmospheric Administration has a 28 km grid resolution, and the European Centre for Medium-Range Weather Forecasts (ECMWF) global forecasting simulator has a 20 km resolution. You can run at a higher resolutions with these and other tools – something around 10 km resolution is what a lot of our national and local forecasts run at – but you can only cover a proportionately smaller part of the Earth in the same amount of computational time. And if you want to get a quick forecast, you need to tighten in quite a bit or have a lot more computing power than a lot of weather centers have.

Weather forecasting was done, somewhat crudely as you might expect, on computers powered by vacuum tubes back in the 1950s, but with proper systems in the late 1960s, weather models could run at 200 km resolutions, which means you could simulate hurricanes and cyclones a bit. It took five decades of Moore’s Law improvements in hardware and better algorithms in the software to drive that down to 20 km, where you can simulate storms and eddies pretty faithfully. But even at these resolutions, features in the air and water systems – importantly clouds – exist at a much finer grained resolution than the 10 km grids we have today. Which is why weather and climate forecasters want to push down below 1 km resolution. Thomas Schultess of ETH Zurich and Bjorn Stevens of the Max Planck Institute for Meteorology explained it well in this chart:

This sentence from the chart above explains why exascale computers like the “Frontier” supercomputer installed at Oak Ridge National Laboratory, the world’s first bona fide exascale-class machine as gauged by 64-bit floating point processing, are so important: “We move from crude parametric presentations to an explicit, physics based, description of essential processes.”

It is with this all in mind that we ponder the work being done at Sandia National Laboratories on a variant of its Energy Exascale Earth System Model, also known as E3SM, that pushed the grid size of a full-scale Earth climate model down to 3.25 kilometer resolution. That variant, known as Simple Cloud Resolving E3SM Atmosphere Model, just looked at clouds, and in a ten day window for access to more than 9,000 nodes on the Frontier system, the model was able run at greater than one simulated year per wall clock day, or SYPD in weather-speak.

In the image above, the simulation of the Earth’s clouds is showing a tropical cyclone off the coast of Australia, and then it is zooming in on a section of that cyclone, showing a vertical cross-section of where ice and liquid water are swirling around.

“Traditional Earth system models struggle to represent clouds accurately,” Mark Taylor, chief computational scientist of the E3SM project, said in a statement accompanying the performance results. “This is because they cannot simulate the small overturning circulation in the atmosphere responsible for cloud formation and instead rely on complex approximations of these processes. This next-generation program has the potential to substantially reduce major systematic errors in precipitation found in current models because of its more realistic and explicit treatment of convective storms and the atmospheric motions responsible for cloud formation.”

The E3SM project spans eight of the labs that are part of the Department of Energy’s Office of Science, and it develops advanced climate models not just for us directly, but for national security reasons.

This is a first step in pushing down the model resolution, but we do shudder to think of what it might take to add in ocean models and to push the grid resolution down below 1 km. Adding other simulation layers and driving down that resolution by more than a factor of 3X is going to take a lot more oomph.

When we talked to the US National Oceanic and Atmospheric Administration, which runs the National Weather Service, last June about its pair of 12.1 petaflops weather forecasting systems, which run at 13 km resolution for a global model, we did some back-of-the-envelope math and reckoned it would take 9.3 exaflops to do a 1 km model. (To double the grid resolution takes 8X the compute, according to Brian Gross, director of NOAA’s Environmental Modeling Center.) That math is simple, but getting the money to build such a machine – or a pair of machines because weather forecasting requires a hot spare – would be plenty expensive.

But the work on E3SM and tuning it up on Frontier is an important step in the right direction. We would personally advocate for spending a hell of a lot more money on weather and climate forecasting systems than we currently do so we can get the better models that we know are possible. And then we can complain about the weather, and not the forecasting.

We look forward to seeing if this SCREAM variant of E3SM comes up for a Gordon Bell prize this year, and hope it does so we can get some more details on it.

Exit mobile version