AI

Nvidia Takes The Open Road In AI Weather Forecasting

Published

Amid the myriad discussions about AI – from the astounding amount of money being spent by vendors and enterprises and the debate about actual ROI those businesses are getting to the technology’s effect on cybersecurity, jobs, and the fear of disinformation and resulting distrust – it’s easy to forget its usefulness in particular industries.

Those data-heavy sectors include healthcare and life sciences, manufacturing, financial services, and retail and e-commerce. It also includes weather forecasting, where the rapidly evolving technology – which now includes AI agents – is making significant in roads and is driving a global market that some observers say could grow from $165.7 million two years ago to $926.3 million by 2033.

The current way of forecasting – a three-step approach called “numerical weather prediction” – involves meteorologists collecting data from a range of sources, such as weather stations, weather balloons, and ships and aircraft, running it through a complex computational atmospheric model, and then processing it, such as expanding spatial granularity and including input from human forecasters.

“This pipeline requires huge supercomputers, complex software and large support teams,” researchers at The Turing Institute wrote last year. It’s expensive and consumes huge amounts of energy and time.

A wide range of companies, from tech giants like Google and its DeepMind unit with GenCast and WeatherNext2 family of models and Microsoft’s Aurora to specialized organizations like The Weather Company and its GRAF model to smaller AI specialists, such as The Turing Institute and its Aardvark weather prediction technology.

Using ever-improving AI models and tools, weather forecasters who long relied on massive supercomputers and huge teams of experts can now run predictions on systems as small as laptops, driving down the costs and allowing countries that didn’t have the means before to create their own local forecasts.

“The stakes can’t be higher in weather,” Mike Pritchard, director of climate simulation research at Nvidia and a professor with the Department of Earth System Science at the University of California, Irvine, told journalists during a media briefing. “Worsening extreme weather driven by climate changes having impacts on all of us, nearly every aspect of modern life. Forecasting affects us all. It can drive improvements to agriculture, energy, aviation, and emergency response. But the science of forecasting is changing. AI has sparked a scientific revolution in weather forecasting, unlocking more skill than had previously been thought possible.”

Trillions Of Dollars Lost

The costs of severe weather incidents are immense. The total cost of 27 such events in the United States in 2024 was $182.7 billion, with the total costs between 20125 and 2024 reaching more than $1.4 trillion.

Unsurprisingly, Nvidia has been active in applying AI to weather research and forecasting. In 2024, the vendor rolled out its Earth-2 climate digital twin platform, created to significantly scale the simulating and visualizing of weather and climate around the world, and it “includes models for forecasting weather, downscaling, course resolution predictions into higher resolution, actionable predictions and tools for diagnosing skill and accuracy,” Pritchard said.

Those outside models come from such sources as models the European Centre for Medium-Range Weather Forecasts and Microsoft.

Among those are the FourCastNet and CorrDiff models, along with models built by other vendors that are hosted on Nvidia’s Earth2Studio, a Python-based toolkit developers can use to build models and applications for Earth-2.

Open Forecasting Models, New Architectures

At the American Meteorological Association’s annual meeting today in Houston, Nvidia unveiled an open set of AI models and tools – such as pretrained models, frameworks, and inference libraries – to give developers more options working with Earth-2 and developing tools for their own infrastructures. The aim to create technology that can speed up weather forecasting and predictions while driving down costs to enable more countries and regions to do such work.

“The goal here is not to force people into a particular way of working,” Pritchard said of the open nature of the new offerings. “It is to develop the tools that let people work in a way that works best for them and gives them the most control over their systems, because for some users, it makes sense to subscribe to an enterprise centralized weather forecasting system. But for others, like countries, sovereignty matters.”

Nvidia introduced three new forecasting models, including Earth-2 Medium Range, which is based on a new model architecture called “Atlas” and is aimed at medium-range forecasts, up to 15 days in advance. It ingests and processes more than 70 weather variables, such as temperature, pressure, wind, and humidity. Pritchard said the new model outperforms Google’s open GenCast mid-range model across those variables.

“Philosophically, scientifically, it’s a return to simplicity,” he said of Atlas. “We’re moving away from hand-tailored niche AI architectures and leaning into the future of simple, scalable transformer architectures, [which] are having transformative results in drug discovery, self-driving, and robotics, and showing that these methods, which enjoy a critical mass of performance and engineering tooling, can produce state-of-the-art results in weather forecasting.”

Nvidia’s new Earth-2 Nowcasting model, also built using a new architecture called “StormScope,” is tighter in scope with its forecasting, with kilometer-scale resolution and a timeline of zero- to six-hour predictions of local storms or other weather system, all done within minutes. The model creates what Pritchard called “country scale” forecasts using generative AI and transformers, adding that the six hours is a “critical window for decision-making about local weather for agencies who issue warnings and emergency responders who need to direct resources.”

The model simulates storm dynamics directly from observations, which allows it to outperform modern physics-based weather prediction methods for short-term precipitation forecasting.

“Importantly, because this model is trained directly on globally available geostationary satellite observations rather than region-specific physics model outputs, Nowcasting’s approach can be adapted anywhere on the planet with good satellite coverage, enabling any nation to build sovereign, high-resolution, severe weather forecasting capabilities without dependency on expensive local data archives,” he said.

Data Assimilation Is Key

Earth-2 Global Data Assimilation is based on another new model architecture, this one called “HealDA,” that produces initial conditions that are used to launch weather forecasts. It essentially creates snapshots of the current atmosphere, such as temperature, wind speed, humidity, and air pressure, at thousands of locations around the world, including areas in between where there are direct measurements from satellites. Because it runs on GPUs, it can do all of this in minutes rather than the hours needed for supercomputers, he said.

It’s an important step, Pritchard added.

“While the AI community and the research community have focused a lot on the prediction models over the past five years, this data assimilation task – this state estimation task – has remained largely unsolved by AI, yet it consumes roughly 50 percent of the total supercomputing loads of traditional weather,” he said.

The new AI forecasting models and tools join others on Earth-2 that are used for both commercial and non-commercial settings and are available on both GitHub and Hugging Face. Nvidia also noted a range of organizations that are using or testing the Earth-2 offerings, including the Israel Meteorological Service, which is using CorrDiff, The Weather Company, which is evaluating Nowcasting, and financial services firms like S&P Global Energy, which is using CorrDiff for risk assessments.