Aurora Exascale System To Advance Dark Matter Research

Scientists have unlocked many atomic secrets through physics research that studies particle interactions such as quarks, gluons, protons, and neutrons within the nucleus of an atom. However, little is known about a mysterious form of matter that has been named dark matter.

This is one of the research challenges being pursued by scientists preparing for the upcoming Aurora exascale supercomputer from Intel and Hewlett Packard Enterprise, which will be housed at the US Department of Energy’s Argonne National Laboratory. Supported by the Argonne Leadership Computing Facility’s (ALCF) Aurora Early Science Program (ESP), a team based at the Massachusetts Institute of Technology (MIT) is using advanced machine learning (ML) and state-of-the-art physics simulations to help unravel the mysteries of dark matter while simultaneously providing insights into fundamental particle physics. The ESP research team includes: William Detmold and Phiala Shanahan who are co-principal investigators from MIT as well as researchers at New York University and their collaborators in Europe and at Argonne National Laboratory. The hope is that by moving from the current petascale systems to the Aurora exascale system, the researchers will be able to do simulations on the entire nucleus of an atom – and not just a single proton.

“One of our biggest challenges in the dark matter research is a computational challenge. Our ML and Lattice Quantum Chromodynamics (LQCD) calculations are computationally intensive and currently use approximately 100 TB of data per simulation run. Running on the Aurora exascale supercomputer will allow us to increase the data used in our calculations by a factor of ten. In addition, we will be able to perform research on the dark matter puzzle that is not currently possible on existing petascale supercomputers,” states Detmold.

Dark Matter Research Extends To Underground Mines

Detmold states: “Scientists currently understand that protons are made up of quarks and gluons that are the fundamental building blocks of the universe. Protons make up 99 percent of visible matter but only make up approximately 20 percent of mass content of the universe. The term dark matter has been applied to the other unknown matter that is invisible to the naked eye and has not been detected by current instruments but is inferred based on its gravitational effects.”

Many experiments relating to dark matter have been conducted using detectors built from materials such as sodium iodide and xenon. In one experiment, the search for clues about dark matter is moving deep underground in a former gold mine in Lead, South Dakaota. The mine is now a research facility called Sanford Underground Research Facility (SURF). Once completed, researchers worldwide will work deep underground on the LUX-ZEPLIN (LZ) experiment. A stainless steel vessel will be filled with layers containing water, liquified gadolinium and linear alkyl benzene, and an inner cylinder containing liquid xenon. Sensors around the acrylic tanks are designed to detect the minute flashes of light that dark matter colliding with a nucleus may produce.

The ESP team’s dark matter research is based on the theory of quantum chromodynamics, which explains the way quarks interact with one another inside the nucleus of an atom. The team use ML and LQCD supercomputer simulations for physics experiments to determine the atomic constituents and their potential interactions with dark matter. LQCD is a set of sophisticated numerical techniques that compute the effects of the strong interaction of particle physics that binds quarks and gluons together into protons and neutrons and then to the nuclei of an atom.

Parts of the LQCD calculations can only run on large-scale supercomputers. Detmold states, “In our research, we can’t look at the entire universe. We take a small region looking at the nucleus of an atom for our calculations. The size of the region will influence the outcome. Our calculations look at a small box and increase the region to larger areas or boxes and extrapolate the results to the infinite box size limit.”

Artist’s impression of a nucleus in a lattice QCD calculation. Numerical calculations use a spacetime lattice (grid) to determine the properties and interactions of nuclei, including their potential interactions with dark matter.

With the power of the future Aurora supercomputer, the team wants to use their calculations to analyze the nuclear matrix elements to understand how the nucleus reacts when it gets hit by a particular type of dark matter. “Our goal is to be able to evaluate what is happening if a dark matter experiment shows an interaction and recoil of the nucleus. With our LQCD and ML software, we want to be able to look at a potential dark matter model, do calculations to predict the strength of dark matter – nucleus interactions and then compare that number to an actual experiment,” states Detmold.

MIT Machine Learning Software Developed For Dark Matter Research

The MIT team developed their own ML software for the dark matter research to solve some of the challenging computational tasks. Their software is designed to speed up HPC algorithms in certain parts of the LQCD calculation. The team’s ML algorithm is optimized to take advantage of other software tools such as USQCD libraries, TensorFlow, HDF5, and PyTorch.

The ML software uses the self-training method where the model generates samples of typical configurations of quarks and gluons, and the program learns from the samples to more accurately generate new samples. These ML-generated configurations can be used for a variety of other physics calculations besides the dark matter research that is the focus of the ESP project.

The researchers use convolutional models that are ubiquitous in ML but are typically two dimensional convolutions that are needed for images classification/generation tasks. The LQCD calculations involve three-dimensions of space plus time, so the team needs convolutions that work in four dimensions. Convulsions are not typically optimized to work in four-dimensions, so the team is working to verify that the ML software will work on the future Aurora supercomputer.

Detmold notes the caution must be used when creating ML tools. “When doing calculations in fundamental science and physics, we must be careful that ML is not changing the nature of the problem that we are solving. We focus on implementing algorithms that are exact, as well as understand uncertainties that are introduced.”

Preparing For Aurora

While they are waiting on Aurora, the team typically works on the ALCF’s Theta supercomputer but are also working on other supercomputers including “Summit” at Oak Ridge National Laboratory, “Frontera” at the Texas Advanced Computing Center, and “Marconi” at Cineca in Italy).

Aurora will incorporate new Intel compute engines, including the “Ponte Vecchio” Xe HPC GPU accelerator and the “Sapphire Rapids” Xeon SP CPU, as well as the DAOS storage Intel has taken a shining to. Detmold notes that the dark matter ML tools run best on GPUs but supercomputers currently use different GPUs, which requires time consuming programming changes. Aurora will use the Data Parallel C++ (DPC++) programming language as part of the Intel-led cross-industry oneAPI initiative designed to unify and simplify application development across diverse computing architectures. Detmold indicates that HPC researchers need a tool such as oneAPI to save time.

The future Aurora supercomputer architecture is designed to optimize deep learning and the software stack will run at scale. ALCF postdoctoral researcher Denis Boyda is working on the Theta supercomputer to make sure that the MIT software can scale correctly. “Our team’s large box calculations can become a bottleneck and Aurora’s DAOS storage and architecture is designed to reduce the bottleneck issues,” indicates Detmold.

According to Detmold, “Going from petascale to exascale supercomputers will allow us to do research on a different scale and consider a different type of problem. For example, doing research on a petascale system allows us to do research on a single proton such as determining the proton mass. With an exascale system, the team can consider doing simulations of the nucleus of helium and eventually carbon. Even with an exascale supercomputer, researchers will not be able to do dark matter research on a larger nucleus such as xenon. Our team has developed specific machine learning software to accelerate the calculations to speed our dark matter research.”

The ALCF is a DOE Office of Science user facility.

Linda Barney is the founder and owner of Barney and Associates, a technical/marketing writing, training, and web design firm in Beaverton, Oregon.

Sign up to our Newsletter

Featuring highlights, analysis, and stories from the week directly from us to your inbox with nothing in between.
Subscribe now

Be the first to comment

Leave a Reply

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.