With the end of physical nuclear testing came an explosion in computing technology at massive scale to simulate in rich detail and high resolution the many scenarios and factors that influence a hefty nuclear weapons stockpile.
From physics simulations demonstrating weapon detonation, to complex chemical reactions detailing the breakdown of nuclear material over time, as just a couple of examples, the need for very large-scale modeling and simulation resources grew exponentially in the wake of these demands–particular following the test bans of the mid-1990s.
While supercomputing as we know it would have likely evolved naturally given the demands of a wide range of scientific and commercial applications, key enabling technology on both hardware and software fronts, rolled from these supercomputing requirements. These same drivers, at least on a technology level, are being kick-started once again as the limitations in future computing are becoming clearer. Even with exascale-class supercomputers to model, monitor, and maintain the nuclear stockpile in the U.S., the impetus to look at what is next is as clear as ever, according to Mark Anderson. Deputy Program Director for Advanced Simulation and Computing at Los Alamos National Laboratory.
In his role, Anderson supports various scientific computing initiatives that fall under the National Nuclear Security Administration at Los Alamos and connected NNSA national labs. While this includes the oversight of existing and forthcoming supercomputers (more on how those are set to be used for future applications here), Anderson tells The Next Platform that teams at the NNSA labs are increasingly cognizant of the future limitations of Moore’s Law and the CMOS-based approaches as conventional technologies hit power, programmability, and other walls in years ahead. Last November, he was tasked with starting an investigation into whether quantum computing might be a worthwhile investment for NNSA objectives. Following the initial meetings and a workshop with several other national security agencies in February, from which a whitepaper detailing the findings emerged, it was found that there might indeed be a host of potential applications for the quantum system. The end result of this investigation was the purchase of a D-Wave 2X system, which will arrive at the lab in early 2016.
What is noteworthy here, aside from the fact that this was one of the first systems to be acquired by a non-public company, is that this shows agencies with mission-critical national security priorities are taking this esoteric technology seriously enough to invest monetary and research resources into. But more subtly, this is interesting in that the quantum system is being considered less as a potential replacement for advanced modeling and simulation for nuclear security requirements than a potential accelerator–at least in the future. If this all sound familiar, consider an article from earlier this week where we spoke with a researcher at Berkeley National Lab about how the future of quantum computing would likely, at the beginning at least, work in tandem with modern supercomputers as a type of co-processor or accelerator for certain types of intractable scientific computing problems. In that conversation, he suggested that using an offload model wherein complex portions of scientific codes are handed over to the quantum devices while other parts of the simulation run on classical supercomputers, will be a probable path in the evolution of quantum computing.
This approach is similar to how Los Alamos is looking at the role of traditional supercomputers versus quantum computing. Instead of seeing standard supers that run large-scale weapons simulations as disappearing in favor of quantum approaches anytime soon, Anderson says that there is a co-processing potential they will explore over the next couple of years with the machine. “There are specific aspects of our applications where a quantum computer is a reasonable fit. If we can accelerate those applications this way, an interesting future challenge will be to pair, heterogeneously, conventional and quantum computing.”
There are ways the NNSA could, in theory, make use of the quantum annealing device they have now as they are best suited to a range of optimization problems. These have a fit in engineering, physics, pattern recognition and other areas, but the limitations, especially when it comes to simulation and study of a vast well of nuclear weapons reserves, are clear. “With the size of the quantum machines currently, which are on the order of 1000 qubits maximum, the quantum technology is not as good as the HPC technology. However, once it evolves and hits closer to the 4,000 qubit range it could very well be more efficient than existing HPC.” He reminds that all of this is still in its infancy and while there are no plans to make use of the quantum system for mission-critical NNSA work in the near future, the investment is worthwhile for what it means for development and interestingly—for the cultivation of national lab talent.
With Google, Facebook, Amazon and the like gathering the mindshare in terms of young, bright developer talent, a challenge for the NNSA, not to mention national labs in general, is competing for talent against that kind of appeal. Anderson said repeatedly during the conversation that one goal is to find sharp young minds who are interested in how quantum computers might solve a next generation of challenges. It is the hope that having a quantum computer available for experimentation will be a compelling motivator as young computer science researchers determine where their futures might lie.
Sign up to our Newsletter
Featuring highlights, analysis, and stories from the week directly from us to your inbox with nothing in between.