The refrigerators are on order and the lists of scientific applications are formed for another new quantum architecture to enter the quantum hardware space.
This one, of course, is not a commercial effort but rather a deeply funded effort for Lawrence Berkeley National Laboratory to explore the best approaches to create superconducting qubits for science.
With $30 million over five years in hand to build the Advanced Quantum Testbed (AQT), the lab will focus on qubit connectivity, lengthening coherence times for quantum simulations, and exploring how to create reliable quantum chips that can tackle problems that future exascale systems might not be able to touch.
Problems in quantum chemistry, materials science, and high-energy physics are at the top of the target application list, but so too are other problem areas that fall inside the Department of Energy’s scope, according to Jonathan Carter, Co-PI on the AQT.
Carter tells The Next Platform the AQT teams will be focused on gate model quantum systems (as opposed to the quantum annealing approach from commercial company, D-Wave) and how they might explore differentiating from IBM, Google, and others by exploring lower fidelity qubits and different ways of connecting qubits that lead to longer coherence times and better reliability.
“There are interesting questions that are still unresolved in several areas of scientific simulation that are going to be hard to crack with classical computers for the foreseeable future, even with exascale computing. For instance, there has been a lot of work over many years modeling processes that involve coupled electronic and nuclear motion, but getting close to this will take far more than anything we have on the classical horizon,” Carter explains.
The goal for AQT is to create an open architecture that provides both low and high-level access to the gates as well as more general API that allows users to access the system through more layers of abstraction. The team will not get bogged down in building any more software than is necessary to interface with the future device and will work to integrate the higher-level Q# and other quantum languages when possible.
While having higher qubit counts and more fault tolerant quantum computing are critical issues for the larger ecosystem, Carter says they are far more interested in connectivity, extending coherence, and experimenting with fidelity for different quantum algorithms. “Topologies are a key area of experimentation. Right now, there are no arrays of qubits in any other quantum devices that have all-to-all connectivity. There are nearest neighbor, ring, and a few other ideas, but all of these choices affect how difficult it is to run an arbitrary sequence of gates—they all come with costs in terms of coherence.” The open question is what the ideal topology might be for scientific simulations in general—and like so many things in computing, it all depends on the application or algorithm.
All of this explains why it is not useful for Berkeley to use quantum systems from the few companies making such devices. IBM and Rigetti have cloud-accessible qubits, with D-Wave doing the same in coming months (which, by the way, is an important competitive move for the handful of quantum system makers). Carter says first, the lab has years of expertise in designing and fabricating superconducting devices. The PI of ATQ, Irfan Siddiqi, ran UC Berkeley’s quantum nanoelectronics lab and worked with teams in physics and other areas to create hardware for experiments, taking advantage of existing packaging and fabrication facilities.
Second, the cloud access is limited to a few qubits, which is not enough for the grand science challenges the lab hopes to tackle. “IBM has to keep their device stable for many users at a time. We want to be able to dedicate a block of time and run pulse sequences, for example, that are more general than sequences of one or two-qubit gates.”
Some of the real innovation in quantum technology relies on superconducting technologies, which again, Berkeley Lab has expertise with. ATQ is not starting from scratch on the superconducting materials side but they are looking at some of the subtle differences between how the various quantum hardware makers approach circuits and what might be best for scientific quantum computing.
“One of the things that we are focused on that is different is around the idea of fixed frequency qubits. We see it is possible to manufacture arrays of qubits where the resident frequencies are fixed, but if you need a lot of those on a wafer, we have to make sure those frequencies are sufficiently distinguished and separated so there is not a lot of cross-talk when trying to alter the state of just one. This is a very hard problem—fabricating one chip with many qubits, each with precisely defined frequencies. Having tunable frequencies creates a lot more complexity but it does seem to be a viable path,” Carter says.
“Because our team can design and fabricate quantum processors with considerable flexibility, we can tailor the design to meet our scientific needs,” Carter said. “We also have team members who are working on the classical control hardware and software needed to operate these chips, as well as the algorithms that will run on them.”
Over the last five years, Berkeley Lab researchers developed quantum chemistry and optimization algorithms targeting prototype superconducting quantum processors funded by Laboratory Directed Research and Development (LDRD) grants. They proved the viability of their work by running these algorithms on a quantum processor comprising two superconducting transmon qubits developed in Siddiqi’s Quantum Nanoelectronics Lab at the University of California Berkeley. The success of their LDRD work eventually paved the way for two DOE-funded projects to explore quantum computing for science.
The Advanced Quantum Testbed is the latest project for Berkeley Quantum, a partnership that harnesses the expertise and facilities of Berkeley Lab and the University of California, Berkeley to advance U.S. quantum capabilities by conducting basic research, fabricating and testing quantum-based devices and technologies, and educating the next generation of researchers. In addition to the collaboration with MIT-LL, AQT will tap resources and expertise of a number of DOE user facilities, including the Molecular Foundry and National Energy Research Scientific Computing Center, both located at Berkeley Lab.
Sign up to our Newsletter
Featuring highlights, analysis, and stories from the week directly from us to your inbox with nothing in between.