The Future of Quantum Computing Will be Hybrid

Behold, the myth of the magical machine that, given all known quantities, will flawlessly arrive at the most perfect possible answer–all as a self-contained, limiting (and limitless) unit.

Although there has been plenty of hype around the future potential of quantum computers for solving some of the world’s most complex computational problems, the fact remains that these systems are still limited in scope and applicability. It is all still promising, but not enough yet for us to believe that quantum computers exclusively will someday dominate the vast numbers of scientific calculations and simulations that are currently run on large supercomputers.

So why is it, in the general view of quantum computing’s future, that there seems to be an “all or nothing” zeal? That we will either have a complete shift to just quantum devices over classical architectures? And more pressing, who is to say it cannot be a blend of both, a weaving of the best of both computing worlds where some calculations, which are best suited for CPU cores have results that are meshed with those from quantum systems?

For those who already dwell in the supercomputing sphere, this all probably sounds rather familiar. After all, it is simply another (albeit more sci-fi sounding) variant of the offload or acceleration model, which for existing systems that use accelerators to boost certain parts of the workload, means that each part of the system handles what it is best equipped for. A GPU or another coprocessor can take highly parallelizable chunks of code and process those quickly while other parts of the application fall under the host CPU. This same offload is central to an emerging theory that suggests, very simply, that calculations of dramatic scale (predictive materials analysis, for example) that can’t be tackled on even the largest of supercomputers, can offload some of their heaviest lifting to quantum devices for results, then run the rest of the simulation without the time and energy-intensive hitch of the optimization work that goes into the core application.

There are some problems with this idea, of course. For instance, it’s a rather sci-fi example of Amdahl’s Law in action wherein the total time it takes to arrive at a result will always be dependent on the slowest part of the execution. That slowest middle point is likely not to be in hardware, but rather, due to the second big problem–the programming models and data representations are entirely different in classical versus quantum computing, meaning there is no seamless way to dynamically share and spin data quickly between the two worlds. These are problems that will emerge in the future, says one of the lead thinkers behind the potential hybrid quantum supercomputing approach, Jarrod McClean.

McLean, a 2015 Alvarez Fellow in Computer Science at Berkeley National Lab finished his PhD at Harvard for his PhD focused on chemical physics, where he evolved his understanding of quantum chemistry and chemistry in tandem, in part because the two were closely interlinked in his mind naturally—not as two separate fields or ways at looking at quantum chemistry and complex materials science problems at scale.

The trouble in understanding how existing supercomputers and future quantum computers will work together is a theoretical problem of its own, but McClean says the time to consider how to create hybrid systems comes with great impetus. “The big picture goal from our end can be seen in something like predictive materials properties analysis. So drug binding affinities, designing new catalysts from first principles, studying nitrogenates for nitrogen fixation, high-temperature superconductivity, or looking at bio-metallic enzymes we have trouble studying with classical approaches.”

“For these problems, they’re out of size reach with classical methods but with quantum computers, we’ve identified some pretty clear resource requirements for computations that would in theory take hundreds of millions of years that could now take only a few seconds.”

“The models we are envisioning and testing in small prototypes show that you have your HPC or normal classical system ad use that do pre-computation, to define the problem. From that, there might be a set of specific subroutines that will run faster on the quantum device. We offload that, perform the calculation, then feed that and only that back to the classical system with information that would have taken a very long time to get to. The quantum computer is, in this sense, a co-processor of sorts.”

“In terms of these particular domains, especially in materials research, we have an advantage because we’re not solving an abstract number theory problem. We’re making one quantum system look very much digitally like another and interrogating that digital copy of the system. When the problem we want to solve is a quantum one and the system is a quantum one, we translate a problem in its natural domain versus load a quantum system into a classical system and hoping it does well,” McLean explains.

But to make the most practical use out of this approach now, quantum computers alone cannot achieve the full host of results on their own. McLean says there is a common misconception that quantum systems can perform a computation on all possible inputs, which is not the case given the way it reads out data. Defining the problem with classical methods first and using those results for further analysis on classical systems is the future.

On that tricky programming point, McLean describes a feedback loop between the two systems. One computation is offloaded to get a value, when the classical computer must do a “complicated update step to adjust to the parameterization of the quantum device.” That still does not solve the problem that coding for quantum computers is still a young science, but the ability simulate an existing quantum state inside a quantum system will spur the development of more accessible programming models and hopefully ease the update transition cycle to move results off quantum systems and onto classical supercomputers.

This is not to say that quantum computers are not independently solving complex problems inside Lockheed Martin, Google, and the handful of other labs that are in possession of a D-Wave machine, but without a robust way for classical systems to interact with and interpret as well as define that data, they will remain rooted along a narrow set of potential algorithms. The time is right, McLean said, to look at ways classical and quantum computers can co-exist and to push more effort toward efficient programmatic approaches and new algorithms that can bring together the best of both these worlds.

At the very least, as McLean reminds (and will do so again at SC15 in Austin next week), quantum computing, for all of its promise, is not something that will happen in a vacuum–at least not in terms of computation systems. Supercomputing will continue to play a critical role in testing, developing, and ultimately defining the conditions and parameters for how quantum computers tackle the big questions at hand–and will continue to play a role how the resulting knowledge is further considered. To suggest that quantum computing will supplant supercomputing anytime soon, which is what the mainstream representations have put forth, is misguided in that it discounts how quantum devices, at this point anyway, are offload engines–bleeding edge coprocessors versus replacements for all problems.

Sign up to our Newsletter

Featuring highlights, analysis, and stories from the week directly from us to your inbox with nothing in between.
Subscribe now

Be the first to comment

Leave a Reply

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.