In fact, a more salacious title might have said it will not be threat in most of our career spans, if in our lifetimes at all.
For problems of high scientific value, some type of classical computer will be required unless there is a fundamental change in how algorithms are conceived from the ground up. And in traditional supercomputing, that is not going to happen either—again, at least not in the next decade or more. The practical HPC community is nose to the grindstone getting (mostly decades-old) codes prepared for exascale and perhaps an accelerator. Quantum readiness? It’s on the research horizon for codes, just like its on the evaluation agenda for supercomputer makers like Cray—but it’s too far out to see clearly.
Even the most optimistic estimates from those in the quantum commercialization trenches at D-Wave, IBM, and Intel predict the arrival of reliable, large-scale quantum computers for challenging applications in scientific and technical spheres put meaningful share at least a decade away. While there will be success for both universal (gate model) and annealing quantum machines, the problem set will be thin even if the performance and efficiencies blow classical methods away.
Though that clock is ticking, some of the world’s most well-known supercomputer makers are not worried that their business is under pressure, even after the “quantum advantage” tipping point happens. Some see the space as valuable to their core business by providing the platforms for quantum simulation and compilation. And after all, for some of the most high-value problems in physics-heavy traditional HPC areas will always need some kind of classical underpinning with quantum machines serving as powerful co-processors for narrowly-defined parts of those workflows.
As CTO of supercomputer maker, Cray, interconnect pioneer and famed architecture expert, Steve Scott says his role requires looking ahead at all new technological options and evaluating what might be relevant to begin building into systems roughly five years before they mature. And while he believes that quantum computers will eventually find a valuable place, even as accelerators to traditional HPC applications, that roadmap is long and complicated.
In short, Cray is not pursuing any kind of quantum strategy at the moment following a detailed evaluation.
“We are probably five-plus years from the first demonstration of quantum beating a classical computer on a contrived problem—one that highlights capability but not problems people are actually trying to solve. We are probably ten-plus years from practical quantum advantage where quantum is the most effective and cost-effective way to solve an actual problem. And we are at least 15-20 years away from having algorithms with strong advantage. There are a variety of algorithms that require reliable qubits and a large number of them, but we are a long way from having that,” Scott argues.
He is confident that some of the core engineering challenges will be solved over time (cooling and isolating quantum machines, keeping qubits coherent, getting signals in and out of the machine, etc.) but there are other problems that he thinks will stretch the quantum roadmap out for a larger number of real and challenging applications.
Even though we can see a future where there are one thousand reliable qubits, there is an interesting constraint that is not frequently discussed. The exponential power that comes with adding qubits is part of the appeal but the problem is the specification or input of the problem. It must be defined to a certain number of qubits which means, as Scott explains, that there needs to be a “very compact problem description but an exponential amount of computation followed by a very small output that can be checked with a classical computer.” This limits even further the problem space that could be tackled by quantum systems and the code base issue further complicates that.
“When it comes to something like factoring where you have input numbers and can run the algorithm, get output and check back easily, that is a perfect example, think Shor’s algorithm. But this is also the reason you can’t simulate the weather or virtually crash a car or design a jet engine; the description for those problems requires a lot of data and with a quantum computer, you have to be able to describe the computer in a certain number of qubits. There might be exponential acceleration possible but the input is constrained to a very tiny problem. There is an interesting niche of problem spaces you can contemplate once the engineering problems of reliable qubits and scaling them up are solved.”
“There are only a small number of core algorithms, some have suggested there are six, that are amenable to quantum and because of this input problem, you have to concentrate on problems that have exponential, not just polynomial or quadratic advantage; it has to be exponential advantage to beat classical machines.”
For traditional supercomputing, the problems for quantum will have to be written from scratch, Scott contends. Consider that there are still many million-plus lines of commonly HPC community codes that have been works in progress since the 70s. These are not going away anytime soon, and they cannot be translated to quantum. It took GPUs a long time and a lot of software footwork on the part of researchers and Nvidia to bring offload model accelerators into the HPC fold and it is still the exception not the norm this decade later.
“There will be entirely new applications for very specific fields, likely in quantum chemistry and materials science. There will be inroads and they will have new capabilities in their specific range that could have potential big impacts 10-20 years out but there is no chance to take large existing codes and translate those for quantum.”
In terms of how classical and quantum systems might interact, Scott says he cannot see a time in the medium range when there will not need to be a lot of special purpose software to interface to the quantum system. “Even with new codes specifically for quantum computers there will always be a coprocessor model back to a classical system that might do something like problem setup, a loop through many possibilities to explore, and then for each possibility there might be something akin to a subroutine call to the quantum coprocessor to get back results.” He says that it many cases there might not need to be supercomputer-class resources required from the classical side, but in the near term many problems he can envision would have only small parts of a kernel passed over for quantum acceleration.
“As CTO of Cray, I have to think about what future technologies are coming along, keep an eye on what’s interesting, and start thinking seriously about how to incorporate those into machines when they are at that middle-range time horizon. With quantum computers we don’t think there will be any real impact on the HPC market for at least ten years and it just doesn’t make sense for us to spend resources and time here. It will likely have a very big impact on a narrow slice of our market but we’re focused on building systems that help people sole their biggest challenges. Quantum does not factor in the medium term.”
For more context on the co-processor concept, timelines for more HPC-like application readiness, and discussions of use cases beyond D-Wave-like optimization problem solving and next-gen applications for gate model peruse this article set.
Most people don’t realize that quantum computers are not an upgrade to classical computing. They’re Special Ops. Not backbone infantry, not air power, not navy. There’s a few things they do with magical effectiveness, and we’ll continue to find more uses for them. But for the vast majority of computing applications, they’re no better than your laptop.
AI is much more likely to revolutionize computing.