Google Gives A Peek At What A Quantum Computer Can Do

Four years ago, Google engineers boasted of achieving “quantum supremacy” following experiments that showed its 53-qubit Sycamore quantum system solving problems that classical supercomputers either can’t or take a very long time to accomplish. At the time, Google was slapped around by rivals in the quantum space, with competitors like IBM saying the job undertaken wasn’t overly complex and could be solved by a traditional system, though it may take longer.

But the promise of quantum computing – to scientists and researchers who could solve an array of currently impossibly complex problems and to hardware and software vendors that stand to move into a highly promising and potentially lucrative growth area – ensures that the work will go on, even in the face of criticisms and challenges.

So it is that Google is once again touting quantum supremacy following the publication of research showing that the next generation of the Sycamore system – this one with 70 qubits and a quantum processor that is 241 million times more powerful than the previous iteration – can outperform the most powerful supercomputer in the world, running calculations that would take the massive 1.68 exaflops “Frontier” system at Oak Ridge National Laboratories 47 years to complete.

That was a key goal of the research outlined in a paper published on the ArXiv site in April but first written about this month, with the Google researchers writing that “quantum computers hold the promise of executing tasks beyond the capability of classical computers” and adding that with the experiment, “we estimate the computational cost against improved classical methods and demonstrate that our experiment is beyond the capabilities of existing classical supercomputers.”

Quantum computing has been talked about for decades, but the sector still has a way to go. It could take more than a decade for full-functioning, error-corrected quantum systems, with a million or more qubits, to be up and running. And that doesn’t take into account the various other challenges facing it, from bringing the cost of the systems and computing time down to a level that makes it financially worthwhile to developing the software to run on them to finding enough skilled people to build and manage them.

There also are questions about how they’ll be used, at least initially. Will they be standalone systems? Run in the cloud? Part of a combined classical-quantum hybrid, taking on workloads that the traditional supercomputers just can’t? The industry also will likely have to settle on a few of the various modalities being used by the vendors to create qubits, the engines for driving quantum systems.

Still, there’s a lot of work being done, including among such IT titans as Google, IBM, Microsoft, AWS, and Intel, as well as numerous smaller companies and startups. Even organizations in other industries are doing work in this area: financial services firm Fidelity is researching quantum tech in its Center for Applied Technology. And while results like those from Google are incremental, they show that progress is being made and give a hint at the direction the work is going.

An example is joint research put out last month by IBM and UC Berkeley that showed that even their still-experimental forms, quantum systems can outperform their traditional counterparts. Scientists from both ran calculations for complex physical simulation workloads on IBM’s 127-qubit Quantum “Eagle” processor – even without the fault-tolerant quantum circuits that are needed (and that processors are not yet ready for) to tamp down the noise that can affect qubits – and classical systems.

“We report experiments on a noisy 127-qubit processor and demonstrate the measurement of accurate expectation values for circuit volumes at a scale beyond brute-force classical computation,” the researchers wrote in the research report. “We argue that this represents evidence for the utility of quantum computing in a pre-fault-tolerant era.”

The intent of the Google research fell along similar lines, with the researchers writing about the “intensifying quantum-classical competition.”

“The interplay between computational complexity and noise is highlighted by recent RCS [random circuit sampling] experiments, starting with a 53-qubit Sycamore quantum processor in 2019,” they wrote. “Ever since, similar experiments with expanded system sizes and reduced noise have been reported, while classical algorithms have also advanced substantially.”

The research set out to address two primary questions, they wrote: “Does there exist well defined boundaries for the region where the exponentially large Hilbert space is, in fact, leveraged by a noisy quantum processor? More importantly, can we establish an experimental observable that directly probes these boundaries?”

The researchers ran RCS experiments on the second-generation Sycamore system that included 70 qubits at 24 cycles and identified what they said were “distinct phases” fueled by the interplay between the noise and quantum dynamics. They determined the boundaries of the phases using what they called “finite-size studies with cross-entropy benchmarking [XEB].” The interplay between the quantum dynamics and noise can lead to the phases.

The result is a quantum system that Google says can significantly outperform Frontier, the world’s first certified exascale system. The massive system is based on Hewlett Packard Enterprise’s Cray EX235a systems and powered by custom “Trento” Epyc CPUs and “Aldebaran” Instinct MI250X GPU accelerators. It holds almost 8.7 million cores, includes HPE’s Slingshot-11 interconnect, and offers 1.194 exaflops of performance.

According to Google’s research, it would take Frontier 6.18 seconds to run the same calculation that the 53-qubit Sycamore system can do instantly. Against the 70-qubit next-gen Sycamore, that number stretches to 47.2 years. The experiments showed that even a noisy quantum system can run certain calculations significantly faster that even the most powerful classical supercomputer and provided “direct insights on how quantum dynamics interacts with noise,” the researchers wrote. “The observed phase boundaries lay out quantitative guidance to the 7 regimes where noisy quantum devices can properly leverage their computational power.”

That said, the quantum field right now is still running up to some of the challenges we spoke about above. “Looking forward, despite the successes of RCS achieved so far, finding practical applications for near-term noisy quantum processors still remains as an outstanding challenge,” they wrote.

It echoes back to the patient approach that Rick Stevens, associate laboratory director of computing for environment and life sciences at Argonne National Laboratory, said in May is needed when talking about anything in HPC. That includes everything from zettascale systems to, in this case, quantum. Argonne this year is due to turn on “Aurora,” another exascale system designed by HPE, but this one is based on Intel’s “Sapphire Rapids” Xeon SP CPUs and “Ponte Vecchio” Max Series GPUs.

It took a decade and a half to get Frontier online and zettascale and quantum systems could take another 15 to 20 years, Stevens said during a webinar.

In HPC, the road is a long and winding one.

“This is a long-term game,” he said. “If you’re interested in what’s happening next year, HPC is not the game for you. If you want to think in terms of a decade or two decades, HPC is the game for you. . . . These are just early days in that. We’ve got a long way to go, so we have to be thinking about, what does high performance computing mean ten years from now? What does it mean twenty years from now?”

Sign up to our Newsletter

Featuring highlights, analysis, and stories from the week directly from us to your inbox with nothing in between.
Subscribe now

3 Comments

  1. I find quantum computing to be a fascinating prospect (inasmuch as it may be “fundamental” in the sense of physical reality, and may be needed where process lithography reaches limits related to the size of atoms — if that has anything to do with it), but should readily admit that I do not understand it well, or much, or maybe even at all.

    Nevertheless, for the one or two Scheme aficionados ou there, I’d like to point to Choudhury, Agapiev, and Sabry’s 2022 “Scheme Pearl: Quantum Continuations” where they apply delimited first-class continuations to solution of “Simon’s problem” (backtracking search and non-determinism, for decision tree complexity, with exponential oracle separation). It suggests to me that quantum computing might implicitly embody (or seek to exploit) McCarthy’s computational non-determinism, in parallel, within the fluctuating quantum interactions of its qubits.

    Irrespective, it is great to see that IBM, Google, Universities, and Labs are competing to impress us with their wherewithal in this domain (and nice article)!

    • Interesting paper! One major issue is the I/O bottlenecks in quantum machines, that defavor large-data applications, and the other is the type of speedup: quadratic vs exponential. Computational problems that do not require a lot of data I/O, and for which algos provide exponential speedup, or that have a deep oracle (depth >> 1) should run faster on quantum machines: cryptanalysis using Shor’s algorithm, and quantum problems in chemistry and materials science. Also, Simon’s Problem shows exponential speedup, but Grover’s Algorithm is just quadratic.

      The Google paper linked in the “That was a key goal …” paragraph of this TNP article focused on cross-entropy benchmarking of noise-induced near-term quantum-decorrelation to identify phase boundaries of some sort (using Random Circuit Sampling — RCS). It seems to conclude that the very large speedup that they obtained (relative to Frontier) is applicable to “certified randomness generation”.

      I’m pretty sure that some of my students used this method to answer questions in their mid-term exams!

Leave a Reply

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.