Quantum Has Its Role, But In-Memory Is the Way To Go

A lot of money and time is being thrown at quantum computing by vendors, including IBM, Google, Microsoft, and Intel, and there is the normal competitiveness between the United States and China and Europe as well as work in Japan. We are at the early stages of quantum computer development. D-Wave has been selling systems using its quantum annealing architecture and Atos is offering a quantum simulator. Meanwhile, other companies are making strides. IBM late last year rolled out a 20-qubit chip, a key part of the vendor’s Q Network initiative to build an ecosystem around its cloud-based Quantum Experience.

Meanwhile, Microsoft has its quantum simulator and Q# language, Intel has a 17-qubit processor and is partnering with QuTech in its quantum computing efforts, startup Rigetti Computing has a 19-qubit superconducting chip accessible via the cloud and Google is doing its own work in the area, having run tests on chips with six, nine and 20 qubits. IBM, Rigetti, Microsoft, and others also have software efforts underway.

The hope is that somewhere down the line, maybe a couple of decades from now, quantum computers will be able to run some portions of commercial workloads, which would be a significant step forward, given that such systems theoretically will be many times faster than current supercomputers at certain kinds of numerical work. The keys are the subatomic qubits – or quantum bits – which are to quantum computers what bits are to traditional systems. Qubits can hold the value of 1 and 0 simultaneously and can be entangled, giving them even greater possibilities, but they are also famously fragile and run best at extremely cold temperatures – in the millikelvin range – such as those in deep space.

We have double-digit qubit technologies today, systems with commercial capabilities will need thousands of qubits. But given the amount of work, time, and sustained investment it will take to develop quantum computers and the uncertainties and challenges surrounding the effort –– some scientists and engineers in the field try to temper expectations, media excitement and concerns about quantum computing’s perceived threat to current cryptographic techniques – there is concern that such systems will enable bad guys to easily blow past computer security measures and make all machines easily vulnerable to exploits. But not any time soon.

Jim Held, Intel Fellow and director of emerging technologies research at Intel Labs, earlier this year said that “quantum computers can compute problems we cannot do today. However, they will augment rather than replace existing computers,” adding that “quantum computing is a research area that is a decade away from running commercial applications.”

Count Ray Beausoleil among those urging the community to look at quantum computers in a practical way. Speaking at the recent HPE Discover 2018 conference, Beausoleil, Senior Fellow at HPE Labs and head of the organization’s Large Scale Integrated Photonics research group, said the best way to view a quantum computer is “as an accelerator to solve a certain type of problem” rather than a really fast and really powerful general-purpose system. This seems to be an evolving consensus view, with even Google conceding that scale-out quantum computer will require a massive supercomputer to maintain its state and make use of its output.

“We have been looking for quantum computation to become a thing for 70 years,” Beausoleil said. “The types of problems that quantum computers are really suited for, that they’re perfect for, are questions that are very simple to state, that have very simple answers, but the intermediate computation is so complicated and so complex that you essentially have to wait forever for your X86 computer or even memory-driven computer using ordinary classical technology to finish. Eventually, what you want to be able to do is use a quantum computer in particular to simulate quantum systems.”

They might be best used to develop new medicines and drugs, design new materials or address highly complex theoretical physics problems. However, their role in enterprise environments will probably be negligible.

“Quantum computers are amazingly good at simulating other quantum systems – or we hope they are, they should be, there’s every reason to believe they will be – but they’re really terrible at the three Rs – reading, writing, and arithmetic,” he said. “We don’t actually know how to take a big database and code it in quantum bits and then work on it. At the current time, if we want to take a petabit of information under ideal circumstances, we could write that petabit onto only 50 qubits. That’s one of the things that’s amazing about the way the quantum complexity scales – every factor of 1,000 is just 10 qubits. Those are ten noise-free, perfect qubits. The problem is that, at this point in time, we have no idea really how to take a petabit of information and code it on 50 qubits. That’s an unsolved problem. And nature is twice cruel – there’s no such thing as a quantum hard drive, so you can’t take your petabit database and then write in in a quantum form and then keep reading that database ready to go. Every time you want to analyze your data with a quantum computer, you have to do that conversion over and over and over again.”

Another challenge comes at the output. He used a souffle as an analogy, comparing a quantum computer to an oven. If everything is done right, the cook might get a souffle, but it may be a chocolate souffle rather than the cheese souffle she wanted.

“This is the issue of quantum measurements,” Beausoleil said. “Ideally, if the quantum computer is well isolated from our part of the universe, the answer you want to get is very, very likely to come out of the computer, but there’s no guarantee because every measurement you make in the quantum system is probable. You have a 90 percent probably you’ll get the answer that you want. The way we will know that quantum computing is a thing, a reality, is when we can use one to teach us something we didn’t already know. It should be an important question we’re trying to get answered.”

The need for quantum computing depends on the industry a particular business is in. Companies shouldn’t “regard quantum computing as a soup-to-nuts or end-to-end solution to our computing problems. It’s an accelerator for certain types of problems, so if you’re in an industry that needs that type of problem solved – if you’re in pharmaceuticals, if you’re in material science, if you’re in a fabrication facility – you would love to have access to a quantum computer. You don’t need a quantum computer to run Excel spreadsheets. You can best be ready by understanding which of the problems that you want to solve in your enterprise would benefit from quantum computing and then making sure that you’re working with your suppliers to provide you with a platform so you can add that kind of acceleration when it becomes available.”

Beausoleil said he had a quantum computing project underway but abandoned it some six years ago. It was difficult to scale and running an accelerator that had to be at absolute zero would be inconvenient if you wanted it to run in a smartphone at the edge. Instead, HPE has focused on other efforts to address the challenges enterprises face in terms of the growth in data, distributed computing environments and emerging workloads like artificial intelligence (AI), machine learning and data analytics. Certainly the highest-profile project underway is The Machine (below), a memory-driven system that will leverage such technologies as memristors, silicon photonics, and optical networks to shift the focus in computing away from the compute and toward memory. The Machine is still years away from being commercialized, but technologies being developed for it will find their way into other high-end HPE systems. The company this month unveiled a Machine  sandbox development environment housed in a Superdome Flex system.

“In the enterprise, our most pressing problem is trying to gather from hither, thither, and yon all of the data being generated at the edge, sometimes being stored in the cloud, sometimes being brought into the enterprise’s private datacenter,” he said. “We’ve reached the point where these datasets have become so large that our old approach to memory – which was memory attached to compute and put on a board and then swing those boards out of servers – just doesn’t work anymore, and it doesn’t work in enterprise datacenters or high performance computing centers. We moved to another paradigm, memory-driven compute, that we’re using as a platform that we can innovate on. There’s a lot of IP in that data and our customers are desperate to process it in a way that benefits their mission. Being able to put all that data into a very large memory pool and then making it easy for compute no matter where it is to access that data and use it is the next paradigm, the next generation of compute.”

The vendor also is working on developing accelerators to address modern workloads, including the Dot Product Engine (DPE) to accelerate deep-learning work at lower costs, and also is investing in neuromorphic computing, the idea of developing large-scale, many-processor systems that work in fashion similar to the human brain and its synapses and neurons.

“Once you have everything in memory and you can access everything in memory without having to shuffle data all over the place, that’s step one,” Beausoleil said. “Then, with step two, we’re working in parallel on neuromorphic computing – using neuromorphic-inspired technologies – to do a better job at what computers right now have trouble doing very well in parallel, like multiplying a matrix by a vector. That sounds like something any computer should be amazingly good at, but in the case of a deep neural net, you do so much of it that it bogs down your ability to train the neural net and you end up having to do the training completely offline and it’s very hard to update with new data, so actually fielding a new version of voice recognition on your phone just seems to take forever.”

What’s needed is a new datacenter architecture, he said. Not all enterprises need to solve quantum-level problems, but they do need memory-driven architectures and technologies like DPE.

Sign up to our Newsletter

Featuring highlights, analysis, and stories from the week directly from us to your inbox with nothing in between.
Subscribe now


  1. I agree with much of this article, especially the framing of quantum computing as an accelerator device. On the technical Level of course this is spot on — QC needs a certain kind of input, is good at performing a certain kind of calculation, and provides a certain kind of output.

    What I do passionately disagree with is the Statement that the role of quantum computers “in enterprise environments will probably be negligible”. To say so is to vastly underestimate the plasticity and adaptiveness of the economy. Essentially, it’s as if in the late 1800s someone had watched the first applications of electrical power and said “Yeah, it’s great for a niche market, but you fellas shouldn’t expect it to have impact on normal businesses”. For sure it will.

    • Quantum computers in their current state are only useful for verifying NP-hard problems in polynomial time. In other words, that’s a very specific set of problems in computer science. This isn’t “electricity” that’s being invented here, it’s more like inventing the lithium ion battery. Sure, it’ll be useful and change things up quite a bit, but it’s not going to drastically change how computing is done. It just opens up another path to solve a certain class of problems.

Leave a Reply

Your email address will not be published.


This site uses Akismet to reduce spam. Learn how your comment data is processed.