The Cloud is Where Quantum Competition Gets Real

At the dawn of the mainframe era, Thomas Watson is said to have remarked that perhaps only five such systems would ever be sold in the world and that perhaps, just one machine would be needed to solve the most intractable problems.

Whether or not this was uttered is up for debate, but for the sake of historical context, it reminds us of the evolution of hardware value for companies producing a new architecture that promises unique capability. For IBM, the services side of the business was where the investments paid off over time, but of course none of that could have developed without forward-thinking investments in physical systems. Even those they thought very few would buy.

The other lesson is that markets value unique capability. With the margins shaved to smithereens off classical computing devices and not much variation in terms of what system makers can offer, it is possible we are resting at the same inflection point T.J. Watson was at when he speculated about the limited market for unique hardware capability in 1943.

With cloud now, new innovations in hardware can filter into mainstream market faster than ever before, broadening both access and competition, especially if those systems can provide an indisputable edge for one or more application areas.

There are plenty of new architectures emerging today, especially those targeting machine learning, but those are spin-offs of the standard base of applications, processing technologies, and software ecosystems. The hardware is simple to procure and build from. For an emerging area like quantum computing, however, the situation is quite different.

A company like quantum annealing system maker, D-Wave, has very few installations at customer sites—a number that has not grown significantly over the last couple of years, despite the release of four major iterations of its quantum chips and subsequent exponential capability boost. However, D-Wave’s business is far more complex than simply selling systems into customer datacenters, and it harkens back to the systems and services business model IBM paved the way for decades ago.

Quantum system sales appeared at first to be the primary way D-Wave was going to return on the incredible amount of investment the company has gathered since 1999, beginning with future technologies funds from Jeff Bezos, In-Q-Tel (funded by U.S. intelligence agencies), and others. However, as quantum maker’s CEO, Vern Brownell tells The Next Platform, their business is actually shaping up around a much more comprehensive strategy that relies on remote access and cloud vendors to increase their share around key, high-value applications.

D-Wave has long been selling time on their systems to organizations, which include customers in the intelligence, security, and financial services spaces. It could also be that those in said areas have their own datacenter installations as well, but the point is, the types of problems quantum systems are good for (Monte Carlo simulations, for instance) can only really be efficiently done at massive scale with an architecture that does something classical machines cannot touch.

Brownell, who once spearheaded technology at Goldman Sachs at a time when a million cores dedicated to Monte Carlo compute meant a $50 million annual power bill can certainly speak of disruptive efficiency, but at the beginning of his career at DEC, he got an early idea about the power of the minicomputer and scalable, shared systems. These two things are coming to bear at D-Wave, which has demonstrated efficiencies far beyond the reach of classical systems for Monte Carlo simulations in particular with more applications being added to the list of 70 that are can be accelerated by quantum systems.

“Looking back ten years ago when I first joined D-Wave, no one in Silicon Valley was investing in hardware. Now that people are thinking about computing as a collection of compute resources it is the perfect time for quantum computing in general, whether it’s IBM, Google, us, or whoever. All computing will be hybrid in nature and with the cloud as the predominant way of delivering resources in many places, the space I used to be in—enterprise—is completely revolutionized. Cloud was the most dramatic thing to happen then but I believe what we are working in will be the most important thing to happen in decades to come.”

Brownell says that given the expertise D-Wave’s teams have developed around quantum chip fabrication they aim to be the “Intel of quantum computing” but philosophically, they will need to share just as much in common with Nvidia. GPU computing was born at the right place and time—and continues to evolve with trends like machine learning now. But the way they took an esoteric accelerator and created a real mainstream product was by firming up a wide-reaching developer ecosystem and adding as many accelerated applications to their architecture as possible.

The biggest step they, and any other quantum hardware makers, will take is getting the major cloud vendors to install and run their systems for timeshare for a wide base. This means having all the tooling for both experts and beginners on hand—something that companies like Amazon did first with GPUs for HPC applications in the 2010s and what they did more recently with high-end FPGAs that come with both high and low-level tools to get developers used to using FPGAs no matter where they reside.

Having a quantum computer installed in a major cloud vendor’s datacenter with public access would mean two things. First, it would provide a wider base for comparisons of real application performance. Second, it would provide the foundation for a first conversation about price and bout return on investment over traditional architectures. And as a side benefit, understanding ease of use would lead to a better sense of market reach in coming years.

We at The Next Platform understand that D-Wave has some big cloud news coming soon. Although D-Wave has been offering remote access to its systems for years now, we are guessing that one of the big cloud players will be providing a quantum computer for users. If pressed to venture a guess it would certainly be Amazon Web Services—and not just because Bezos was an early investor, but because they tend to get the cutting-edge hardware first and throw big support around it from the get-go.

And this is how quantum computing goes mainstream. Well, as mainstream as it will ever be. Seventy applications for quantum annealing is no small feat, but these took a number of years to ready for the system. The other quantum vendors have all beat D-Wave to the cloud punch but these companies are approaching these systems differently and do not have the integrated systems (or even the same way of harnessing quantum mechanics) that D-Wave has. IBM, Rigetti, and Google are all-gate based quantum models and Microsoft is out on the edge with its topological approach that solves the error correction bottleneck in a way that frees it up architecturally. Intel’s architecture too is blending both ideas. But ultimately, real hardware in volume is only found with D-Wave.

This article is not doing the job of relative comparisons between quantum computing approaches. The vendors are all offering their devices remotely but these are niche and insular. Being able to compare quantum tech on actual applications and see the cost structure over traditional chips will be a game changer. The conversation will shift from comparing who has the most qubits, or whether gate model versus annealing is better or more truly “quantum”, or which has the most efficient error correction method to who can actual deliver real-world results. The results will speak for themselves, and it’s about time.

“We have been shipping systems to Lockheed, Google, NASA, and others as everyone knows but not everyone could access these. We had our own cloud offering in 2010, it wasn’t large scale but was done individually with important customers. We are about to do this in a big way and we will be able to support large numbers of customers and developers with a toolkit for users at all levels,” Brownell tells us.

Again, our bets are on AWS but even if it does provide its own official cloud as Rigetti did this week, it is still important in terms of access to the only annealing-based approach with hardware.

“Our business model moving forward is to keep selling systems to national labs, the intelligence and security communities and some select companies,” Brownell explains. “But ultimately, the future for us is in the cloud customers. Eventually, every cloud company is going to need this capability. Cloud is a hyper-competitive business where they all have the same offerings. So, just like with GPUs where every cloud vendor had to integrate those, the same will be true with quantum annealing. This is how we will operate our business.”

“The cloud for quantum computing, and that means all the companies doing this, is about providing real decision making power about the various offerings. We will no longer be selling against vaporware.”

Insinuating some of the current quantum efforts amount to vaporware is incendiary, but so is the idea that all of the current quantum approaches claim larger qubit counts but the assertion that we are still 10-15 years away from viable products for more mainstream use. The real test will come when a pricing structure and application performance (and performance per watt) profile emerges from real-world use cases.

Sign up to our Newsletter

Featuring highlights, analysis, and stories from the week directly from us to your inbox with nothing in between.
Subscribe now

Be the first to comment

Leave a Reply

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.