The Winding Road to Quantum Computing ROI, Competition

With a timeline to early production sometime in the next decade (if not longer), the business model for quantum computing has been nebulous from the beginning.

This is true for startups like quantum annealing device maker, D-Wave (we talked about this recently with its CEO), and equally so for the big companies that have thrown their hat into the quantum product ring.

Even though companies like IBM, Intel, and Google might have softer cushions if the risk bubble pops before commercialization, the R&D investments are all sizable, especially for those who fab their own devices and hire legions of theoretical physicists to shoulder the burden of mapping old problems onto an entirely substrate.

All of this was central to a conversation we had with Bob Sutor, a 36-year IBM veteran and current VP of the company’s quantum-focused Q Strategy and Ecosystem. As we have detailed in the past, IBM has two quantum businesses; one that is based on a free-to-access 5-qubit quantum system to allow developers to run early-stage applications and form the groundwork for future quantum progress. On the commercial side is the Q Network, whose over 30 partners span academia and research with noteworthy members, including JP Morgan Chase, among others. In other words, these are the folks who are past the point of pushing a few qubits and want to work with real systems at what we can, at this early stage in quantum, call real “scale”—20 qubits.

That note about scale is not to diminish such progress. Getting error rates down and coherence times up is a massive undertaking, one that Sutor and his team at the Yorktown Heights, NY lab are well-equipped to deal with. IBM Research has its own fabs at that location, along with the physicist troops to march to the new algorithmic directions users want to go and how qubits can best be connected for those tasks.

Sutor makes no bones about the ROI of the quantum business, even for a IBM Research, an organization within the more risk-averse larger company that is used to making long bets on new tech.

“There is no return on investment in this for at least three to five years,” Sutor says.

“What we are doing has concerned some of the top analyst firms because they are used to helping make buy decisions on relatively short-term strategic decisions. The thing to remember is that IBM has been around for over a hundred years, IBM Research since 1946. We are used to incubating things and the only real surprise is that we told the world in 2016 that we had a computer. It was very un-IBM-like.”

What is missing from the conversations about evaluating the eventual return on investment is a sense of how IBM’s unique approach might win out over others. That is, of course, assuming any of this materializes in a broad enough way to make quantum a business for any of the companies involved. And this is also assuming that any of the quantum hardware devices are sufficiently differentiated to create competition on anything other than which software stack is most usable.

Even with a viable quantum ecosystem still several years off, Sutor says there is a feeling among the limited quantum device maker ecosystem that competition is beginning to heat up. The points of differentiation can look subtle from the outside and go well beyond mere qubit count, focusing on connectivity, coherence, and error rates, but it is still difficult to make comparisons between the gate model systems (D-Wave is the only maker on the outside with its annealing approach) given the nuanced metrics.

“Companies are getting more concerned about their relative positions and their commercial and investment prospects. Right now, you can go to Q Experience and look at the exact error rates for every single qubit on every machine we have. Where are those numbers for anyone else? That is something we publish daily and we would love for other quantum companies to publish similar statistics,” he tells The Next Platform.

IBM has made it a mission to assign metrics and standards to its own tech so users can begin to evaluate the relative merits. The Q Experience error rate reporting is one element with the forthcoming Quantum Volume metric another. This one will take a holistic view of a quantum system in terms of its balance between performance and capability with error rates, qubit counts, coherence, and other factors. It is called “volume” because it will be multidimensional with the slew of new algorithms for these systems put through a ringer of their own devising.

“At this point, it is better to have 20 or 50 great qubits over two thousand really lousy ones because of error rates and coherence. Progress is not simply about adding more qubits; the topology of the qubits is critical and so is the way they are connected. Right now, what the ultimate architecture and layout of the qubits will be as we scale is not clear because there are so many considerations.”

It is just this fact that makes determining what technologies, if any, will make it through to commercial viability, let alone if they will look anything like what IBM Research, Intel, Google, Rigetti, and others are pushing when the time is right from an applications perspective.

Metrics from IBM about its own hardware (or any of the other quantum makers) will not shape hearts and minds that this point. It’s all useful information about performance and reliability for now but we are still miles away from quantum standards to judge by. If there is a real ROI for IBM Research with its expanding quantum efforts it could be much farther away than a few years if for no other reason than it will take the software piece at least that long to easily snap in and seamlessly allow the idealized hybrid quantum-classical applications. It turns out that is all far more difficult to get to work than it may sound—as with all things in quantum. It goes beyond interface issues, down to the assembly level for some and the compiler/library ecosystem for others.

As already noted, it could simply be that the company that develops the best software stack will win since there could be very similar hardware products. The prize of first dominating vendor in universal quantum systems could well be the one that allows the most users to bring the most code in the most familiar format—and lets optimizing compilers and libraries do the heavy lifting of abstraction. Here is where IBM has really built a story with its AQUA and TERRA stacks, which we will describe in more detail in a future piece.

Much of the work that will drive an early quantum ecosystem will be done in hybrid algorithms. As we’ve established repeatedly, a quantum computer is not going to completely replace classical, it will work in concert with quantum algorithms woven into the classical with loops and conditionals and the quantum part being a relatively small but very important piece..

“The trick in developing any applications in the next few years will be doing this. Even in five years if you look at what we might call a quantum application there’s a good chance that 95% of the code will still be classical for good reasons, not the least of which is the user interface. But that 5% is something that would take a classical computer an incredibly long to do in simulation. The trick will be learning exactly where to use quantum and have that something that is callable as a subroutine or as a library method.”

“I’m very conservative; I always say quantum computing ‘might’, I never say it ‘will’ do something because there are a lot of things in between here and there and occasionally a miracle or two. Right now we may say there are five or six potential application areas but others will have dozens. And there will be variations on the same algorithms for different fields. Right now it’s about getting people, like those in HPC in particular, to start learning enough about all of this to investigate.”

Sutor says they have a hundred thousand users for the free 5 and 16 qubit Q Experience machines with the Q Network growing to household names and institutions. “We have our own fabs, our own small clean room to build machines, mask designers, theoretical physicists—and we have expansion plans to keep growing. The Q Network has support programs for education to broaden reach and internally we are committed to a high level of performance and quality in this because every bit on the science side is important.”

He did address the concern we have heard about the queue times for full access, noting that it does happen from time to time but that they will continue building the Q Experience and Q Network pieces to support further development.

Sign up to our Newsletter

Featuring highlights, analysis, and stories from the week directly from us to your inbox with nothing in between.
Subscribe now

1 Comment

  1. Lasers, magnetic feilds, & hanpurple. Hanpurple could stabilize the qubits, along side, Maxwell’s demon theory, to remove entropy, and allow the qubit to preform near perfect; in a byzantine model.

Leave a Reply

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.