If you thought the up-front costs and risks were high for a silicon startup, consider the economics of building a full-stack quantum computing company from the ground-up—and at a time when the applications are described in terms of their potential and the algorithms still in primitive stages.
Quantum computing company, D-Wave managed to bootstrap its annealing-based approach and secure early big name customers with a total of $200 million over the years but as we have seen with a range of use cases, they have been able to put at least some funds back in investor pockets with system sales to Volkswagen, Lockheed-Martin, NASA, Google, and others. As we described in detail, there are similar efforts to kickstart specific quantum approaches from other companies, including Intel, IBM, and others as well.
But for other startups that were not there at the beginning of widespread quantum computing interest like startup Rigetti Computing, the investment/payoff strategy is a bit more muddled; especially when giants able to absorb big risks (and potential failures) like IBM are working on similar gate model quantum systems and software stacks for ready cloud delivery.
What Rigetti has done on a relatively shoestring budget of $70 million is from first glance far more than any silicon upstart could pull off with twice that amount. There are tricks to stretching the funds, to be sure, but when one considers that the 100-person company has used that amount to fund its own fabs, integrate its systems, write the many layers of the software stack required for all, and set up cloud infrastructure to deliver the product, it is skillful use of limited resources. And while it could take at least five years to be in the black (and that is counting back to the company’s founding in 2013), according to the company’s COO, Madhav Thattai, the company is willing to hold onto its vision to see the long road—the ten year roadmap with more mature applications and verticals where its software environment becomes another profit center with developers building on top of its Forest quantum framework.
Thattai says the $70 million has gone into building the core and experimental infrastructure (including the need for several quantum systems in their lab) as well as the fab itself. The quantum physicists and requisite mechanical, electrical, software and fabrication engineers are another cost center since a comprehensive engineering team is needed for a full-stack quantum effort.
He also says that for other startups coming after their camp, the startup capital (or lack thereof) isn’t really the constraint. In fact, it is not unlikely that there will be others, but “once the resources are in place to invest in infrastructure and people that lets you build the chips. The tough part is integration. We can drive iteration cycles in a matter of a few weeks at a time which is unusual in the semiconductor industry where design changes happen a couple times per year.”
And that hits on an important couple of points. Although in some ways this might seem like it parallels a startup in silicon world, some of the features may be similar, but the entire process—from materials to investment philosophy—is different. Take, for example, the investment in self-built fabs.
In the wider world, building a fab is a many-billions effort with emphasis on scaling to smaller nodes. Since Rigetti’s approach is based on superconducting circuits, they can take advantage of the existing work on fabrication and packaging of such technologies without reinventing wheels or worrying about building a facility designed to churn out high volumes of chips or that needs to scale to new expectations.
“An advantage of using superconducting circuits is that it gives you access to scalable technology that we know a lot about. There are many interesting ways to build gate model quantum computers; from ion traps and lasers and photonics—but our approach has momentum because it rests on this base of semiconductor manufacturing from the last 50-60 years but modified for what we need to build quantum integrated circuits.”
While Thattai says the fab investment was still sizable, it was acceptable for the overall model. “We don’t have to optimize for scale, which let us invest much less than one would think. Instead, we invested in what would let us iterate and drive process improvements. We don’t have to buy the latest generation fab technology; our wafer sizes are a couple generations older, our tools don’t need to be foundry tools to pump out millions of chips and those decisions end up driving orders of magnitude differences in the amount of money needed to do this. It is still a very sizable investment but it is not the constraining factor.”
Taking a full stack approach to building a quantum computing company on the fly with limited investment also meant creating a supply chain where none existed. The piece many focus on with quantum devices is the cryogenic cooling system but that is actually not difficult to acquire. The real challenges were integrating from the top down. “We tried to run virtual supply chains for the chip but it quickly became clear we had to do it all ourselves. Besides, in analyzing our options, we saw that everything we did on the device, including communication, had ripple effects on the hardware and software and how we designed algorithms. He says that they will be able to standardize their processes eventually but that is still 5-10 years out.
This has all come together in the form of the latest 19 qubit superconducting processor the company unveiled recently for use on the cloud alongside their Forest software environment, which is still in public beta. “The core of our innovation is the architecture of our integrated circuits. It is scalable and our design choices allow us to scale the size of our chips consistently over time as well as improve performance and fidelity.”
These cost-cutting strategies and operational efficiencies are all clever, but even the best of those cannot change the fact that the market is still years away from seeing mainstream deployments of quantum systems. While RIgetti is focused on the most approachable quantum algorithms, those that are “hybrid” or a blend of classical and quantum, five years to get to the black and ten years before there is a sufficient market size to warrant great expansion is still a long time to wait—and for an uncertain future. As we know, a lot can change in three years, nevermind ten.
“Most efforts in quantum computing, even among the larger industrial players, are mostly still research efforts funded off balance sheets to see if it makes sense to keep building. We are not a research project; we are bringing this to bear, making it provide value,” Thattai tells The Next Platform. The big question we are all asking is what the path is to show quantum advantage. In other words, when and how can we prove what we are doing is better than classical approaches. That is important in the history of the technology and we have to show, for a particular vertical and application how it is better than what we have today.”
“We think in next 5 years there will be certain verticals and applications, not widespread, that are valuable where people can use these systems to significant quantum advantage. “We’ll see gate model chips then with a few hundred qubits and those will be large enough to perform calculations using these hybrid algorithms. In the next decade we’ll be even closer to fault tolerance and that will unlock new algorithms and performance and by that point the application space will be widespread enough that multiple industries take part.”
“For the longest time in the quantum industry, people have assumed that when you’re building a general gate model quantum computer, you need systems with millions of qubits to do anything useful to get the right level of fault tolerance—all to solve quantum algorithms that were designed 20 years ago. And all of that happened even before the time more recently when the industry struggled to get beyond building more than a few qubits,” says Thattai. “People thought it would take an IBM or Google to keep plugging away to build bigger systems for this fault tolerance but the paradigm changed.” That paradigm shifted in favor of hybrid models that allow for more productivity out of smaller chips and with greater forgiveness built into experimental algorithms.
Rigetti thinks this will be fertile ground to prove quantum algorithm, device and delivery and unlock new verticals over time but back to the original question—when will a startup like this ever be profitable? Even if it is, do the economies of scale across the board (fabs, developers, etc) from companies like IBM pose an ever-looming threat when progress on device capabilities is tit-for-tat? And even further, there is still no consensus about whether quantum annealing or gate-based models will win out in certain verticals.
For Rigetti’s part, the question about being in the black in ten years seemed to miss the point. “We will just keep investing to stay ten years ahead of the next curve.”