What supercomputers will look like in the future, post-Moore’s Law, is still a bit hazy. As exascale computing comes into focus over the next several years, system vendors, universities and government agencies are all trying to get a gauge on what will come after that. Moore’s Law, which has driven the development of computing systems for more than five decades, is coming to an end as the challenge of making smaller chips loaded with more and more features is becoming increasingly difficult to do.
While the rise of accelerators, like GPUs, FPGAs and customized ASICs, silicon photonics and faster interconnects will help drive performance to meet many of the demands of such emerging applications as artificial intelligence and machine learning, data analytics, autonomous vehicles and the Internet of Things, down the road new computing paradigms will have to be developed to address future workload challenges. Quantum computing is among the possibilities being developed as a possible solution as vendors look to map out their pathways into the future.
Intel, which more successfully than any other chip maker has driven Moore’s Law forward, is now turning some of its attention to the next step in computing. CEO Brian Krzanich last week during the company’s investor event said Intel is investing a lot of time, effort and money in both quantum computing and neuromorphic computing – developing systems that can mimic the human brain – and Mark Seager, Intel Fellow and CTO for the HPC ecosystem in the chip maker’s Scalable Datacenter Solutions Group, told The Next Platform that “at Intel, we are serious about other aspects of AI like cognitive computing and neuromorphic computing. … Our way of thinking about AI is more broad than just machine learning and deep learning, but having said that, the question is how the technologies required for these workloads are converging with HPC.”
Quantum computing has been talked about for decades, and there have been projects pushing the idea for almost just as long. It holds out the promise of systems that are multiple times faster than current supercomputers. At the core of quantum computers are qubits, which are to quantum systems what bits are to traditional computers.
IBM last year made its quantum computing capabilities available on the IBM Cloud to give the public access to the technology and to drive innovation and new applications that can be used for the technology. Big Blue has been working on quantum computing technology for more than three decades. D-Wave currently is the only company to offer commercial quantum computing systems, and last month introduced its latest version, the D-Wave 2000Q, which has 2,000 qubits – twice the number of its predecessor – and has its first customer in Temporal Defense Systems, which will use the system to address cybersecurity threats. The systems are expensive – reportedly in the $15 million range – and the number of applications that can run on them are small, though D-Wave officials told The Next Platform that the number of applications will grow over the next decade and that the company is working to encourage that growth.
Others organizations also are pushing to expand the capabilities of quantum computing. Researchers led by Prof. Winfried Hensinger, head of the Ion Quantum Technology Group at the University of Sussex in England, this month unveiled a blueprint for building a modular, large-scale and highly scalable quantum computer and plans to build a prototype of the system at the university. The modular model and a unique way for moving qubits between the modules are at the center of what the researchers – who also come from the United States, Denmark, Japan and Germany – are developing. Qubits take advantage of what is called in quantum mechanics “superposition” – the ability to have values of 1 and 0 at the same time. That ability fuels much of the promise of quantum computers that are significantly faster than conventional systems.
“Quantum physics is a very strange theory predicting things like an atom can be in two different places at the same time, we’re harnessing these very strange effects in order to build a new type of computer. These quantum computers will change all of our lives, revolutionizing science, medicine and commerce.”
The computer will be built through modules that contain an electronics layer, a cooling layer using liquid nitrogen and piezo actuators. Each module will be lowered into a steel frame, and the modules will leverage connections created via electric fields that transmit ions from one module to the next. It’s a step in another direction from the fiber optic technologies many scientists are advocating for in quantum computers.
The researchers in Sussex argue that using electric fields to transport the charged atoms will offer connection speeds between the modules that are 100,000 faster than current fiber technologies and, according to Hensinger, “will allow us to build a quantum computer of any size [and] allow us to achieve phenomenal processing powers.” Each module will hold about 2,500 qubits, enabling a complete system that can contain 2 billion or more qubits.
The blueprint and prototype will be the latest step in what is sure to be an ongoing debate about what quantum computers will look like. However, creating modular system that can scale quickly and offers a very fast connectivity technology will help drive the discussion forward. Hensinger and his colleagues are making the blueprint public in hopes that other scientists will to take in what they’re developing and build off of it.
Sign up to our Newsletter
Featuring highlights, analysis, and stories from the week directly from us to your inbox with nothing in between.
The current hard cooling requirements will never make it a mainstream architecture. Having to use liquid helium simply will never be a very cost effective solution. Unless you send those type of computers in deep space.