If there is any organization on the planet that has had a closer view of the coming demise of Moore’s Law, it is the Institute of Electrical and Electronics Engineers (IEEE). Since its inception in the 1960s, the wide range of industry professionals have been able to trace a steady trajectory for semiconductors, but given the limitations ahead, it is time to look to a new path—or several forks, to be more accurate.
This realization about the state of computing for the next decade and beyond has spurred action from a subgroup, led by Georgia Tech professor Tom Conte and superconducting electronics researcher, Elie Track called “Rebooting Computing,” which produces reports based on invite-only deep dives on a wide range of post-Moore’s Law technologies, many of which were cited here this week via Europe’s effort to pinpoint future post-exascale architectures. The Rebooting Computing effort is opening its doors next week for a wider-reaching, open forum in San Diego to bring together new ideas in novel architectures and modes of computing as well as on the applications and algorithm development fronts.
According to co-chair of the Rebooting Computing effort, Elie Track, a former Yale physicist who has turned his superconducting circuits work toward high efficiency solar cells in his role at startup Nvizix, Moore’s Law is unquestionably dead. “There is no known technology that can keep packing more density and features into a given space and further, the real issue is power dissipation. We just cannot keep reducing things further; a fresh perspective is needed.” The problem with gaining that view, however, is that for now it means taking a broad, sweeping look across many emerging areas; from quantum and neuromorphic devices, approximate computing, and a wide range of other technologies. “It might seem frustrating that this is general, but there is no clear way forward yet. What we all agree on is that we need exponential growth in computing engines.”
“This end of CMOS scaling, coupled with the explosion of big data and the need for more compute power for both civilian and military applications, requires the kind of exponential growth we have enjoyed for so many years. But that is the challenge; this is a crisis. This is an inflection point. Incremental improvements are not the solution, but the improvements we need might come from other areas in computing.”
The IEEE Rebooting Computing initiative has three main pillars, as Track explains. These include targeting new technology approaches that emphasize energy efficiency, security, and the human-application interfaces that make this all possible. Under this is an “engine room” of the host of non-traditional architectures that break beyond existing silicon barriers.
It is worth taking note of what topics generate the most heat in the IEEE Rebooting Computing initiative because this is where government will take at least some of its cues for future investment. We pressed Track to detail which of the many architectures and approaches being presented seem to have the most traction. While the augmentation of continued CMOS (pushing Moore’s Law to the very end) is always a hot topic (this is especially the case in the Department of Energy with its exascale-class supercomputers still dependent on traditional scaling), the most controversy and diversity of ideas is happening around neuromorphic computing“There are many ideas here; a much higher density that stretches across both devices and architectures.” Other topics that have wider pools of research to pool are in approximate computing and on the software side, bolstering the human-computer interface.
Despite the lack of direction or emphasis on one “saving grace” technology among the many being explored, Track says the reboot work is valuable for government agencies as they decide where to invest. Among the many agencies downwind of the most recent presidential green light, the National Strategic Computing Initiative, is IAPRA, which Track says is doing some of the most bleeding edge, practical work to push new silicon alternatives from concept to reality. Programs that explore novel approaches to computing at IARPA run between 3-5 years. Track was associated with the C3 superconducting electronics group, which we will cover in a future article.
While IARPA is focused on exploring some entirely new modes of computing, other agencies are forced to stay on the current CMOS track for practical reasons, including the need to keep scientific progress humming along via the same software frameworks. This is the case for the Department of Energy (DoE), which will, at this point, continue to follow traditional system design choices into the exascale era based on large-scale, heterogeneous platforms for the most part, all of which push closer to the 20MW power ceiling proposed by exascale decision-makers in the government and DoE. The Department of Energy (DoE) is another agency that has mission-critical workloads it wants to bolster using novel architectures, and given the emphasis on systems that go beyond HPC (embedded, for instance) their research efforts are more wide-ranging.
DARPA has also done pioneering work on novel approaches to computing, most notably with their efforts to support neuromorphic computing, which is on the radar for the IEEE Rebooting Computing program. The National Science Foundation (NSF) also funds various initiatives and efforts that serve the goals put forth in NSCI. Despite these various efforts, Track says it is critical for the NSCI program to start to center on new programs that focus on a few of the technologies that will be presented this week and in their reports. “Everyone is waiting on the next election to see what programs are defined and how the funding will work. There is a need for better defined programs but we are still watching this take shape.”
Rebooting computing as we know it is a strategic imperative for research and enterprise reasons, but it will be a long road ahead, especially for those who cannot invest in untested technologies in the next decade. However, as Track mentions, getting the government and broader industry understanding which new approaches offer the most scalable, power efficient, and mass-producible results well in advance is critical.