Memristor Research Highlights Neuromorphic Device Future

Much of the talk around artificial intelligence these days focuses on software efforts – various algorithms and neural networks – and such hardware devices as custom ASICs for those neural networks and chips like GPUs and FPGAs that can help the development of reprogrammable systems. A vast array of well-known names in the industry – from Google and Facebook to Nvidia, Intel, IBM and Qualcomm – is pushing hard in this direction, and those and other organizations are making significant gains thanks to new AI methods as deep learning.

All of this development is happening at a time when the stakes appear higher than ever for future deep learning hardware. One of the forthcoming exascale machines is mandated to sport a novel architecture (although what that means exactly is still up for debate), and companies like Intel are suddenly talking with renewed vigor about their own internal efforts on neuromorphic processors.

The focus on such AI efforts has turned attention away from work that has been underway for years on developing neuromorphic processors – essentially creating tiny chips that work in a similar fashion as the human brain, complete with technologies that mimic synapses and neurons. As we’ve outlined at The Next Platform, there are myriad projects underway to develop such neuromorphic computing capabilities. IBM, Hewlett Packard Enterprise – with its work with memristors – Qualcomm through its Brain Corporation venture and other tech vendors are making pushes in that direction, while government agencies like the Oak Ridge National Laboratory in Tennessee and universities like MIT and Stanford and its NeuroGrid project also have efforts underway. Such work also has the backing of federal government programs, such as DARPA’s SyNapse and UPSIDE (Unconventional Processing of Signals for Intelligent Data Exploitation) and the National Science Foundation.

Another institution that is working on neuromorphic processor technology is the University of Michigan’s Electrical Engineering and Computer Science department, an effort led by Professor Wei Lu. Lu’s group is focusing on the memristors – a two-terminal device that essentially is a resistor with memory that retain its stored data even when turned off – that can act like synapses to build computers that can act like the human brain and drive machine learning. We’ve talked about the growing interest in memristors for use in developing computer systems that can mimic the human brain.

Lu’s group created a nanoscale memristor that to mimic a synapse by using a mixture of silicon and silver that is housed between a pair of electrodes. Silver ions in the mixture are controlled by voltage applied to the memristor, changing the conductance state, similar to how synaptic connections between neurons rise and fall based on when the neurons fire off electrical pulses. (In the human brain, there are about 10 billion neurons, with each connected to other neurons via about 10,000 synapses.)

Neuromorphic computing proponents like Lu believe that building such brain-like computers will be the key moving forward in driving the development of systems that are smaller, faster and more efficient. During a talk last year at the International Conference for Advanced Neurotechnology, Lu noted the accomplishment of Google’s AlphaGo program, but noted that it had to be done on a system powered by 1,202 CPUs and 176 GPUs. He also pointed out that it was designed for a specific task – to learn and master Go – and that doing so took three weeks of training and some 340 million repeated training reps. Such large compute needs and specific task orientation are among the weaknesses of driving AI in software, he said. AlphaGo’s win was an example of “brute force” – an inefficient computer using a lot of power (more than the human brain consumes) and designed for s specific job that necessitated a long period of training. He also pointed to IBM’s BlueGene/P supercomputer at Argonne National Lab that was used to simulate a cat’s brain. It used 147,456 CPUs and 144TB of memory to create a simulation that was 83 times slower than that of a real cat’s brain.

“Once again, this is because they tried to emulate this system in software,” he said. “We don’t have the efficient hardware to emulate these biological systems. So the idea is that if we have the hardware, then we can also implement some of the rules … or features we learn in biology, not only will we make computers faster, but also you can use it to up with biological system to enhance our brain functions.

“We’re not trying to do it in software. We’re actually trying to build as a fundamental device on hardware … a computer network very similar to the biological neuro-network.”

His group is doing that through the use of memristor “synapses” and CMOS components that work like neurons and are built on what Lu described as a “crossbar” electrical circuit. The crossbar network is comparable to biological systems in the way it operates. An advantage such a system like this has over traditional computers is the synapse-like way memristors operate. Traditional computers are limited by the separation between the CPU and memory.

Such a change could have a significant impact on a $6 billion memory industry that is looking at what comes after flash, he said. Lu’s team introduced its concept in 2010, and now he is a cofounder of Crossbar ReRAM, a company with $85 million in venture capital backing that was founded that same year and is working to commercialize what the University of Michigan team developed. He said in 2016 that the company already had developed some products for several customers. The company last month announced it is sampling embedded 40nm ReRAM manufactured by Semiconductor Manufacturing International Corp. (SMIC) with plans to come out with a 28nm version in the first half of the year.

Sign up to our Newsletter

Featuring highlights, analysis, and stories from the week directly from us to your inbox with nothing in between.
Subscribe now

2 Comments

  1. I’m not sure that digital circuits are the best way to emulate biological systems. There is this nice story back from 1997 (www.newscientist.com/article/mg15621085-000-creatures-from-primordial-silicon-let-darwinism-loose-in-an-electronics-lab-and-just-watch-what-it-creates-a-lean-mean-machine-that-nobody-understands-clive-davidson-reports/) that nicely demonstrates that evolution quickly learns to use analog properties even of digital circuits that are not there by design. I think it would make much more sense to see a resurrection of analog computing, maybe along the lines of what Optalysys is doing with light.

  2. I don’t know when will folks come out in unison and accept that Memristor was a hoax perpetrated by a group at HP. That group was divested last year. Yes, high device density is needed for realizing brain like computing, but sticking to memristors with a desire for analog retention is a massive waste of time and resources. Till then digital platforms for deep learning will dominate the space. A hoax can’t save our future!

Leave a Reply

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.