In a recent study published in Nature, an international research team have created a computer chip, called the NeuRRAM neuromorphic chip, that performs computational tasks directly in memory and capable is of running a multitude of artificial intelligence (AI) applications, all while using only a fraction of the energy traditionally consumed by artificial intelligence computing platforms. This research holds the potential for allowing AI to operate on a broad range of edge devices that are both disconnected from the cloud and without the need for a network connection to a centralized server.
"The conventional wisdom is that the higher efficiency of compute-in-memory is at the cost of versatility, but our NeuRRAM chip obtains efficiency while not sacrificing versatility," said Dr. Weier Wan, lead author of the study, and a recent Ph.D. graduate of Stanford University who worked on the chip while at UC San Diego.
The NeuRRAM chip boasts double the energy efficiency as traditional state-of-the-art “compute-in-memory” chips while delivering equally accurate results. Along with having greater versatility, the NeuRRAM chip is far less bulky and constrained than conventional AI platforms. This results in the NeuRRAM chip being capable of being used for several different applications, which include image recognition and reconstruction and voice recognition.
As a result of less power consumption, the NeuRRAM chip could result in developing more robust, smarter and accessible edge devices, along with smarter manufacturing, as well. The lack of cloud necessity could also result in better data privacy.
"Compute-in-memory has been common practice in neuromorphic engineering since it was introduced more than 30 years ago," said Dr. Gert Cauwenberghs, who co-advised Wan in the Department of Bioengineering at UC San Diego and is a co-author on the study. "What is new with NeuRRAM is that the extreme efficiency now goes together with great flexibility for diverse AI applications with almost no loss in accuracy over standard digital general-purpose compute platforms."
As always, keep doing science & keep looking up!