JP Data LLC
101 Jefferson Dr, 1st Floor
Menlo Park CA 94025

Phone
(408) 623 9165

Email
info at jpdata dot co
sales at jpdata dot co

Is Neuromorphic technology ready to take next steps?

Note: This blog was published via Tractica in 2019.

Neuromorphic technology has been around since 90s in some form of the other. Early version of the Neurimorphic chipsets include SpinNNaker from University of Manchester and IBM’s TrueNorth. Both of them won several accolades from the university community and were primarily R&D projects. A recent chip from Intel called Loihi was yet another attempt from a prominent chipmaker to revive the interest in Neuromorphic technology. With some of the recent development of software tools, the technology has made some leaps and is slowly starting to reach the maturity needed for commercial deployment.

Neuromorphic compute is different from today’s mainstream Von Neuman’s compute paradigm. The Von Neumann architecture separates compute and storage. The data is stored in a RAM and is then shuffled to compute unit to make operations such as convolutions and the results are then stored back into RAM. The cost of moving data back and forth often consumes high power and limits the ability to significantly reduce power consumption. Neuromorphic  computing tries to mimic the way the way brain works where compute and storage is co-located. One good outcome of the colocation is though that it reduces the cost of transferring the data back and forth thus gives significant power savings.

Neuromorphic technology also uses what is called as ‘spikes’ to compute. Today’s Neural Networks work on a frame by frame basis in which an image is input to a compute element and the output is the value of that compute operation. The compute operations are repeated until meaningful results are produced. Neuromorphic technology looks at the difference between two frames  (‘Spikes’). This is then input to a a compute element that outputs spikes. This operation is repeated to generate meaningful results. This fundamentally changes the way computing is done and requires a different kind of network called Spiking Neuron Network (SNN).

Thus, Neuromorphic computing boils down to new hardware and software paradigm. In the last couple of years, there have been many new developments in this field driven by AI. Several neuromorphic chipsets companies entered the market that have successfully raised capital. These include Rain Neuromorphics (Memristor based approach), Prophesee (spiking sensor), Brainchip (SNN) and so on. Intel’s Loihi was another endorsement from a large corporate entity of the technology.

Neuromorphic chip can be fabricated using existing CMOS technologies and these chips rely on traditional semiconductor technologies. However, software is where the technology had remained short in terms of making it commercially viable in the past. Applied Brain Research, a Canada based company specializing in Neuromorphic software, has developed a software tool called Nengo and has published very promising results in recent years. The company’s paper goes on to demonstrate how Loihi can be used to recognize Wakewords at a very low power consumption and provides a framework to map NNs into Neuromorphic chipsets with tradeoffs in accuracy. The company has made tools freely available for download to anyone interested in trying out the technology.  

Neuromorphic technology has some prominent critiques though. Yann LeCun, one of the fathers of AI, has been vocal about criticizing Neuromorphic and Intel has openly responded to the criticism. According to Yann LeCun, Neuromorphic compute it does not scale very well and the accuracy drops as the size of network increases. He believes that the results are not quite practical for commercial deployments for larger networks.

Is Neuromorphic technology ready to take next steps?

Neuromorphic technology has been around since 90s in some form of the other. Early version of the Neurimorphic chipsets include SpinNNaker from University of Manchester and IBM’s TrueNorth. Both of them won several accolades from the university community and were primarily R&D projects. A recent chip from Intel called Loihi was yet another attempt from a prominent chipmaker to revive the interest in Neuromorphic technology. With some of the recent development of software tools, the technology has made some leaps and is slowly starting to reach the maturity needed for commercial deployment.

Neuromorphic compute is different from today’s mainstream Von Neuman’s compute paradigm. The Von Neumann architecture separates compute and storage. The data is stored in a RAM and is then shuffled to compute unit to make operations such as convolutions and the results are then stored back into RAM. The cost of moving data back and forth often consumes high power and limits the ability to significantly reduce power consumption. Neuromorphic  computing tries to mimic the way the way brain works where compute and storage is co-located. One good outcome of the colocation is though that it reduces the cost of transferring the data back and forth thus gives significant power savings.

Neuromorphic technology also uses what is called as ‘spikes’ to compute. Today’s Neural Networks work on a frame by frame basis in which an image is input to a compute element and the output is the value of that compute operation. The compute operations are repeated until meaningful results are produced. Neuromorphic technology looks at the difference between two frames  (‘Spikes’). This is then input to a a compute element that outputs spikes. This operation is repeated to generate meaningful results. This fundamentally changes the way computing is done and requires a different kind of network called Spiking Neuron Network (SNN).

Thus, Neuromorphic computing boils down to new hardware and software paradigm. In the last couple of years, there have been many new developments in this field driven by AI. Several neuromorphic chipsets companies entered the market that have successfully raised capital. These include Rain Neuromorphics (Memristor based approach), Prophesee (spiking sensor), Brainchip (SNN) and so on. Intel’s Loihi was another endorsement from a large corporate entity of the technology.

Neuromorphic chip can be fabricated using existing CMOS technologies and these chips rely on traditional semiconductor technologies. However, software is where the technology had remained short in terms of making it commercially viable in the past. Applied Brain Research, a Canada based company specializing in Neuromorphic software, has developed a software tool called Nengo and has published very promising results in recent years. The company’s paper goes on to demonstrate how Loihi can be used to recognize Wakewords at a very low power consumption and provides a framework to map NNs into Neuromorphic chipsets with tradeoffs in accuracy. The company has made tools freely available for download to anyone interested in trying out the technology.  

Neuromorphic technology has some prominent critiques though. Yann LeCun, one of the fathers of AI, has been vocal about criticizing Neuromorphic and Intel has openly responded to the criticism. According to Yann LeCun, Neuromorphic compute it does not scale very well and the accuracy drops as the size of network increases. He believes that the results are not quite practical for commercial deployments for larger networks.

Nevertheless, Neuromorphic computing remains an active area of research in the academics and efforts are ongoing to overcome limitations. The start-ups who have raised capital are planning on going after the edge market by our own estimates worth $55 billion dollars by 2025. This presents a large opportunity for AI in a wide range of battery powered consumer devices. Applications such as wakeword recognition are on the rise and Alexalike devices are starting to dominate households. A small SNN running on Neuromorphic chip that provides 10X power benefit in comparison with the traditional compute might just be enough to give design wins for Neuromorphic companies and jumpstart the ramp up.