We want less power consuming chips, and neuromorphic processors have already raised their hands. 

33
Neuromorphic computing

Neuromorphic computing, while not as well-known as quantum computing, is a topic with immense potential as a complement to the traditional computers we are all familiar with. In reality, not only are some of the world’s most prestigious academic institutions, such as MIT, contributing to its development, but Intel, IBM, and HP are three of the firms leading the charge. 

The idea behind neuromorphic computing is to mimic the behaviour of the nervous system in animals in general and the brain in particular. Carver Mead, an American electrical engineer who developed this theory in the 1960s, stated the beginning point as treating transistors as analogue natural devices rather than digital switches. 

This method looked ideal since transistor behaviour mirrors how neurons communicate with one another via electrical impulses (this mechanism is known as a neuronal synapse). Mead’s concept is novel and appealing, but putting it into practise necessitates a multidisciplinary approach in which physicists, biologists, mathematicians, computer scientists, and microelectronics must work together. 

In any event, the ultimate goal of this subject, which has advanced dramatically in the last 15 years, is to create electronic systems that can handle data more efficiently. In fact, they strive to be as efficient as a biological brain, a lofty goal that is both intriguing and difficult to achieve. 

-advertisement-

An organic brain can do a lot of work with very little energy, and the way it processes information makes it extremely skilled in some situations but inefficient in others. This explains why, while a neuromorphic processor can tackle some tasks faster and with less energy than a traditional computer, it can also be far less efficient in others. 

A neuromorphic system can improve efficiency by up to sixteen times. 

Intel’s Loihi neuromorphic processor has 128 nuclei and a little over 130,000 artificial neurons, and it was created using 14 nm photolithography. It was created for research initiatives, according to Intel, and has capabilities akin to a miniature brain. 

These specifications are remarkable, but the most astonishing feature is that each of these artificial neurons may communicate with thousands of the neurons with whom it coexists, forming a complex network that mimics the neural networks of our own brain. Loihi’s strength is concentrated in this location. 

According to Intel, the Kapoho Bay neuromorphic system has two Loihi chips with 262,000 neurons that enable it to recognise movements in real time and read Braille. 

Using this semiconductor as a starting point, Intel has built more complex neuromorphic systems that integrate numerous Loihi units to adapt to substantially higher workloads and more demanding procedures. According to Intel, it uses two Loihi chips with 262,000 neurons that, according to Intel, allow it to recognise movements in real time and read Braille, among other things. 

Pattern recognition, machine learning, selecting the best answer from a large number of possibilities, and developing algorithms to match requirements are just a few of the difficulties that neuromorphic systems excel at. The researchers had previously demonstrated the effectiveness of chips and neuromorphic algorithms in dealing with these issues, but it was unclear whether they were much more efficient in terms of energy usage. 

That is no longer true. Several researchers from Intel and the Institute of Theoretical Informatics at Graz University of Technology in Austria recently published an article in the journal Nature Machine Intelligence claiming to have experimentally proven that a Nahuku board made up of 32 Loihi chips is up to sixteen times more efficient than a hardware infrastructure with comparable power, but GPUs give us higher performance than general-purpose.

The issue is that a graphics processor farm’s energy consumption can be rather high, and the prospect of meeting this task while using up to sixteen times less energy is highly appealing at this point. This is what Intel’s neuromorphic systems already give us, according to Intel, and it appears to us to be a strong reason to keep a careful eye on them.