Intel Jumps Into Brain-Like Computing With New Self-Learning Chip

The brain has long inspired the design of computers and their software. Now Intel has become the latest tech company to decide that mimicking the brain’s hardware could be the next stage in the evolution of computing.

On Monday the company unveiled an experimental “neuromorphic” chip called Loihi. Neuromorphic chips are microprocessors whose architecture is configured to mimic the biological brain’s network of neurons and the connections between them called synapses.

While neural networks—the in vogue approach to artificial intelligence and machine learning—are also inspired by the brain and use layers of virtual neurons, they are still implemented on conventional silicon hardware such as CPUs and GPUs.

The main benefit of mimicking the architecture of the brain on a physical chip, say neuromorphic computing’s proponents, is energy efficiency—the human brain runs on roughly 20 watts. The “neurons” in neuromorphic chips carry out the role of both processor and memory which removes the need to shuttle data back and forth between separate units, which is how traditional chips work. Each neuron also only needs to be powered while it’s firing.

At present, most machine learning is done in data centers due to the massive energy and computing requirements. Creating chips that capture some of nature’s efficiency could allow AI to be run directly on devices like smartphones, cars, and robots.

This is exactly the kind of application Michael Mayberry, managing director of Intel’s research arm, touts in a blog post announcing Loihi. He talks about CCTV cameras that can run image recognition to identify missing persons or traffic lights that can track traffic flow to optimize timing and keep vehicles moving.

There’s still a long way to go before that happens though. According to Wired, so far Intel has only been working with prototypes, and the first full-size version of the chip won’t be built until November.

Once complete, it will feature 130,000 neurons and 130 million synaptic connections split between 128 computing cores. The device will be 1,000 times more energy-efficient than standard approaches, according to Mayberry, but more impressive are claims the chip will be capable of continuous learning.

Loihi-Intel-self-learning-AI-chip-accelerate-artificial-intelligence-neuromorphic-computing
Intel’s newly launched self-learning neuromorphic chip.

Normally deep learning works by training a neural network on giant datasets to create a model that can then be applied to new data. The Loihi chip will combine training and inference on the same chip, which will allow it to learn on the fly, constantly updating its models and adapting to changing circumstances without having to be deliberately re-trained.

A select group of universities and research institutions will be the first to get their hands on the new chip in the first half of 2018, but Mayberry said it could be years before it’s commercially available. Whether commercialization happens at all may largely depend on whether early adopters can get the hardware to solve any practically useful problems.

So far neuromorphic computing has struggled to gain traction outside the research community. IBM released a neuromorphic chip called TrueNorth in 2014, but the device has yet to showcase any commercially useful applications.

Lee Gomes summarizes the hurdles facing neuromorphic computing excellently in IEEE Spectrum.  One is that deep learning can run on very simple, low-precision hardware that can be optimized to use very little power, which suggests complicated new architectures may struggle to find purchase.

It’s also not easy to transfer deep learning approaches developed on conventional chips over to neuromorphic hardware, and even Intel Labs chief scientist Narayan Srinivasa admitted to Forbes Loihi wouldn’t work well with some deep learning models.

Finally, there’s considerable competition in the quest to develop new computer architectures specialized for machine learning. GPU vendors Nvidia and AMD have pivoted to take advantage of this newfound market and companies like Google and Microsoft are developing their own in-house solutions.

Intel, for its part, isn’t putting all its eggs in one basket. Last year it bought two companies building chips for specialized machine learning—Movidius and Nervana—and this was followed up with the $15 billion purchase of self-driving car chip- and camera-maker Mobileye.

And while the jury is still out on neuromorphic computing, it makes sense for a company eager to position itself as the AI chipmaker of the future to have its fingers in as many pies as possible. There are a growing number of voices suggesting that despite its undoubted power, deep learning alone will not allow us to imbue machines with the kind of adaptable, general intelligence humans possess.

What new approaches will get us there are hard to predict, but it’s entirely possible they will only work on hardware that closely mimics the one device we already know is capable of supporting this kind of intelligence—the human brain.

Image Credit: Intel

Edd Gent
Edd Genthttp://www.eddgent.com/
Edd is a freelance science and technology writer based in Bangalore, India. His main areas of interest are engineering, computing, and biology, with a particular focus on the intersections between the three.
RELATED
latest
Don't miss a trend
Get Hub delivered to your inbox

featured