Robots Will Be Able to Feel Touch With This Artificial Nerve

When the disembodied cockroach leg twitched, Yeongin Kim knew he had finally made it.

A graduate student at Stanford, Kim had been working with an international team of neuroengineers on a crazy project: an artificial nerve that acts like the real thing. Like sensory neurons embedded in our skin, the device—which kind of looks like a bendy Band-Aid—detects touch, processes the information, and sends it off to other nerves.

Yup, even if that downstream nerve is inside a cockroach leg.

Of course, the end goal of the project isn’t to fiddle with bugs for fun. Rather, the artificial nerve could soon provide prosthetics with a whole new set of sensations.

Touch is just the beginning: future versions could include a sense of temperature, feelings of movement, texture, and different types of pressure—everything that helps us navigate the environment.

The artificial nerve fundamentally processes information differently than current computer systems. Rather than dealing with 0s and 1s, the nerve “fires” like its biological counterpart. Because it uses the same language as a biological nerve, the device can directly communicate with the body—whether it be the leg of a cockroach or residual nerve endings from an amputated limb.

But prosthetics are only part of it. The artificial nerve can potentially combine with an artificial “brain”—for example, a neuromorphic computer chip that processes input somewhat like our brains—to interpret its output signals. The result is a simple but powerful multi-sensory artificial nervous system, ready to power our next generation of bio-robots.

“I think that would be really, really interesting,” said materials engineer Dr. Alec Talin at Sandia National Laboratory in California, who was not involved in the work. The team described their device in Science.

Feeling Good

Current prosthetic devices are already pretty remarkable. They can read a user’s brain activity and move accordingly. Some have sensors embedded, allowing the user to receive sparse feelings of touch or pressure. Newer experimental devices even incorporate a bio-hack that gives its wearer a sense of movement and position in space, so that the user can grab a cup of coffee or open a door without having to watch their prosthetic hand.

Yet our natural senses are far more complex, and even state-of-the-art prosthetics can generate a sense of “other,” often resulting in the device being abandoned. Replicating all the sensors in our skin has been a longtime goal of bioengineers, but hard to achieve without—here’s the kicker—actually replicating how our skin’s sensors work.

Embedded inside a sliver of our skin are thousands of receptors sensitive to pressure, temperature, pain, itchiness, and texture. When activated, these sensors shoot electrical signals down networks of sensory nerves, integrating at “nodes” along the way. Only if the signals are strong enough—if they reach a threshold—does the information get passed on to the next node, and eventually, to the spinal cord and brain for interpretation.

This “integrate-and-fire” mode of neuronal chatter is partly why our sensory system is so effective. It manages to ignore hundreds of insignificant, noisy inputs and only passes on information that is useful. Ask a classic computer to process all these data in parallel—even if running state-of-the-art deep learning algorithms—and it chokes.

Neuromorphic Code

One thing was clear to Kim and his colleagues: forget computers, it’s time to go neural.

Working with Dr. Zhenan Bao at Stanford University and Dr. Tae-Woo Lee at the Seoul National University in Seoul, South Korea, Kim set his sights on fabricating a flexible organic device that works like an artificial nerve.

The device contained three parts. The first is a series of sensitive touch sensors that can detect the slightest changes in pressure. Touching these sensors sparks an electrical voltage, which is then picked up by the next component: a “ring oscillator.” This is just a fancy name for a circuit that transforms voltage into electrical pulses, much like a biological neuron.

The pulses are then passed down to the third component, a synaptic transistor. That’s the Grand Central Station for the device: it takes in all of the electrical pulses from all active sensors, which then integrates the signals. If the input is sufficiently strong the transistor fires off a chain of electrical pulses of various frequencies and magnitudes, similar to those produced by biological neurons.

In other words, the outputs of the artificial nerve are electrical patterns that the body can understand—the “neural code.”

“The neural code is at the same time rich and efficient, being an optimal choice to design artificial systems for sensing and perception,” explained Dr. Chiara Bartolozzi at the Italian Institute of Technology in Genova, who was not involved in the work.

Neural Magic

In a series of tests, the team proved her right.

In one experiment, they moved a small rod across the pressure sensor in different directions and found that it could distinguish between each movement and provide an estimate of the speed.

Another test showed that a more complicated artificial nerve could differentiate between various Braille letters. The team hooked up two sets of synaptic transistors with oscillators. When the device “felt” the Braille characters, the pressure signals integrated, generating a specific output electrical pattern for each letter.

“This approach mimics the process of tactile information processing in a biological somatosensory system,” said the authors, adding that raw inputs are partially processed at synapses first before delivery to the brain.

Then there was the cockroach experiment. Here, the team hooked up the device to a single, detached cockroach leg. They then applied pressure to the device in tiny increments, which was processed and passed on to the cockroach through the synaptic transistor. The cockroach’s nervous system took the outputs as its own, twitching its leg more or less vigorously depending on how much pressure was initially applied.

The device can be used in a “hybrid bioelectronics reflex arc,” the authors explained, in that it can be used to control biological muscles. Future artificial nerves could potentially act the same way, giving prosthetics and robots both touch sensations and reflexes.

The work is still in its infancy, but the team has high hopes for their strategy. Because organic electronics like the ones used here are small and cheap to make, bioengineers could potentially pack more sensors into smaller areas. This would allow multiple artificial nerves to transmit a wider array of sensations for future prosthetic wearers, transforming the robotic appendage into something that feels more natural and “self.”

Natural haptic feedback could help users with fine motor control in prosthetic hands, such as gently holding a ripe banana. When embedded in the feet of lower-limb prosthetics, the artificial nerves could help the user walk more naturally because of pressure feedback from the ground.

The team also dreams of covering entire robots with the stretchy device. Tactile information could help robots better interact with objects, or allow surgeons to more precisely control remote surgical robots that require finesse.

And perhaps one day, the artificial nerve could even be combined with a neuromorphic chip—a computer chip that acts somewhat like the brain—and result in a simple but powerful multi-sensory artificial nervous system for future robots.

“We take skin for granted but it’s a complex sensing, signaling, and decision-making system,” said study author Dr. Zhenan Bao at Stanford University. “This artificial sensory nerve system is a step toward making skin-like sensory neural networks for all sorts of applications.”

Image Credit: Willyam Bradberry / Shutterstock.com

Shelly Fan
Shelly Fanhttps://neurofantastic.com/
Shelly Xuelai Fan is a neuroscientist-turned-science writer. She completed her PhD in neuroscience at the University of British Columbia, where she developed novel treatments for neurodegeneration. While studying biological brains, she became fascinated with AI and all things biotech. Following graduation, she moved to UCSF to study blood-based factors that rejuvenate aged brains. She is the co-founder of Vantastic Media, a media venture that explores science stories through text and video, and runs the award-winning blog NeuroFantastic.com. Her first book, "Will AI Replace Us?" (Thames & Hudson) was published in 2019.
RELATED
latest
Don't miss a trend
Get Hub delivered to your inbox

featured