When Keven Walgamott reached out and grasped his wife’s hand, his face broke into an enormous smile.
For the first time in 14 years, he could feel her soft fingers, pressing firmly into his prosthetic palm.
David was testing a new interface that seamlessly integrates movement and feeling into one. The goal? To replace the alien, “other” feeling of a robotic prosthetic with something intuitive and natural—like a part of oneself.
Tapped directly into the remaining nerves of Keven’s lost forearm, the interface translates his thoughts into electrical signals that propel a dexterous robotic hand into motion.
But that’s not all: it also takes information from sensors on the robotic hand and sends it back to the brain through two implanted high-density electrical arrays. There, the electrical pulses are translated into feelings of pressure, vibration, and movement—feelings that seem to come from the patient’s own missing limb.
It’s this closed loop from thought to movement to sensory feedback that makes the device special.
“People often think of touch as a single sense, but it’s actually sub-divided into other senses, such as pressure, vibration, temperature, pain, etc. The high resolution of our device allows us to activate these sub-classes of touch in isolation (i.e., pressure without vibration or pain) in a specific part of the hand,” says Jacob George, who helped develop the interface at the University of Utah, to Singularity Hub.
Unlike previous generations that only spottily convey 20 or so sensations—participants can only feel pressure, for example—the new system restores 100 unique sensations with exquisite resolution.
And this crucially sets the interface apart. George explains, “Our participants have controlled prosthetics with their minds before. They’ve also felt sensations from their missing hand before. But combine both, add a dash of visual feedback, and it’s a completely different experience.”
“Suddenly his hand becomes alive again,” he says, presenting the results at the Society for Neuroscience conference in Washington, DC.
Keven is one of the 1.6 million people in the US suffering from limb loss.
“Losing a limb is not just losing a physical part of yourself. It’s also losing a part of you emotionally,” says George.
Depression and anxiety are common. Some patients even experience stubborn phantom pain—a shooting, burning sensation that seemingly emanates from their amputated limb and is unresponsive to normal painkillers.
“Our participants have described losing their hand as losing a family member, except you’re reminded of it every day,” says George.
It’s this emotional beast that’s exceedingly hard to tame. Part of the reason is that current prostheses just don’t feel right. Scientists have already made strides on making flexible prostheses that restore some sense of normal function. These devices generally listen in on electrical signals coming from muscles above the site of amputation using surface electrodes—a non-invasive approach, but also less selective in teasing different nerve signals apart.
“Metaphorically, [this is] just as difficult to have a private conversation with someone inside a football stadium if you are shouting from outside,” says George.
More recent prostheses are wired directly to the source: the patient’s motor cortex, or his remaining arm nerves. These nerves transmit the brain’s intention to move to the amputated limb, and these signals can be decoded to move the prosthetic hand accordingly.
Compared to the brain, the arm nerves are much safer implant targets. Their signals are also easier to understand—one neuron fires, one muscle twitches. Rather than trying to figure out the cacophonous, probabilistic chatter in the motor cortex, it makes far more sense to tap into its downstream wires.
Yet, despite the jaw-dropping tech that allows patients to control robotic arms with their minds, these devices still have issues.
Without the sensory feedback that we normally get, patients can’t fine-tune their grip. Rather than pinching a grape, they easily end up squeezing it into mush. It’s un-intuitive, frustrating, and utterly foreign.
“As a result, nearly 50 percent of amputees abandon their prostheses,” says George. “We want to restore the full experience of using a hand,” he adds.
The Utah Solution
The new system works in two parts: first, it picks up signals from the brain through two microelectrode arrays a fraction of the size of a penny, which are implanted into the patient’s residual nerves that normally innervate the hand.
At the same time, a different assembly of electrodes picks up signals from the residual muscles.
“After implantation, we have to create a map of what sensations are possible, and these are patient-specific,” says George.
They meticulously stimulated each of the array’s electrodes—a stunning 192 in total—and asked Keven to report where he felt a sensation and what it was.
For example, electrode 64 was associated with a feeling of pressure on the tip of his thumb. The team then activates the pressure sensor on the prosthetic’s thumb, so that it behaves as a biological hand would.
Once they complete mapping, the entire translation process happens under the hood. Keven could move the prosthetic hand intuitively, as if operating his own hand.
When the prosthesis makes contact, the team then stimulates the corresponding electrode to provide feedback to the patient. For the tip of the thumb, it’s electrode 64. The stronger the pressure, the stronger the zap. In essence, the arrays essentially replace lost nerve endings that normally innervate the hand.
In this way, when the prosthetic hand’s thumb touches a grape, Keven feels like his missing finger is being touched.
The functional improvements were enormous. In one test, where Keven was asked to move around a “mechanical egg” that senses pressure, he succeeded without breaking the egg 21 percent more when sensory feedback was turned on.
A Whole Self
But arguably, the biggest benefit is emotional.
In one powerful video demonstration, a former participant was asked to touch a virtual door using a computer-generated prosthetic. The simulation was cartoonish and rough, but it didn’t matter to the patient.
Brushing his avatar hand against the door, he audibly gasped.
“Oh my god. I just felt that door,” he exclaimed breathlessly, “God! That is so cool!”
“It’s not that he felt the sensation on his missing hand; it’s that he felt the door and it’s suddenly him interacting with the environment around him in the first time in 24 years,” explains George. This sense of embodiment is what was previously missing from prosthetics.
Keven’s experience was no less transformative. After a training session with a 3D printed prosthetic hand, he was asked what he would like to grasp.
“I want to clasp my hands together,” he said, folding his remaining hand into a loose fist and gently, repeatedly pressing down on the prosthesis, as if massaging his knuckles after a long day.
“The prosthesis was originally just a tool to help him with activities in his daily life, but now…it’s something that is recognized as his own hand,” says George.
“It’s not just about improved dexterity or improved feeling, it’s about feeling whole again,” he adds.
The team is now readily working on a wireless take-home version of their system for patients to use in their regular lives. Without wires pocking out, there should be less chance of infection and breakage. The benefits are also expected to amplify with more practice and continued use.
Having already tested their current interface in seven patients, the team is feeling confident. They say they should have a wireless version ready for human testing in about a year.
“Ultimately, like one of our patients said, it’ll be just like Luke Skywalker. Then everyone will want one,” concludes George with a wide grin.