Cathy Hutchinson hasn’t moved her limbs of her own volition for 15 years, but by imagining she was using her own hand, she controlled a robotic arm to pick up a thermos of coffee and took a sip. The technology is a neural interface system called BrainGate2, currently in clinical trials, which connects Cathy’s brain to a robot. The device is the result of over 10 years of research at Brown University and an extension of the first BrainGate in 2006, which allowed patients to control a computer cursor on a screen.
Cathy was one of two patients on the study, which was recently reported in Nature, who suffer from tetraplegia, a condition in which communication between the brain and the rest of the body is disconnected either through a stroke or damage to the spinal cord. Prof. John Donoghue, principal investigator on the BrainGate project, described their approach to Nature: “Our idea is to bypass that damaged nervous system and go directly from the brain to the outside world, so the brain signals cannot control muscles but machines and devices, like a computer or a robotic limb.” When Cathy controlled the arm with her mind to bring the coffee over for her to drink, the team was amazed.
Check out the video to see the moment for yourself:
As we previously introduced, BrainGate2 has three components: a sensor, a decoder, and assistive technology. The sensor consists of an array of 96 hair-thin electrodes the size of a children’s aspirin that is surgically implanted into the motor cortex, the part of the brain that controls body movements. Neural activity is relayed through a gold wire to a computer (the decoder), which interprets the signals and produces a command for the robot arm. Two robotic arms have been tested in the study: the DEKA Arm System and the heavier DLR Light-Weight Robot III arm from the German Aerospace Centre.
Cathy has had the BrainGate sensor implanted in her brain for the last five years, as she was involved in previous studies with the system. During testing that took place one year ago, Cathy was able to successfully raise the coffee and drink from it using BrainGate2 four times out of six attempts. In another test of the BrainGate2 system, the two patients had to reach out and grab a ball in a 30-second window, and Cathy experienced better success with the DEKA arm (46 percent success rate) than the DLR arm (21 percent).
Prof. Donoghue explained to Nature that controlling the robotic arm is much more complicated than moving the cursor on a screen in the original BrainGate study: “To move from this type of two-dimensional movement to movements involving reaching out for an object, grasping it and then guiding it in three-dimensional space is a huge step for us. It seems like more than one additional dimension in complexity.” He emphasized that a lot of work needed to be done to improve the rate and accuracy of motion as well as improving the decoding algorithms for more complex motions.
The Brown researchers already have plans to make the sensor wireless and improve the robotic arm to allow for more complicated tasks, such as brushing teeth. In the long term, an alternative approach is being considered in which the signals from the decoder are transmitted to the patient’s muscles, allowing them to reuse their own limbs.
This is a huge stride for the field of brain-computer interfaces, and will undoubtedly inspire more surgical and nonsurgical approaches. Controlling objects with the mind makes for great science fiction, but people who suffer from conditions that prohibit motion due to spinal cord damage are on the cusp of regaining a part of themselves that they thought was lost forever. Furthermore, similar technologies will open up even more possibilities for mind control of objects as the programs that can translate neural signals into instructions become more sophisticated.
“All of us were standing in awe, more or less, because we’re watching her drinking the coffee,” Prof. Donoghue commented in the video. “It was really such a stunning scene.”