You know telekinesis, right – moving objects with your thoughts? Let’s talk techno-kinesis – moving machines with your thoughts. There’s a really great new example out of EPFL’s Center for Neuroprosthetics. Researchers have developed a wheelchair which responds to brain activity as monitored by an EEG cap. Operators can simply think of moving their hand in one of four directions (left, right, back, forward) and the wheelchair will automatically move them as commanded. Pretty cool, right? To make things easier for the user, the mind-controlled system is augmented with an artificial intelligence. The AI has cameras mounted on the wheelchair and can guide the device around obstacles as the user thinks of moving in that direction. Head researcher Jose Millan calls the AI/EEG combination scheme ‘shared control’. Watch him explain the system in the video below. Looks like a pretty smooth ride. This could mean a world of difference to paralyzed people, and has some big implications for the expansion of techno-kinesis in the future.
Wheelchair guidance is just the lowest hanging fruit in the orchard of thought-controlled devices. It’s easy to demonstrate how such systems could assist a certain demographic (quadriplegics or those with locked-in syndrome) in the near future. I can’t look at this video, though, without thinking of some much bigger (and possibly far-fetched) long term applications. We could control robots, maybe as surrogates for ourselves, in remote locations. In fact, we’ve already seen other EEG systems control bots. Or forget humanoid shapes, we could control cars, automated portions of our homes, or any motorized device with these brain-computer interfaces. Seeing how easily the EPFL wheelchair is used gives me hope that we’re on the path to developing more complex applications of this technology. There’s a lot of potential here, though most of it is many years away.
In terms of the actual wheelchair application, the EPFL isn’t all that impressive. Regular readers of Singularity Hub will recognize that this isn’t the first wheelchair run via brain-computer interface. We’ve seen a whole fleet of them. Each, however, takes a slightly different approach to the problem. Braingate puts electrodes directly into the brain and uses motor neuron signals as command inputs. Audeo picks up on signals to muscles in the throat – you think about speaking and the chair moves. The Emotiv Epoc headset can do much the same with facial muscles. The University of Zaragoza system has an EEG cap, but a computer guides the chair to where an user is looking on an image of their surroundings – there’s no thought-to-direction commands. EPFL’s approach is the first we’ve seen that uses an EEG but still lets a user decide how to move as well as his or her destination.
So the EPFL chair grants a little more control than others, and isn’t as invasive. Points in its favor. The bigger innovation, however, is probably the cooperation between the brain computer interface and the guidance artificial intelligence. It’s hard to tell from the video how much the AI is controlling the movement of chair once it gets rolling, but the overall effect looks like a very smooth ride. Negotiating a good relationship between human and AI commands isn’t easy, so if EPFL can really make this ‘shared control’ work, it will be a strong step forward in human-machine interactions. Again, the possible applications extend well beyond wheelchairs. No matter where we look to use brain computer interfaces, we’re going to rely on AI to take care of parameters that are too complex, or too dynamic for our brains to control. Getting that cooperation down now is going to pay big dividends in the future. Millan and his crew will present their findings at the AAS annual meeting in February of next year. Let’s hope they’ll have moved us even further down the path of techno-kinesis by then.
[image and video credits: EPFL News]
[source: EPFL News]