Patients Control Computer Using Only Their Minds (video)

41 1 Loading

Patients can control a computer with their thoughts using ECoG electrodes placed directly on the brain.

A temporary surgical implant enabled patients to “talk” to a computer. Just by thinking the words aloud in their head they were able to control a cursor on a computer screen. The brain-computer interface (BCI) technology could one day be used to help people who are unable to talk or have other physical disabilities due to brain injury. The technology could one day be used to read a person’s mind.

Published April 7 in the Journal of Neuroengineering, the study was carried out by scientists at the Center for Innovation in Neurosciences and Technology at Washington University in St. Louis. The team was led by Dr. Eric Leuthardt, a pioneer in the field who previously developed a BCI that enabled people to play video games with their thoughts. In the current study a net of ECoG (electrocorticographic) electrodes was temporarily placed beneath the dura, a layer of connective tissue surrounding the brain. Rather than performing a craniotomy and placing electrodes on the brain for an experiment–might be hard to get approval for that–the original purpose of the electrodes was to map activity in patients with intractable epilepsy so that those areas could be surgically removed. As human brain studies are often brought about, Dr. Leuthardt combined his clinical aims with experimental. The ECoG electrodes detect the activity of underlying neurons and transmit the signals to a computer that then uses the signals to perform a task. In the current study the patients’ brain activity was used to control a cursor on a computer screen. Remarkably, the patients were able to accurately control the cursor in as little as 4 minutes. The slowest of them took 15 minutes. The ease with which the patents were able to perform the task is an encouraging sign that the technology could be applied to prosthetics control.

Other researchers have successfully used a BCI to interact with a computer. What’s novel about Leuthardt’s study was the region of the brain they recorded from. Building off work in monkeys where a mathematical relationship was found between the activity of motor cortex neurons and movements produced, early work in neural interfaces for prosthetic control logically focused efforts of how to use the motor cortex as the brain activity source. Leuthardt’s group, however, took a different approach. They hypothesized that, instead of imagining an arm movement–from right to left, for example–the patient could control the cursor with sounds either spoken aloud or imagined.

Instead of recording from the motor cortex, the researchers needed to record from the speech centers of the brain: Wernicke’s area in the temporal lobe and Broca’s area in the frontal lobe. The patients were asked to say or think of four sounds: oo, ah, ee, and eh. The computer then associated the patterns of brain activity that represented each of the sounds and tied specific cursor movements to the sounds. When the patient said or thought “ah” for example, the cursor would move left.

Spoken or imaginary sounds generate brain waves which are recorded by ECoG electrodes and sent to a computer to control the movements of a cursor.

Using the brain’s speech centers instead of the motor area was a major achievement. Human speech has been studied extensively with brain imaging techniques such as positron emission tomography (PET) or functional magnetic resonance imaging (fMRI). Data from these experiments have revealed a great deal about how different parts of the speech network work together to produce and understand language. But prior to Leuthardt’s demonstration it was not known if speech network activity could be used in BCI control.

Will the computer understand us if we simply talk to it? This is important for neuroprosthetic devices of the future as it expands the repertoire of brain function that clinicians can potentially use to control a robotic limb.

Another way to phrase the above question: can the computer read our minds? Amazingly, the answer seems to be yes. But simple oos and ahs are one thing, articulated thoughts are quite another. When we talk–either to each other or internally to ourselves–our thoughts aren’t limited to the words we’re using. Our brain relates to the words in intuitive ways, as in all of the imagery and associations that pop up in our heads when we hear a simple word like “ninja.” BCIs are a long way off from extracting the tremendously more complex idea of ninja our brain conjures up, but understanding overt statements from the brain is a step in that direction. It’s fun to think that this technology might be used someday to record our thoughts in the same way tape recorders are used. Brain implants could enable us to “jot down” lecture notes in our thoughts and retrieve them from the computer later. You’ll definitely want to keep those notes heavily guarded, lest someone hacks in and realizes that your mind kept wandering to the cute girl in the row next to you.

The video below from Russia TV Today is a great summary of the state of BCI technology today. Instead of using surgically-implanted ECoG electrodes, the Russian scientists in the video use a much more user-friendly “shower cap” of EEG electrodes that can read brain waves from outside the head. The video nicely illustrates the technology, including the difficulties of calibrating BCIs. Check it out as users solve puzzles, drive a remote controlled car, and move a ball across the floor using only their thoughts.

Computers are already being used to read our minds–and companies are cashing in on the data. Neuromarketing is a field born when a neuroscientist performed the Pepsi Challenge while scanning people’s brain activity with fMRI. The study showed that a part of the brain called the medial prefrontal cortex lights up when people really like a product. As before, Pepsi beat Coke and when they drank Pepsi the MPC lit up. But then why, if more people prefer Pepsi, does Coke dominate the market? The answer came when researchers uncovered the labels. Now that the people knew what they were drinking, the MPC lit up with Coke, not Pepsi. The conclusion was that Coke’s advertising was much more effective than Pepsi’s: even though people preferred Pepsi, they thought they preferred Coke. Lighting up the MPC meant a refreshed and satisfied Coke drinker. Thus, a cottage industry was born. Companies began putting people in MRI machines and testing their slogans and ad campaigns, and watching to see if the MPC lit up. If it did, it meant the consumer was thinking, “I need that pair of shoes.”

The potential of combining mind and machine is limitless. The two are being brought ever closer as developments in BCI technology proceed in parallel with our increasing understanding of the how the brain works. The future of BCIs will take us in even more exciting and unpredictable directions. Whether it improves the lives of disabled people, enhances our use of information, makes video games more fun, or makes companies money only time will tell. Eventually, I have no doubt, it will be all of the above and more.

[image credits: Journal of Neuroengineering]

video: Russia TV Today

Discussion — One Response

  • Eric K April 26, 2011 on 4:51 am

    Nanorobot brain computer interface is the future