Scientists at the University of Malta think touch screens are for suckers. Mind-controlled devices? Now, that’s where it’s at. Outfitted in an electrode-studded cap, users of the group’s specially designed music software are able to play a song, fast forward tracks, and adjust the volume by merely looking at the screen.

While we don’t yet understand our brains in great detail, we do get the broad strokes. The brain uses a combination of electrical and chemical signals to “compute,” and repetitive thoughts equate to repeating electrical brain patterns.

By feeding test subjects controlled stimuli and recording the subsequent patterns, we can reverse the process in the future. Even if we don’t know what the subject is doing, when we see a familiar pattern, we can infer they’re looking at, hearing, seeing, or even thinking the same thing they were in the original experiment.

The University of Malta researchers, for example, observed the electrical patterns made by subjects’ brains when they looked at flickering boxes on a screen. As the flickering’s frequency varied, so too did each brain’s electrical patterns.

The group recorded the various brain patterns, assigned each frequency an action (play/pause, fast forward, volume), and coded the software to take certain actions when particular patterns were detected. Voilà—mind-controlled Spotify. (Or...Spock-ify?)

Similar principles have elsewhere been  used to reconstruct images viewed by an individual by reading their neural signals alone. And reportedly, electronics giant Samsung is (very) preliminarily investigating mind-controlled smartphones as the next step beyond touch, gesture, and eye-controlled interfaces.

Rajesh Rao (left) briefly takes control of Andrea Stocco's brain (right).

Last year, in one of the most jaw-dropping uses of the technology, University of Washington researchers demonstrated how one scientist, Rajesh Rao, could remotely take control of his colleague Andrea Stocco’s brain.

The two sat in separate rooms watching a simple target practice video game.

Outfitted with an electrode-studded cap, Rao played the game. Instead of clicking the mouse to fire, he thought about moving his finger without actually moving it. The machine read and recorded his brain’s electrical impulses and sent the signal to a room across campus.

In the other room, Stocco was hooked up to a transcranial magnetic stimulation coil (TMS)—a fancy name for a device that delivers low-level electrical impulses to the brain. Each time Rao thought about firing, Stocco’s finger twitched, hit the space bar,and fired.

Before getting too carried away, however, it’s important to note we’re talking about some very simple stuff here. The brain-reading apparatus is cumbersome (if you think Google Glass unsightly, imagine Sergey Brin in an EEG cap), and the readings are still fairly low resolution. Greater control would require more detailed readings.

The larger concept, however, is viable. And for folks who’ve lost the ability to physically control their environment—quadriplegics or sufferers of ALS (Steven Hawking, for example) and locked-in syndrome—such methods might offer a non-invasive way to regain some sense of control, freedom, and easier communication with the world.

Image Credit: University of Malta; University of Washington

Jason is managing editor of Singularity Hub. He cut his teeth doing research and writing about finance and economics before moving on to science, technology, and the future. He is curious about pretty much everything, and sad he'll only ever know a tiny fraction of it all.