Want to Decode the Human Brain? There’s a New System for That, and It’s Pretty Wild

Even for high-tech California, the man strolling around UCLA was a curious sight.

His motion capture suit, sensor-embedded gloves, and virtual reality eyewear were already enough to turn heads. But what stopped people in their tracks and made them stare was a bizarre headgear, tightly strapped to his head through a swimming cap-like device embedded with circular electrode connectors. Several springy wires sprouted from the headgear—picture a portable hard drive hooked up to a police siren enclosure—and disappeared into a backpack. The half-cyborg look teetered between sci-fi futurism and hardware Mad Libs.

Meet Mo-DBRS, a setup that could fundamentally change how we decode the human brain.

The entire platform is a technological chimera that synchronizes brain recordings, biomarkers, motion capture, eye tracking, and AR/VR visuals. Most of the processing components are stuffed into a backpack, so that the wearer isn’t tethered to a “landline” computer. Instead, they can freely move around and explore—either in the real world or in VR—something not usually possible with brain scanning technology like MRI.

Movement may seem like a trivial addition to brain scanning, but it’s a game changer. Many of our treasured neural capabilities—memory, decision-making—are honed as we explore the world around us. Mo-DBRS provides a window into those brain processes in a natural setting, one where the person isn’t told to hold still while a giant magnet clicks and clangs around their head. Despite its non-conventional look, Mo-DBRS opens the door to analyzing brain signals in humans in environments close to the real world, while also having the ability to alter those brain signals wirelessly with a few taps on a tablet.

All custom software powering Mo-DBRS is open-sourced, so neuroscientists can immediately play with and contribute to the platform. However, because the setup relies on volunteers with implanted electrodes into the brain, it’s currently only tested in a small number of people with epilepsy who already have neural implants to help diagnose and prevent their seizures.

Published in Neuron last week, the response from the neuroscience community on Twitter was a unanimous “Wow!” “Fantastic work,” wrote Dr. Michael Okun, Medical Director at the Parkinson’s Foundation. “Very impressive setup,” tweeted Dr. Klaus Gramann, a researcher in mobile brain-body imaging at Technische Universität Berlin.

“Dreamt since grad school of 1-day being able to record from deep brain regions (like hippocampus) in humans during spatial navigation & learning/memory in naturalistic experiences,” tweeted lead author Dr. Nanthia Suthana at UCLA. “My lab team has made that dream come true!”

So What?

Mo-DBRS isn’t as sleek as Neuralink’s brain implant. It’s also restricted to people with electrodes already in their brains. So what’s the big deal?

Everything. Those sci-fi dreams of restoring memory, reversing paralysis, battling depression, erasing fear, and solving consciousness? They all depend on capturing and understanding the human brain’s neural code—that is, how do electrical firings turn into memories, emotions, and behavior? Since the beginning of modern neuroscience, this has been done using electrodes implanted into mice or other experimental animals.

Take memory, a brain capability that lays the foundation of who you are.

Until now, memory research has mostly relied on rodents scurrying around mazes looking for tasty treats. Rough translation? Those experiments simulate us finding our cars in a parking lot, and identify the brain waves behind that spatial memory. By recording signals from the mice’s hippocampi, a seahorse-shaped structure buried deep inside the brain, scientists have set up a framework of how our memory works—how a single experience is tied to a time and space, and how a precious memory is linked to our emotions and reinforced.

The obvious problem? Humans aren’t mice.

For a brain function as intimate as memory, it’s incredibly difficult to extrapolate from rodent brain recordings. While traditional brain imaging methods for humans, such as functional MRI (fMRI) or magnetoencephalography (MEG) can paint a stationary image of the brain as it remembers a place—often played on a video screen—the setup is far from “normal” in that the person is completely immobile.

Meet Mo-DBRS

Mo-DBRS goes after a whole “wish list” of brain decoding needs: reading and writing from the human brain in real time, wirelessly, while the person walks around, and combining neural recordings with heart rate, breathing, and other biomarker sensors.

The inspiration came from patients with epilepsy and other neurological disorders who already have electrodes implanted into their brains and go about their normal lives. “There are over 2,000 individuals with chronic sensing and stimulation devices…with the number expected to increase as additional invasive treatments are proven successful,” the team wrote. These devices are implanted into deep regions of the brain—those controlling memory, emotion, and movement. With careful planning to avoid interfering with their treatment, the authors reasoned, it’s possible to tap into these neural recordings to directly decode the human brain’s activity in a real-life setting, rather than relying on rodent studies or MRI-style immobile brain imaging.

The heart of Mo-DBRS’s brain recording and stimulation setup is a medical device called NeuroPace, which is often implanted inside the skull to help epilepsy patients control their seizures. Think of NeuroPace as a pacemaker for the brain. It can both “read” the brain’s electrical signals and “write” into the brain—using short electrical pulses to prevent a seizure electrical storm from occurring. However, like a radio, many brain processes rely on a certain frequency. By skirting the frequencies that help control seizures, the team was able to listen in and control other brain processes, such as electrical signals that form as people explore new environments. Data from the implanted device is wirelessly transferred to a custom-built “wand” (the weird hard drive-police siren-looking thing) strapped to the outside of the head.

Using a Raspberry PI computer and a tablet—both stored inside a backpack—that are connected to the wand, the team was able to wirelessly program the neural implant to deliver electrical pulses into the brain. At the same time, the team also added scalp EEG, which measures the brain’s electrical waves through electrodes embedded in a cap that’s worn on the head like a swim cap. This technological tag-team provides an explosion of neural data, from both inside and outside the brain.

Moving beyond the brain, the team further equipped volunteers with a chest strap that senses heart rate, breathing, and sweating. These biomarkers capture the emotional responses around a specific memory, which could help better understand how emotionally-charged memories tend to stick around. To synchronize all the data, the team added an artificial “marking signal”—a strange-looking electrical pattern—into brain recordings to denote the start of an experiment.

The whole system weighs about nine pounds, with most of the processing components tucked inside a backpack. A lighter version weighing about a pound, called “Mo-DBRS Lite” is also ready to go, the team explained, but comes with the caveat of decreased efficiency on synchronization with a higher delay in reading from the brain.

As a proof-of-concept, Mo-DIBS was tested on seven volunteers already implanted with the NeuroPace system. One person easily walked around a room to look at a wall-mounted sign while having his eyes, brain activity, and other biomarkers tracked without a hitch. Add in a component of VR, and it’s completely possible to recreate the classic memory experiment of navigating a maze—only this time, rather than rodents, scientists are recording directly from the human brain, with the potential to disrupt those signals and play with memory.

Although Mo-DBRS is built using NeuroPace, the platform can be integrated with other existing neural implants, the team said. The entire software code is open-sourced for researchers to collaborate and expand on.

“There’s a lot of potential here with the platform to start asking questions that we haven’t been able to do before in neuroscience, because we’ve been limited by the immobility of our participants,” said Suthana. “We can start to explore novel therapies that involve neurostimulation and [understand] the neural mechanisms that are involved in these types of treatments.”

Image Credit: christitzeimaging.com / Shutterstock.com

Shelly Fan
Shelly Fanhttps://neurofantastic.com/
Shelly Xuelai Fan is a neuroscientist-turned-science writer. She completed her PhD in neuroscience at the University of British Columbia, where she developed novel treatments for neurodegeneration. While studying biological brains, she became fascinated with AI and all things biotech. Following graduation, she moved to UCSF to study blood-based factors that rejuvenate aged brains. She is the co-founder of Vantastic Media, a media venture that explores science stories through text and video, and runs the award-winning blog NeuroFantastic.com. Her first book, "Will AI Replace Us?" (Thames & Hudson) was published in 2019.
RELATED
latest
Don't miss a trend
Get Hub delivered to your inbox

featured