How Does Social Interaction Change Our Brains? Hyperscans Can Show Us

Brain scans, like social distancing, are inherently very lonely.

Regardless of the equipment, brain scans often rely on a single person performing a single task, often completely still, outside of their normal environment. It’s powerful, sure. Brain mapping projects haven’t only uncovered hidden anatomical highways between brain regions, but also how they transiently organize into neural networks to support sensations, memories, thoughts, decisions, life.

Yet as this year has brutally underscored, our brains don’t function in isolation. Social distancing, isolation, and loneliness are difficult to tolerate because our emotions, well-being, and high-level thought processes rely on other people. Even extreme introverts spend their lives forging and learning from social bonds. Our brains’ electrical waves literally synchronize when we listen to a story or watch movies together. In a way, our brains are subconsciously fine-tuned to those surrounding us. In autism, depression, schizophrenia, and other neurological hiccups, these mechanisms break down.

It’s high time to map our social connectomes, argues a new paper in Neuron.

The past decade of neuroscience wizardry has mostly focused on mapping a single person’s connectome. Connectomes are often considered transient “cognitive maps” that underlie how we think and act. Yet what’s been critically missing is the impact of other people on our brains. With the rise of social robots like Pepper and our increasing interactions with autonomous cars, it’s even more critical to understand how our neurons engage during social interactions—be they with men or machines.

“Our actions and decision-making in everyday life are heavily influenced by others,” but we know very little about how brains couple, said author Dr. Antonia F. de C. Hamilton at University College London. The good news? A “game-changing” new technique has the power to change that. That is, if we’re very careful about how to interpret what we find.

Collective Brainmaps

Meet hyperscanning, a marriage between neurotech, mind-reading, and some serious math.

For about two decades, a growing voice among neuroscientists argued we should use non-invasive brain reading technologies on multiple people simultaneously. Scientists can then watch all their brains in action as the volunteers either collaborate or compete on a single task. Add in a deluge of complicated math, and it’s then possible to statistically see what happens in multiple brains while people are hanging out at the same time. It’s conceptually similar to expanding a single person’s connectome to a group brainmap, though at a far lower resolution.

It’s a technique with a slight whiff of voodoo. Humans are, after all, independent creatures, and our interactions rather unpredictable. But it’s been low-key successful for tackling social neuroscience, such as how our neurons encode social engagements. In one seminal study, a team from Stanford used lasers to measure brain activity from two people playing a collaboration game side by side.

The scanner, NIRS (Near-Infrared Spectroscopy) uses light to detect how blood oxygen changes in a given part of the brain. Because neurons use up oxygen as they fire with activity, blood oxygen levels provide a proxy to how much activity is happening in one part of the brain. The beauty of NIRS is that the setup is much more simple than a traditional brain scan: the volunteers can literally wear the entire scanning equipment like a swimming cap and walk around, or talk and work with other people. It was a game-changer for the study of human social interaction, said Hamilton, because it shows it’s possible to scan multiple peoples’ brains at once.

The study found that people who successfully partnered on a task tend to have the frontal part of their brain activity singing in harmony. Since then, multiple studies observed the same coherence in our neurons when we connect with other people—for example, when making eye contact, conversations, or collective decisions.

The problem, says Hamilton, is how much we should trust these brain scanning data alone.

Brain-to-Brain Coupling

NIRS is one way to easily scan multiple brains. But there are other options: EEG (electroencephalography), which measures the brain’s electrical activity through the scalp, or its sibling EMG, which does the same thing using magnets. These setups, once limited to select labs, have gotten progressively more sensitive and accurate and are now entering the consumer sphere. Add in hyper sci-fi setups with multiple sensors, and we seem to be at a turning point for being able to “mind-read” multiple brains at once, decoding how social interactions make our brains tick—or not.

Not so fast, said Hamilton. Claims that one’s brain activity influences another’s feels like “telepathy,” disconnected from how we understand the brain or even from anything currently plausible neurobiologically, she said.

Yes, our brains do sync up when watching a movie together. But all it reflects could be a “common cognitive processing” mechanism—basically, the brain’s basic OS—that launches together and results in synced-up brains, rather than a brain-specific social mechanism per se.

“To move beyond the hype, we may need to do bigger, better experiments and interpret them within a stronger theoretical framework,” she said.

One way to get clearer answers is to add in other bodily inputs, such as what we see, feel, or sense. Think about your last interaction with a loved one, a friend, or a coworker—your body reacts as well. This sort of “embodied mutual prediction” is critical, says Hamilton, so it’s possible to gather complimentary neural electrical data and heart rate or other biomarkers to parse social interactions. For example, to gauge how well a group works together, scientists can potentially measure the brain and biomarker responses of a team over a period of time, capture their neural activity and their bodily responses, and use statistical analyses and modeling to see how well their brain activity coheres.

“It will be possible to understand how the coordination of social brains is embodied in the interaction of social bodies,” said Hamilton.

Loyalty Brain Scans?

Of course, this is all very preliminary. For now, both social neuroscience hyperscanning and social prediction algorithms are in their infancy. We don’t yet have a perfect mathematical model—or an algorithm—that can appropriate a person’s response to another person at the level of neurons or neural networks. To understand someone else, you need to both capture your own thoughts as well as gauge the intention of the other person, whether you’re cooking a meal together, playing a duet, or knowing when to take turns in conversation.

Neuroscientists are now working on a “mutual prediction theory” to map how our neurons may work to support these processes. The main idea is that everyone has two predictive powers: one that gauges and controls our own behaviors, and one that predicts and maps the behavior of people you’re interacting with. At the back end of both are powerful biological software algorithms that can model both your and your interacting partner’s behaviors. The key is to decode these algorithms in the brain while it’s engaged with another brain.

It might sound very vague, but it’s the type of study that could give more insight into how much people are willing to sacrifice during a lockdown, or how we respond to people with different mask-wearing values. Using an algorithm called cross-brain general linear model (xGLM), for example, scientists may be able to understand how people predict others’ responses, and how they respond in turn.

Overall, argues Hamilton, any social interaction study should include brain scans and how our bodies respond. Hyperscanning may be the hottest kid on the social neuroscience block. But adding in our bodies’ reactions captures how we feel and react during social engagements, subconscious brain processes be damned. Combining the two drives us towards a powerful model of social interaction, allowing us to “move social neuroscience research into the real world.” How children respond to virtual learning. How therapists lead patients to a more happy and satisfying life. How being isolated feels like you’re being ground down. How Zoom calls change a corporate team’s work rhythm.

And if there was ever a year to better understand what our brains get from face-to-face interactions, this is it.

Image Credit: adike /

Shelly Fan
Shelly Fan
Shelly Xuelai Fan is a neuroscientist-turned-science writer. She completed her PhD in neuroscience at the University of British Columbia, where she developed novel treatments for neurodegeneration. While studying biological brains, she became fascinated with AI and all things biotech. Following graduation, she moved to UCSF to study blood-based factors that rejuvenate aged brains. She is the co-founder of Vantastic Media, a media venture that explores science stories through text and video, and runs the award-winning blog Her first book, "Will AI Replace Us?" (Thames & Hudson) was published in 2019.
Don't miss a trend
Get Hub delivered to your inbox