This Is Where Empathy Lives in the Brain, and How It Works

Mind reading comes easily to most of us.

For all our divisions, humans are uncannily efficient at simulating another person’s thoughts and beliefs. It’s how you can “walk a mile in someone else’s shoes,” know “where they’re coming from,” and in turn, generate empathy or predict how your actions impact others. Most of the time, we can even do this when we fundamentally disagree with the other person’s point of view.

This mysterious ability to hop into someone else’s head—heck, even just to admit that they’re conscious beings with their own minds—is dubbed the “theory of mind.” It’s simulation at its very best, where it allows us to connect and interact with others not just based on our own thoughts and actions, but also on our understanding of theirs. It’s what lets you guess why your friend is upset on her birthday. It’s behind strategy games like chess and entire disciplines such as game theory. It’s what makes human society flourish or fail.

The problem? No one really knows how theory of mind works in our heads—but that’s set to change.

This week, in a study with over half a dozen people, a team from Harvard Medical School and MIT recorded directly from single neurons in the forepart of their brains. For the first time, the scientists identified a special group of cells that lets us acknowledge and predict someone else’s hidden beliefs. Even crazier, these neurons loyally encoded demonstrably false ideas that others may have, and beliefs that the person being studied doesn’t necessarily agree with.

In other words, each of us has a smattering of brain cells dedicated to modeling another mind inside our own heads.

“Until now, it wasn’t clear whether or how neurons were able to perform these social cognitive computations,” said study author Mohsen Jamali.

The results propelled a centuries-old debate on the nature of self and other into a new, scientifically-grounded era. But to lead author Dr. Ziv Williams, it also builds a framework to better capture the intricacies of how we model minds—and when or why it fails. Autism, for example, often leads to a breakdown in the ability to gauge social cues. People with brain injuries due to trauma can also lose that predictive superpower. And outside our own species, a model of how we model each other’s minds could form a powerful tool to bolster AI, providing them with an artificial theory of mind and a lot more common sense when dealing with people.

Mirror, Mirror in the Brain

Debates over the theory of mind have roots going back to 17th-century philosophy. But modern excitement, especially in neuroscience, sparked in the early 1990s, when neuroscientists captured the inner electrical dialogue of a very special type of neurons.

Recording from the motor regions of the brains of macaque monkeys, they found a bizarre population that fired not only when the monkey waved its arm around—say, to grab an apple or ring a bell—but also when it watched another monkey perform the same action. Even weirder, the same neurons sparked with electrical activity when the monkey heard someone else performing the task in another room. Unlike any other known type of brain cell at the time, these “mirror neurons” seemed to encode for another being’s actions and goals, rather than those of its own host.

Mirror neurons exploded in popularity for the next few decades. Some believed they’re the seat of empathy. Others thought they’re central to human social interaction capabilities, such as speech. One prominent pop-culture neuroscientist even went as far as saying that these cells shaped our civilization.

Yet as more sophisticated tools and techniques grew in social neuroscience, people soon realized that mirror neurons weren’t the end-all of empathy, language, or autism. Rather, using state-of-the-art brain imaging, scientists began honing down towards the front of the brain, sitting right behind the forehead—the prefrontal cortex—as the piece of the brain that captures another’s beliefs and thoughts.

Schooled by overpromises from mirror neurons, however, few were willing to hallmark the brain region as a supporter of theory of mind. After all, brain imaging captures the aggregated and averaged activity of thousands, if not more, neurons simultaneously. The readout is then influenced by other brain regions, painting a murky picture.

One way to sharpen it? Record from single neurons.

The Empathetic Brain

The new study blew people away with just that. Rather than relying on social but non-human animals, they went straight to the source: human volunteers who have electrodes implanted. These participants had already gone through brain surgery in preparation for a treatment for Parkinson’s disease, and bravely signed on for the study. This allowed the team to directly record from single neurons in human brains—something generally outside the reach of most theory of mind studies.

In all, they tapped into over 320 neurons embedded in the subjects’ frontal brains. As the implanted microelectrodes silently recorded the brain cells’ electrical activity, the team asked the participants to listen to a short story.

Take this scenario: “You and Tom see a jar on the table. After Tom leaves, you move the jar to the cupboard.” The listener knows that the jar is in the cupboard. But Tom doesn’t. Because of theory of mind, we can reason that Tom will still think the jar is on the table.

The team then asked the listeners two seemingly simple questions. The first was “where is the jar,” or an objective assessment based on the listener’s understanding. The second was more interesting—“where does Tom think the jar is?” which probes the brain’s simulation of Tom’s mind.

Immediately, the team found a slew of neurons that surprisingly captured the distinction between internal beliefs and those of others. About 20 percent of recorded neurons reliably fired with activity when they predicted Tom’s belief. An even higher percentage sparked to life when Tom stated a true belief—that is, “true” from his perspective. In all, the electrical activity of these neurons could predict nearly 80 percent of the time whether the listener accurately predicted Tom’s mental image of the jar.

To rephrase: we have neurons in our heads that encode for someone else’s idea of reality, rather than what’s actually true or real. This holds rather unnerving implications, in that the neurons solely reflect someone else’s specific perspective—your perspective, or the truth, doesn’t come into play.

Down the Rabbit Hole

If you’re thinking “oh well, these brain cells just respond to prediction,” the authors have answers here too. It gets weirder.

For one, the cells that encode data for Tom’s ideas update to his perception of reality. When the participants heard that “after Tom leaves, you move the jar to the cupboard as he watches you through the window,” the same cells that encoded Tom’s perspective will shift gears, leading to the answer that now Tom knows the jar is in the cupboard. Your brain cells, encoding for someone else’s beliefs, will update when their beliefs—not your own—update.

For another, the neurons also captured specific details about Tom’s beliefs. Using stories similar to the jar and cupboard, for example, the team found that these mind-reading neurons could encode for the item’s identity (a jar versus a table or vegetables), its location, color, and other characteristics. Compiling all the tests together, the team built a model with these neurons that could accurately predict another person’s concept at nearly six times more than chance, regardless of the difficulty of reasoning.

“Each neuron is encoding different bits of information,” said Jamali. “By combining the computations of all the neurons, you get a very detailed representation of the contents of another’s beliefs and an accurate prediction of whether they are true or false.”

Are these predictive neurons just another mirror neuron story in the making? Many don’t think so. Dr. Uta Frith, an Emeritus Professor at UCL Institute of Cognitive Neuroscience, commented, “Amazing that single cells in… [the prefrontal cortex]…show activity during mentalizing,” recapitulating findings from more blunt human brain recording instruments such as MRI. But mostly, the leap is in our methods for probing our own minds—even as they encode for someone else’s. It’s “amazing that this type of recording can be done at all,” said Frith.

Image Credit: Gerd Altmann from Pixabay

Shelly Fan
Shelly Fanhttps://neurofantastic.com/
Shelly Xuelai Fan is a neuroscientist-turned-science writer. She completed her PhD in neuroscience at the University of British Columbia, where she developed novel treatments for neurodegeneration. While studying biological brains, she became fascinated with AI and all things biotech. Following graduation, she moved to UCSF to study blood-based factors that rejuvenate aged brains. She is the co-founder of Vantastic Media, a media venture that explores science stories through text and video, and runs the award-winning blog NeuroFantastic.com. Her first book, "Will AI Replace Us?" (Thames & Hudson) was published in 2019.
RELATED
latest
Don't miss a trend
Get Hub delivered to your inbox

featured