Rats Engineered to See Infrared Light, Use It to Seek Out Water

The brain is a great information processor, but one that doesn’t care about where information comes from.

Sight, scent, taste, sound, touch — all of our precious senses, once communicated to the brain, are transformed into simple electrical pulses. Although we consciously perceive the world through light rays and sound waves, the computing that supports those experiences is all one tone — electrical.

bionic-rats-see-infrared-hunt-water-3Simply put, all of our senses are the same to our brain.

It’s a strange notion that’s led to some even stranger “sensory substitution” experiments.

In 1969, the late neuroplasticity pioneer Dr. Paul Bach-y-Rita designed a vision replacement setup that looked straight out of the mind of 1950s-era sci-fi master Isaac Asimov.

Picture this: rows and rows of tiny vibrating needles, 400 in total, were mounted on the back of a menacing-looking dental chair. Blind subjects sat in the chair, exposing the sensitive skin on their backs to the vibration matrix.

Mounted close to the arm of the chair was an old-school video camera, which captured black-and-white images of objects placed in front of the chair. The image from the camera was converted into a 400-pixel “image” (a kind of pressure map) using the vibrating needles. Each camera pixel corresponded to a needle in the vibration matrix — black “pixels” produced a strong jab from a corresponding needle, whereas white pixels produced only a gentle touch.

It was a big, clunky and slow setup — but it worked.

After training, blind subjects not only learned to discriminate between squiggles, shapes and faces, but could also analyze complex visual scenes — involving more than three people or partially concealed objects — with just their skin.

But here’s the real kicker: the vibrations weren’t computed in the patients’ sensory cortex; instead, they were processed in their visual cortex.

Somehow, the patients’ defunct visual processing centers adopted the tactile information as their own. The end result? The patients “saw” with their skin.

Since then, sensory substitution has allowed the blind to see with music, read with sound, and has given balance back to motor impaired patients by providing relevant information to their tongues.

Yet all these experiments were done in patients with one or more defective senses. This led Duke neuroengineers Dr. Eric Thomson and Dr. Miguel Nicolelis to ask: what if we did this to a healthy brain? Could we “program in” additional senses?

What the heck, thought Thomson, let’s give rats infrared vision.

Let there be (infrared) light

Thomson began his experiment by designing a small bi-module implant only a few millimeters wide. The implant sent the output of a head-mounted infrared detector to a microarray of electrical microstimulators, which were fitted onto a rat’s sensory cortex (specifically, the parts that respond to touch signals coming in from their whiskers).

He then trained water-deprived rats to discriminate between three ports in a circle-shaped arena. Each of the ports emitted visible light in a random order; all the rats had to do was walk over to the lit port to get their water reward.

Once the rats learned the rules of the game, Thomson switched over to infrared.

bionic-rats-see-infrared-hunt-water-12
Rat and headmounted infrared detector. Image Credit: Eric Thomson/Duke University.

Different intensities of infrared light, captured by the detectors mounted on top of the rats’ heads, were given a value and transformed into different electrical simulation patterns. The patterns were then sent to the microstimulator, which communicated the desired current pulses to the sensory cortex in real time.

We wanted the animals to process graded infrared intensities, not just binary on-or-off, said Thomson. After all, we don’t experience visible light as all-or-none.

At first, the rats seemed confused — in response to stimulation, instead of going to the infrared source, they sat and groomed their whiskers as if being touched by an external force (which in a sense they were, since their sensory cortex was being zapped).

After roughly a month of training, however, all six animals adapted to their infrared headgear, learning to forage with infrared.

We could see that they were sweeping their heads side-to-side to better detect where the infrared light waves were coming from, said Thomson. This led to them correctly picking out the water-containing port over 70% of the time.

Additional tests confirmed the rats could still detect whisker “touch information” just fine — the new infrared “sense” didn’t boot out an existing capability.

“We have implemented, as far as we can tell, the first cortical neuroprosthesis capable of expanding a species’ perceptual repertoire to include the near infrared electromagnetic spectrum,” wrote Thomson in a 2013 report of the study published in Nature Communications.

Lightning-fast sensory integration

As cool as that study was, Thomson wasn’t satisfied.

For one, the rats only had one infrared detector, which severely limited depth perception. For another, the rats were technically “feeling” not “seeing” infrared, since their sensory cortices were doing all the hard work.

In a new series of experiments, reported recently at the 2015 Society for Neuroscience annual conference in Chicago, Thomson inserted three additional electrodes into the rats’ brains to give them 360 degrees of panoramic infrared perception.

The tweak boosted how fast the animals adopted infrared by almost 10 fold. When primed to perform the same water-seeking task, they learned in just 4 days, compared to 40 days with only a single implant.

“Frankly, this was a surprise,” said Thomson to Science News. “I thought it would be really confusing for [the rats] to have so much stimulation all over their brain, rather than [at] one location.”

But the biggest “whoa” moment came when he re-implanted electrodes into the rats’ visual cortex: this time, it took only a single day for the animals to learn the water task.

Why would redirecting infrared traffic to the visual brain regions speed up learning? Thomson isn’t quite sure, but he thinks it has to do with the nature of infrared light.

After all, our visual cortex is optimized to process visible light, which is very close to infrared in terms of wavelength. Maybe the visual cortex is “primed” to process infrared in a way that the sensory cortex isn’t.

Without digging deeper and looking at changes in plasticity at different levels of the visual system, however, we can’t tell for sure, says Thomson. What we do know, however, is that the visual cortex can do both jobs — visible light and infrared — simultaneously.

Augmenting senses is limited to animals for now, although biohackers are busy at work extending the human visible light spectrum into the near infrared.

Thomson’s study suggests that it’s possible — if we get “infrared eye” hardware working, our brains will likely rapidly adapt.

Frankly, I’m still amazed, Thomson said. The brain is always hungry for new sources of information, but it’s incredibly auspicious for the field of neuroprosthetics and augmentation that it can absorb this completely foreign type so quickly.

Our work suggests that sensory cortical prostheses, in addition to restoring normal neurological functions, may serve to expand natural perceptual capabilities in mammals, he said.

“And that’s why I’m excited.”

Image Credit: Shutterstock.com; Eric Thomson/Duke University

Shelly Fan
Shelly Fanhttps://neurofantastic.com/
Shelly Xuelai Fan is a neuroscientist-turned-science writer. She completed her PhD in neuroscience at the University of British Columbia, where she developed novel treatments for neurodegeneration. While studying biological brains, she became fascinated with AI and all things biotech. Following graduation, she moved to UCSF to study blood-based factors that rejuvenate aged brains. She is the co-founder of Vantastic Media, a media venture that explores science stories through text and video, and runs the award-winning blog NeuroFantastic.com. Her first book, "Will AI Replace Us?" (Thames & Hudson) was published in 2019.
RELATED
latest
Don't miss a trend
Get Hub delivered to your inbox

featured