A new study measures how our tendency to use Internet search engines is changing the way our brains use information.

The debate about whether or not computers are making us dumber, largely speculative to this point, has recently gotten a healthy dose of scientific data. Last week scientists at Columbia University published a report showing that our dependency on Google for information has changed how we think.

The study involved a series of experiments that tested how much people rely on readily-available information in lieu of committing it to memory. In the first experiment students were given true-or-false statements after which they were presented a series of colored words for which they had to match each word to its color. Typically reaction times for this task are slower if the word relates to what the person is thinking–if the question involves an ostrich, matching “bird” to its color would take longer than matching “ladder” to its color, for example. Interestingly, they found that reaction times for words related to the Internet, such as “Google” or “Yahoo,” got slower reactions times, indicating that the students were thinking about looking the answers up online. When presented with any factual question, to Google is now a human instinct.

The next set of experiments showed that the students’ memory suffered if they were told that whatever they learned would be saved on the computer, as if they resisted “saving” the data to their own memory if it was already being saved in the computer’s. But then something interesting happened. Even though they couldn’t recall the facts, they recalled very well the specific computer folders into which the facts had been saved. Betsy Sparrow, lead author in the study, summed it up by saying that the students “were better at remembering where information was stored than the information itself.”

Roddy Roediger, a psychologist at Washington University in St. Louis, said in a commentary of the study, “…there is no doubt that our strategies are shifting in learning. Why remember something if I know I can look it up again? In some sense, with Google and other search engines, we can offload some of our memory demands onto machines.”

Offload…memory…machines. Sounds like science fiction, except it’s not fiction. It’s real. Today we live in a world where citizens have access to and are exposed to a staggering amount of information. Streams of information bombard us from all directions: Twitter, online news sites updated by the minute, televised media with news streams updated by the minute. And whatever we miss we can catch on our smartphones over lunch. This unprecendented barrage of information presents several challenges: deciding what information is relevant, what information is reliable, and when is enough enough. Some pundits have pointed to the Internet as the beginning of the end of quality contemplation. Even before the Internet novelist Michael Crichton referred to the Information Age as the dis-Information Age. More recently–and more emphatically–tech journalist Nicholas Carr seems to have dedicated his life to spreading the gospel of the evil Internet. His latest book, “What The Internet Is Doing To Our Brains” was a Pulitzer Prize finalist.

If you don’t have the time to read his book in between Tweets, Carr’s thoughts are summarized in a nice, condensed, easily-skimmable online essay from a 2008 issue of the Atlantic. He begins the essay, entitled “Is Google Making Us Stupid?” bemoaning the fact that he and his friends can’t read anymore. Well, not like they used to. They’re thoughts are all scattered and, dammit, one of his friends has just given up reading books altogether. He says that the Internet made them do it. They’ve simply skimmed too many headlines, blogs and articles, and clicked on too many links that unscrupulously led to more headlines, blogs and articles.

Seriously?

Carr should hang out at my house. I’m online all the time, reading all sorts of stuff. Yes, I skim all the time–that is, until I find an article that I’m interested in. Carr points to a study that looked at how people conducted research online. Predictably, the users skimmed, “bounced” to other sites, and rarely revisited a site they’d already looked at. Reminds me of when I was doing research. I’d type in the keywords “cortex” and “motor control” and get a list of titles that I would skim. If something caught my saccade-crazy eye I’d open the abstract and skim that. If that was interesting I’d skim the article. If I thought I missed something, I would read. It. More. Slowly. This past weekend I read the first of Issac Asimov’s Foundation books. Was awesome. Unlike Carr I wasn’t afflicted with a sudden skimming-induced inability to concentrate and left to wonder what was on the Twitter feed instead of what’s Hari Seldon got up his sleeve.

The trend is clear: computers will be more powerful than the human brain. I can't wait.

Other people are not so quick to label the Google generation the new MTV generation (remember how watching disjointed music videos was supposed to turn our kids’ brains to mush?). Last year Harvard cognitive neuroscientist Stephen Pinker wrote an article in the New York Times that addressed the rising panic over Internet-induced chronic ADD. His basic message was relax, we’ve heard this all before: “the printing press, newspapers, paperbacks and television were all once denounced as threats to their consumers’ brainpower and moral fiber.” A crucial discord between Pinker’s and Carr’s theses is the malleability of the brain. While Carr writes “The human brain is almost infinitely malleable,” Pinker writes, almost in response, that “Critics of new media sometimes use science itself to press their case, citing research that shows how ‘experience can change the brain.’ But cognitive neuroscientists roll their eyes at such talk. Yes, every time we learn a fact or skill the wiring of the brain changes; it’s not as if the information is stored in the pancreas. But the existence of neural plasticity does not mean the brain is a blob of clay pounded into shape by experience.” Suffice it to say that “experience does not revamp the basic information-processing capacities of the brain.”

It might surprise you to know that the IQ of the human race is actually increasing over time. We’re getting smarter despite the less-than-learned distractions of MTV, video games, Twitter, Facebook, and Googling everything from where to eat to who was the 1979 American League batting champion (Fred Lynn). Pinker does acknowledge, however, that because we have Twitter, Facebook and the world through Google at our fingertips that an impulse is there that wasn’t before. Like the students who instinctively wanted to go online when they were asked about an Ostrich, we take all our queries to the great all-knowing oracle named Google.

And that’s a bad thing?

I say certainly not. Yes, our cognitive strategies are a changin’, but I say that’s a good thing. It’s clear how technology changes our physical lives, less clear how it changes our mental lives. When was the last time you memorized a phone number? The more we can offload our cognitive work to computers the better. It’s what neuroscientist Rodolfo Llinas meant when he spoke of human consciousness spreading to computers. Asimov predicted that computers would one day relieve us of all the mundane, phone number-like facts and leave our minds free to exercise their highest function: creativity.

It’ll be interesting to see what other technology-induced cognitive quirks are revealed through studies like the Columbia one. I, for one, will be watching with anticipation, not apprehension. Come to think of it, maybe something new has already come out. Gotta go check Twitter!

[image credits: chaobin.me and Prism Decision Systems]
image 1: Homer
image 2: Kurzweil

Peter Murray was born in Boston in 1973. He earned a PhD in neuroscience at the University of Maryland, Baltimore studying gene expression in the neocortex. Following his dissertation work he spent three years as a post-doctoral fellow at the same university studying brain mechanisms of pain and motor control. He completed a collection of short stories in 2010 and has been writing for Singularity Hub since March 2011.