The results are a bit scary. Not only have computers changed the way we think, they’ve also discovered what makes humans think—or think we're thinking. At least enough to predict and even influence it.
Here are the four things cognizant people should know about the decade when computers mastered our cognition.
GOOGLE MAKES US STUPID
Most of the news about how our brains have been affected by Internet use has been covered in the optimistic shades of Google. As Dr. Gary Small, a neurobiologist at UCLA's Semel Institute for Neuroscience & Human Behavior discovered, an MRI of the brain of a person doing Web searches lights up in more regions than that of a person just reading a book. The interpretation that made it to the headlines? Google makes us smarter.
But when I recently challenged Dr. Small on this conclusion, he was quick to disassociate himself from it: "Well, you know, on a brain scan, big is not necessarily better." Quit the contrary, it turns out. The reading brain may be less lit up simply because it is working more efficiently. "I mean, just because their brain is more efficient doesn’t mean it’s behaving poorly."
In fact, as one of the scientists who has spent the most time observing the effects of Internet use on the brain, Dr. Small does not even sound convinced we are becoming empowered—least of all neurologically. Yes, the neuroplasticity of our brains does cause it to physically change as a result of its interaction with technology. But it's not necessarily for the better. "There’s some question about whether Google might be making us stupid. That we’re becoming less thoughtful. That there’s sort of a staccato quality to our thinking—that we don’t slow down and go into issues in depth. Instead, we’re just moving along to whatever question we might have at any moment." And of course, as far as marketers are considered, that which makes us stupid is good for business.
MULTITASKING AND DISTRACTION
Like most early enthusiasts, I always thought the way the Internet encouraged multitasking made users less vulnerable to manipulation, while simultaneously exploiting even more of our brain's capacity than before.
Apparently not. Cliff Nass, director of Stanford University's Communication Between Humans and Interactive Media Lab (known as CHIMe Lab), has been studying the best multitaskers on the face of the earth: college students. "How do they do it? Do their brains work differently?" He, too, was shocked by his own research. "It turns out, multitaskers are terrible at every aspect of multitasking. They're terrible at ignoring irrelevant information. They're terrible at keeping information in their heads nice and neatly organized, and they're terrible at switching from one task to the other. This shocks us."
Nass split his subjects into two groups—those who regularly do a lot of media multitasking, and those who don't. When they took simple tests comparing assortments of shapes, the multitaskers were more easily distracted by random images, and incapable of determining which data was relevant to the task at hand. And just because the multitaskers couldn't ignore irrelevant data didn't mean they were better at storing and organizing information. They scored worse on both sorting and memorizing information.
So what does it mean if we multitaskers are actually fooling ourselves into believing we're competent when we're not? "If multitasking is hurting their ability to do these fundamental tasks," Nass explained matter-of-factly, "life becomes difficult. Some of studies show they are worse at analytic reasoning. We are mostly shocked. They think they are great at it." We're not just stupid and vulnerable online—we simultaneously think we're invincible. And that attitude, new brain research shows, has massive carryover into real life.
Bailenson has discovered the Holy Grail for those seeking a dependable technology for mind control. I asked him if this freaked him out. “I just see it as where we’re going.”
IMPLANTING FALSE MEMORY, TAKING AWAY REAL ONES
At Stanford's Virtual Human Interaction Lab, I visited a psychiatrist named Jeremy Bailenson who has been studying the way virtual experiences are stored in the brain. He does stuff to people in virtual simulations like Second Life, and then observes how it affects the way they act back in the real world. He has discovered that the areas of the brain responsible for memory are really bad at tagging whether a particular incident happened in the real world or a virtual one.
In other words, just as we can wake up from a nightmare and stay angry all day at the person who wronged us in our dream, we tend to remember and act on our virtual experiences as if they had really happened.
On the one hand, this makes for terrific behavioral modification. I watched as Bailenson had a subject sit in a chair and experience a virtual-reality simulation of herself eating food. But as she ate, her avatar slowly got fatter—retraining her brain's understanding of the effects of her habits.
Of course, in theory, any of these techniques could be used for or against our best interests. In one study, Bailenson found that "having a 10 centimeter advantage in height causes you to be three times more likely to beat someone in a negotiation in virtual reality." But that's not the strangest part. Back in the real world, "regardless of your actual height you'll then beat me face to face when we have a negotiation. So, this stunned us. A small exposure in virtual reality carried over to their behavior face to face."
Weirdest of all, Bailenson gave young children a virtual-reality experience of swimming with whales, and then questioned them two weeks later. Fifty percent of them believed they actually went to Sea World and swam with whales.
We're talking implanted memories. Bailenson has discovered the Holy Grail for those seeking a dependable technology for mind control. I asked him if this freaked him out. "I just see it as where we're going."
Not surprisingly, the U.S. military is first in line for these findings, and has its own labs looking at how to apply them to the battlefield as well as traumatized veterans. Virtual simulations allow post-traumatic stress disorder sufferers to re-experience the events that traumatized them, and then slowly desensitize themselves to their impact through repeated recreations involving not just sight and sound but even smell.
I tried one of these immersive sessions myself at a military-funded lab in Marina Del Rey, California—substituting the memory of a fatal car crash in my 20s for combat in Iraq—and the emotional vividness was chilling.
The military is also looking at how they might apply this technology before the fact, essentially inoculating soldiers' brains to the trauma of war, in advance.
NEUROMARKETING
While the effort to exploit technology to train the human brain goes back far further than 2000, it wasn't until this decade that scientists—and the compliance professionals for whom they worked—had any hard data on how our brains were responding to their efforts. That's when the BrightHouse Institute came along and turned the MRI machines that Emory University Hospital used to use on stroke victims onto the lobes of consumer test subjects.
At the behest of early clients such as Coca-Cola, Kmart, and Home Depot, BrightHouse placed people in MRI machines while exposing them to advertisements, packaging, or even political candidates, and then measured the reaction of different parts of the brain in order to gauge the response. Although the science remains relatively crude, MRI monitoring does permit scientists and marketers to observe which parts of our brains are activated when we are exposed to their products and pitches. If it's the same part that lights up when we think about good sex, then it's considered a success.
The more data these folks accumulated, the more automatic our higher cognition began to appear to them. Countless books emerged on the new science of mind, arguing that the human decision-making process occurred in the "blink" of the unconscious, almost reptilian eye. Not only were they figuring out how to control our brains, it didn't really matter on any ethical level because, it turns out, our brains were hardly thinking, anyway.
Meanwhile, research descendants of the direct-marketing industry found in technology a new way to keep track of the millions of consumers in their databases. Instead of analyzing our preferences individually, companies like Acxiom and Claritas used their newfound processing power to find correlations between all that data. If left-handed, cat-owning, Coke drinkers commuting more than 8 miles to work in a two-door car tended to respond better to commercials about beer than those driving four-door cars, marketers have a piece of information they can use.
Computers then help them crunch all the data to come up with those correlations, and our use of everything from Gmail to Facebook provides them with countless more terabytes of data to play with than they ever could before. Each of us is no longer a person, but one of many possible overlapping model types. Once they know your model, all they need to do is stick one of your fellow models in an MRI machine, figure out his responses, and then apply the ones that worked on him to you.
While all these developments, both psychological and commercial, may make the Manchurian Candidate look like child's play, I'm not so sure the decade technology conquered the brain will necessarily be followed by one in which we successfully exploit all of these findings.
Brains are tricky and adaptable organs. For all the "neuroplasticity" allowing our brains to reconfigure themselves to the biases of our computers, we are just as neuroplastic in our ability to eventually recover and adapt. When people first saw motion pictures, they leaped from their seats for fear that the train on the screen would crash through the screen and into the theater. Just a few years later, what had once felt genuinely life threatening was recognized as a two-dimensional illusion. Our biology might prove more nimble than our software.
And if it doesn't, at least we probably won't know the difference.
Douglas Rushkoff, a professor of media studies at The New School University and producer and correspondent for the PBS Frontline Digital Nation project, is the author of numerous books, including Cyberia, ScreenAgers, Media Virus , and, most recently, Life Inc., from Random House.