Behold, the noble spectacle of the human mind contemplating itself -and coming up confused. Is what we call the "mind" totally encompassed in the jangling nerve ends of the brain? Or does it exist in some separate, insubstantial state, hovering over the brainworks like a cartoon bubble? In short, do we experience a consciousness that is more than the sum of the brain's biology?
The issue is not just academic; it hits many of us where we would like to live. For most of us, consciousness defines our specialness; it's what sets us apart from the lower orders. We prefer to believe we're more favored than garden slugs-and perhaps less impermanent. If we are essentially defined by neurons, what about the little matter of soul?
As recently as 20 years ago, the idea of consciousness looked hopelessly old hat; in neuroscience, brain chemistry was the thing. But the advent of new scanning technology, with the promise of actually "seeing" thoughts in action, has prompted a fresh examination of mind theories. In response to philosopher Daniel Dennett's "Consciousness Explained," a widely admired 1991 book that debunks the idea of a central "observer" in the brain, some dissenting sages brand the author an "eliminativist." He calls them "defeatist" or "obscurantist." They label him "reductionist." The, um, mind reels.
Properly speaking, Dennett, who heads the Center for Cognitive Studies at Tufts University, is a materialist. For centuries, the consciousness argument was waged mainly between materialists and dualists. Dualists believed, with their patron saint, French philosopher-mathematician Rend Descartes, that mind and body are separate entities. Materialists maintained the opposite: that the mind is no more than the interactions among the brain's numberless neurons. Not to put too fine a point on it, the materialists won. In his 1949 book, "The Concept of Mind," noted English philosopher Gilbert Ryle delivered the coup de grace by deriding Descartes's notion of the mind as a sort of "ghost in the machine." The phrase reverberated through academia, and the Cartesians were laughed out of court. Dualists nowadays are considered as obsolete as manual typewriters. The current academic battle, for the most part, is among materialists, many of whom don't quite agree with Dennett's basic statement that "the mind is somehow nothing but a physical phenomenon. In short, the mind is the brain."
Or is it? For ordinary people, the notion of a central, transcendent consciousness retains a powerful appeal-probably because that's the way we experience things. Normally, we hear a single, inner voice giving form to our thoughts, passing judgments, making choices. In his 1989 book "The Cerebral Symphony," neurobiologist William Calvin describes this as "a unity of conscious experience," a kind of "narrator" of our mental life. The rest of us may call it simply the mind, or the soul. It's what convinces us we exist as individuals rather than mere cellular functions. In other words, it is us, our personalized sense of self--the free-floating "observer" in the brain. It's also the idea that Dennett decries as "the most tenacious bad idea bedeviling our attempts to think about consciousness."
In its place, Dennett offers his own theory of what goes on in there, employing the metaphor of a "homunculus," the tiny dwarf supposedly created in an alchemist's test tube. Early popular representations often showed a homunculus pushing buttons and pulling levers in the skull. Dennett posits swarms of these figurative imps, a "Pandemonium of Homunculi," all clamoring and competing for attention, like traders on the floor of the stock exchange. Each of them specializes in different aspects of perception-shape, language, motion and so on. As they go about their tasks, they confer with each other and form coalitions, producing "collated, revised, enhanced" drafts of the raw data they take in. The process goes on ceaselessly: "Information entering the nervous system is under continuous 'editorial revision,' so that at any point in time there are multiple 'drafts' of narrative fragments at various stages of editing in various places in the brain." Ultimately, we experience this as a sort of "silent narrative," a single, coherent stream of consciousness-in the same way that our eyes seem to bring us a clear, steady image of the world although they jiggle around like handheld cameras.
Dennett's work dazzles and neatly tracks various neuroscientific theories. Yet there remains something incomplete about it. He never clearly explains how the edited drafts find their way into print-that is, into acts of cognition or behavior. At most, he says that they "yield, over the course of time, something rather like a narrative stream or sequence" that allows us to think what appear to be consecutive thoughts. No single part of this process can be pinpointed as central. Instead, consciousness, if that's what it is, is "smeared all over the brain."
If there is no central consciousness, can there be a self? That, too, says Dennett, is something of an illusion. All organisms, he says, have a built-in, functional sense of self, based, like so much else in evolution, on survival. It ranges from the rudimentary protective instinct of the lobster that prevents it from eating its own claws, all the way up to the "magnificent fictions" of self that humans spin out of their cumulative experience, like bowerbirds assembling nests from a melange of found objects. In short, it's just a story, a "representation" of a self, concludes Dennett.
Is nothing sacred to these fellows? They seem almost to delight in grounding our most cherished ideas about ourselves in sheer biology-or is it technology? "The brain," insists MIT's Marvin Minsky, is just "hundreds of different machines ... connected to each other by bundles of nerve fibers, but not everything is connected to everything else. There isn't any 'you'."
Minsky, then, could be called a machinist. So, at heart, are many materialists, including Dennett. The machinists' model of consciousness is the computer; they see the brain as a superior model, somewhat more versatile than the industrial-strength Cray supercomputer. If this were " Star Trek," the machinists would be Klingons and their foes might be called Mysterians. The Mysterians are materialists, too; they agree that the mind is a function of the brain. But beyond that, they're convinced there are things we simply can't know, because we're not equipped. " It's like monkeys trying to do physics," says philosopher Colin McGinn, author of " The Problem of Consciousness." "Which is not to say it's miraculous or has anything to do with God. It's just not available to us."
The Mysterians irk Dennett. First they seem to say that consciousness matters, then they place it off-limits to investigation. To him, that's not only "obscurantism" but "defeatism." In principle, he says, there's no reason we can't learn a lot about the subjective life of any creature, be it a bat or a Martian or "just a fellow human being with a very different background. " All it takes is a little knowledge and a lot of hard work.
But materialists of Dennett's persuasion are accused of leaving out feelings, or what some philosophers call the subjective state of being. One illustration they like to use is to imagine identical twins, one of whom, from birth, wears contact lenses that change the color red to green. When the twins play with a red ball, both recognize it as red, although one twin is seeing green. His inner sense of red, some argue, must be different from his brother's. In essence, the philosophers ask, what does it feel like on the inside to see red-or to bite your tongue? " How it feels to have pain is utterly mysterious to me," says Christof Koch, a computational neuroscientist at Cal-tech, in Pasadena, Calif. "I mean, if you have a toothache, why does it feel bad? If you try to describe the mind from the outside, there are 10,000 ways to do it. We can never test that."
The problem could be stated as the difficulty of actually feeling what someone's feelings feel like. For example, the machinists speculate that one day there will be robots programmed to simulate feelings so convincingly that they would seem to have consciousness. Yes, say the Mysterians, but would they really experience the taste of berries, the enchantment of love? The machinists reply: if their mouths pucker, or if they sigh a lot, who's to know the difference?
In "The Emperor's New Mind," published in 1989, English physicist Roger Penrose makes a stab at outlining some sort of new "physics of mind," in which the subatomic world of quantum mechanics could explain how thought arises from the brain in something like quantum jumps. Penrose rejects the computer model of consciousness, professing instead some sympathy for the "mystical" point of view. " It may well be there is something else going on in the brain that we don't have an inkling of at the moment," he says.
Dennett, for his part, says he has no compunction at all about demystifying consciousness; rather, he feels "something more like exhilaration in science explaining something that was heretofore mysterious." In any case, he says, his explanation is by no means meant to be the last word; the multiple-drafts thesis is "deliberately open-ended and sketchy. I'm merely predicting the theories we end up with will fall in this space. I'm trying to put up some lifelines in the fog."
It's still pretty foggy in there. Rutgers philosophy professor Jerry Fodor illustrates the problem by invoking an analogy originally used by his colleague Thomas Nagel: imagine you come across a wiggly, crawly creature. You put it in a box and leave it awhile. When you open the box, a butterfly flies out, and you say, "Gee, that's got to be the wiggly, crawly creature." Nothing else went in, it has to be the same; but you can't imagine how it could be.
That's the problem with consciousness, the butterfly that emerges almost magically from the brain. "And I think basically that's the situation with materialism," says Fodor. "We're sure something like that, that consciousness is a mechanism of the brain, has to be true. It's totally unclear how it could be."
But Dennett lobs back an analogy of his own. To him, it's all a game of intellectual tennis: "One side says 'You can't explain this,' the other says, 'I can explain this little bit.' 'Yes, but you can't explain the rest.' 'OK, I'll explain a little bit more.' The question is, is there always a residue left over that the other side is right about, that is simply unreachable by objective science? They say yes, I say no, and in a way it's a point of faith."
At the risk of trampling the analogy, the players seem locked at deuce. Time, then, to appeal to a linesman like the venerable philosopher John Searle, author of "Minds, Brains and Science." Searle is less than impartial about Dennett's book. "It's not consciousness explained," he says. "It's consciousness explained away." Yet he has no doubt the game must go on. "I think it is madness to suppose that everything is understandable by us," he says. "However, let me say we have to act as if everything were."