Women are bad drivers, Saddam plotted 9/11, Obama was not born in America, and Iraq had weapons of mass destruction: to believe any of these requires suspending some of our critical--thinking faculties and succumbing instead to the kind of irrationality that drives the logically minded crazy. It helps, for instance, to use confirmation bias (seeing and recalling only evidence that supports your beliefs, so you can recount examples of women driving 40mph in the fast lane). It also helps not to test your beliefs against empirical data (where, exactly, are the WMD, after seven years of U.S. forces crawling all over Iraq?); not to subject beliefs to the plausibility test (faking Obama’s birth certificate would require how widespread a conspiracy?); and to be guided by emotion (the loss of thousands of American lives in Iraq feels more justified if we are avenging 9/11).
The fact that humans are subject to all these failures of rational thought seems to make no sense. Reason is supposed to be the highest achievement of the human mind, and the route to knowledge and wise decisions. But as psychologists have been documenting since the 1960s, humans are really, really bad at reasoning. It’s not just that we follow our emotions so often, in contexts from voting to ethics. No, even when we intend to deploy the full force of our rational faculties, we are often as ineffectual as eunuchs at an orgy.
An idea sweeping through the ranks of philosophers and cognitive scientists suggests why this is so. The reason we succumb to confirmation bias, why we are blind to counterexamples, and why we fall short of Cartesian logic in so many other ways is that these lapses have a purpose: they help us “devise and evaluate arguments that are intended to persuade other people,” says psychologist Hugo Mercier of the University of Pennsylvania. Failures of logic, he and cognitive scientist Dan Sperber of the Institut Jean Nicod in Paris propose, are in fact effective ploys to win arguments.
That puts poor reasoning in a completely different light. Arguing, after all, is less about seeking truth than about overcoming opposing views. So while confirmation bias, for instance, may mislead us about what’s true and real, by letting examples that support our view monopolize our memory and perception, it maximizes the artillery we wield when trying to convince someone that, say, he really is “late all the time.” Confirmation bias “has a straightforward explanation,” argues Mercier. “It contributes to effective argumentation.”
Another form of flawed reasoning shows up in logic puzzles. Consider the syllogism “No C are B; all B are A; therefore some A are not C.” Is it true? Fewer than 10 percent of us figure out that it is, says Mercier. One reason is that to evaluate its validity requires constructing counterexamples (finding an A that is a C, for instance). But finding counterexamples can, in general, weaken our confidence in our own arguments. Forms of reasoning that are good for solving logic puzzles but bad for winning arguments lost out, over the course of evolution, to those that help us be persuasive but cause us to struggle with abstract syllogisms. Interestingly, syllogisms are easier to evaluate in the form “No flying things are penguins; all penguins are birds; so some birds are not fliers.” That’s because we are more likely to argue about animals than A, B, and C.
The sort of faulty thinking called motivated reasoning also impedes our search for truth but advances arguments. For instance, we tend to look harder for flaws in a study when we don’t agree with its conclusions and are more critical of evidence that undermines our point of view. So birthers dismiss evidence offered by Hawaiian officials that Obama’s birth certificate is real, and death-penalty foes are adept at finding flaws in studies that conclude capital punishment deters crime. While motivated reasoning may cloud our view of reality and keep us from objectively assessing evidence, Mercier says, by attuning us to flaws (real or not) in that evidence it prepares us to mount a scorched-earth strategy in arguments.
Even the sunk-cost fallacy, which has tripped up everyone from supporters of a losing war (“We’ve already lost so many lives, it would be a betrayal to withdraw”) to a losing stock (“I’ve held onto it this long”), reflects reasoning that turns its back on logic but wins arguments because the emotions it appeals to are universal. If Mercier and Sperber are right, the sunk-cost fallacy, confirmation bias, and the other forms of irrationality will be with us as long as humans like to argue. That is, forever.
Sharon Begley is NEWSWEEK’s science editor and author of Train Your Mind, Change Your Brain: How a New Science Reveals Our Extraordinary Potential to Transform Ourselves.