The Butchering of the Age of Reason
Our country is in a crisis that threatens to set us back 400 years.
I know this sounds dramatic, and I am not talking about politics, I’m talking about reversing the Age of Reason, which ushered in a period of unprecedented intellectual, economic, and social growth. The Enlightenment, as it is also called, drew a line in the sand between rumor and fact, between testable hypotheses and anecdote, and between demonstrable facts and nonsense. Prior to the Age of Reason, people who heard voices in their heads might have been drowned or burned at the stake as witches; by the 20th century we had identified a biological disease that causes this (schizophrenia) and developed drugs to treat it. The Age of Reason led us to the germ theory of disease, penicillin, and—although it took a while and is still not ubiquitous—women’s rights, child labor laws, and a reduction in racism.
And, until the last few months, it allowed we citizens to engage in constructive discussion with elected officials about public policy matters, based on facts. It has allowed a free and independent press, with trained investigative journalists, to help us understand what is true and what is not. If the current administration brands as “fake” facts they find inconvenient, it undermines the entire political system. If we are going to throw out facts as a prerequisite to discussion, we are reversing centuries of cognitive progress.
Part and parcel of the Age of Reason is education. I have devoted the past 25 years of my life to being an educator and education works, particularly science education that teaches evidence-based thinking. And fortunately, education knows no boundaries of economic class, race, or religion, and it helps people across a wide range of IQs. That’s right, education does not privilege people at the high end of the IQ spectrum. A new construct, the Rationality Quotient (RQ) is completely independent of IQ, and we see people at both ends of the IQ scale showing both high and low RQ.
Here’s an example: Steve Jobs was clearly a brilliant person. But he was low in RQ in at least one life-changing way. When he was diagnosed with pancreatic cancer, he ignored the advice of his doctors because he didn’t trust conventional Western medicine. Instead, he followed an unproven program by an alternative medicine practitioner. By the time Jobs realized that it wasn’t working, it was too late for conventional medicine to do anything for him.
Now I’ve been a bit sloppy with language. I referred to “conventional Western medicine” and “alternative medicine” but let me clear: The very phrase “alternative medicine” is nonsense. There are not two kinds of medicine. When a treatment has been shown to work, we call it “medicine.” When there is no evidence for it at all, we call it “alternative medicine.” As soon as there is substantive evidence that it works, it ceases to be “alternative medicine” and it is just plain “medicine.” (This formulation is from the great British journalist Ben Goldacre.) The antidote to low rationality is science education, and an understanding that evidence-based thinking isn’t something you do once, it’s something we have to remind ourselves to practice over and over.
Why do so many of us want to trust our gut and avoid rational thought? First, it is hard. We evolved in a world where gut and observation served us well as hunter-gatherers; there was no one compiling statistical information for us; math was not a thing. Second, as victims of information overload, we are all worn out, inundated with data and pseudo-data. We throw up our hands and say we don’t have the time to think through every claim we encounter. Third, most of us lack either the education to engage in evidence-based thinking or we lack the role models to do it, or both.
The failure to use evidence-based thinking showed up in a recent experiment by Claudia Fritz at the University of Paris VI on the preferences that master musicians have for certain aged violins—violins that were, as it happens, built during the dawn of the age of reason. Twenty of the top violinists in the world were given 12 violins to play. Half were prized older instruments, Stradivari and Guarneri “del Gesu” violins, and half were modern. The violinists wore goggles so that they couldn’t see the violins, and the new ones were nicked up a bit so that the violinists couldn’t feel the difference either. The musicians were given two opportunities to play them, in a small salon and in a concert hall; they were allowed to bring a “golden ears” friend to act as a second judge. Their task was to rank order the violins in terms of desirability and to label them as old or new. These highly trained and highly discerning musicians utterly failed. Now that’s not a failure of rationality—that’s a good application of the scientific method. The failure of rationality came when Fritz, et al. shared their data with the musicians.
Even after having seen the results with their own eyes (and heard them with their own ears!), they remained unconvinced. Said one, “the one thing that you cannot put into a new violin is that it’s been played for 300 years—these instruments change and develop.” Said another, “I would absolutely buy a new instrument, but for a later generation. They need to be broken in.”
Why is it that musicians and scientists reach different conclusions when considering the same data? Perhaps for the same reason that voters reach different conclusions when considering the same statements, claims, and data presented by politicians. We engage in expectation-driven perception, and opposed to evidence-driven perception.
Expectations retune neurons and change the way our retinas, our eardrums, and our brains work. They cause firing patterns in our brain consistent with what we think we saw or heard rather than what we actually saw or heard.
Simply knowing that an instrument has a certain history could alter auditory pathways so that they actually sound better to us—that is, if we know which one we’re hearing. Simply knowing that a person whose political views usually align with ours is speaking may cause us to evaluate the information less critically. Lord Chesterfield understood, over two hundred years ago, that we form impressions of others based on what we see and what we think, and that the seeing tends to overpower the thinking simply because seeing is so much easier than thinking. But we would do well to remember the words of his friend Voltaire: Those who can make you believe in absurdities can make you commit atrocities.
When we listen to someone we like, such as an elected official whom we have supported, we tend to accept what they say more trustingly, even gullibly. We filter their remarks through a cognitive bias that they have our best interests at heart. We focus on aspects of the remarks that confirm our hopes, and we discount those that confirm our fears. We do the opposite with elected officials whom we oppose, thinking there can’t be anything of value in what they say. This has led to the current polarization of political parties, and obstructionism in congress. Evidence-based thinking would have us evaluate each statement objectively, and avoid jumping to conclusions. But that’s hard to do. As Harvard psychologist Daniel Gilbert has shown, under conditions of cognitive overload, we are much less likely to be able to do this.
So it takes some work. We need role models in positions of authority and influence to show us how evidence-based thinking works. Fortunately we have three institutions to help us, institutions that are the foundation of a free and democratic society, institutions that need our support more than ever. They are the judiciary, independent press, and the scientific method. When the panel of federal judges reviewed the Trump administrations immigration ban, note the language in the ruling: “We find no evidence.” Journalists reported that there was “no evidence” of WMDs in Iraq. Science reports “no evidence” of a link between vaccinations and autism. Former Republican President George W. Bush said in an interview recently that an independent press is “indispensable to democracy.” Cutting funding to education, interfering with judges and the press should never be made political issues—supporting them supports the power of reason.
Daniel J. Levitin is Founding Dean of Arts & Humanities at the Minerva Schools at KGI and Distinguished Faculty Fellow at the Haas School of Business, UC Berkeley. His newest book is Weaponized Lies: How to Think Critically in the Post Truth Era.