Congress is now on the verge of passing the final version of legislation to tighten the regulation of banks and other financial firms. The original version of this bill, which had been in the works for over a year, gained passage by the Senate in the wake of the Securities and Exchange Commission’s decision to sue Goldman Sachs. The SEC alleges that in 2007 Goldman failed to disclose important information to investors in one of the collateralized debt obligations (CDOs) that famously contributed to the real-estate bubble and ensuing chaos in the financial system.
When a piece of illusory knowledge spreads to epidemic proportions and turns into common knowledge, bubbles and panics can result.
The lawsuit focuses on information disclosure, and the new bill addresses specific trading practices and regulatory standards. But by focusing on the need for more information and tighter controls, the legislation neglects what might be the biggest factor contributing to the financial crisis: the human tendency to act as though we know more than we really do. We all labor under this “illusion of knowledge”—the intuitive belief that we understand almost everything we deal with, from simple household devices to the world’s most complex financial transactions and markets, better than we really do.
• Harvey Pitt: The Ugly Truth About Financial-Regulatory Reform In our new book The Invisible Gorilla, we discuss a deceptively simple experiment done by psychologist Rebecca Lawson. She asked people how well they understand how a bicycle works. Many subjects in Lawson’s study rated their knowledge of bicycles to be quite good, but when they were asked to add the pedals, frame, and chain to a schematic picture of a bicycle, half of them made mistakes that would have rendered the bicycle impossible to ride (for example, connecting the chain to both wheels, rather than just the back wheel).
Many of us think we know how a bicycle works, and we could probably figure out how it works if it were right in front of us, but when we must rely on memory and imagination, our “knowledge” suddenly evaporates. The illusion of knowledge occurs when people misinterpret their surface familiarity with a concept (here, a bicycle) as a deeper understanding of its mechanisms (how the gears and chain transmit power to the wheels). The way to fight this illusion in your own thinking is to literally test your knowledge, like a diligent student who actually takes those end-of-chapter quizzes in the textbook rather than just looking them over. Trying to draw a diagram of a bicycle will show the gap between what you actually know and what you just think you know.
Now imagine something much more complicated than a bicycle—say, a synthetic CDO like the one at issue in the SEC-Goldman case. This CDO was derived from 90 residential mortgage-backed securities, each of which was based on hundreds or thousands of individual mortgages that varied in location, terms, creditworthiness, and a host of other characteristics, all of which determine how likely the bondholders were to get paid. Can you even visualize such a sprawling, interlocking structure?
During the housing boom, even professional investors rarely went to the trouble of reading the full documentation of a CDO, let alone doing independent research and testing their own knowledge of its likely risks and returns. In place of actual understanding, many money managers substituted two dangerous things: illusory knowledge (e.g., a belief that mortgage defaults were uncorrelated, or that mortgage bonds would perform over the next 10 years just like they had over the past two), and the herd-following behavior of buying the newest and shiniest financial product. Yet they thought they knew enough to put billions of dollars at risk. We don’t know whether Goldman’s clients were defrauded in this case, but we are fairly certain that they were not forced at gunpoint to invest substantial sums of money in trendy CDOs rather than something they understood better. As Goldman CEO Lloyd Blankfein told Congress, “This is not a transaction that had to be done.”
The illusion of knowledge is an overlooked factor in many poor investing decisions. Whenever you pick an individual stock, you are acting on a belief that your knowledge of its likely risks and returns is superior to the collective wisdom of the market. (You might not proclaim your confidence explicitly, but it is implicit in the action you are taking—buying or selling a particular stock rather than an investment vehicle that isn’t tied to the workings of a specific company or industry.) Such superior knowledge might actually be possessed by insiders trading illegally, and perhaps by a few financial geniuses (though the plethora of hedge-fund “blowups” over the last few years suggests otherwise), but it is rarely seen in ordinary investors—or in “experts” who peddle advice on what to buy and sell.
Although we overestimate our knowledge of simple, mechanical devices like bicycles, we are particularly prone to illusions when we try to predict how complex, abstract entities will behave in the future. The financial crisis has been blamed on many actors, among them self-interested legislators, greedy investment bankers, lax regulators, reckless mortgage issuers, and inept rating agencies. All surely contributed. But the illusion of knowledge was a crucial precondition that allowed the other ingredients to combine in a toxic mix.
When a piece of illusory knowledge spreads to epidemic proportions and turns into common knowledge, bubbles and panics can result. To take just three recent examples of financial overexuberance, once a large enough cadre of people “knew” that biotechnology would revolutionize medicine, that the Internet would rewrite the rules of business valuation, or that residential housing prices would never go down, bubbles were set to form. Regardless of what new strictures are imposed on Wall Street, making market participants aware of the limitations of their own knowledge will help stave off similar crises in the future.
Plus: Check out Book Beast, for more news on hot titles and authors and excerpts from the latest books.
Christopher Chabris and Daniel Simons are cognitive psychologists who have each received accolades for their research on a wide range of topics. Their “Gorillas in Our Midst” study reveals the dark side of our ability to pay attention and has quickly become one of the best-known experiments in all of psychology; it inspired a stage play and was even discussed by characters on C.S.I. Chabris, who received his Ph.D. from Harvard, is a psychology professor at Union College in New York. Simons, who received his Ph.D. from Cornell, is a psychology professor at the University of Illinois.