Memory isn’t what it used to be.
That much is clear from reading Joshua Foer’s new book Moonwalking With Einstein: The Art and Science of Remembering Everything. The title comes straight from his own stash of highly memorable images, developed in his quest to become the U.S. memory champion.
One year Foer, a journalist, wandered into the U.S. memory competition, and returned the next as a contender. In between, he trained in the intricate techniques for memorizing anything—a list of words, the order of a deck of cards, faces and names, random sets of numbers, the entire Iliad. These are the kinds of skills required of the “mental athletes” who compete on the memory circuit. If their real-world applications seem elusive, that’s because they mostly are.
The techniques that Foer describes are as effortful as they are old. They were invented by the ancient Greeks in the fifth century B.C., and they haven’t been significantly altered since. Foer’s training is mostly based on the “method of loci.” It requires you to visualize a location that you know well, say, for instance, your childhood home, and mentally walk through it, distributing each one of the items that you need to remember in a specific place in a room inside your “memory palace.”
You reflect on each item as you place it—the sight of it, the smell of it, the way it looks sitting there where you’ve put it. And later, when you need to retrieve these items, you simply walk through your memory palace, and there they are, right where you left them.
But besides the obvious effort and concentration required to master such methods, when it comes to implementing them in day to day life, one also faces a big picture problem: what’s the point?
After all, isn’t there a fundamental difference between retaining a list of random digits and being able to remember experiences in their native richness, with all the emotions and asides that rippled through? This is one of the many questions Foer raises as he makes his way through the strange terrain of memory land.
Do we engage more casually with our lives, on the theory that any experience can be reduplicated later?
Now, more than ever, the subject of memory has taken on a new urgency. The baby boomers are senior citizens, having senior moments. The rise of neuroscience in the last decade has made us aware of our brains in a way that we weren’t before. And, perhaps most significantly, technology is obviating the need to remember anything.
Welcome to the age of forgetting.
Before the advent of gigabytes, Wikipedia, and Watson, memory was more closely identified with intelligence. The shift is most evident in our schools, where critical thinking has replaced rote learning as the central goal of education. It also shows up in the dimming of quiz show glory. In the 1950s, the contestants who used their turbo charged memories to win money on game shows became revered cultural figures, regarded as geniuses, and even got investigated by Congress.
Now, however, a successful appearance on Jeopardy is unlikely to yield anything more than a few thousand dollars. Today, savant-like information recall seems more like a parlor trick than a mark of real intelligence: cool, but not enviable. What’s the point of honing our internal stash of facts when all the facts are at our fingertips?
Maybe our downgraded respect for remembering reflects, in part, our discomfort at the thought that our machines have caught up with—even surpassed—us. We devalue the significance of memory in order to cope with the fact that our gadgets are now better at it than we are.
Foer's book points to another moment in history when technology changed the role of memory. “If you were a medieval scholar reading a book, you knew that there was a reasonable likelihood you’d never see that particular text again, and so a high premium was placed on remembering what you read,” Foer writes. “You couldn’t just pull a book off the shelf to consult it for a quote or an idea. ”
The invention of the Gutenberg press meant that books were no longer such a rarity that you had to imprint their contents onto your memory whenever you ran across them. Once they became retrievable, books changed the way people read. Now, information is even more easily tracked; all events easily documented; all opinions available. Knowing this, do we know things differently? Do we engage more casually with our lives, on the theory that any experience can be reduplicated later?
The conversation can wait.
Didn’t go? Don’t worry, it’ll wind up on YouTube.
After all, if it’s not online, it didn’t occur.
One of the more intriguing ideas in Foer’s book: to know something, really, in the first place, is to memorize it.
“Memory training was considered a form of character building, a way of developing the cardinal virtue of prudence and, by extension, ethics,” he writes. “Only through memorizing, the thinking went, could ideas truly be incorporated into one’s psyche and their values absorbed.”
And here comes the shadow of doubt. The thousands of books we read in our life times—where do they go? The revelatory hours in the grasp of a writer who amazes. But what did he say? We might take comfort in the idea that even if we can’t remember everything—or sometimes anything—about them, still, these books shaped our sensibility, and therefore, they stay with us.
But is this, in fact, the case?
Foer’s book doesn’t break ground on the topic of memory enhancement. His renderings are often too simplified to capture exactly those qualities that make memory such a special phenomenon, and such a personal one. Nevertheless, he has thought deeply about memory and his effort yields questions that are well worth reflecting on—even if, by tomorrow, we no longer recall what they were.
Casey Schwartz is a graduate of Brown University and has a Masters Degree in psychodynamic neuroscience from University College London. She has previously written for The New York Sun and ABC News. Currently, she's working on a book about the brain world.