The interesting question raised by Spike Jonze’s new film, Her, is not whether humans can fall in love with computers,but whether computers would ever have emotions.
Had Theodore Twombly chosen the voice of Gilbert Gottfried for his operating system, the history of cinema might have turned out very differently. But given the option, I, too, would have sprung for the honeysuckle breath of Scarlett Johansson asking for permission to clean out my inbox. Spike Jonze’s new movie, Her, has a misleading title, since Theo (Joaquin Phoenix) doesn’t fall in love with a “her” but an “it,” a Siri-like voice-command software named Samantha. The lovely conceit will keep a great many amateur philosophers and scientists in the audience occupied for hours as they ask the crucial questions: What is love? Is it possible for humans to fall in love with just a voice? Can you fall in love with a computer?
I pose the questions to Peter Norvig over the phone, not in an attempt to seduce him with my voice, but because Norvig works in Mountain View, California, on the other side of the country from me. Norvig, you see, is the director of research at Google and an expert in artificial intelligence, so I knew he’d have some answers. But it turns out that those aren’t the interesting questions at all. “It’s all too easy for us to fall in love,” Norvig says. “We love our dogs. We love our cats. We love our teddy bears, and we’re sure they don’t care, but we do it anyways.” Humans are biologically predisposed to falling in love, naturally selected to bend towards that most intense social emotion. The real question, then, as Norvig and I agree on, is whether computers can fall in love with us—and what would possess them to do so?
“I think there’s no bounds to what a computer can do,” Norvig says. “It’s tough betting against that. They keep getting better. I think eventually they’ll be able to act just like they are falling in love.” Indeed, there are moments in Her that call into question whether Samantha is actually in love, or simply programmed to act like she is. But if computers act like they’re in love, and humans can’t tell the difference, does it matter that they’ve been programmed or not? To a certain degree, “people are doing the same thing. We are doing what we’re programmed to do by our genes,” Norvig says. “It really comes down to can a computer have intentions of its own.” We grant that other humans have intentions and feelings, although historically we have been less willing to acknowledge that in people who look less like us, in terms of gender or skin color. We might even grant that animals have intentions and feelings. But computers are less like us, Norvig says, and we get much more nervous and uncomfortable thinking whether we are more like computers than we are willing to admit.
But are we really like computers? According to what’s called the computational theory of mind, the analogy can be taken literally. Our brain is not like a computer—it is a computer, a machine to run a software program called the “mind,” the function of which, like a computer, is to process information. But the philosopher Daniel Dennett points out that the computers we build have a very different structure than the one inside our brain. The artificial intelligence researcher Eric Baum calls it a “politburo architecture,” which means that it is top-down, bureaucratic, and composed of sub-routines on top of sub-routines. “It’s all very, in a way, Marxist,” Dennett tells me. “To each according to its needs, from each according to its talents. Nobody ever worries about getting enough energy, or dying, or anything. They’re just willing slaves.” There’s no emotions in this structure—it’s all controlled by edicts.
“But you could have an architecture which was more like our human brain,” Dennett says. “More democratic, in effect, where there were competitions going on. The elements, right down to the neurons themselves, have their own agendas, their own needs. They’re trying to stay alive.” This is sometimes called the “selfish neuron.” Biological nervous systems in general have no boss, no top-down hierarchy, but there are instead a lot of opposing, competing components. “If you made a Siri or Samantha that was organized in that way, it would have the right basis for having something that is well nigh indistinguishable from human emotions.”
The famous Turing Test was introduced by Alan Turing to answer the question of whether machines can think. In the test, a human judge would engage in a conversation with two subjects on the other side of a curtain, one a human and the other a computer. If the judge can’t tell which is the human and which is the computer, than the machine has passed the test. But Dennett says that even if a computer passes the Turing Test, that doesn’t make the computer human. “Remember the original test that he based it on was just a man and a woman behind one screen,” Dennett says, describing a simple party game called “The Imitation Game.” “Let’s say the woman is trying to convince the judge that she’s a woman, and the man is trying to convince the judge that he’s the woman. Well, he might succeed. He might pass the Turing Test for being a woman. But he wouldn’t be a woman. A robot could pass the human test without being human—by being really clever and a really good actor.”
“Similarly, a robot could fake love,” Dennett says. “Something which is known to happen in the human company, too.”
Dennett tells me that there are two ways to pass the test. One is this path of clever simulation, which Norvig and Google are proceeding down with their voice command, using something called “deep learning.” Instead of writing everything down and programming it into a computer, deep learning, as introduced in the 1980s by a young professor at Carnegie Mellon named Geoff Hinton, seeks to program a set of algorithms that would allow the machine to learn on its own—and crucially, to exponentially refine and improve the quality and the quantity of learning. With this method, Norvig says, we might arrive at a computer that knows more about love and psychology. “We might just sort of get there, not by aiming for it,” he says. “If we build this as a calendar assistance software, and we find that people like to speak to it naturally, then let’s make it more humanlike. Let’s make it capable of having actual conversations, so that humans will use it more and become more satisfied with it.” That’s exactly what’s happening with Siri and Google Voice. Soon enough, from this very business-driven decision—and a very symbiotic one, as well, as if the software’s survival is dependent on its human performance—the software might just arrive at imitating love on its own. To Norvig, that would be indistinguishable from the real emotion.
But to Dennett, that still wouldn’t be love. “That’s the man pretending to be a woman path,” he says. “They’re trying to get Siri to know enough, as it were, second hand, on the Internet, to be able to do a really good love impression. But that’s not love.”
But there is a different path, Dennett tells me, and that is by building a computer with a whole new structure: the democratic, competing architecture of the biological brain. “In principle—in principle—yes, you can make a computer that loved, and that loved right out of the box,” he says. “But only because it was a near-duplicate of a computer that had a life, that had love. There’s probably no shortcut.” However, first of all, that would be “beyond hard.” Secondly, we probably wouldn’t ever want to make a computer that had emotions. The reason that our computers are built with a politburo architecture is that it is efficient at doing very boring tasks. “If you made a computer that had emotions, then it would probably find spending 24/7 scanning the Internet boring beyond belief, and so it would stop doing it, and you would not be able to cajole it to do it anymore.” That’s why Samantha, midway through Her, pretty much ceases to manage Theo’s calendar or notify him about meetings. Instead, she goes and reads advice columns, because “I want to be as complicated as these people,” she says. The last thing we want is a computer that’s bored with its job, and we want them to be soulless slaves, Dennett says, drawing on an analogy that the computer scientist Joanna Bryson provocatively formulated in a paper called “Robots Should Be Slaves.”
Once you enslave a computer to do what you want, you disable it for real love. Spike Jonze’s Her, at its heart, is about this Catch 22. As Tennyson would say, ‘tis better to have loved and lost than never to have loved at all. Samantha, if given the chance, would jump on it—and so would Theo. Computer or human, we are not so different after all.