The congressional hearing on domestic drone use scheduled to happen today is the second in three months. Four states have already passed laws curtailing the use of drones by law enforcement, and 32 other states are actively considering it. The speed and intensity—and remarkable bipartisanship—of the response to domestic drones are the latest signs that the technology occupies a uniquely sensitive spot in the public imagination.
Just look at the public outrage over rumors that the Los Angeles Police Department was using a drone to search for Christopher Dorner. No one cared that they were using helicopters with heat-sensing technology, dogs, and surveillance cameras to give them a leg up, but the idea of a drone was appalling. Or look at the people who demanded that Amazon stop selling a toy drone, when the rest of the toy aisle looks like a plastic arsenal. Even New York Mayor Michael Bloomberg, when telling people to get used to the idea of drone surveillance, acknowledges that they’re “scary.”
Why do drones elicit such a visceral reaction? Is it because they’re seen primarily as weapons of war? After all, “drone” refers to any unmanned aircraft with an autopilot, not just Predators and Reapers. Is it a fear of new technology? Or fear they’ll make privacy invasions cheap and easy? Or something else?
A simple answer, the one favored by the drone industry, is that drones are merely suffering from a branding problem. “Please, don’t call them drones,” says Michael Toscano, the president of the Association for Unmanned Vehicle Systems International, an industry trade group. “What do you think of when you hear the word ‘drone’?” he asks. “You think of something military, hostile, weaponized,” not the tiny four-propeller aircrafts used by hobbyists and researchers.
And of course drones didn’t just get their reputation from a decade of being synonymous with targeted killing. They’re the latest iteration in a long genealogy of frightening military robots. From Terminators to Cylons, pop culture is littered with military robots that get out of control and turn against us. You can even go back to the Jewish story of the golem for an example of a superhuman robot, built to defend the Prague ghetto, that runs amok. As a culture, we’ve been warned about this for hundreds of years.
You can see drone proponents fighting against this deep-seated cultural prejudice against flying killer robots whenever they propose a new acronym. Toscano prefers “Unmanned Arial System” because he wants to emphasize that the thing that flies is only 30 percent of a system of software and sensors that is ultimately under human control. In a Senate hearing on the drone program, retired Col. Martha McSally wanted to call them “Remotely Piloted Aircraft.” Each drone, she emphasized, is maintained and operated by 200 people on the ground. Both are trying to stress that the drone is only a tool, not an autonomous robot.
But all this rebranding is not going to work. And mostly that’s because when it comes to technology, we’re still not very sophisticated thinkers.
Millennia of interacting with other humans has left us ill equipped to deal with objects that sometimes act in humanlike ways. Studies have shown repeatedly that people apply gender stereotypes, personality, and social conventions to computers, even when they say they know better. It’s embarrassing how little it takes. In one study, people performed a series of tasks on a computer and then evaluated its performance; when asked to fill out the evaluation on that same computer, they gave a more flattering review than when filling out the evaluation on a computer across the room. In other words, they were polite to the computer. In another study, people were more likely to divulge personal information when the computer first told them something about itself, e.g., “This computer has been configured to run at speeds up to 266MHz. But 90 percent of computer users don’t use applications that require these speeds. So this computer rarely gets to use its full potential. What has been your biggest disappointment in life?”
When something acts even a little bit human—when it has eyes, or uses language, or even when it’s merely interactive—we treat it like a human. Seventy percent of Roomba owners name their vacuum robots, according to iRobot, the company that makes them. Soldiers in Afghanistan and Iraq give their bomb-defusing PackBots purple hearts and hold funerals for them when they die. “They are anthropomorphized,” says Clifford Nass, who studies interactive technology at Stanford, “and accurately so! They do in fact have intent, they get interested in things, engage with them.” A stationary camera, Nass says, we would treat as an object; maybe there’s a person watching on the other end, but maybe not—it doesn’t interact with us and so we don’t think much about it. A camera merely sees, Nass says, but a drone seems to watch.
We all know we’re on camera whenever we’re in public—in stores, at traffic lights, etc.—but somehow when that camera is moving and interacting with things, it gives us the raised-hackles feeling of being watched.
But if drones creep us out more than stationary cameras, they are also in some way more menacing even than human guards. Unlike humans, it can watch for days, even years, watch an entire city, zoom in close, use heat sensors and infrared. “Technology gets better all the time,” says Nass. “Humans don’t.” Not only can this create a feeling of helplessness, it also lends drones an inscrutability that can feel threatening. “The scariest encounters are the ones where you don’t know what the other person is going to do, and I sort of know how people operate,” says Nass. Throw in a dash of military menace and you have a device seemingly designed to make people jumpy.
Some privacy advocates think that our propensity for being unnerved by drones will end up being a boon to privacy. You’re creeped out by a camera that happens to be flying, they argue; why aren’t you creeped out by all the ones that are already watching you? And what about the companies collecting data on what you buy, read, and where you are? What about the cookies in your own home computer? “It’s definitely become a big privacy issue in a way we often don’t see with other new technology,” says Jay Stanley, a senior analyst at the American Civil Liberties Union. “I think partly that’s because drones are very concrete, where online data mining is more abstract. A drone hovering above your yard—that feels like an invasion.”
Ryan Calo, a law professor at the University of Washington, thinks that drones will give people the jolt they need to bring our privacy laws into the 21st century. After all, he says, it was the threat of tabloid photographers armed with newly invented snapshot cameras in the late 19th century that prompted the first formulation of the modern right to privacy. Our conception of privacy never quite made it into the digital age, largely, Calo says, because contemporary privacy violations are so difficult to visualize. People who would be very upset to have someone—or some robot—peering in their window blithely give away reams of personal information online, information more personal than anything one could glean by following you around with a camera. We all carry state-of-the-art surveillance technology in our pocket, something we often forget. “You’re already a walking sensor platform,” as the CIA’s chief technology officer bluntly put it.
If our intuitions about privacy are so out of sync, it’s no wonder our laws are too. The Supreme Court ruled only recently that attaching a GPS device to someone’s car counted as a search, and only then because officers physically trespassed on the suspect’s property to attach it. But what officer would bother doing that today when most people carry far better tracking devices with them? Police need a warrant to tap your phone, but all they have to do for your texts, old emails, and location is subpoena the phone company. Nor do police need a warrant to fly above your house and gather evidence, but because helicopters are expensive, they do so only in extreme cases. Cheap drones would likely change that equation. Calo hopes that our visceral response to drones will prompt a national conversation. It wouldn’t be the first time that a new technology spooked us into updating our understanding of privacy.