Science fiction seems obsessed with the idea of artificial intelligence—the singularity—taking its first metaphorical breath and immediately becoming evil. It’s understandable: it’s both a worrisome thought and an entertaining one.
It’s on everyone’s minds, too, from Isaac Asimov, to Steven Spielberg, to the folks at Google, are thinking seriously about ways to stop an artificial intelligence in just such a situation.
Perhaps it’s because of these fears that all of the robotic projects we’ve heard of recently are so obsessed with making a mannered, deferential machine that observes manners and rules equally. And as we begin to see more robots walking and driving around to test their functions, what we should really be excited about is them learning to be socially aware.
What are your main gripes in an average day? If you’re employed and in good health, the biggest social inconveniences are likely inconsiderate people around you: the guy who cuts you off in traffic or won’t put down his backpack to make more room in the subway; the people blocking an entire sidewalk by walking slowly; the woman who skips in line at checkout.
Robots, unlike some humans, are being not just told, but taught how to avoid these sorts of faux pas, and the result is that they might be better behaved that most of us when they’re out and about.
Already Google and other companies are looking to make sure bots are able to participate in normalized conversation with humans, and let idiom and improper nouns make sense in the context of conversation. This means smarter bots in customer service and other areas where a computer can do a lot of the work.
And they’ll do a lot of interacting in general over the next few years, both personally and professionally. IBM’s CEO even expects that computers will be at the boardroom table, and be involved in the smallest business decisions. “There is no doubt in my mind that cognitive computing will impact every decision in five years,” Ginni Rometty sad at Code Conference just a couple weeks ago.
All of that participatory work—being part of the social structure—means that they’ve got to observe not just polite conversation rules, nor the just the rules of the road when they begin driving for us.
It’s also about the rules of the sidewalk, as they deliver packages or run errands. That’s why technologies like the sword-dodging drone are so important: drones need to be aware of the environment around them, and be able to react when, for instance, someone fails to hold a door for them, or an unaware construction worker swings a board out into the drones way as it flies by on the sidewalk.
It’s also why researchers at Stanford are looking to teach sidewalk etiquette to robots—or rather to let them learn it for themselves.
The robot is programmed to collect and analyze data on how people move on a sidewalk. It’s meant to teach the robot how to read the signals of pedestrians around it: who’s trying to cross and head into a doorway, or who’s about to stop short and check their phone for an address.
That’s important stuff for small bots like Stanford’s Jackrabbot, so they don’t get kicked around. But it’s also important for larger robots, who would likely outweigh and out-balance humans upon accidental contact. No one likes the idea of being accidentally tackled by a 300-pound robot carrying groceries.
Stanford’s robot will (hopefully) eventually learn more subtle differences in social cues, like the difference between two people standing next to each other, and two people standing together in conversation. Which means that as this technology evolves, robots will be better at not interrupting private conversations, or walking between couples on the sidewalk, or crossing in front of people whose noses are buried in their phones.
We’re of course still years off sharing the sidewalks with “I, Robot” types, but the idea that more of the total volume of pedestrians would be more polite overall is actually kind of nice to think about.
Of course, it does still make you wonder what would happen if, one day, a self-aware robot got tired of being cut off in traffic.
Maybe we should try and be polite to them, too.