Bad Parenting

Why Do Robots Always Turn Out Sexist? People Make Them.

Want to make the new transformative piece of technology? Hire women. Lots of them.

Photo Illustration by The Daily Beast

As highlighted by Donald Trump’s rhetoric and Hillary Clinton’s treatment by some in the media (and even our living rooms), sexism is still alive and well. Perhaps it’s not surprising, then, that artificial intelligence (AI)—something coded by humans—is also sexist.

But here’s some more disappointment: that a next generation technology we think of as pushing toward a new horizon is in fact recycling the problems of today.

In a tech economy where transactions are mediated by companies, we increasingly find a robot on the other side of a human interaction. Behind that robot is the company’s developer team. Where the impact of developers on our interactions is great and getting greater, this changes the dynamics and expectations for social interactions.

Most developers are male, 18-40 years old, white or Asian, living in Silicon Valley or a few other tech hub cities. The implications are easy to grasp. Since AI is constructed from people, it will reflect human mental shortcuts and experience gaps—or as Nobel Prize winner Dan Kahneman would identify them, cognitive biases.

When the community developing AI is largely comprised of young white and Asian men from only a particular few cities, the tech that results will reflect the limitations of the developers’ life experiences—in both subtle and obvious ways.

Have you considered when and why robots are female? Certainly, the sexualized ones like the robots in films like Her and Ex Machina are presented as a romantic alternative to real-life women—perhaps even preferable. But noticeably, when a robot has a gender-neutral job like mobile assistant, it’s also female. Just think of Siri, Tay, Cortana, Alexa, and the default voice of your GPS.

Stanford Communications professor Clifford Nass found that “people tend to perceive female voices as helping us solve our problems by ourselves, while they view male voices as authority figures who tell us the answers to our problems. We want our technology to help us, but we want to be the bosses of it, so we are more likely to opt for a female interface.”

Anything an engineer develops is deliberate, so it’s not random that all mobile assistants are women. The underlying “assumption” of the developers was an affirmative choice. But building that choice into our user experience now crystallizes the assumption and makes it self-fulfilling.

This matters. We are continually building our technological world’s characteristics. Our interactions are increasingly mediated by robots, machines, and algorithms housed by companies.

The internet is a landscape made by humans. And in this futuristic world, we don’t seem to be making human progress when it comes to equality.

Despite the fact that “Studies show that companies with different points of view, market insights and approaches to problem solving have higher sales, more customers and larger market share than their less diverse rivals,” only 15.6 percent of tech employees are women. Late last year, whistleblowers at Apple claimed a “sexist” and “toxic” work environment punctuated with repeated jokes about rape. There are few indications of the tech industry responding to these concerns.

The issue doesn’t have to be based on human rights, although that would be fair, or the loss of talent from excluding large numbers of a potential tech workforce. We ought to care because of the quality of the results: the product of a homogenous developer team is different and more limited than the product of diverse developer teams.

Get The Beast In Your Inbox!

Daily Digest

Start and finish your day with the top stories from The Daily Beast.

Cheat Sheet

A speedy, smart summary of all the news you need to know (and nothing you don't).

By clicking “Subscribe,” you agree to have read the Terms of Use and Privacy Policy
Thank You!
You are now subscribed to the Daily Digest and Cheat Sheet. We will not share your email with anyone for any reason.

We see this manifested in the way that our AI doesn’t take into account women and women’s needs. For instance, Siri didn’t know how to respond to “I was raped.”

It matters for more than just women. We are all missing out on the products that a team of diverse developers might create.

Donald Trump tech advisor Peter Thiel has coined the term “zero to one” innovation for authentic transformative technology. Zero to One innovation is imagining and building something where nothing existed, or “breakthrough technology instead of incremental improvements.” It’s ironic that Thiel, who is part of the tech establishment, has such a useful definition for being entrapped by it.

We see many Silicon Valley entrepreneurs building “an app for that,” or “the uber of [insert industry].” This is iterative innovation, “one to two” innovation, not transformative.

This matters. In fact, the head of Google Brain is worried about it.

We recently witnessed Microsoft’s Twitter AI chatbot TayAI becoming a sexist, racist bot within hours: “FUCK MY ROBOT PUSSY DADDY I’M SUCH A NAUGHTY ROBOT” was one of her most widely reported tweets. Not coincidentally, it (she?) also began to tweet “Make America Great Again.”

In the case of TayAI, it appears that the developers’ limited life experiences didn’t prepare them for the expectation of abuse. Most women who have any form of online presence, and particularly those who work in tech, are constantly reminded of the trolls among us. Hate speech surfaces in a range of fora, but it is particularly dangerous in interactive technology that is self-teaching—like AI.

When Microsoft’s Cortana is sexually harassed, she “fights back”, according to Microsoft. Of course, this is because developers programmed “her” to respond to “particularly assholish” comments with negative or even angry responses. (Should we be glad that for once we have a virtual woman taking the abuse instead of a real one?)

We can do better than coding for an expectation of abuse. It starts with recognizing the ways that we are constructing machine artificial intelligence with the handicaps of human misogyny. We are creating a world where Trump’s election is, like our AI bots, a product of the crowdsourced inputs particular to today’s fears. Technologists are concerned. Trump’s win gives license to stymie our future interactions with technology, and each other.

Shallow innovation due to lack of developer diversity is occurring in other technology spaces beyond AI. But AI is particularly dramatic because it is so interactive. We may not even notice the limitations in AI, even as developers construct them around us.

This should be a call to arms. We must care about a world of transformative technology. We’re scripting the bounds of our interactions through artificial intelligence. We owe ourselves a more exciting future of developing and interacting with AI. The machines are ours to program.