Chatbots and digital assistants are everywhere. Start typing questions in a chatbox on a website, and there’s a good chance you’re not conversing with a living, breathing human. And whom did you consult about the weather this morning? Did you tune into your local news station, or did you simply ask Alexa while standing in your closet, trying to decide how many layers you’d need?
The truth is chatbots save companies money. Recent findings by Juniper Research predict that bots will save banks roughly $7.3 billion in operating costs by 2023, and financial institutions are doing their research and constructing AI assistants accordingly. But the question must be asked, why are so many of them “female?”
Good, old-fashioned sexism is one explanation. As Monica Nickelsburg wrote for GeekWire, “Virtual assistants like Siri, Cortona and Alexa perform functions historically given to women. They schedule appointments, look up information and are generally designed for communication.”
The dark side of this, said Vitor Shereiber Nogueira, a language specialist for the German language-learning app Babbel, speaking to Forbes’ Parmy Olson, is that bots may perpetuate unrealistic expectations for women in the business world, in the same way that images of women altered in Photoshop have warped beauty standards in recent years.
Plus, notes a 2018 article on Medium by integrate.ai, “There’s good reason to feel uncomfortable with the dominance of female characteristics in these products while women themselves continue to be deeply underrepresented as engineers.”
(Interesting side note: Many chatbots in the U.K., including the British version of Siri, are male. Why? In a 2018 interview with NPR’s Laura Sydell, Justine Cassell, member of the Equal AI initiative and dean of Carnegie Mellon’s School of Computer Science, attributed this to another stereotype, that of the English butler or valet.)
Darker still, there’s evidence that people routinely verbally abuse their digital assistants and chatbots, and Robert LoCascio, CEO of LivePerson, believes this has troubling implications for societal behavior as a whole. Also speaking to Sydell, he said, “If you talk derogatory to an Alexa, children pick this up. They go back to school, and they think this is the way you talk to someone, and this is maybe the way you talk to women.”
A second, less insidious explanation for the female bot is trust. In short, companies doing their research have found that people respond better to a female voice. Nickelsburg’s article notes that depictions of AI in pop culture tend to play out in two ways, “malevolent or subservient.” (Think Kubrick’s HAL 9000 vs. the Samantha bot in “Her” voiced by Scarlett Johansson.)
Quoted in Geekwire’s article, Michelle Habell-Pallan, associate professor in gender, women and sexuality studies at the University of Washington, describes reactions to female voices as, “… more warm, more welcoming, more nurturing, all those associations that are connected with women that are not necessary essential qualities but are socially constructed.”
integrate.ai’s article likened chatbots to “sources of knowledge, modern oracles,” and reminded readers that it’s important to ensure that women are seen as able to provide authoritative information. The article suggested that rather than degrading interactions with women, bots programmed to respond in kind to polite conversation may actually improve societal treatment of women.
Still, there’s a third, largely unexplored avenue that could be worth taking, especially as the world begins to grapple with the idea of non-binary gender identities. Do chatbots, which are not living beings, need to be gendered at all? Sure, Hollywood tells us again and again that we must check the male or female box when creating our latest robot hero or villain, but must we? Speaking to Lydell, Cassel offered that with “a little pitch adjusting to the voice, it’s harder to tell its gender.”