Study suggests tech giants remove female voices as the default option
Smart assistants with female voices are contributing to harmful gender stereotypes, according to a new study from the United Nations.
The report highlights that digital helpers, usually given the voice of a woman by default, are portrayed as “obliging and eager to please” – reinforcing the idea that women are “subservient”. The findings, published in a paper titled, I’d blush if I could– named after a Siri response to a provocative user prompt – also described the nature of insulting replies from assistants as “deflecting, lacklustre or apologetic” .
“Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminised digital assistants to greet verbal abuse with catch-me-if-you-can flirtation,” the report says, while calling for companies to put an end to default female voices.
Of course, this isn’t the first time the genderization of voice assistants has surfaced. As reported last September, Amazon chose a female-sounding voice because market research suggested it came across as more “sympathetic”. However, the very fact the assistant is named Alexa – very much a female name – already hints at gender stereotyping.
It’s not much better with Microsoft’s Cortana, which is named after the barely-clothed Halo female AI character. At present, you can’t change Cortana’s voice to a male one, and there’s no indication you’ll be doing so any time soon.
Siri? Well, Siri is a female Scandinavian name that typically translates to, “Beautiful woman who leads you to victory” in Norse. Though, naturally, it also has different meanings in different languages.
The point here is that, as the report indicates, with digital assistants being created by humans, they therefore own the same stereotypes we do. It’s something we’ve seen some companies attempt to move away from – Google, for example, now represents Assistant by color, with different accents and gender options. These colors are assigned at random one of the eight choices for a user.
The solution to the issue, the UN says, would be to create a gender-neutral assistant and to discourage insulting prompts from users. Treating AI as a lesser, subservient being is also suggested by the study.
Whether the industry wakes up to the problems attached to genderizing voice assistants before behaviors become too ingrained remains to be seen, but the responsibility lies squarely on the shoulders of the industry’s biggest companies to push towards a neutral AI that reinforces different, and better, attitudes.