AI assistants like Alexa and Siri reinforce gender stereotypes, UN report says

Study suggests tech giants remove female voices as the default option

Alexa reinforces gender bias, says UN report
The Ambient is reader-powered. If you click through using links on the site, we may earn an affiliate commission. Learn more

Smart assistants with female voices are contributing to harmful gender stereotypes, according to a new study from the United Nations.

The report highlights that digital helpers, usually given the voice of a woman by default, are portrayed as "obliging and eager to please" - reinforcing the idea that women are "subservient". The findings, published in a paper titled, I'd blush if I could- named after a Siri response to a provocative user prompt - also described the nature of insulting replies from assistants as "deflecting, lacklustre or apologetic" .

"Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminised digital assistants to greet verbal abuse with catch-me-if-you-can flirtation," the report says, while calling for companies to put an end to default female voices.

Of course, this isn't the first time the genderization of voice assistants has surfaced. As reported last September, Amazon chose a female-sounding voice because market research suggested it came across as more "sympathetic". However, the very fact the assistant is named Alexa - very much a female name - already hints at gender stereotyping.

It's not much better with Microsoft's Cortana, which is named after the barely-clothed Halo female AI character. At present, you can't change Cortana's voice to a male one, and there's no indication you'll be doing so any time soon.

Siri? Well, Siri is a female Scandinavian name that typically translates to, "Beautiful woman who leads you to victory" in Norse. Though, naturally, it also has different meanings in different languages.

The point here is that, as the report indicates, with digital assistants being created by humans, they therefore own the same stereotypes we do. It's something we've seen some companies attempt to move away from - Google, for example, now represents Assistant by color, with different accents and gender options. These colors are assigned at random one of the eight choices for a user.

The solution to the issue, the UN says, would be to create a gender-neutral assistant and to discourage insulting prompts from users. Treating AI as a lesser, subservient being is also suggested by the study.

Whether the industry wakes up to the problems attached to genderizing voice assistants before behaviors become too ingrained remains to be seen, but the responsibility lies squarely on the shoulders of the industry's biggest companies to push towards a neutral AI that reinforces different, and better, attitudes.


TAGGED    smart home

Related stories

smart tv and streaming Complete guide to 4K Netflix: How to get UHD in your living room
smart home Cheap smart plugs to make your home connected on a budget
smart home What is Amazon Sidewalk? (And why did you get that email?)
amazon alexa How to use Amazon Alexa in non-supported countries
smart home Best smart home hubs 2020: Do more by picking the perfect hub
smart home The Ambient Smart Home Awards 2020: The big winners revealed