AI assistants like Alexa and Siri reinforce gender stereotypes, UN report says

Study suggests tech giants remove female voices as the default option

Alexa reinforces gender bias, says UN report

Smart assistants with female voices are contributing to harmful gender stereotypes, according to a new study from the United Nations.

The report highlights that digital helpers, usually given the voice of a woman by default, are portrayed as "obliging and eager to please" - reinforcing the idea that women are "subservient". The findings, published in a paper titled, I'd blush if I could- named after a Siri response to a provocative user prompt - also described the nature of insulting replies from assistants as "deflecting, lacklustre or apologetic" .

Read this: What do people think Alexa looks like?

"Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminised digital assistants to greet verbal abuse with catch-me-if-you-can flirtation," the report says, while calling for companies to put an end to default female voices.

Of course, this isn't the first time the genderization of voice assistants has surfaced. As reported last September, Amazon chose a female-sounding voice because market research suggested it came across as more "sympathetic". However, the very fact the assistant is named Alexa - very much a female name - already hints at gender stereotyping.

It's not much better with Microsoft's Cortana, which is named after the barely-clothed Halo female AI character. At present, you can't change Cortana's voice to a male one, and there's no indication you'll be doing so any time soon.

Siri? Well, Siri is a female Scandinavian name that typically translates to, "Beautiful woman who leads you to victory" in Norse. Though, naturally, it also has different meanings in different languages.

The point here is that, as the report indicates, with digital assistants being created by humans, they therefore own the same stereotypes we do. It's something we've seen some companies attempt to move away from - Google, for example, now represents Assistant by color, with different accents and gender options. These colors are assigned at random one of the eight choices for a user.

The solution to the issue, the UN says, would be to create a gender-neutral assistant and to discourage insulting prompts from users. Treating AI as a lesser, subservient being is also suggested by the study.

Whether the industry wakes up to the problems attached to genderizing voice assistants before behaviors become too ingrained remains to be seen, but the responsibility lies squarely on the shoulders of the industry's biggest companies to push towards a neutral AI that reinforces different, and better, attitudes.


This week's best deals

Philips Hue Premium 2-Pack Starter Kit - Save $40
Philips Hue Premium 2-Pack Starter Kit - Save $40
Amazon
$109.99
Lifx smart light bulb - Save $20
Lifx smart light bulb - Save $20
Amazon
$39.99
Google Home Mini two-pack - Save $29
Google Home Mini two-pack - Save $29
Walmart
$49
Google Nest Hub - Save $20
Google Nest Hub - Save $20
Walmart
$129

TAGGED   smart home

Recent stories

smart home Zigbee vs Z-Wave: We help you decide which is best for your smart home
smart home Awair Glow C is a smart plug, night light and air quality monitor rolled into one
lighting Ikea Trådfri guide: Your missing manual to Ikea's smart lights, plugs and more

What do you think?

Reply to
Your comment