Female-voice AI reinforces bias, says UN report
- Published
AI-powered voice assistants with female voices are perpetuating harmful gender biases, according to a UN study.
These female helpers are portrayed as "obliging and eager to please", reinforcing the idea that women are "subservient", it finds.
Particularly worrying, it says, is how they often give "deflecting, lacklustre or apologetic responses" to insults.
The report calls for technology firms to stop making voice assistants female by default.
The study from Unesco (United Nations Educational, Scientific and Cultural Organization) is entitled, I'd blush if I could, external, which is borrowed from a response from Siri to being called a sexually provocative term.
"Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminised digital assistants to greet verbal abuse with catch-me-if-you-can flirtation," the report says.
"Because the speech of most voice assistants is female, it sends a signal that women are... docile helpers, available at the touch of a button or with a blunt voice command like 'hey' or 'OK'. The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility," the report says.
"In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment."
Research firm Canalys estimates that approximately 100 million smart speakers - the hardware that allows users to interact with voice assistants - were sold globally in 2018.
And, according to research firm Gartner, by 2020 some people will have more conversations with voice assistants than with their spouses.
Voice assistants now manage an estimated one billion tasks per month, according to the report, and the vast majority - including those designed by Chinese tech giants - have obviously female voices.
Microsoft's Cortana was named after a synthetic intelligence in the video game Halo that projects itself as a sensuous unclothed woman, while Apple's Siri means "beautiful woman who leads you to victory" in Norse. While Google Assistant has a gender-neutral name, its default voice is female.
Apple did make a male Siri voice available in 2013 and that is the default voice in languages including British, Arabic and French.
The report calls on developers to create a neutral machine gender for voice assistants, to programme them to discourage gender-based insults and to announce the technology as non-human at the outset of interactions with human users.
The report also highlights the digital skills gender gap, from lack of internet use among girls and women in sub-Saharan Africa and parts of South Asia, to the decline of ICT studies being taken up by girls in Europe.
According to the report, women make up just 12% of AI researchers.
- Published10 January 2019
- Published11 April 2019