Female-voice AI Reinforces Bias, Says UN Report
AI-powered voice assistants with female voices are perpetuating harmful gender biases, according to a UN study.
These female helpers are portrayed as "obliging and eager to please", reinforcing the idea that women are "subservient", it finds.
Particularly worrying, it says, is how they often give "deflecting, lacklustre or apologetic responses" to insults.
The report calls for technology firms to stop making voice assistants female by default.
The study from Unesco (United Nations Educational, Scientific and Cultural Organization) is entitled, I'd blush if I could, which is borrowed from a response from Siri to being called a sexually provocative term.
"Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminised digital assistants to greet verbal abuse with catch-me-if-you-can flirtation," the report says.
"Because the speech of most voice assistants is female, it sends a signal that women are... docile helpers, available at the touch of a button or with a blunt voice command like 'hey' or 'OK'. The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility," the report says.
"In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment."
Research firm Canalys estimates that approximately 100 million smart speakers - the hardware that allows users to interact with voice assistants - were sold globally in 2018.
And, according to research firm Gartner, by 2020 some people will have more conversations with voice assistants than with their spouses.
Voice assistants now manage an estimated one billion tasks per month, according to the report, and the vast majority - including those designed by Chinese tech giants - have obviously female voices.
Microsoft's Cortana was named after a synthetic intelligence in the video game Halo that projects itself as a sensuous unclothed woman, while Apple's Siri means "beautiful woman who leads you to victory" in Norse. While Google Assistant has a gender-neutral name, its default voice is female.
The report calls on developers to create a neutral machine gender for voice assistants, to programme them to discourage gender-based insults and to announce the technology as non-human at the outset of interactions with human users.
The report also highlights the digital skills gender gap, from lack of internet use among girls and women in sub-Saharan Africa and parts of South Asia, to the decline of ICT studies being taken up by girls in Europe.
According to the report, women make up just 12% of AI researchers.
From Chip War To Cloud War: The Next Frontier In Global Tech Competition
The global chip war, characterized by intense competition among nations and corporations for supremacy in semiconductor ... Read more
The High Stakes Of Tech Regulation: Security Risks And Market Dynamics
The influence of tech giants in the global economy continues to grow, raising crucial questions about how to balance sec... Read more
The Tyranny Of Instagram Interiors: Why It's Time To Break Free From Algorithm-Driven Aesthetics
Instagram has become a dominant force in shaping interior design trends, offering a seemingly endless stream of inspirat... Read more
The Data Crunch In AI: Strategies For Sustainability
Exploring solutions to the imminent exhaustion of internet data for AI training.As the artificial intelligence (AI) indu... Read more
Google Abandons Four-Year Effort To Remove Cookies From Chrome Browser
After four years of dedicated effort, Google has decided to abandon its plan to remove third-party cookies from its Chro... Read more
LinkedIn Embraces AI And Gamification To Drive User Engagement And Revenue
In an effort to tackle slowing revenue growth and enhance user engagement, LinkedIn is turning to artificial intelligenc... Read more