By Hannah Dawson For The Daily Mail
Published: 00:08 BST, 22 May 2019 | Updated: 00:08 BST, 22 May 2019
Alexa is a common cry in most homes and the smart speaker has become a household staple.
But AI-powered voice assistants with female voices may not be norm in the future as they encourage harmful gender biases.
A UN study revealed that the voices used by smart speakers reinforce ideas that women are 'subservient' as they are portrayed as 'obliging and eager to please.'
It also criticised the way female AI's respond to gender-based insults with 'deflecting, lacklustre or apologetic responses.' The Unesco report, called 'I'd blush if I could' calls for technology firms to stop making voice assistants female by default and to hire more women to work on them.
Prompts on how to use Amazon's Alexa personal assistant are seen in an Amazon 'experience centre' in Vallejo, California
The title is borrowed from a standard response from Siri, Apple's first mobile assistant, and is what the automated voice said in response to being called a 'b**ch'.
The 146 page report said: 'Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminised