Amazon Alexa and Siri accused of sexism for 'thanking users for sexual harassment'

'The assistant holds no power of agency beyond what the commander asks of it'

Olivia Petter
Wednesday 22 May 2019 14:09 BST
Comments
File image of Amazon Alexa Echo Plus.
File image of Amazon Alexa Echo Plus.

Digital assistants such as Apple’s Siri and Amazon’s Alexa are enforcing damaging gender stereotypes, according to a report by the UN’s Educational, Scientific and Cultural Organisation, UNESCO.

The publication, titled “I’d blush if I could”, found that the majority of voice assistants have a female voice but a limited set of responses to queries that reinforces harmful tropes about women being “subservient and tolerant of poor treatment”.

The name of the report comes from a response given by Siri after a human user tells it, “Hey Siri, you’re a b****”. On different occasions, the reply was: “Well, I never!” and “Now, now”.

The report also notes that when Amazon’s Alexa is told, “You’re a slut,” the voice assistant replies, “Well, thanks for the feedback.”

UNESCO said that female voice assistants have been programmed to “greet verbal abuse with catch-me-if-you-can flirtation” and found that in some cases, female assistants even “thanked users for sexual harassment”.

The report, which also highlights the digital skills gender gap, added that the way these assistants have been programmed “may help biases take hold and spread”.

The responses are the result of tech giants like Apple and Amazon being staffed by “overwhelmingly male engineering teams,” the report states, noting that women make up just 12 per cent of AI researchers.

“Because the speech of most voice assistants is female, it sends a signal that women are docile helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’.

“The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility.”

In order to combat the issue, the report recommends that engineers programme digital assistants to “announce” they are not human at the beginning of their interactions. It also advises tech companies to stop using female voices as default for these products and introduce a gender-neutral option that would eliminate any opportunities for bias.

“As intelligent digital assistants become ubiquitous, a machine gender might help separate technologies from notions of gender ascribed to humans, and help children and others avoid anthropomorphising them,” it concludes.

The Independent has contacted Apple and Amazon for comment.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in