UN: AI voice assistants fuel stereotype women are ‘subservient’

Ryan Daws is a senior editor at TechForge Media with over a decade of experience in crafting compelling narratives and making complex topics accessible. His articles and interviews with industry leaders have earned him recognition as a key influencer by organisations like Onalytica. Under his leadership, publications have been praised by analyst firms such as Forrester for their excellence and performance. Connect with him on X (@gadget_ry) or Mastodon (@gadgetry@techhub.social)

A report from the UN claims AI voice assistants like Alexa and Siri are fueling the stereotype women are ‘subservient’.

Published by UNESCO (United Nations Educational, Scientific and Cultural Organization), the 146-page report titled “I’d blush if I could” highlights the market is dominated by female voice assistants.

According to the researchers, the almost exclusive use of female voice assistants fuels stereotypes that women are “obliging, docile and eager-to-please helpers”.

The researchers also believe the lack of mannerisms required in speaking to current virtual assistants is also problematic. They claim the fact a virtual assistant will respond to requests no matter how it’s asked reinforces the idea women are “subservient and tolerant of poor treatment” in some communities.

Similarly, the fact virtual assistants can be summoned with just a “touch of a button or with a blunt voice command like ‘hey’ or ‘OK’,” makes it appear like women are available on demand.

Most virtual assistants use female voices by default but offer a male option. Technology giants such as Amazon and Apple have in the past said consumers prefer female voices for their assistants, with an Amazon spokesperson recently attributing these voices with more “sympathetic and pleasant” traits.

The report highlights virtual assistants are predominantly created with male engineering teams. Some cases even found assistants “thanking users for sexual harassment”, and that sexual advances from male users were tolerated more than from female users.

Siri was found to respond “provocatively to sexual favours” from male users, with phrases such as: “I’d blush if I could” (hence the report’s title) and “Oooh!”, but would do so less towards women.

The lack of ability for female voice assistants to defend themselves from sexist and hostile insults “may highlight her powerlessness,” claims the report. Such coding “projects a digitally encrypted ‘boys will be boys’ attitude” that “may help biases to take hold and spread”.

In a bid to help tackle the issue, the UN believes gender-neutral and non-human voices should be used. The researchers point towards Stephen Hawking’s famous robotic voice as one such example.

Alexa, Google Assistant, and Cortana all use female voices by default. Siri uses a male voice in Arabic, British English, Dutch, and French.

Interested in hearing industry leaders discuss subjects like this and their use cases? Attend the co-located AI & Big Data Expo events with upcoming shows in Silicon Valley, London, and Amsterdam to learn more. Co-located with the IoT Tech Expo, Blockchain Expo, and Cyber Security & Cloud Expo.

Tags: , , , , , , , , , , , ,

View Comments
Leave a comment

Leave a Reply