Members Only
lock close icon

Why Are Most Voice Assistants Female & How Do They Reinforce Gender Stereotypes?

The issue lies in the names of AI voice assistants like Alexa and Cortana and the tasks they are made to carry out.

Madhusree Goswami
Explainers
Published:
<div class="paragraphs"><p>By default, these devices are associating female voices often with  domestic and ‘assistant’ type tasks.</p></div>
i

By default, these devices are associating female voices often with domestic and ‘assistant’ type tasks.

(Photo: Aroop Mishra/The Quint)

advertisement

Tech voice assistants such as Alexa, Siri, Cortana, and Google Assistant have undoubtedly become a part of our everyday life. From helping you remember birthdays to putting together your shopping list, they are here to make your life easier.

And they have been around for quite sometime now. While Siri was launched in 2011, Alexa was launched in 2014. Despite that, have you ever noticed that AI assistants are most often voiced by women?

Of course, one may argue that several AI voice automated systems now let you opt for a male voice.

However, the fact is that the default voice of most voice assistants is female and most of them have a female name.

But why is that an issue at all? Do voice assistants reinforce gender stereotypes? The Quint answers all these questions for you.

Why Are Feminine Voices the Default?

When AI voice assistants first hit the market around a decade ago, a majority of them were launched with a feminine voice and, in many cases, a feminine name. While many of the firms responsible have since created more masculine-sounding options, the default remains female.

One of the commonly cited reasons for this feminine touch in virtual assistants rests in social science and history, most of which are controversial and widely debated.

For one, companies spent decades acquiring more recordings of women's voices than that of men's. As a result, many AI voice assistant developers tend to turn to a feminine voice as a matter of convenience.

For example, when Google first launched its Google Assistant in 2016, it opted for a woman’s voice (but a noticeably gender-neutral name), citing a historical bias in its text-to-speech systems, which had been mostly trained on feminine voices.

The other theory for the heavy lean towards feminine voices in AI virtual assistants has to do with biology. Many studies throughout history have indicated that more people tend to prefer listening to feminine voices. 

Another belief (that has since been debunked) stated that women tend to articulate vowel sounds more clearly, making them easier to hear when using small speakers over background noise.

Largely, studies indicating these preferences have either been disputed or shown to be flat-out wrong.

What Do Studies Have To Say About Gender Bias in AI?

Gender bias is rife in the AI system. As a 2019 report by the United Nations Educational, Scientific and Cultural Organisation (UNESCO) noted, AI voice assistants projected as young women perpetuate harmful gender biases.

“Because Alexa, Cortana, Google Home and Siri are all female exclusively or female by default in most markets, women assume the role of digital attendant, checking the weather, changing the music, placing orders upon command and diligently coming to attention in response to curt greetings like ‘Wake up, Alexa,’” the report read.

The report pointed out that because the voice of most voice assistants is female, it sends a signal that women are obliging, docile, and eager-to-please helpers, that they are available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’.

How Does Pop Culture Perpetuate This Stereotype?

The stereotying extends beyond real life as well. For example, JARVIS, the popular AI assistant to Tony Stark in Marvel’s Avengers and Iron Man movies (2008-2013), is more of a companion rather than a subservient, automated system. JARVIS helps him save the world and actor Paul Bettany voices him.

In comparison, Samantha, a female AI assistant voiced by actress Scarlett Johansson in the movie Her (2013), talks about relationships and plans dates for the protagonist played by actor Joaquin Phoenix.

ADVERTISEMENT
ADVERTISEMENT

The Problem With Typecasting Women

A growing number of people around the world are currently utilising voice automated software to help them go about their day. The number is expected to grow to 8.4 billion users by 2024, according to a study by research and data firm Statista.

The issue with the 'default' setting in the devices lies in the female-gendered names of AI voice assistants like Alexa and Cortana and the tasks they are typically made to carry out.

While Microsoft's Cortana was named after a synthetic intelligence in the video game Halo that projects itself as a sensuous unclothed woman, Apple's Siri means "beautiful woman who leads you to victory" in Norse. 

Now, for instance, think about the tasks you assign these assistants (like creating your shopping list, remembering dates on your calendar, and so on). They aren't very complex, are they?

But what is actually happening is that these devices are helping associate female voices with domestic and ‘assistant’ type tasks.

One can argue that the settings on individual devices can be changed. But, let's be honest, how many of us actually play around with the settings anyway?

The conversation around this type of stereotyping is hitting social media as well.

“When Alexa can keep track of your shopping list, set your alarm, turn on the heating and lights, even keep your kids entertained with games, it’s not that we prefer women’s voices, it’s what we prefer women doing domestic tasks," Feminist cartoonist Lily O’Farrell said in an Instagram post.

Meanwhile, companies behind these virtual assistants have been at the receiving end for casting them as female, possibly perpetuating stereotypes that women are submissive, obedient, and are the ones that carry out domestic chores.

What started out as a hands-free, eyes-free form of human-computer interaction has sparked discussions around gender bias and the role of artificial intelligence (AI) in our lives.

Does the Bias Extend Beyond Smartphones?

Yes. From GPS instructions in vehicles to lifts in buildings, female voices are also used as the default in everyday applications.

Verena Rieser, a professor in Computer Science at Heriot-Watt University, told Cosmopolitan UK that “the feminisation of AI personal assistants can reinforce negative stereotypes of women as subservient, as these systems often produce responses which are submissive at best, and sexualised at worst.”

She added that there is a danger that the interactions "we have online with AI ‘spill over’ into the real world."

She highlighted that AI is not neutral, but because it is created by humans, it can be informed by pre-existing societal bias. "AI is currently reflecting this bias and we need to address this," she explained.

(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)

Become a Member to unlock
  • Access to all paywalled content on site
  • Ad-free experience across The Quint
  • Early previews of our Special Projects
Continue

Published: undefined

ADVERTISEMENT
SCROLL FOR NEXT