• Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Intelligence
    • Policy Intelligence
    • Security Intelligence
    • Economic Intelligence
    • Fashion Intelligence
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • LBNN Blueprints
  • Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Intelligence
    • Policy Intelligence
    • Security Intelligence
    • Economic Intelligence
    • Fashion Intelligence
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • LBNN Blueprints

Alexa, should voice assistants have a gender?

Simon Osuji by Simon Osuji
January 17, 2025
in Artificial Intelligence
0
Alexa, should voice assistants have a gender?
0
SHARES
2
VIEWS
Share on FacebookShare on Twitter


Alexa
Credit: Anete Lusina from Pexels

Studies have long shown that men are more likely to interrupt, particularly when speaking with women. New research by Johns Hopkins engineers reveals that this behavior also extends to AI-powered voice assistants like Alexa and Siri, with men interrupting them almost twice as often as women do. The findings are published in Proceedings of the ACM on Human-Computer Interaction.

Related posts

Notepad++ Users, You May Have Been Hacked by China

Notepad++ Users, You May Have Been Hacked by China

February 4, 2026
Our Favorite Pixel Phone Is $100 Off

Our Favorite Pixel Phone Is $100 Off

February 4, 2026

These findings raise concerns about how voice assistant design—notably the use of stereotypically “feminine” traits like apologetic behavior and warmth—may reinforce gender biases, leading researchers to advocate for the design of more gender-neutral voiced tools.

“Conversational voice assistants are frequently feminized through their friendly intonation, gendered names, and submissive behavior. As they become increasingly ubiquitous in our lives, the way we interact with them—and the biases that may unconsciously affect these interactions—can shape not only human-technology relationships but also real-world social dynamics between people,” says study leader Amama Mahmood, a fifth-year Ph.D. student in the Whiting School’s Department of Computer Science.

Mahmood and adviser Chien-Ming Huang, an assistant professor of computer science and the director of the Intuitive Computing Laboratory, presented their findings on voice assistant gender and perception at the 27th ACM Conference on Computer-Supported Cooperative Work and Social Computing, held last fall in San José, Costa Rica.

In Mahmood and Huang’s in-person study, 40 participants—19 men and 21 women—used a voice assistant simulation to complete an online shopping task. Unbeknownst to them, the assistant was pre-programmed to make specific mistakes, allowing the researchers to observe the participants’ reactions.

Participants interacted with three voice types—feminine, masculine, and gender-neutral—and the voice assistant responded to its errors by either offering a simple apology or monetary compensation.

“We examined how users perceived these agents, focusing on attributes like perceived warmth, competence, and user satisfaction with the error recovery,” Mahmood says. “We also analyzed user behavior, observing their reactions, interruptions of the voice assistant, and if their gender played a role in how they responded.”

The researchers observed clear stereotypes in how users perceived and interacted with the AI voice assistants. For instance, users associated greater competence with feminine-voiced assistants, likely reflecting underlying biases that link certain “supportive” skills with traditionally feminine roles.

Users’ own gender also influenced their behavior—male users interrupted the voice assistant more often during errors and responded more socially (smiling and nodding) to the feminine assistant than to the masculine one, suggesting a preference for feminine voice support.

However, working with a gender-neutral voice assistant that apologized for its mistakes reduced impolite interactions and interruptions—even though that voice was perceived as less warm and more “robotic” than its gendered counterparts.

“This shows that designing virtual agents with neutral traits and carefully chosen error mitigation strategies—such as apologies—has the potential to foster more respectful and effective interactions,” Mahmood says.

Mahmood and Huang plan to explore designing voice assistants that can detect biased behaviors and adjust in real time to reduce them, fostering fairer interactions. They also aim to include more nonbinary individuals in their research, as this group was underrepresented in their initial study pool.

“Thoughtful design—especially in how these agents portray gender—is essential to ensure effective user support without the promotion of harmful stereotypes. Ultimately, addressing these biases in the field of voice assistance and AI will help us create a more equitable digital and social environment,” Mahmood says.

More information:
Amama Mahmood et al, Gender Biases in Error Mitigation by Voice Assistants, Proceedings of the ACM on Human-Computer Interaction (2024). DOI: 10.1145/3637337

Provided by
Johns Hopkins University

Citation:
Alexa, should voice assistants have a gender? (2025, January 17)
retrieved 17 January 2025
from https://techxplore.com/news/2025-01-alexa-voice-gender.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Previous Post

Corpolgia Partners with Forte Secur Group to Launch Premier Security Training Programs in the UAE

Next Post

Ethio Telecom & AfDB Partnership to Drive Ethiopia’s Digital Economy

Next Post
Ethio Telecom & AfDB Partnership to Drive Ethiopia’s Digital Economy

Ethio Telecom & AfDB Partnership to Drive Ethiopia’s Digital Economy

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

RECOMMENDED NEWS

LAPD Investigating Bizarre Calls Targeting Dennis Graham

LAPD Investigating Bizarre Calls Targeting Dennis Graham

3 years ago
Social housing is America’s “missing tool” to solve housing crisis says Alex Lee

Social housing is America’s “missing tool” to solve housing crisis says Alex Lee

2 years ago
AI and robots pose new ethical challenges for society

AI and robots pose new ethical challenges for society

1 year ago
6 Best Cordless Vacuums for Carpet, Hardwood, and Hard-to-Reach Areas (2024)

6 Best Cordless Vacuums for Carpet, Hardwood, and Hard-to-Reach Areas (2024)

1 year ago

POPULAR NEWS

  • Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    0 shares
    Share 0 Tweet 0
  • The world’s top 10 most valuable car brands in 2025

    0 shares
    Share 0 Tweet 0
  • Top 10 African countries with the highest GDP per capita in 2025

    0 shares
    Share 0 Tweet 0
  • Global ranking of Top 5 smartphone brands in Q3, 2024

    0 shares
    Share 0 Tweet 0
  • When Will SHIB Reach $1? Here’s What ChatGPT Says

    0 shares
    Share 0 Tweet 0

Get strategic intelligence you won’t find anywhere else. Subscribe to the Limitless Beliefs Newsletter for monthly insights on overlooked business opportunities across Africa.

Subscription Form

© 2026 LBNN – All rights reserved.

Privacy Policy | About Us | Contact

Tiktok Youtube Telegram Instagram Linkedin X-twitter
No Result
View All Result
  • Home
  • Business
  • Politics
  • Markets
  • Crypto
  • Economics
    • Manufacturing
    • Real Estate
    • Infrastructure
  • Finance
  • Energy
  • Creator Economy
  • Wealth Management
  • Taxes
  • Telecoms
  • Military & Defense
  • Careers
  • Technology
  • Artificial Intelligence
  • Investigative journalism
  • Art & Culture
  • LBNN Blueprints
  • Quizzes
    • Enneagram quiz
  • Fashion Intelligence

© 2023 LBNN - All rights reserved.