Sunday, June 1, 2025
LBNN
  • Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • Documentaries
No Result
View All Result
LBNN

Researchers seek to reduce harm to multicultural users of voice assistants

Simon Osuji by Simon Osuji
July 12, 2024
in Artificial Intelligence
0
Researchers seek to reduce harm to multicultural users of voice assistants
0
SHARES
2
VIEWS
Share on FacebookShare on Twitter


Siri
Credit: Unsplash/CC0 Public Domain

Users of voice assistants such as Siri, Alexa or Google Assistant know the frustration of being misunderstood by a machine.

Related posts

Analysts Say Trump Trade Wars Would Harm the Entire US Energy Sector, From Oil to Solar

Analysts Say Trump Trade Wars Would Harm the Entire US Energy Sector, From Oil to Solar

May 31, 2025
Nike x Hyperice Hyperboot Review: Wearable Post-Run Recovery

Nike x Hyperice Hyperboot Review: Wearable Post-Run Recovery

May 31, 2025

But for people who may lack a standard American accent, such miscommunication can go beyond simply irritating to downright dangerous, according to researchers in the Human-Computer Interaction Institute (HCII) in Carnegie Mellon University’s School of Computer Science.

In a new study published in the Proceedings of the CHI Conference on Human Factors in Computing Systems, HCII Ph.D. student Kimi Wenzel and Associate Professor Geoff Kaufman identified six downstream harms caused by voice assistant errors and devised strategies to reduce them. Their work won a Best Paper award at the Association for Computing Machinery’s Conference on Human Factors in Computing Systems (CHI 2024).

“This paper is part of a larger research project in our lab looking at documenting and understanding the impact of biases that are embedded in technology,” Kaufman said.

White Americans are overrepresented in most datasets used to train voice assistants, and studies have shown that these assistants are far more likely to misinterpret or misunderstand Black speakers and people with accents or dialects that vary from standard American.

Earlier researchers tended to look at this problem as a technical issue to be overcome, as opposed to a failure that has repercussions on the user, Kaufman said. But having speech misunderstood, whether by a person or a machine, can be experienced as a microaggression.

“It can have effects on self-esteem or your sense of belonging,” Kaufman said.

In a controlled experiment last year, Kaufman and Wenzel studied the impact that error rates by a voice assistant had on white and Black volunteers. Black people who experienced high error rates had higher levels of self-consciousness, lower levels of self-esteem and a less favorable view of technology than Black people who experienced low error rates. White people didn’t have this reaction, regardless of error rate.

“We hypothesize that because Black people experience miscommunication more frequently, or have more everyday experience with racism, these experiences build up and they suffer more negative effects,” Wenzel said.

In the latest study, Wenzel and Kaufman interviewed 16 volunteers who experienced problems with voice assistants. They found six potential harms that can result from seemingly innocuous voice assistant errors. These included emotional harm as well as cultural or identity harm caused by microaggressions.

They also included relational harm, which is when an error leads to interpersonal conflict. A voice assistant, for instance, might make a calendar entry with the wrong time for a meeting or misdirect a call.

Other harms include paying the same price for a technology as other people even though it doesn’t work as well for you, as well as needing to exert extra effort—such as altering an accent—to make the technology work.

A sixth harm is physical endangerment.

“Voice technologies are not only used as a simple voice assistant in your smartphone,” Wenzel said. “Increasingly they are being used in more serious contexts, for example in medical transcription.”

Voice technologies also are used in conjunction with auto navigation systems, “and that has very high stakes,” Wenzel added.

One person interviewed for the study related their own hair-raising experiences with a voice-controlled navigation system. “Oftentimes, I feel like I’m pronouncing things very clearly and loudly, but it still can’t understand me. And I don’t know what’s going on. And I don’t know where I’m going. So, it’s just this, this frustrating experience and very dangerous and confusing.”

The ultimate solution is to eliminate bias in voice technologies, but creating datasets representative of the full range of human variation is a perplexing task, Wenzel said. So she and Kaufman talked to the participants about things voice assistants could say to their users to mitigate those harms.

One communication repair strategy they identified was blame redirection—not a simple apology, but an explanation describing the error that doesn’t put the blame on the user.

Wenzel and Kaufmann also suggest that voice technologies be more culturally sensitive. Addressing cultural harms is to some extent limited by technology, but one simple yet profound action would be to increase the database of proper nouns.

“Misrecognition of non-Anglo names has been a persistent harm across many language technologies,” the researchers noted in the paper.

A wealth of social psychology research has shown that self-affirmation—a statement of an individual’s values or beliefs—can be protective when their identity is threatened, Kaufman said. He and Wenzel are looking for ways that voice assistants can include affirmations in their conversations with users, preferably in a way that isn’t obvious to the user. Wenzel is currently testing some of those affirmations in a follow-up study.

In all these conversational interventions, the need for brevity is paramount. People often use voice technologies, after all, in hopes of being more efficient or able to work hands-free. Adding messages into the conversation tends to work against that goal.

“This is a design challenge that we have: How can we emphasize that the blame is on the technology and not on the user at all? How can you make that emphasis as clear as possible in as few words as possible?” Wenzel said. “Right now, the technology says ‘sorry,’ but we think it should be more than that.”

More information:
Kimi Wenzel et al, Designing for Harm Reduction: Communication Repair for Multicultural Users’ Voice Interactions, Proceedings of the CHI Conference on Human Factors in Computing Systems (2024). DOI: 10.1145/3613904.3642900

Provided by
Carnegie Mellon University

Citation:
Researchers seek to reduce harm to multicultural users of voice assistants (2024, July 12)
retrieved 12 July 2024
from https://techxplore.com/news/2024-07-multicultural-users-voice.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Previous Post

IMB notes increasing violence against seafarers

Next Post

Clevetura CLVX 1 Review: I Love This Keyboard-Touchpad Hybrid

Next Post
Clevetura CLVX 1 Review: I Love This Keyboard-Touchpad Hybrid

Clevetura CLVX 1 Review: I Love This Keyboard-Touchpad Hybrid

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

RECOMMENDED NEWS

Empowering women through energy communities

NNPC, Aiteo launch new low-carbon crude

2 years ago
CAAC certifies ExecuJet MRO Services Malaysia as an approved maintenance organisation for Falcon Aircraft

CAAC certifies ExecuJet MRO Services Malaysia as an approved maintenance organisation for Falcon Aircraft

4 months ago
Five highlights of New York’s Upstate Art Weekend 2023

Five highlights of New York’s Upstate Art Weekend 2023

2 years ago
What happens when machine learning goes too far

What happens when machine learning goes too far

1 year ago

POPULAR NEWS

  • Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    0 shares
    Share 0 Tweet 0
  • When Will SHIB Reach $1? Here’s What ChatGPT Says

    0 shares
    Share 0 Tweet 0
  • Matthew Slater, son of Jackson State great, happy to see HBCUs back at the forefront

    0 shares
    Share 0 Tweet 0
  • Dolly Varden Focuses on Adding Ounces the Remainder of 2023

    0 shares
    Share 0 Tweet 0
  • US Dollar Might Fall To 96-97 Range in March 2024

    0 shares
    Share 0 Tweet 0
  • Privacy Policy
  • Contact

© 2023 LBNN - All rights reserved.

No Result
View All Result
  • Home
  • Business
  • Politics
  • Markets
  • Crypto
  • Economics
    • Manufacturing
    • Real Estate
    • Infrastructure
  • Finance
  • Energy
  • Creator Economy
  • Wealth Management
  • Taxes
  • Telecoms
  • Military & Defense
  • Careers
  • Technology
  • Artificial Intelligence
  • Investigative journalism
  • Art & Culture
  • Documentaries
  • Quizzes
    • Enneagram quiz
  • Newsletters
    • LBNN Newsletter
    • Divergent Capitalist

© 2023 LBNN - All rights reserved.