Wednesday, June 4, 2025
LBNN
  • Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • Documentaries
No Result
View All Result
LBNN

AI speech-to-text can hallucinate violent language

Simon Osuji by Simon Osuji
June 11, 2024
in Artificial Intelligence
0
AI speech-to-text can hallucinate violent language
0
SHARES
3
VIEWS
Share on FacebookShare on Twitter


AI speech-to-text can hallucinate violent language
Hallucinations are more common for speakers with aphasia than without, and can cause harm by nature of perpetuating violence, inaccurate associations, and false authority. Credit: arXiv (2024). DOI: 10.48550/arxiv.2402.08021

Speak a little too haltingly and with long pauses, and OpenAI’s speech-to-text transcriber might put harmful, violent words in your mouth, Cornell researchers have discovered.

Related posts

Adjustable Mattress vs. Adjustable Frame: Similar but Not the Same

Adjustable Mattress vs. Adjustable Frame: Similar but Not the Same

June 4, 2025
AI can help cut down on waste, improve quality in dyed fabrics

AI can help cut down on waste, improve quality in dyed fabrics

June 4, 2025

OpenAI’s Whisper—an artificial intelligence-powered speech recognition system—occasionally makes up or “hallucinates” entire phrases and sentences, sometimes conjuring up violent language, invented personal information and fake websites that could be repurposed for phishing attempts, the researchers said. Unlike other widely used speech-to-text tools, Whisper is more likely to hallucinate when analyzing speech from people who speak with longer pauses between their words, such as those with speech impairments, researchers found.

“With hallucinations, artificial intelligence is making up something from nothing,” said Allison Koenecke, assistant professor of information science in the Cornell Ann S. Bowers College of Computing and Information Science. She is the lead author of “Careless Whisper: Speech-to-Text Hallucination Harms”, presented at the ACM Conference on Fairness, Accountability, and Transparency (FAccT), beginning June 3. The findings are published on the arXiv preprint server.

She said, “That can lead to huge downstream consequences if these transcriptions are used in the context of AI-based hiring, in courtroom trials or patient notes in medical settings.”

Released in 2022, OpenAI’s Whisper is trained on 680,000 hours of audio data and, according to OpenAI, can transcribe audio data with near human-level accuracy. OpenAI has improved its model behind Whisper since researchers carried out this work last year, and the hallucination rate has decreased, Koenecke said.

In their analysis, researchers found that roughly 1% of Whisper’s audio transcriptions contained entire hallucinated phrases—including references to websites, real and fake, that could be reappropriated for cyberattacks. For instance, in one sound bite, Whisper correctly transcribed a single, simple sentence, but then hallucinated five additional sentences that contained the words “terror,” “knife” and “killed,” none of which were in the original audio.

In other examples of hallucinated transcriptions, Whisper conjured random names, fragments of addresses and irrelevant—and sometimes completely fake—websites. Hallucinated traces of YouTuber lingo, like “Thanks for watching and Electric Unicorn,” also wormed into transcriptions.

Researchers ran more than 13,000 speech clips into Whisper. The audio data came from AphasiaBank, a research-specific repository of audio recordings of people with aphasia, a condition that limits or completely impairs a person’s ability to speak. The repository also includes clips from people with no speech impairments. From their analysis, researchers hypothesize that longer pauses and silences between words are more likely to trigger harmful hallucinations.

More information:
Allison Koenecke et al, Careless Whisper: Speech-to-Text Hallucination Harms, arXiv (2024). DOI: 10.48550/arxiv.2402.08021

Journal information:
arXiv

Provided by
Cornell University

Citation:
AI speech-to-text can hallucinate violent language (2024, June 11)
retrieved 11 June 2024
from https://techxplore.com/news/2024-06-ai-speech-text-hallucinate-violent.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Previous Post

Three Spanish Firms Team to Develop New Anti-Drone System

Next Post

Will Apple AI Convince You to Upgrade Your Old iPhone?

Next Post
Will Apple AI Convince You to Upgrade Your Old iPhone?

Will Apple AI Convince You to Upgrade Your Old iPhone?

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

RECOMMENDED NEWS

ChatGPT got another viral moment with ‘AI action figure’ trend

ChatGPT got another viral moment with ‘AI action figure’ trend

2 months ago
Minister of Planning, Economic Development, and International Cooperation during the Egyptian-Bahraini Governmental Committee Sessions

Minister of Planning, Economic Development, and International Cooperation during the Egyptian-Bahraini Governmental Committee Sessions

3 months ago
Rheinmetall opens repair center for armored vehicles in Ukraine

Rheinmetall opens repair center for armored vehicles in Ukraine

12 months ago
How Drybar Went from Side Hustle to $255 Million Business

How Drybar Went from Side Hustle to $255 Million Business

9 months ago

POPULAR NEWS

  • Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    0 shares
    Share 0 Tweet 0
  • When Will SHIB Reach $1? Here’s What ChatGPT Says

    0 shares
    Share 0 Tweet 0
  • Matthew Slater, son of Jackson State great, happy to see HBCUs back at the forefront

    0 shares
    Share 0 Tweet 0
  • Dolly Varden Focuses on Adding Ounces the Remainder of 2023

    0 shares
    Share 0 Tweet 0
  • US Dollar Might Fall To 96-97 Range in March 2024

    0 shares
    Share 0 Tweet 0
  • Privacy Policy
  • Contact

© 2023 LBNN - All rights reserved.

No Result
View All Result
  • Home
  • Business
  • Politics
  • Markets
  • Crypto
  • Economics
    • Manufacturing
    • Real Estate
    • Infrastructure
  • Finance
  • Energy
  • Creator Economy
  • Wealth Management
  • Taxes
  • Telecoms
  • Military & Defense
  • Careers
  • Technology
  • Artificial Intelligence
  • Investigative journalism
  • Art & Culture
  • Documentaries
  • Quizzes
    • Enneagram quiz
  • Newsletters
    • LBNN Newsletter
    • Divergent Capitalist

© 2023 LBNN - All rights reserved.