Monday, May 12, 2025
LBNN
  • Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • Documentaries
No Result
View All Result
LBNN

Model collapse could be coming for LLMs, say researchers

Simon Osuji by Simon Osuji
July 25, 2024
in Artificial Intelligence
0
Model collapse could be coming for LLMs, say researchers
0
SHARES
0
VIEWS
Share on FacebookShare on Twitter


Model collapse coming for LLMs, say researchers
The high-level description of the feedback mechanism in the learning process. Credit: Nature (2024). DOI: 10.1038/s41586-024-07566-y

Using AI-generated datasets to train future generations of machine learning models may pollute their output, a concept known as model collapse, according to a new paper published in Nature. The research shows that within a few generations, original content is replaced by unrelated nonsense, demonstrating the importance of using reliable data to train AI models.

Related posts

Hansker Productivity Vertical Gaming Mouse Review: Super Ergonomics

Hansker Productivity Vertical Gaming Mouse Review: Super Ergonomics

May 12, 2025
Revolutionizing baseball training with AI-simulated pitchers

Revolutionizing baseball training with AI-simulated pitchers

May 12, 2025

Generative AI tools such as large language models (LLMs) have grown in popularity and have been primarily trained using human-generated inputs. However, as these AI models continue to proliferate across the Internet, computer-generated content may be used to train other AI models—or themselves—in a recursive loop.

Ilia Shumailov and colleagues present mathematical models to illustrate how AI models may experience model collapse. The authors demonstrate that an AI may overlook certain outputs (for example, less common lines of text) in training data, causing it to train itself on only a portion of the dataset.

Shumailov and colleagues also investigated how AI models responded to a training dataset that was predominantly created with artificial intelligence. They found that feeding a model AI-generated data causes successive generations to degrade in their ability to learn, eventually leading to model collapse.

Nearly all of the recursively trained language models they tested tended to display repeating phrases. For example, a test was run using text about medieval architecture as the original input and by the ninth generation the output was a list of jackrabbits.

The authors propose that model collapse is an inevitable outcome of AI models that use training datasets created by previous generations. In order to successfully train artificial intelligence with its own outputs, Shumailov and colleagues suggest that training a model with AI-generated data is not impossible, but the filtering of that data must be taken seriously.

At the same time, tech firms that rely on human-generated content may be able to train AI models that are more effective over their competitors.

More information:
Ilia Shumailov et al, AI models collapse when trained on recursively generated data, Nature (2024). DOI: 10.1038/s41586-024-07566-y

Provided by
Nature Publishing Group

Citation:
Using AI to train AI: Model collapse could be coming for LLMs, say researchers (2024, July 25)
retrieved 25 July 2024
from https://techxplore.com/news/2024-07-ai-collapse-llms.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Previous Post

‘5-in-1 Trade Show’ brings important industries, buyers and sellers together

Next Post

Grabbing public land in the name of housing

Next Post
Grabbing public land in the name of housing

Grabbing public land in the name of housing

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

RECOMMENDED NEWS

New Chinese Ambassador to Zambia H.E. Han Jing Arrives in Zambia

New Chinese Ambassador to Zambia H.E. Han Jing Arrives in Zambia

9 months ago
African Development Bank to make $30 million equity investment in Africa Finance Corporation to catalyse climate action

African Development Bank to make $30 million equity investment in Africa Finance Corporation to catalyse climate action

5 months ago
Bigger Crash Than 2008 on the Cards, Explains Analyst

Bigger Crash Than 2008 on the Cards, Explains Analyst

1 year ago
Madagascar’s Untouched Island Of Nosy Be

Madagascar’s Untouched Island Of Nosy Be

8 months ago

POPULAR NEWS

  • Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    0 shares
    Share 0 Tweet 0
  • When Will SHIB Reach $1? Here’s What ChatGPT Says

    0 shares
    Share 0 Tweet 0
  • Matthew Slater, son of Jackson State great, happy to see HBCUs back at the forefront

    0 shares
    Share 0 Tweet 0
  • Dolly Varden Focuses on Adding Ounces the Remainder of 2023

    0 shares
    Share 0 Tweet 0
  • US Dollar Might Fall To 96-97 Range in March 2024

    0 shares
    Share 0 Tweet 0
  • Privacy Policy
  • Contact

© 2023 LBNN - All rights reserved.

No Result
View All Result
  • Home
  • Business
  • Politics
  • Markets
  • Crypto
  • Economics
    • Manufacturing
    • Real Estate
    • Infrastructure
  • Finance
  • Energy
  • Creator Economy
  • Wealth Management
  • Taxes
  • Telecoms
  • Military & Defense
  • Careers
  • Technology
  • Artificial Intelligence
  • Investigative journalism
  • Art & Culture
  • Documentaries
  • Quizzes
    • Enneagram quiz
  • Newsletters
    • LBNN Newsletter
    • Divergent Capitalist

© 2023 LBNN - All rights reserved.