• Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Intelligence
    • Policy Intelligence
    • Security Intelligence
    • Economic Intelligence
    • Fashion Intelligence
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • LBNN Blueprints
  • Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Intelligence
    • Policy Intelligence
    • Security Intelligence
    • Economic Intelligence
    • Fashion Intelligence
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • LBNN Blueprints

Bilinear sequence regression model shows why AI excels at learning from word sequences

Simon Osuji by Simon Osuji
June 20, 2025
in Artificial Intelligence
0
Bilinear sequence regression model shows why AI excels at learning from word sequences
0
SHARES
0
VIEWS
Share on FacebookShare on Twitter


ChatGPT
Credit: Unsplash/CC0 Public Domain

Researchers at EPFL have created a mathematical model that helps explain how breaking language into sequences makes modern AI-like chatbots so good at understanding and using words. The work is published in the journal Physical Review X.

Related posts

Our Favorite Pixel Phone Is $100 Off

Our Favorite Pixel Phone Is $100 Off

February 4, 2026
Governance and data readiness enable the agentic enterprise

Governance and data readiness enable the agentic enterprise

February 4, 2026

There is no doubt that AI technology is dominating our world today. Progress seems to be moving in leaps and bounds, especially focused on large language models (LLMs) like chatGPT.

But how do they work? LLMs are made up of neural networks that process long sequences of “tokens.” Each token is typically a word or part of a word and is represented by a list of hundreds or thousands of numbers—what researchers call a “high-dimensional vector.” This list captures the word’s meaning and how it’s used.

For example, the word “cat” might become a list like [0.15, -0.22, 0.47, …, 0.09], while “dog” is encoded in a similar way but with its own unique numbers. Words with similar meanings get similar lists, so the LLM can recognize that “cat” and “dog” are more alike than “cat” and “banana.”

A black box, even for experts

Processing language as sequences of these vectors is clearly effective, but, ironically, we don’t really understand why. Simple mathematical models for long sequences of these high-dimensional tokens are still mostly unexplored.

This leaves a gap in our understanding: Why does this approach work so well, and what makes it fundamentally different from older methods? Why is it better to present data to neural networks as sequences of high-dimensional tokens rather than as a single, long list of numbers? While today’s AI can write stories or answer questions impressively, the inner workings that make this possible are still a black box—even for experts.

Now, a team of scientists led by Lenka Zdeborová at EPFL has built the simplest possible mathematical model that still captures the heart of learning from tokens as LLMs do.

Their model, called bilinear sequence regression (BSR), strips away the complexity of real-world AI but keeps some of its essential structure and acts as a “theoretical playground” for studying how AI models learn from sequences.

How does BSR work? Imagine a sentence where you can turn each word into a list of numbers that captures its meaning—just like LLMs do. You line these lists up into a table, with one row per word. This table keeps track of the whole sequence and all the details packed into each word.

A clear mathematical benchmark

Instead of processing all the information at once like older AI models, BSR looks at the rows of the table in one way and at the column in another. The model then uses this information to predict a single outcome, such as the sentiment of the sentence.

The power of BSR is that it is simple enough to be fully solved with mathematics. This lets researchers see exactly when sequence-based learning starts to work, and how much data is needed for a model to reliably learn from patterns in sequences.

BSR sheds light on why we get better results using a sequence of embeddings rather than flattening all the data into one big vector. The model revealed sharp thresholds where learning jumps from useless to effective once it “sees” enough examples.

This research offers a new lens for understanding the inner workings of large language models. By solving BSR exactly, the team provides a clear mathematical benchmark that takes a step toward a theory that can guide the design of future AI systems.

These insights could help scientists build models that are simpler, more efficient, and possibly more transparent.

More information:
Vittorio Erba et al, Bilinear Sequence Regression: A Model for Learning from Long Sequences of High-Dimensional Tokens, Physical Review X (2025). DOI: 10.1103/l4p2-vrxt

Provided by
Ecole Polytechnique Federale de Lausanne

Citation:
Bilinear sequence regression model shows why AI excels at learning from word sequences (2025, June 20)
retrieved 20 June 2025
from https://techxplore.com/news/2025-06-bilinear-sequence-regression-ai-excels.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Previous Post

China’s Sunrev to invest $200 million in solar manufacturing complex in Egypt

Next Post

A longer ‘winter’: Public funding slowdown heightens pressure on biotech startups

Next Post
A longer ‘winter’: Public funding slowdown heightens pressure on biotech startups

A longer ‘winter’: Public funding slowdown heightens pressure on biotech startups

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

RECOMMENDED NEWS

Influencers Are Hawking Wellness Products in Response to the LA Fires

Influencers Are Hawking Wellness Products in Response to the LA Fires

1 year ago
South African defence industry to trial solutions for Border Management Authority

South African defence industry to trial solutions for Border Management Authority

2 months ago
WHO Needs A Treaty? “One Health” Is Already Firmly Established In America

WHO Needs A Treaty? “One Health” Is Already Firmly Established In America

2 years ago
How Cisco builds smart systems for the AI era

How Cisco builds smart systems for the AI era

8 hours ago

POPULAR NEWS

  • Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    0 shares
    Share 0 Tweet 0
  • The world’s top 10 most valuable car brands in 2025

    0 shares
    Share 0 Tweet 0
  • Top 10 African countries with the highest GDP per capita in 2025

    0 shares
    Share 0 Tweet 0
  • Global ranking of Top 5 smartphone brands in Q3, 2024

    0 shares
    Share 0 Tweet 0
  • When Will SHIB Reach $1? Here’s What ChatGPT Says

    0 shares
    Share 0 Tweet 0

Get strategic intelligence you won’t find anywhere else. Subscribe to the Limitless Beliefs Newsletter for monthly insights on overlooked business opportunities across Africa.

Subscription Form

© 2026 LBNN – All rights reserved.

Privacy Policy | About Us | Contact

Tiktok Youtube Telegram Instagram Linkedin X-twitter
No Result
View All Result
  • Home
  • Business
  • Politics
  • Markets
  • Crypto
  • Economics
    • Manufacturing
    • Real Estate
    • Infrastructure
  • Finance
  • Energy
  • Creator Economy
  • Wealth Management
  • Taxes
  • Telecoms
  • Military & Defense
  • Careers
  • Technology
  • Artificial Intelligence
  • Investigative journalism
  • Art & Culture
  • LBNN Blueprints
  • Quizzes
    • Enneagram quiz
  • Fashion Intelligence

© 2023 LBNN - All rights reserved.