• Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Intelligence
    • Policy Intelligence
    • Security Intelligence
    • Economic Intelligence
    • Fashion Intelligence
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • LBNN Blueprints
  • Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Intelligence
    • Policy Intelligence
    • Security Intelligence
    • Economic Intelligence
    • Fashion Intelligence
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • LBNN Blueprints

New AI framework enhances emotion analysis

Simon Osuji by Simon Osuji
June 27, 2024
in Artificial Intelligence
0
New AI framework enhances emotion analysis
0
SHARES
0
VIEWS
Share on FacebookShare on Twitter


Social media enthusiasts tend to spice up their text posts with emojis, images, audio, or video to attract more attention. Simple as it is, this technique makes scientific sense: multimodal information is found to be more effective in conveying emotions, as different modalities interact and enhance one another.

To advance the understanding of these interactions and improve the analysis of emotions expressed through modality combinations, a Chinese research team introduced a novel two-stage framework using two stacked layers of transformers, state-of-the-art AI models for multimodal sentiment analysis. This study was published May 24 in Intelligent Computing.

Current research in multimodal sentiment analysis often focuses on either fusing different modalities or addressing intricate interactions or adaptations between different types of fused information. Either approach alone can lead to information loss. This team’s framework, on the other hand, fuses information in two stages to effectively capture information on both levels. It was tested on three open datasets—MOSI, MOSEI, and SIMS—and performed better than or as well as the benchmark models.

The general workflow of this framework includes feature extraction, two stages of information fusion, and emotion prediction. First, text, audio, and video signals taken from source video clips are processed through their corresponding feature extractors and then encoded with additional context information into context-aware representations.

Next, the three types of representations fuse for the first time: the text representations interact with the audio and video representations, allowing each modality to adapt to the others during the process, and the results further integrate with the original text representations. The text-centered output from the first stage then fuses with the adapted non-text representations so that they can enhance each other before the final, enriched output is ready for the emotion prediction stage.

The core of the team’s framework is stacked transformers and consists of bidirectional cross-modal transformers and a transformer encoder. These components correspond to two functional layers: the bidirectional interaction layer allows cross-modal interaction and is where the first-stage fusion occurs, and the refine layer addresses the more nuanced second-stage fusion.

To enhance the performance of the framework, the team implemented an attention weight accumulation mechanism that aggregates the attention weights of the text and non-text modalities during fusion to extract deeper shared information. Attention, a key concept in transformers, enables the model to identify and focus on the most relevant parts of the data. The team’s stacked transformers adopt two types of attention mechanism: the bidirectional cross-modal transformers use cross-attention, and the transformer encoder uses self-attention.

The future work of the team will focus on integrating more advanced transformers to improve computational efficiency and mitigate the inherent challenges associated with the self-attention mechanism.

More information:
Guofeng Yi et al, A Two-Stage Stacked Transformer Framework for Multimodal Sentiment Analysis, Intelligent Computing (2024). DOI: 10.34133/icomputing.0081

Provided by
Intelligent Computing

Citation:
New AI framework enhances emotion analysis (2024, June 26)
retrieved 26 June 2024
from https://techxplore.com/news/2024-06-ai-framework-emotion-analysis.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Related posts

Crypto-Funded Human Trafficking Is Exploding

Crypto-Funded Human Trafficking Is Exploding

February 13, 2026
Our Favorite TV Is Still Almost Half Off

Our Favorite TV Is Still Almost Half Off

February 13, 2026
Previous Post

Taiwan Unveils New Eight-Wheeled Fighting Vehicle Prototype

Next Post

St. George’s University School of Medicine Empowers Korean Students to Become Doctors in the US

Next Post
St. George’s University School of Medicine Empowers Korean Students to Become Doctors in the US

St. George’s University School of Medicine Empowers Korean Students to Become Doctors in the US

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

RECOMMENDED NEWS

How Van Gogh inspired his artist friends to exchange self-portraits

How Van Gogh inspired his artist friends to exchange self-portraits

3 years ago
4 Best Chromebooks (2024): Tested and Reviewed

4 Best Chromebooks (2024): Tested and Reviewed

2 years ago
SANDF ‘deployment’ to Matatiele clarified

SANDF ‘deployment’ to Matatiele clarified

10 months ago
Lizzo’s Current Dancers Express Their Support For The Singer

Lizzo’s Current Dancers Express Their Support For The Singer

2 years ago

POPULAR NEWS

  • Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    0 shares
    Share 0 Tweet 0
  • The world’s top 10 most valuable car brands in 2025

    0 shares
    Share 0 Tweet 0
  • Top 10 African countries with the highest GDP per capita in 2025

    0 shares
    Share 0 Tweet 0
  • Global ranking of Top 5 smartphone brands in Q3, 2024

    0 shares
    Share 0 Tweet 0
  • When Will SHIB Reach $1? Here’s What ChatGPT Says

    0 shares
    Share 0 Tweet 0

Get strategic intelligence you won’t find anywhere else. Subscribe to the Limitless Beliefs Newsletter for monthly insights on overlooked business opportunities across Africa.

Subscription Form

© 2026 LBNN – All rights reserved.

Privacy Policy | About Us | Contact

Tiktok Youtube Telegram Instagram Linkedin X-twitter
No Result
View All Result
  • Home
  • Business
  • Politics
  • Markets
  • Crypto
  • Economics
    • Manufacturing
    • Real Estate
    • Infrastructure
  • Finance
  • Energy
  • Creator Economy
  • Wealth Management
  • Taxes
  • Telecoms
  • Military & Defense
  • Careers
  • Technology
  • Artificial Intelligence
  • Investigative journalism
  • Art & Culture
  • LBNN Blueprints
  • Quizzes
    • Enneagram quiz
  • Fashion Intelligence

© 2023 LBNN - All rights reserved.