• Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Intelligence
    • Policy Intelligence
    • Security Intelligence
    • Economic Intelligence
    • Fashion Intelligence
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • LBNN Blueprints
  • Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Intelligence
    • Policy Intelligence
    • Security Intelligence
    • Economic Intelligence
    • Fashion Intelligence
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • LBNN Blueprints

A new transformer architecture emulates imagination and higher-level human mental states

Simon Osuji by Simon Osuji
May 29, 2025
in Artificial Intelligence
0
A new transformer architecture emulates imagination and higher-level human mental states
0
SHARES
3
VIEWS
Share on FacebookShare on Twitter


A new transformer architecture that emulates imagination and higher level human mental states
Co4 architecture: N denotes the number of input tokens, and each token has an embedding dimension of E. Q1, Q2,…,QL represent the latent query tokens input to the associated Q-TPNs. K1, K2,…,KN represent the Key tokens input to the associated K-TPNs. V1, V2,…,VN represent the Value tokens input to the associated V-TPNs. This configuration forms part of the “seeing” state (i.e., sensory processing). In the “seeing as” state (i.e., perceptual and interpretive state), triadic modulation loops among questions (Q), clues (keys, K), and hypotheses (values, V ) are executed through distal (D) and universal (U) contexts. Proximal (P) context represents normalization via information from neighboring neurons in the same population, including the prior information from the same neuron. The TPNs associated with Q, K, and V are assumed to be analogous to three subtypes of pyramidal neurons, although their exact correspondence to neurobiologically distinguished subtypes is still under investigation. Through varying states of mind, high-level perceptual processing and wakeful thought, diverse, parallel reasoning chains are enabled. This mechanism incurs a computational cost of O(N · L), where L is a small fraction of the input length, making the overall cost approximately O(N). The triadic modulation loops, based on element-wise operations, add a nominal cost of L · N · E, which is significantly lower than that of the feedforward residual network used in standard Transformer blocks, a component Co4 does not require. Co4 can be viewed as a parallel, representation-level, silent yet deep form of Chain-of-Thought (CoT) reasoning [56] (a quiet mind), enabling multi-perspective inference without requiring sequential token-level generation, much like the brain’s cortico-thalamic modulation. Credit: Ahsan Adeel.

The advancement of artificial intelligence (AI) and the study of neurobiological processes are deeply interlinked, as a deeper understanding of the former can yield valuable insight about the other, and vice versa. Recent neuroscience studies have found that mental state transitions, such as the transition from wakefulness to slow-wave sleep and then to rapid eye movement (REM) sleep, modulate temporary interactions in a class of neurons known as layer 5 pyramidal two-point neurons (TPNs), aligning them with a person’s mental states.

Related posts

The War on Iran Puts Global Chip Supplies and AI Expansion at Risk

The War on Iran Puts Global Chip Supplies and AI Expansion at Risk

March 7, 2026
Sleep Apnea Often Goes Undetected in Women. That’s Starting to Change

Sleep Apnea Often Goes Undetected in Women. That’s Starting to Change

March 7, 2026

These are interactions between information originating from the external world, broadly referred to as the receptive field (RF1), and inputs emerging from internal states, referred to as the contextual field (CF2). Past findings suggest that RF1 and CF2 inputs are processed at two distinct sites within the neurons, known as the basal site and apical site, respectively.

Current AI algorithms employing attention mechanisms, such as transformers, perceiver and flamingo models, are inspired by the capabilities of the human brain. In their current form, however, they do not reliably emulate high-level perceptual processing and the imaginative states experienced by humans.

Ahsan Adeel, an Associate Professor at the University of Stirling, recently carried out a study exploring the possibility of developing AI models that can reproduce these higher mental states, which could in turn speed up their learning and reduce their computational load.

His paper, published on the arXiv preprint server, introduces Co4, a brain-inspired cooperative context-sensitive cognitive computation mechanism specifically designed to replicate the dual-input state-dependent mechanism uncovered in pyramidal TPNs in layer 5 of the human neocortex.

“Attending to what is relevant is fundamental to both the mammalian brain and modern machine learning models such as transformers,” wrote Adeel in his paper.

“Yet determining relevance remains a core challenge, traditionally offloaded to learning algorithms like backpropagation. Inspired by recent cellular neurobiological evidence linking neocortical pyramidal cells to distinct mental states, this work shows how models (e.g., transformers) can emulate high-level perceptual processing and awake thought (imagination) states to pre-select relevant information before applying attention.”

As part of his recent study, Adeel developed a new transformer model that can emulate human perceptual reasoning and imaginative states. This model works by pre-selecting relevant information and identifying the most salient parts of it before applying its full attention to it.

The model connects ideas following a particular reasoning pattern, which focuses on the question (i.e., what is being asked); clues (i.e., pieces of information that could help answer the question); and values or hypotheses (i.e., possible answers to the questions). This reasoning “loop” emulates the ways in which humans try to solve problems, adapting their thinking processes over time.

“Triadic neuronal-level modulation loops among questions (Q), clues (keys, K), and hypotheses (values, V) enable diverse, deep, parallel reasoning chains at the representation level and allow a rapid shift from initial biases to refined understanding,” wrote Adeel.

“This leads to orders-of-magnitude faster learning with significantly reduced computational demand (e.g., fewer heads, layers, and tokens), at an approximate cost of O(N), where N is the number of input tokens. Results span reinforcement learning (e.g., CarRacing in a high-dimensional visual setup), computer vision, and natural language question answering.”

Adeel evaluated his adapted transformer architecture in a series of learning, computer vision and language processing tasks. The results of these tests were highly promising, highlighting the promise of his newly developed mechanism for advancing the reasoning skills of AI models, potentially bringing them even closer to those observed in humans.

“The initial evidence presented here is one of many reasons to believe that emulating the cellular foundations of higher mental states, ranging from high-level perceptual processing to deep, deliberate imaginative reasoning, could be a step toward cognitively meaningful machine intelligence,” concluded Adeel.

“This approach opens the door not only to implementing large numbers of lightweight, inference-efficient AI modules, but also to moving these systems beyond mere information processing toward contextual reasoning, shifting from raw efficiency to real understanding.”

More information:
Ahsan Adeel, Beyond Attention: Toward Machines with Intrinsic Higher Mental States, arXiv (2025). DOI: 10.48550/arxiv.2505.06257

Journal information:
arXiv

© 2025 Science X Network

Citation:
A new transformer architecture emulates imagination and higher-level human mental states (2025, May 29)
retrieved 29 May 2025
from https://techxplore.com/news/2025-05-architecture-emulates-higher-human-mental.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Previous Post

Kenyans in Diaspora: Unique Perspectives Explored

Next Post

Abu Dhabi Investment Office and Metal Park to establish Industry 4.0 Competence Centre in Abu Dhabi

Next Post
Abu Dhabi Investment Office and Metal Park to establish Industry 4.0 Competence Centre in Abu Dhabi

Abu Dhabi Investment Office and Metal Park to establish Industry 4.0 Competence Centre in Abu Dhabi

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

RECOMMENDED NEWS

US Approves Arming Taiwan with $360 Million Loitering Munitions

US Approves Arming Taiwan with $360 Million Loitering Munitions

2 years ago
Researchers say chatbot exhibits self-awareness

Researchers say chatbot exhibits self-awareness

2 years ago
Photos: Here Are the Piles of Used Bedding and Children’s Play Sets Left Near DOGE’s Old Offices

Photos: Here Are the Piles of Used Bedding and Children’s Play Sets Left Near DOGE’s Old Offices

8 months ago
The Nigerian investor behind some of Africa’s biggest deals

The Nigerian investor behind some of Africa’s biggest deals

2 years ago

POPULAR NEWS

  • Mahama attends Liberia’s 178th independence anniversary

    Mahama attends Liberia’s 178th independence anniversary

    0 shares
    Share 0 Tweet 0
  • Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    0 shares
    Share 0 Tweet 0
  • The world’s top 10 most valuable car brands in 2025

    0 shares
    Share 0 Tweet 0
  • Top 10 African countries with the highest GDP per capita in 2025

    0 shares
    Share 0 Tweet 0
  • Global ranking of Top 5 smartphone brands in Q3, 2024

    0 shares
    Share 0 Tweet 0

Get strategic intelligence you won’t find anywhere else. Subscribe to the Limitless Beliefs Newsletter for monthly insights on overlooked business opportunities across Africa.

Subscription Form

© 2026 LBNN – All rights reserved.

Privacy Policy | About Us | Contact

Tiktok Youtube Telegram Instagram Linkedin X-twitter
No Result
View All Result
  • Home
  • Business
  • Politics
  • Markets
  • Crypto
  • Economics
    • Manufacturing
    • Real Estate
    • Infrastructure
  • Finance
  • Energy
  • Creator Economy
  • Wealth Management
  • Taxes
  • Telecoms
  • Military & Defense
  • Careers
  • Technology
  • Artificial Intelligence
  • Investigative journalism
  • Art & Culture
  • LBNN Blueprints
  • Quizzes
    • Enneagram quiz
  • Fashion Intelligence

© 2023 LBNN - All rights reserved.