• Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Intelligence
    • Policy Intelligence
    • Security Intelligence
    • Economic Intelligence
    • Fashion Intelligence
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • LBNN Blueprints
  • Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Intelligence
    • Policy Intelligence
    • Security Intelligence
    • Economic Intelligence
    • Fashion Intelligence
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • LBNN Blueprints

Brain-inspired AI could cut energy use and boost performance

Simon Osuji by Simon Osuji
October 31, 2025
in Artificial Intelligence
0
Brain-inspired AI could cut energy use and boost performance
0
SHARES
2
VIEWS
Share on FacebookShare on Twitter


Brain-inspired AI could cut energy use and boost performance
A convergence unit in retinal topography extracted from Ref [65]. Numbers on this figure are chosen for the purpose of illustrating the converging nature of the topography. On average, 50 photoreceptors are connected to a single bipolar cell. Credit: Neurocomputing (2025). DOI: 10.1016/j.neucom.2025.131740

Artificial intelligence (AI) could soon become more energy-efficient and faster, thanks to a new approach developed at the University of Surrey that takes direct inspiration from biological neural networks of the human brain.

Related posts

Keychron Q16 HE 8K Review: A Ceramic Disappointment

Keychron Q16 HE 8K Review: A Ceramic Disappointment

January 30, 2026
After Minneapolis, Tech CEOs Are Struggling to Stay Silent

After Minneapolis, Tech CEOs Are Struggling to Stay Silent

January 30, 2026

In a study published in Neurocomputing, researchers from Surrey’s Nature-Inspired Computation and Engineering (NICE) group have shown that mimicking the brain’s sparse and structured neural wiring can significantly improve the performance of artificial neural networks (ANNs)—used in generative AI and other modern AI models such as ChatGPT—without sacrificing accuracy.

The method, called Topographical Sparse Mapping (TSM), rethinks how AI systems are wired at their most fundamental level. Unlike conventional deep-learning models—such as those used for image recognition and language processing—which connect every neuron in one layer to all neurons in the next, wasting energy, TSM connects each neuron only to nearby or related ones, much like how the brain’s visual system organizes information efficiently. Through this natural design, the model eliminates the need for vast numbers of unnecessary connections and computations.

An enhanced version, called Enhanced Topographical Sparse Mapping (ETSM), goes a step further by introducing a biologically inspired “pruning” process during training—similar to how the brain gradually refines its neural connections as it learns. Together, these approaches allow AI systems to achieve equal or even greater accuracy while using only a fraction of the parameters and energy required by conventional models.

Dr. Roman Bauer, Senior Lecturer at the University of Surrey’s School of Computer Science and Electronic Engineering, and project supervisor, said, “Training many of today’s popular large AI models can consume over a million kilowatt-hours of electricity, which is equivalent to the annual use of more than a hundred US homes, and cost tens of millions of dollars. That simply isn’t sustainable at the rate AI continues to grow. Our work shows that intelligent systems can be built far more efficiently, cutting energy demands without sacrificing performance.”

Surrey’s enhanced model achieved up to 99% sparsity—meaning it could remove almost all of the usual neural connections—but still matched or exceeded the accuracy of standard networks on benchmark datasets. Because it avoids the constant fine-tuning and rewiring used by other approaches, it trains faster, uses less memory and consumes less than one percent of the energy of a conventional AI system.

Mohsen Kamelian Rad, a Ph.D. student at the University of Surrey and lead author of the study, said, “The brain achieves remarkable efficiency through its structure, with each neuron forming connections that are spatially well-organized. When we mirror this topographical design, we can train AI systems that learn faster, use less energy and perform just as accurately. It’s a new way of thinking about neural networks, built on the same biological principles that make natural intelligence so effective.”

While the current framework applies the brain-inspired mapping to an AI model’s input layer, extending it to deeper layers could make networks even leaner and more efficient. The research team is also exploring how the approach could be used in other applications, such as more realistic neuromorphic computers, where the efficiency gains could have an even greater impact.

More information:
Mohsen Kamelian Rad et al, Topographical sparse mapping: A neuro-inspired sparse training framework for deep learning models, Neurocomputing (2025). DOI: 10.1016/j.neucom.2025.131740

Provided by
University of Surrey

Citation:
Brain-inspired AI could cut energy use and boost performance (2025, October 30)
retrieved 30 October 2025
from https://techxplore.com/news/2025-10-brain-ai-energy-boost.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Previous Post

Zimbabwe’s wealthiest man sees fortune grow amid renewed investor trust

Next Post

Earnings roundup: Alnylam’s down day, Neurocrine’s head scratch and Biogen’s ‘low quality’ win

Next Post
Earnings roundup: Alnylam’s down day, Neurocrine’s head scratch and Biogen’s ‘low quality’ win

Earnings roundup: Alnylam’s down day, Neurocrine’s head scratch and Biogen’s ‘low quality’ win

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

RECOMMENDED NEWS

Scheduling Time on Your Calendar to Disconnect From Work

Scheduling Time on Your Calendar to Disconnect From Work

2 years ago
AICPA emphasizes importance of a modern IRS in statement

AICPA emphasizes importance of a modern IRS in statement

10 months ago
But XRP May Have Path to $3

But XRP May Have Path to $3

11 months ago
BRICS Abandons US Dollar, Settles 65% of Trade in Local Currencies

BRICS Abandons US Dollar, Settles 65% of Trade in Local Currencies

9 months ago

POPULAR NEWS

  • Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    0 shares
    Share 0 Tweet 0
  • The world’s top 10 most valuable car brands in 2025

    0 shares
    Share 0 Tweet 0
  • Top 10 African countries with the highest GDP per capita in 2025

    0 shares
    Share 0 Tweet 0
  • Global ranking of Top 5 smartphone brands in Q3, 2024

    0 shares
    Share 0 Tweet 0
  • When Will SHIB Reach $1? Here’s What ChatGPT Says

    0 shares
    Share 0 Tweet 0

Get strategic intelligence you won’t find anywhere else. Subscribe to the Limitless Beliefs Newsletter for monthly insights on overlooked business opportunities across Africa.

Subscription Form

© 2026 LBNN – All rights reserved.

Privacy Policy | About Us | Contact

Tiktok Youtube Telegram Instagram Linkedin X-twitter
No Result
View All Result
  • Home
  • Business
  • Politics
  • Markets
  • Crypto
  • Economics
    • Manufacturing
    • Real Estate
    • Infrastructure
  • Finance
  • Energy
  • Creator Economy
  • Wealth Management
  • Taxes
  • Telecoms
  • Military & Defense
  • Careers
  • Technology
  • Artificial Intelligence
  • Investigative journalism
  • Art & Culture
  • LBNN Blueprints
  • Quizzes
    • Enneagram quiz
  • Fashion Intelligence

© 2023 LBNN - All rights reserved.