Tuesday, June 3, 2025
LBNN
  • Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • Documentaries
No Result
View All Result
LBNN

Researchers develop new, more energy-efficient way for AI algorithms to process data

Simon Osuji by Simon Osuji
June 21, 2024
in Artificial Intelligence
0
Researchers develop new, more energy-efficient way for AI algorithms to process data
0
SHARES
0
VIEWS
Share on FacebookShare on Twitter


Can AI learn like us?
A schematic comparing typical machine-learning models (A) with Daruwalla’s new design (B). Row A shows input or data having to travel all the way through every layer of the neural network before the AI model receives feedback, which takes more time and energy. In contrast, row B shows the new design that allows feedback to be generated and incorporated at each network layer. Credit: Kyle Daruwalla/Cold Spring Harbor Laboratory

It reads. It talks. It collates mountains of data and recommends business decisions. Today’s artificial intelligence might seem more human than ever. However, AI still has several critical shortcomings.

Related posts

Exploring the real reasons why some people choose not to use AI

Exploring the real reasons why some people choose not to use AI

June 3, 2025
New tool boosts model transparency

New tool boosts model transparency

June 3, 2025

“As impressive as ChatGPT and all these current AI technologies are, in terms of interacting with the physical world, they’re still very limited. Even in things they do, like solve math problems and write essays, they take billions and billions of training examples before they can do them well,” explains Cold Spring Harbor Laboratory (CSHL) NeuroAI Scholar Kyle Daruwalla.

Daruwalla has been searching for new, unconventional ways to design AI that can overcome such computational obstacles. And he might have just found one.

The key was moving data. Nowadays, most of modern computing’s energy consumption comes from bouncing data around. In artificial neural networks, which are made up of billions of connections, data can have a very long way to go.

So, to find a solution, Daruwalla looked for inspiration in one of the most computationally powerful and energy-efficient machines in existence—the human brain.

Daruwalla designed a new way for AI algorithms to move and process data much more efficiently, based on how our brains take in new information. The design allows individual AI “neurons” to receive feedback and adjust on the fly rather than wait for a whole circuit to update simultaneously. This way, data doesn’t have to travel as far and gets processed in real time.

“In our brains, our connections are changing and adjusting all the time,” Daruwalla says. “It’s not like you pause everything, adjust, and then resume being you.”

The findings are published in the journal Frontiers in Computational Neuroscience.






Credit: Cold Spring Harbor Laboratory

The new machine-learning model provides evidence for a yet unproven theory that correlates working memory with learning and academic performance. Working memory is the cognitive system that enables us to stay on task while recalling stored knowledge and experiences.

“There have been theories in neuroscience of how working memory circuits could help facilitate learning. But there isn’t something as concrete as our rule that actually ties these two together. And so that was one of the nice things we stumbled into here. The theory led out to a rule where adjusting each synapse individually necessitated this working memory sitting alongside it,” says Daruwalla.

Daruwalla’s design may help pioneer a new generation of AI that learns like we do. That would not only make AI more efficient and accessible—it would also be somewhat of a full-circle moment for neuroAI. Neuroscience has been feeding AI valuable data since long before ChatGPT uttered its first digital syllable. Soon, it seems, AI may return the favor.

More information:
Kyle Daruwalla et al, Information bottleneck-based Hebbian learning rule naturally ties working memory and synaptic updates, Frontiers in Computational Neuroscience (2024). DOI: 10.3389/fncom.2024.1240348

Provided by
Cold Spring Harbor Laboratory

Citation:
Researchers develop new, more energy-efficient way for AI algorithms to process data (2024, June 20)
retrieved 20 June 2024
from https://techxplore.com/news/2024-06-energy-efficient-ai-algorithms.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Previous Post

Apollo Global chief economist expects no Fed cuts this year

Next Post

Rumpl’s Amazingly Versatile Outdoor Blanket Is on a Rare Monthlong Sale

Next Post
Rumpl’s Amazingly Versatile Outdoor Blanket Is on a Rare Monthlong Sale

Rumpl’s Amazingly Versatile Outdoor Blanket Is on a Rare Monthlong Sale

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

RECOMMENDED NEWS

It’s central bank week!

It’s central bank week!

1 year ago
Former OpenAI employees lead push to protect whistleblowers flagging artificial intelligence risks

Former OpenAI employees lead push to protect whistleblowers flagging artificial intelligence risks

12 months ago
Myanmar Armed Groups Accuse Junta of Breaking China-Brokered Ceasefire

Myanmar Armed Groups Accuse Junta of Breaking China-Brokered Ceasefire

12 months ago
Congresswoman Lee Introduces Legislation to Improve Access to Mental Health Resources for Students

Congresswoman Lee Introduces Legislation to Improve Access to Mental Health Resources for Students

1 year ago

POPULAR NEWS

  • Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    0 shares
    Share 0 Tweet 0
  • When Will SHIB Reach $1? Here’s What ChatGPT Says

    0 shares
    Share 0 Tweet 0
  • Matthew Slater, son of Jackson State great, happy to see HBCUs back at the forefront

    0 shares
    Share 0 Tweet 0
  • Dolly Varden Focuses on Adding Ounces the Remainder of 2023

    0 shares
    Share 0 Tweet 0
  • US Dollar Might Fall To 96-97 Range in March 2024

    0 shares
    Share 0 Tweet 0
  • Privacy Policy
  • Contact

© 2023 LBNN - All rights reserved.

No Result
View All Result
  • Home
  • Business
  • Politics
  • Markets
  • Crypto
  • Economics
    • Manufacturing
    • Real Estate
    • Infrastructure
  • Finance
  • Energy
  • Creator Economy
  • Wealth Management
  • Taxes
  • Telecoms
  • Military & Defense
  • Careers
  • Technology
  • Artificial Intelligence
  • Investigative journalism
  • Art & Culture
  • Documentaries
  • Quizzes
    • Enneagram quiz
  • Newsletters
    • LBNN Newsletter
    • Divergent Capitalist

© 2023 LBNN - All rights reserved.