Wednesday, June 4, 2025
LBNN
  • Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • Documentaries
No Result
View All Result
LBNN

Flexible multi-task computation in recurrent neural networks relies on dynamical motifs, study shows

Simon Osuji by Simon Osuji
August 16, 2024
in Artificial Intelligence
0
Flexible multi-task computation in recurrent neural networks relies on dynamical motifs, study shows
0
SHARES
0
VIEWS
Share on FacebookShare on Twitter


Flexible multi-task computation in recurrent neural networks relies on dynamical motifs
Dynamical motifs were reused for fast learning of novel tasks with familiar computational elements. Credit: Nature Neuroscience (2024). DOI: 10.1038/s41593-024-01668-6

Cognitive flexibility, the ability to rapidly switch between different thoughts and mental concepts, is a highly advantageous human capability. This salient capability supports multi-tasking, the rapid acquisition of new skills and the adaptation to new situations.

Related posts

AI enables shift from enablement to strategic leadership

AI enables shift from enablement to strategic leadership

June 3, 2025
Exploring the real reasons why some people choose not to use AI

Exploring the real reasons why some people choose not to use AI

June 3, 2025

While artificial intelligence (AI) systems have become increasingly advanced over the past few decades, they currently do not exhibit the same flexibility as humans in learning new skills and switching between tasks. A better understanding of how biological neural circuits support cognitive flexibility, particularly how they support multi-tasking, could inform future efforts aimed at developing more flexible AI.

Recently, some computer scientists and neuroscientists have been studying neural computations using artificial neural networks. Most of these networks, however, were generally trained to tackle specific tasks individually as opposed to multiple tasks.

In 2019, a research group at New York University, Columbia University and Stanford University trained a single neural network to perform 20 related tasks.

In a new paper published in Nature Neuroscience, a team at Stanford set out to investigate what allows this neural network to perform modular computations, thus tackling several different tasks.

“Flexible computation is a hallmark of intelligent behavior,” Laura N. Driscoll, Krishna Shenoy and David Sussillo wrote in their paper. “However, little is known about how neural networks contextually reconfigure for different computations. In the present work, we identified an algorithmic neural substrate for modular computation through the study of multitasking artificial recurrent neural networks.”

The key objective of the recent study by Driscoll, Shenoy and Sussillo was to investigate the mechanisms that underly the computations of recurrently connected artificial neural networks. Their efforts allowed the researchers to identify a computational substrate of these networks that enables modular computations, a substrate that they describe with the term “dynamical motifs.”

“Dynamical systems analyses revealed learned computational strategies mirroring the modular subtask structure of the training task set,” wrote Driscoll, Shenoy and Sussillo. “Dynamical motifs, which are recurring patterns of neural activity that implement specific computations through dynamics, such as attractors, decision boundaries and rotations, were reused across tasks. For example, tasks requiring memory of a continuous circular variable repurposed the same ring attractor.”

The researchers ran a series of analyses, which revealed that in convolutional neural networks, so-called dynamical motifs are implemented by clusters of units when the unit activation function is restricted to being positive. Moreover, lesions to these units were found to adversely impact the ability of the networks to perform modular computations.

“Motifs were reconfigured for fast transfer learning after an initial phase of learning,” wrote Driscoll, Shenoy and Sussillo “This work establishes dynamical motifs as a fundamental unit of compositional computation, intermediate between neuron and network. As whole-brain studies simultaneously record activity from multiple specialized systems, the dynamical motif framework will guide questions about specialization and generalization.”

Overall, the recent study by this team of researchers pinpoints a substrate of convolutional neural networks that significantly contributes to their ability to effectively tackle multiple tasks. In the future, the findings of this work could inform both neuroscience and computer science research, potentially leading to an improved understanding of the neural processes underpinning cognitive flexibility and informing the development of new strategies that emulate these processes in artificial neural networks.

More information:
Laura N. Driscoll et al, Flexible multitask computation in recurrent networks utilizes shared dynamical motifs, Nature Neuroscience (2024). DOI: 10.1038/s41593-024-01668-6

© 2024 Science X Network

Citation:
Flexible multi-task computation in recurrent neural networks relies on dynamical motifs, study shows (2024, August 16)
retrieved 16 August 2024
from https://techxplore.com/news/2024-08-flexible-multi-task-recurrent-neural.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Previous Post

Technology, Education and Health Key to Needed Transformation

Next Post

NewEdge Wealth Adds Teams from Merrill Lynch, Morgan Stanley

Next Post
NewEdge Wealth Adds Teams from Merrill Lynch, Morgan Stanley

NewEdge Wealth Adds Teams from Merrill Lynch, Morgan Stanley

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

RECOMMENDED NEWS

Elemental Energies head of decommissioning and more

Elemental Energies head of decommissioning and more

3 months ago
Amazon Singapore Reveals Six Days of Exciting Deals for Amazon Prime Big Deal Days 2024

Amazon Singapore Reveals Six Days of Exciting Deals for Amazon Prime Big Deal Days 2024

8 months ago
Namibia: Chinese Embassy Attends the Opening Ceremony of the Hoachanas Library in Hardap Region

Namibia: Chinese Embassy Attends the Opening Ceremony of the Hoachanas Library in Hardap Region

7 months ago
Shiba Inu Has Potential To Surge 822% & Hit $0.00029

Shiba Inu Has Potential To Surge 822% & Hit $0.00029

1 year ago

POPULAR NEWS

  • Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    0 shares
    Share 0 Tweet 0
  • When Will SHIB Reach $1? Here’s What ChatGPT Says

    0 shares
    Share 0 Tweet 0
  • Matthew Slater, son of Jackson State great, happy to see HBCUs back at the forefront

    0 shares
    Share 0 Tweet 0
  • Dolly Varden Focuses on Adding Ounces the Remainder of 2023

    0 shares
    Share 0 Tweet 0
  • US Dollar Might Fall To 96-97 Range in March 2024

    0 shares
    Share 0 Tweet 0
  • Privacy Policy
  • Contact

© 2023 LBNN - All rights reserved.

No Result
View All Result
  • Home
  • Business
  • Politics
  • Markets
  • Crypto
  • Economics
    • Manufacturing
    • Real Estate
    • Infrastructure
  • Finance
  • Energy
  • Creator Economy
  • Wealth Management
  • Taxes
  • Telecoms
  • Military & Defense
  • Careers
  • Technology
  • Artificial Intelligence
  • Investigative journalism
  • Art & Culture
  • Documentaries
  • Quizzes
    • Enneagram quiz
  • Newsletters
    • LBNN Newsletter
    • Divergent Capitalist

© 2023 LBNN - All rights reserved.