Saturday, July 26, 2025
LBNN
  • Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • Documentaries
No Result
View All Result
LBNN

Deep-learning system teaches soft, bio-inspired robots to move using only a single camera

Simon Osuji by Simon Osuji
July 9, 2025
in Artificial Intelligence
0
Deep-learning system teaches soft, bio-inspired robots to move using only a single camera
0
SHARES
0
VIEWS
Share on FacebookShare on Twitter


New deep-learning system teaches flexible, soft, bio-inspired robots to move with just a single camera
a, Reconstruction of the visuomotor Jacobian field and motion prediction. From a single image, a machine-learning model infers a 3D representation of the robot in the scene, which we name the visuomotor Jacobian field. It encodes the robot’s geometry and kinematics, enabling us to predict the 3D motions of robot surface points under all possible commands. Colors indicate the sensitivity of that point to individual command channels. b, Closed-loop control from vision. Given desired motion trajectories in pixel space or in 3D, we use the visuomotor Jacobian field to optimize for the robot command that would generate the prescribed motion at an interactive speed of approximately 12 Hz. Executing the robot command in the real world confirms that the desired motions have been achieved. Credit: Nature (2025). DOI: 10.1038/s41586-025-09170-0

Conventional robots, like those used in industry and hazardous environments, are easy to model and control, but are too rigid to operate in confined spaces and uneven terrain. Soft, bio-inspired robots are far better at adapting to their environments and maneuvering in otherwise inaccessible places.

Related posts

SteelSeries Rival 3 Gen 2 Review: Good Budget Gaming Mice

SteelSeries Rival 3 Gen 2 Review: Good Budget Gaming Mice

July 26, 2025
Join Our Next Livestream: Inside Katie Drummond’s Viral Interview With Bryan Johnson

Join Our Next Livestream: Inside Katie Drummond’s Viral Interview With Bryan Johnson

July 25, 2025

These more flexible capabilities, however, would normally require an array of on-board sensors and spatial models uniquely tailored for each individual robot design.

Taking a new and less resource-demanding approach, a team of researchers at MIT has developed a far less complex, deep-learning control system that teaches soft, bio-inspired robots to move and follow commands from just a single image.

Their results are published in the journal Nature.

By training a deep neural network on two to three hours of multi-view of video of various robots executing random commands, the researchers trained the network to reconstruct both the shape and range of mobility of a robot from just a single image.

Previous machine-learning control designs required expert customization and expensive motion-capture systems. This lack of a general-purpose control system limited their applications and made rapid prototyping far less practical.

“Our method unshackles the hardware design of robots from our ability to model them manually, which in the past has dictated precision manufacturing, costly materials, extensive sensing capabilities and reliance on conventional, rigid building blocks,” the researchers note in their paper.

The new single-camera machine-learning approach enabled high-precision control in tests on a variety of robotic systems, including a 3D-printed pneumatic hand, a soft auxetic wrist, a 16-DOF Allegro hand, and a low-cost Poppy robot arm.

These tests succeeded in achieving less than three degrees of error in joint motion and less than 4 millimeters (about 0.15 inches) of error in fingertip control. The system was also able to compensate for the robot’s motion and changes to the surrounding environment.

“This work points to a shift from programming robots to teaching robots,” notes Ph.D. student Sizhe Lester Li in an MIT web feature.

“Today, many robotics tasks require extensive engineering and coding. In the future, we envision showing a robot what to do and letting it learn how to achieve the goal autonomously.”

Since this system relies on vision alone, it may not be suitable for more nimble tasks requiring contact sensing and tactile manipulation. Its performance may also degrade in cases where visual cues are insufficient.

The researchers suggest that the addition of tactile and other sensors could enable the robots to perform more complex tasks. There is also the potential to automate the control of a wider range of robots, including those with minimal or no embedded sensors.

Written for you by our author Charles Blue,
edited by Sadie Harley, and fact-checked and reviewed by Robert Egan—this article is the result of careful human work. We rely on readers like you to keep independent science journalism alive.
If this reporting matters to you,
please consider a donation (especially monthly).
You’ll get an ad-free account as a thank-you.

More information:
Sizhe Lester Li et al, Controlling diverse robots by inferring Jacobian fields with deep networks, Nature (2025). DOI: 10.1038/s41586-025-09170-0

© 2025 Science X Network

Citation:
Deep-learning system teaches soft, bio-inspired robots to move using only a single camera (2025, July 9)
retrieved 9 July 2025
from https://techxplore.com/news/2025-07-deep-soft-bio-robots-camera.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Previous Post

Liquid South Africa | Digital citizenship and South Africa’s future

Next Post

191 Prime Day Deals Picked By People Who Obsessively Test Gear

Next Post
191 Prime Day Deals Picked By People Who Obsessively Test Gear

191 Prime Day Deals Picked By People Who Obsessively Test Gear

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

RECOMMENDED NEWS

An Investment of 75,000 Rupees Turns 250 Crores in May 2024

An Investment of 75,000 Rupees Turns 250 Crores in May 2024

1 year ago
Firefly to Deploy Elytra Dawn Spacecraft for Pentagon’s Orbital Missions

Firefly to Deploy Elytra Dawn Spacecraft for Pentagon’s Orbital Missions

4 months ago
Shiba Inu (SHIB) Mentioned on TV By Top US Network

Shiba Inu (SHIB) Mentioned on TV By Top US Network

1 year ago
Bonhams executive Alex Fortescue has died aged 55

Bonhams executive Alex Fortescue has died aged 55

1 year ago

POPULAR NEWS

  • Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    0 shares
    Share 0 Tweet 0
  • When Will SHIB Reach $1? Here’s What ChatGPT Says

    0 shares
    Share 0 Tweet 0
  • The world’s top 10 most valuable car brands in 2025

    0 shares
    Share 0 Tweet 0
  • Top 10 African countries with the highest GDP per capita in 2025

    0 shares
    Share 0 Tweet 0
  • Global ranking of Top 5 smartphone brands in Q3, 2024

    0 shares
    Share 0 Tweet 0
  • Privacy Policy
  • Contact

© 2023 LBNN - All rights reserved.

No Result
View All Result
  • Home
  • Business
  • Politics
  • Markets
  • Crypto
  • Economics
    • Manufacturing
    • Real Estate
    • Infrastructure
  • Finance
  • Energy
  • Creator Economy
  • Wealth Management
  • Taxes
  • Telecoms
  • Military & Defense
  • Careers
  • Technology
  • Artificial Intelligence
  • Investigative journalism
  • Art & Culture
  • Documentaries
  • Quizzes
    • Enneagram quiz
  • Newsletters
    • LBNN Newsletter
    • Divergent Capitalist

© 2023 LBNN - All rights reserved.