• Business
  • Energy
  • Markets
  • Intelligence
    • Policy Intelligence
    • Fashion Intelligence
    • Economic Intelligence
    • Security Intelligence
  • Technology
  • Infrastructure
  • Politics
  • LBNN Blueprints
  • Business
  • Energy
  • Markets
  • Intelligence
    • Policy Intelligence
    • Fashion Intelligence
    • Economic Intelligence
    • Security Intelligence
  • Technology
  • Infrastructure
  • Politics
  • LBNN Blueprints
LIVE MARKETS
Initializing...
Home Artificial Intelligence

AI Insights Improve Autonomous Vehicles’ Decisions

Simon Osuji by Simon Osuji
November 23, 2025
in Artificial Intelligence
0
AI Insights Improve Autonomous Vehicles’ Decisions
0
SHARES
2
VIEWS
Share on FacebookShare on Twitter



This article is part of our exclusive IEEE Journal Watch series in partnership with IEEE Xplore.

A lot of pressure is riding on autonomous vehicles to perform flawlessly—each mistake they make erodes the public’s trust and puts immense pressure on the industry to further improve safety. What will help autonomous vehicles overcome these challenges?

In a study published in the October issue of IEEE Transactions on Intelligent Transportation Systems, researchers outlined numerous ways in which explainable AI—in which questions are posed to an AI model to understand its decision-making process—can be used to pinpoint exactly when in that process the models overseeing autonomous vehicles are making mistakes. This approach could not only help passengers know when it’s necessary to take control of the vehicle and help build their trust in autonomous vehicles, but also help industry experts develop safer ones.

Shahin Atakishiyev is a deep learning researcher who conducted the study as part of his postdoctoral work with the University of Alberta, in Canada. He notes that autonomous driving architecture is generally a black box. “Ordinary people, such as passengers and bystanders, do not know how an autonomous vehicle makes real-time driving decisions,” he says.

But with rapidly advancing AI, it’s now possible to ask the models why they make the decisions they do. This opens up a wide range of possibilities for digging deeper into the model’s inner workings. For example, what aspects of its visual sensory data was it focusing on when it decided to brake suddenly? To what extent did time constraints affect its decision-making?

Real-Time Feedback in Autonomous Vehicles

In their paper, Atakishiyev and his colleagues provide an example of how real-time feedback could help passengers detect faulty decision-making by autonomous vehicles. They point to a case study where another research group slightly altered a 35-mile-per-hour (56 kilometers per hour) speed limit sign by adding a sticker to it, elongating the middle part of the “3,” and tested Tesla Model S to see how the heads-up display of the vehicle read the altered speed limit. The vehicle read the 35-mph (56 kph) sign as an 85-mph (137-kph) sign and accelerated as it approached and passed the sign.

In such a case, Atakishiyev’s team notes that if the car provides a rationale for its decision on a dashboard or user interface—for example by saying “The speed limit is 85 mph, accelerating”—in real time while approaching the speed sign, a passenger onboard could intervene and ensure the car adheres to the true speed limit.

A challenge here, Atakishiyev says, is what level of information to provide to passengers, each of whom will have different preferences. “Explanations can be delivered via audio, visualization, text, or vibration, and people may choose different modes depending on their technical knowledge, cognitive abilities, and age,” he says.

Whereas real-time feedback to users could help prevent disasters from happening in the moment, analyzing the decision-making process of an autonomous vehicle after it makes a mistake could also help scientists produce safer vehicles, Atakishiyev says.

In their study, Atakishiyev’s team conducted different simulationsin which a deep learning model for autonomous vehicles made various decisions while driving, and the researchers asked the driving model questions about its decisions. They made sure to ask the model trick questions, which revealed instances when the model lacked the ability to explain its actions. This approach can help identify key gaps in the explanation module that need to be addressed.

They also point to an existing machine learning analysis technique, called SHapley Additive exPlanations (SHAP), that researchers can use to assess AV decision-making. After an autonomous vehicle completes a drive, a SHAP analysis involves scoring all features used in decision-making of the autonomous vehicle, to reveal which are helpful and influential (and which are not) for driving decisions. “This analysis helps to discard less influential features and pay more attention to the most salient ones,” Atakishiyev says.

The researchers also discuss how explanations could help tease apart the legalities of when an autonomous vehicle hits a pedestrian. Key questions here include: Was the vehicle following the rules of the road? Once the accident occurred, did the vehicle “understand” that it hit a person and come to a full stop, as it should? Did it activate emergency functions (for example, reporting the accident to authorities and emergency services) immediately? Such questions help scientists identify faults in a model that need correcting.

These tactics to understand the decision-making processes of deep learning models are gaining traction in the field of autonomous vehicles, and will presumably lead to safer roads.

“I would say explanations are becoming an integral component of AV technology,” Atakishiyev says, emphasizing that explanations can help assess the operational safety of the driving by debugging the existing systems.

From Your Site Articles

Related Articles Around the Web



Source link

Previous Post

Expert Reveals The Trigger Behind Upcoming XRP Price Explosion

Next Post

Al Akaria unit wins early works contract for Expo 2030 Riyadh project

Next Post
Al Akaria unit wins early works contract for Expo 2030 Riyadh project

Al Akaria unit wins early works contract for Expo 2030 Riyadh project

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

POPULAR NEWS

  • Mahama attends Liberia’s 178th independence anniversary

    Mahama attends Liberia’s 178th independence anniversary

    0 shares
    Share 0 Tweet 0
  • Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    0 shares
    Share 0 Tweet 0
  • The world’s top 10 most valuable car brands in 2025

    0 shares
    Share 0 Tweet 0
  • Top 10 African countries with the highest GDP per capita in 2025

    0 shares
    Share 0 Tweet 0
  • Global ranking of Top 5 smartphone brands in Q3, 2024

    0 shares
    Share 0 Tweet 0

Get strategic intelligence you won’t find anywhere else. Subscribe to the Limitless Beliefs Newsletter for monthly insights on overlooked business opportunities across Africa.

Subscription Form

© 2026 LBNN – All rights reserved.

Privacy Policy | About Us | Contact

Tiktok Youtube Telegram Instagram Linkedin X-twitter
No Result
View All Result
  • Home
  • Business
  • Politics
  • Markets
  • Crypto
  • Economics
    • Manufacturing
    • Real Estate
    • Infrastructure
  • Finance
  • Energy
  • Creator Economy
  • Wealth Management
  • Taxes
  • Telecoms
  • Military & Defense
  • Careers
  • Technology
  • Artificial Intelligence
  • Investigative journalism
  • Art & Culture
  • LBNN Blueprints
  • Quizzes
    • Enneagram quiz
  • Fashion Intelligence

© 2026 JNews - Premium WordPress news & magazine theme by Jegtheme.