Friday, May 16, 2025
LBNN
  • Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • Documentaries
No Result
View All Result
LBNN

Study presents new method for explainable AI

Simon Osuji by Simon Osuji
October 5, 2023
in Artificial Intelligence
0
Study presents new method for explainable AI
0
SHARES
0
VIEWS
Share on FacebookShare on Twitter


New method for explainable AI
CRP method. Credit: Fraunhofer HHI

Artificial intelligence is already in widespread use, yet it is still difficult to understand how an AI system reaches its decisions. Scientists at the Fraunhofer Heinrich-Hertz-Institut (HHI) and the Berlin Institute for the Foundations of Learning and Data (BIFOLD) at TU Berlin have collaborated for many years to make AI explainable. Now the scientists led by Prof. Thomas Wiegand (Fraunhofer HHI, BIFOLD), Prof. Wojciech Samek (Fraunhofer HHI, BIFOLD) and Dr. Sebastian Lapuschkin (Fraunhofer HHI) have achieved another milestone.

Related posts

No, Graduates: AI Hasn’t Ended Your Career Before It Starts

No, Graduates: AI Hasn’t Ended Your Career Before It Starts

May 16, 2025
‘Fortnite’ Players Are Already Making AI Darth Vader Swear

‘Fortnite’ Players Are Already Making AI Darth Vader Swear

May 16, 2025

In their paper “From attribution maps to human-understandable explanations through concept relevance propagation,” the researchers present concept relevance propagation (CRP), a new method that can explain individual AI decisions as concepts understandable to humans. The paper has now been published in Nature Machine Intelligence.

AI systems are largely black boxes: It is usually not comprehensible to humans how an AI arrives at a certain decision. CRP is a state-of-the-art explanatory method for deep neural networks that complements and deepens existing explanatory models. In doing so, CRP reveals not only the characteristics of the input that are relevant to the decision made, but also the concepts the AI used, the location where they are represented in the input, and which parts of the neural network are responsible for them.

Thus, CRP is able to explain individual decisions made by an AI using concepts that are understandable to humans. As a result, this research sets an entirely new standard for the evaluation of and interaction with AI.

For the first time, this approach to explainability takes a look at the entire prediction process of an AI—all the way from input to output. In recent years, the research team has already developed various methods for using so-called heat maps to explain how AI algorithms reach their decisions.

The heat maps highlight specific areas in an image that are particularly relevant to the decision made. This method has become known as layer-wise relevance propagation (LRP). The importance of this type of explainability is enormous, as it allows us to understand whether an AI is actually making decisions based on sound reasoning or whether it has merely learned shortcut strategies and is thus cheating.

The new CRP method draws on layer-wise relevance propagation. “AI image recognition is a good example of this,” says Prof. Wojciech Samek, head of the Artificial Intelligence department at Fraunhofer HHI, professor of Machine Learning and Communications at TU Berlin, and BIFOLD Fellow. “On the input level, CRP labels which pixels within an image are most relevant for the AI decision process. This is an important step in understanding an AI’s decisions, but it doesn’t explain the underlying concept of why the AI considers those exact pixels.”

For comparison, when humans see a black-and-white striped surface, they don’t automatically recognize a zebra. To do so, they also need information such as four legs, hooves, tail, etc. Ultimately, they combine the information of the pixels (black and white) with the concept of animal.

“CRP transfers the explanation from the input space, where the image with all its pixels is located, to the semantically enriched concept space formed by higher layers of the neural network,” states Dr. Sebastian Lapuschkin, head of the research group Explainable Artificial Intelligence at Fraunhofer HHI, elaborating on the new method.

“CRP is the next step in AI explainability and offers entirely new possibilities in terms of investigating, testing and improving the functionality of AI models. We are already very excited to apply our new method to large language models like ChatGPT.”

More information:
Reduan Achtibat et al, From attribution maps to human-understandable explanations through concept relevance propagation, Nature Machine Intelligence (2023). DOI: 10.1038/s42256-023-00711-8

Provided by
Fraunhofer-Gesellschaft

Citation:
Study presents new method for explainable AI (2023, October 4)
retrieved 4 October 2023
from https://techxplore.com/news/2023-10-method-ai.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Previous Post

Tiny EV maker ElectraMeccanica pulls the plug on its exit plan

Next Post

Donald Trump to be Nominated for US House Speaker

Next Post
Donald Trump to be Nominated for US House Speaker

Donald Trump to be Nominated for US House Speaker

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

RECOMMENDED NEWS

Morocco: 2nd Forum of Foreign Affairs Committee Chairpersons of African Parliaments Kicks Off in Rabat

Morocco: 2nd Forum of Foreign Affairs Committee Chairpersons of African Parliaments Kicks Off in Rabat

3 months ago
Aramco President & CEO calls for energy transition reset during keynote speech at CERAWeek 2024

Aramco President & CEO calls for energy transition reset during keynote speech at CERAWeek 2024

1 year ago
Instafest now lets you create a music festival poster through a playlist link

Instafest now lets you create a music festival poster through a playlist link

1 year ago

Culture ministers recognise impact of pandemic

2 years ago

POPULAR NEWS

  • Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    0 shares
    Share 0 Tweet 0
  • When Will SHIB Reach $1? Here’s What ChatGPT Says

    0 shares
    Share 0 Tweet 0
  • Matthew Slater, son of Jackson State great, happy to see HBCUs back at the forefront

    0 shares
    Share 0 Tweet 0
  • Dolly Varden Focuses on Adding Ounces the Remainder of 2023

    0 shares
    Share 0 Tweet 0
  • US Dollar Might Fall To 96-97 Range in March 2024

    0 shares
    Share 0 Tweet 0
  • Privacy Policy
  • Contact

© 2023 LBNN - All rights reserved.

No Result
View All Result
  • Home
  • Business
  • Politics
  • Markets
  • Crypto
  • Economics
    • Manufacturing
    • Real Estate
    • Infrastructure
  • Finance
  • Energy
  • Creator Economy
  • Wealth Management
  • Taxes
  • Telecoms
  • Military & Defense
  • Careers
  • Technology
  • Artificial Intelligence
  • Investigative journalism
  • Art & Culture
  • Documentaries
  • Quizzes
    • Enneagram quiz
  • Newsletters
    • LBNN Newsletter
    • Divergent Capitalist

© 2023 LBNN - All rights reserved.