• Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Intelligence
    • Policy Intelligence
    • Security Intelligence
    • Economic Intelligence
    • Fashion Intelligence
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • LBNN Blueprints
  • Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Intelligence
    • Policy Intelligence
    • Security Intelligence
    • Economic Intelligence
    • Fashion Intelligence
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • LBNN Blueprints

What influences trust when conversing with chatbots?

Simon Osuji by Simon Osuji
January 9, 2025
in Artificial Intelligence
0
What influences trust when conversing with chatbots?
0
SHARES
1
VIEWS
Share on FacebookShare on Twitter


chatbot
Credit: Pixabay/CC0 Public Domain

Whether on your bank’s website or your telephone provider’s help line, interactions between humans and chatbots have become part of our daily lives. But do we trust them? And what factors influence our trust? Researchers at the University of Basel recently examined these questions.

Related posts

I Tried RentAHuman, Where AI Agents Hired Me to Hype Their AI Startups

I Tried RentAHuman, Where AI Agents Hired Me to Hype Their AI Startups

February 15, 2026
Here’s Why I’d Buy the Last Samsung Soundbar First

Here’s Why I’d Buy the Last Samsung Soundbar First

February 15, 2026

“Hello ChatGPT, can you help me?”—”Of course, how can I help you?” Exchanges between users and chatbots, which have their basis in artificial intelligence (AI), quickly seem like conversations with another person.

Dr. Fanny Lalot and Anna-Marie Betram from the Faculty of Psychology at the University of Basel wanted to know how much people trust AI chatbots and what this trust depends on. They focused on text-based systems—that is, platforms like ChatGPT rather than voice assistants such as Siri or Alexa.

Test subjects were exposed to examples of interactions between users and a chatbot called Conversea that was imagined specifically for the study. They then imagined they would interact with Conversea themselves. The results are published in the Journal of Experimental Psychology: General.

The chatbot as an independent entity

Our level of trust in other people depends on a variety of factors: our own personality, the other person’s behavior and the specific situation all play a role. “Impressions from childhood influence how much we are able to trust others, but a certain openness is also needed in order to want to trust,” explains social psychologist Lalot. Characteristics that promote trust include integrity, competence and benevolence.

The new study shows that what applies to relationships between humans also applies to AI systems. Competence and integrity in particular are important criteria that lead humans to perceive an AI chatbot as reliable. Benevolence, on the other hand, is less important, as long as the other two dimensions are present.

“Our study demonstrates that the participants attribute these characteristics to the AI directly, not just to the company behind it. They do think of AI as if it was an independent entity,” according to Lalot.

Additionally, there are differences between personalized and impersonal chatbots. If a chatbot addresses us by name and makes reference to previous conversations, for example, the study participants assessed it as especially benevolent and competent.

“They anthropomorphize the personalized chatbot. This does increase willingness to use the tool and share personal information with it,” according to Lalot. However, the test subjects did not attribute significantly more integrity to the personalized chatbot and overall trust was not significantly higher than in the impersonal chatbot.

Integrity is more important than benevolence

According to the study, integrity is a more important factor for trust than benevolence. For this reason, it is important to develop the technology to prioritize integrity above all else. Designers should also take into account the fact that personalized AI is perceived as more benevolent, competent and human in order to ensure proper use of the tools. Other research has demonstrated that lonely, vulnerable people in particular run the risk of becoming dependent on AI-based friendship apps.

“Our study makes no statements about whether it is good or bad to trust a chatbot,” Lalot emphasizes. She sees the AI chatbot as a tool that we have to learn to navigate, much like the opportunities and risks of social media.

However, there are some recommendations that can be derived from their results. “We project more onto AI systems than is actually there,” says Lalot. This makes it even more important that AI systems be reliable. A chatbot should neither lie to us nor endorse everything we say unconditionally.

If an AI chatbot is too uncritical and simply agrees with everything a user says, it fails to provide reality checks and runs the risk of creating an echo chamber that, in the worst case, can isolate people from their social environment. “A [human] friend would hopefully intervene at some point if someone developed ideas that are too crazy or immoral,” Lalot says.

Betrayed by AI?

In human relationships, broken trust can have serious consequences for future interactions. Might this also be the case with chatbots? “That is an exciting question. Further research would be needed to answer it,” says Dr. Lalot. “I can certainly imagine that someone might feel betrayed if advice from an AI chatbot has negative consequences.”

There need to be laws that hold the developers responsible. For example, an AI platform could show how it comes to a conclusion by openly revealing the sources it used, and it could say when it doesn’t know something rather than inventing an answer.

More information:
Fanny Lalot et al, When the bot walks the talk: Investigating the foundations of trust in an artificial intelligence (AI) chatbot., Journal of Experimental Psychology: General (2024). DOI: 10.1037/xge0001696

Provided by
University of Basel

Citation:
What influences trust when conversing with chatbots? (2025, January 9)
retrieved 9 January 2025
from https://techxplore.com/news/2025-01-conversing-chatbots.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Previous Post

Ministry of Mines and Energy Responds to Shell’s Orange Basin Write-Down

Next Post

Google Daily Listen Generates a Personalized News Podcast

Next Post
Google Daily Listen Generates a Personalized News Podcast

Google Daily Listen Generates a Personalized News Podcast

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

RECOMMENDED NEWS

Here’s When XRP Could Rally 100% And Hit $1

Here’s When XRP Could Rally 100% And Hit $1

2 years ago
Shibarium to Burn 5 Trillion SHIB Tokens Every Month

Shibarium to Burn 5 Trillion SHIB Tokens Every Month

3 years ago
Solana [SOL] Spikes 21% to $115, Can it Reach $130 Next?

Solana [SOL] Spikes 21% to $115, Can it Reach $130 Next?

2 years ago
Manliness concerns impede forgiveness of coworkers

Manliness concerns impede forgiveness of coworkers

1 year ago

POPULAR NEWS

  • Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    0 shares
    Share 0 Tweet 0
  • The world’s top 10 most valuable car brands in 2025

    0 shares
    Share 0 Tweet 0
  • Top 10 African countries with the highest GDP per capita in 2025

    0 shares
    Share 0 Tweet 0
  • Global ranking of Top 5 smartphone brands in Q3, 2024

    0 shares
    Share 0 Tweet 0
  • When Will SHIB Reach $1? Here’s What ChatGPT Says

    0 shares
    Share 0 Tweet 0

Get strategic intelligence you won’t find anywhere else. Subscribe to the Limitless Beliefs Newsletter for monthly insights on overlooked business opportunities across Africa.

Subscription Form

© 2026 LBNN – All rights reserved.

Privacy Policy | About Us | Contact

Tiktok Youtube Telegram Instagram Linkedin X-twitter
No Result
View All Result
  • Home
  • Business
  • Politics
  • Markets
  • Crypto
  • Economics
    • Manufacturing
    • Real Estate
    • Infrastructure
  • Finance
  • Energy
  • Creator Economy
  • Wealth Management
  • Taxes
  • Telecoms
  • Military & Defense
  • Careers
  • Technology
  • Artificial Intelligence
  • Investigative journalism
  • Art & Culture
  • LBNN Blueprints
  • Quizzes
    • Enneagram quiz
  • Fashion Intelligence

© 2023 LBNN - All rights reserved.