• Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Intelligence
    • Policy Intelligence
    • Security Intelligence
    • Economic Intelligence
    • Fashion Intelligence
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • LBNN Blueprints
  • Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Intelligence
    • Policy Intelligence
    • Security Intelligence
    • Economic Intelligence
    • Fashion Intelligence
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • LBNN Blueprints

Tech companies claim AI can recognize human emotions. But the science doesn’t stack up

Simon Osuji by Simon Osuji
December 14, 2024
in Artificial Intelligence
0
Tech companies claim AI can recognize human emotions. But the science doesn’t stack up
0
SHARES
0
VIEWS
Share on FacebookShare on Twitter


emotion
Credit: Pixabay/CC0 Public Domain

Can artificial intelligence (AI) tell whether you’re happy, sad, angry or frustrated?

Related posts

‘Uncanny Valley’: Tech Elites in the Epstein Files, Musk’s Mega Merger, and a Crypto Scam Compound

‘Uncanny Valley’: Tech Elites in the Epstein Files, Musk’s Mega Merger, and a Crypto Scam Compound

February 8, 2026
A Landmark Social Media Addiction Case Puts Big Tech on Trial

A Landmark Social Media Addiction Case Puts Big Tech on Trial

February 8, 2026

According to technology companies that offer AI-enabled emotion recognition software, the answer to this question is yes.

But this claim does not stack up against mounting scientific evidence.

What’s more, emotion recognition technology poses a range of legal and societal risks—especially when deployed in the workplace.

For these reasons, the European Union’s AI Act, which came into force in August, bans AI systems used to infer emotions of a person in the workplace—except for “medical” or “safety” reasons.

In Australia, however, there is not yet specific regulation of these systems. As I argued in my submission to the Australian government in its most recent round of consultations about high-risk AI systems, this urgently needs to change.

A new and growing wave

The global market for AI-based emotion recognition systems is growing. It was valued at US$34 billion in 2022 and is expected to reach US$62 billion by 2027.

These technologies work by making predictions about a person’s emotional state from biometric data, such as their heart rate, skin moisture, voice tone, gestures or facial expressions.

Next year, Australian tech startup inTruth Technologies plans to launch a wrist-worn device that it claims can track a wearer’s emotions in real time via their heart rate and other physiological metrics.

inTruth Technologies founder Nicole Gibson has said this technology can be used by employers to monitor a team’s “performance and energy” or their mental health to predict issues such as post-traumatic stress disorder.

She has also said inTruth can be an “AI emotion coach that knows everything about you, including what you’re feeling and why you’re feeling it.”

Emotion recognition technologies in Australian workplaces

There is little data about the deployment of emotion recognition technologies in Australian workplaces.

However, we do know some Australian companies used a video interviewing system offered by a US-based company called HireVue that incorporated face-based emotion analysis.

This system used facial movements and expressions to assess the suitability of job applicants. For example, applicants were assessed on whether they expressed excitement or how they responded to an angry customer.

HireVue removed emotion analysis from its systems in 2021 following a formal complaint in the United States.

Emotion recognition may be on the rise again as Australian employers embrace artificial intelligence-driven workplace surveillance technologies.

Lack of scientific validity

Companies such as inTruth claim emotion recognition systems are objective and rooted in scientific methods.

However, scholars have raised concerns that these systems involve a return to the discredited fields of phrenology and physiognomy. That is, the use of a person’s physical or behavioral characteristics to determine their abilities and character.

Emotion recognition technologies are heavily reliant on theories which claim inner emotions are measurable and universally expressed.

However, recent evidence shows that how people communicate emotions varies widely across cultures, contexts and individuals.

In 2019, for example, a group of experts concluded there are “no objective measures, either singly or as a pattern, that reliably, uniquely, and replicably” identify emotional categories. For example, someone’s skin moisture might go up, down or stay the same when they are angry.

In a statement to The Conversation, inTruth Technologies founder Nicole Gibson said “it is true that emotion recognition technologies faced significant challenges in the past,” but that “the landscape has changed significantly in recent years.”

Infringement of fundamental rights

Emotion recognition technologies also endanger fundamental rights without proper justification.

They have been found to discriminate on the basis of race, gender and disability.

In one case, an emotion recognition system read black faces as angrier than white faces, even when both were smiling to the same degree. These technologies may also be less accurate for people from demographic groups not represented in the training data.

Gibson acknowledged concerns about bias in emotion recognition technologies. But she added that “bias is not inherent to the technology itself but rather to the data sets used to train these systems.” She said inTruth is “committed to addressing these biases” by using “diverse, inclusive data sets.”

As a surveillance tool, emotion recognition systems in the workplace pose serious threats to privacy rights. Such rights may be violated if sensitive information is collected without an employee’s knowledge.

There will also be a failure to respect privacy rights if the collection of such data is not “reasonably necessary” or by “fair means.”

Workers’ views

A survey published earlier this year found that only 12.9% of Australian adults support face-based emotion recognition technologies in the workplace. The researchers concluded that respondents viewed facial analysis as invasive. Respondents also viewed the technology as unethical and highly prone to error and bias.

In a US study also published this year, workers expressed concern that emotion recognition systems would harm their well-being and impact work performance.

They were fearful that inaccuracies could create false impressions about them. In turn, these false impressions might prevent promotions and pay rises or even lead to dismissal.

As one participant stated:

“I just cannot see how this could actually be anything but destructive to minorities in the workplace.”

Provided by
The Conversation

This article is republished from The Conversation under a Creative Commons license. Read the original article.The Conversation

Citation:
Tech companies claim AI can recognize human emotions. But the science doesn’t stack up (2024, December 14)
retrieved 14 December 2024
from https://techxplore.com/news/2024-12-tech-companies-ai-human-emotions.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Previous Post

US stock futures rise after Asia sells off on China disappointment

Next Post

G20:What should the South African Presidency do Differently to Secure a better Climate Financing Deal?

Next Post
G20:What should the South African Presidency do Differently to Secure a better Climate Financing Deal?

G20:What should the South African Presidency do Differently to Secure a better Climate Financing Deal?

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

RECOMMENDED NEWS

How Much Do You Really Need to Save for Retirement?

How Much Do You Really Need to Save for Retirement?

2 years ago
President Ramkalawan congratulates H.E Mr Cyril Ramaphosa on his re-election as President of the Republic of South Africa

President Ramkalawan congratulates H.E Mr Cyril Ramaphosa on his re-election as President of the Republic of South Africa

2 years ago
Binance’s BNB Coin On Track To Reclaim $1000: Here’s When

Binance’s BNB Coin On Track To Reclaim $1000: Here’s When

1 month ago
Joe Biden Has a Secret Weapon Against Killer AI. It’s Bureaucrats

Joe Biden Has a Secret Weapon Against Killer AI. It’s Bureaucrats

2 years ago

POPULAR NEWS

  • Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    0 shares
    Share 0 Tweet 0
  • The world’s top 10 most valuable car brands in 2025

    0 shares
    Share 0 Tweet 0
  • Top 10 African countries with the highest GDP per capita in 2025

    0 shares
    Share 0 Tweet 0
  • Global ranking of Top 5 smartphone brands in Q3, 2024

    0 shares
    Share 0 Tweet 0
  • When Will SHIB Reach $1? Here’s What ChatGPT Says

    0 shares
    Share 0 Tweet 0

Get strategic intelligence you won’t find anywhere else. Subscribe to the Limitless Beliefs Newsletter for monthly insights on overlooked business opportunities across Africa.

Subscription Form

© 2026 LBNN – All rights reserved.

Privacy Policy | About Us | Contact

Tiktok Youtube Telegram Instagram Linkedin X-twitter
No Result
View All Result
  • Home
  • Business
  • Politics
  • Markets
  • Crypto
  • Economics
    • Manufacturing
    • Real Estate
    • Infrastructure
  • Finance
  • Energy
  • Creator Economy
  • Wealth Management
  • Taxes
  • Telecoms
  • Military & Defense
  • Careers
  • Technology
  • Artificial Intelligence
  • Investigative journalism
  • Art & Culture
  • LBNN Blueprints
  • Quizzes
    • Enneagram quiz
  • Fashion Intelligence

© 2023 LBNN - All rights reserved.