• Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Intelligence
    • Policy Intelligence
    • Security Intelligence
    • Economic Intelligence
    • Fashion Intelligence
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • LBNN Blueprints
  • Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Intelligence
    • Policy Intelligence
    • Security Intelligence
    • Economic Intelligence
    • Fashion Intelligence
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • LBNN Blueprints

Transgender, nonbinary and disabled people more likely to view AI negatively, study shows

Simon Osuji by Simon Osuji
August 7, 2025
in Artificial Intelligence
0
Transgender, nonbinary and disabled people more likely to view AI negatively, study shows
0
SHARES
3
VIEWS
Share on FacebookShare on Twitter


AI
Datasets used to train AI algorithms may underrepresent older people. Credit: Pixabay/CC0 Public Domain

AI seems to be well on its way to becoming pervasive. You hear rumbles of AI being used, somewhere behind the scenes, at your doctor’s office. You suspect it may have played a role in hiring decisions during your last job search. Sometimes—maybe even often—you use it yourself.

Related posts

The El Paso No-Fly Debacle Is Just the Beginning of a Drone Defense Mess

The El Paso No-Fly Debacle Is Just the Beginning of a Drone Defense Mess

February 16, 2026
Amazon Props Up Misleading, Junky Laptops No One Should Buy

Amazon Props Up Misleading, Junky Laptops No One Should Buy

February 16, 2026

And yet, while AI now influences high-stakes decisions such as what kinds of medical care people receive, who gets hired and what news people see, these decisions are not always made equitably. Research has shown that algorithmic bias often harms marginalized groups. Facial recognition systems often misclassify transgender and nonbinary people, AI used in law enforcement can lead to the unwarranted arrest of Black people at disproportionately high rates, and algorithmic diagnostic systems can prevent disabled people from accessing necessary health care.

These inequalities raise a question: Do gender and racial minorities and disabled people have more negative attitudes toward AI than the general U.S. population?

I’m a social computing scholar who studies how marginalized people and communities use social technologies. In a new study, my colleagues Samuel Reiji Mayworm, Alexis Shore Ingber, Nazanin Andalibi and I surveyed over 700 people in the U.S., including a nationally representative sample and an intentional oversample of trans, nonbinary, disabled and racial minority individuals. We asked participants about their general attitudes toward AI: whether they believed it would improve their lives or work, whether they viewed it positively, and whether they expected to use it themselves in the future.

The results reveal a striking divide. Transgender, nonbinary and disabled participants reported, on average, significantly more negative attitudes toward AI than their cisgender and nondisabled counterparts. These results indicate that when gender minorities and disabled people are required to use AI systems, such as in workplace or health care settings, they may be doing so while harboring serious concerns or hesitations. These findings challenge the prevailing tech industry narrative that AI systems are inevitable and will benefit everyone.

Public perception plays a powerful role in shaping how AI is developed, adopted and regulated. The vision of AI as a social good falls apart if it mostly benefits those who already hold power. When people are required to use AI while simultaneously disliking or distrusting it, it can limit participation, erode trust and compound inequities.

Gender, disability and AI attitudes

Nonbinary people in our study had the most negative AI attitudes. Transgender people overall, including trans men and trans women, also expressed significantly negative AI attitudes. Among cisgender people—those whose gender identity matches the sex they were assigned at birth—women reported more negative attitudes than men, a trend echoing previous research, but our study adds an important dimension by examining nonbinary and trans attitudes as well.

Disabled participants also had significantly more negative views of AI than nondisabled participants, particularly those who are neurodivergent or have mental health conditions.

These findings are consistent with a growing body of research showing how AI systems often misclassify, perpetuate discrimination toward or otherwise harm trans and disabled people. In particular, identities that defy categorization clash with AI systems that are inherently designed to reduce complexity into rigid categories. In doing so, AI systems simplify identities and can replicate and reinforce bias and discrimination—and people notice.

A more complex picture for race

In contrast to our findings about gender and disability, we found that people of color, and Black participants in particular, held more positive views toward AI than white participants. This is a surprising and complex finding, considering that prior research has extensively documented racial bias in AI systems, from discriminatory hiring algorithms to disproportionate surveillance.

Our results do not suggest that AI is working well for Black communities. Rather, they may reflect a pragmatic or hopeful openness to technology’s potential, even in the face of harm. Future research might qualitatively examine Black individuals’ ambivalent balance of critique and optimism around AI.

Policy and technology implications

If marginalized people don’t trust AI—and for good reason—what can policymakers and technology developers do?

First, provide an option for meaningful consent. This would give everyone the opportunity to decide whether and how AI is used in their lives. Meaningful consent would require employers, health care providers and other institutions to disclose when and how they are using AI and provide people with real opportunities to opt out without penalty.

Next, provide data transparency and privacy protections. These protections would help people understand where the data comes from that informs AI systems, what will happen with their data after the AI collects it, and how their data will be protected. Data privacy is especially critical for marginalized people who have already experienced algorithmic surveillance and data misuse.

Further, when building AI systems, developers can take extra steps to test and assess impacts on marginalized groups. This may involve participatory approaches involving affected communities in AI system design. If a community says no to AI, developers should be willing to listen.

Finally, I believe it’s important to recognize what negative AI attitudes among marginalized groups tell us. When people at high risk of algorithmic harm such as trans people and disabled people are also those most wary of AI, that’s an indication for AI designers, developers and policymakers to reassess their efforts. I believe that a future built on AI should account for the people the technology puts at risk.

More information:
Oliver L. Haimson et al, AI Attitudes Among Marginalized Populations in the U.S.: Nonbinary, Transgender, and Disabled Individuals Report More Negative AI Attitudes, Proceedings of the 2025 ACM Conference on Fairness, Accountability, and Transparency (2025). DOI: 10.1145/3715275.3732081

Provided by
The Conversation

This article is republished from The Conversation under a Creative Commons license. Read the original article.The Conversation

Citation:
Transgender, nonbinary and disabled people more likely to view AI negatively, study shows (2025, August 7)
retrieved 7 August 2025
from https://techxplore.com/news/2025-08-transgender-nonbinary-disabled-people-view.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Previous Post

ULA’s heavy-lift rocket prepares to launch first Space Force mission

Next Post

South African-US joint exercise ‘Shared Accord 2025’ cancelled

Next Post
South African-US joint exercise ‘Shared Accord 2025’ cancelled

South African-US joint exercise ‘Shared Accord 2025’ cancelled

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

RECOMMENDED NEWS

The Doyen Who Shaped Kenyan Politics

The Doyen Who Shaped Kenyan Politics

4 months ago
Nigerian engineers harps on innovative technology research

Estate Developers Task Govt. to Consider Subsidy and Price Control as a Means to Affordable Housing in Nigeria

2 years ago
Mafikizolo’s Theo Kgosinkwe celebrates 50th birthday

Mafikizolo’s Theo Kgosinkwe celebrates 50th birthday

3 years ago
Our Favorite Premium TV Is $500 Off

Our Favorite Premium TV Is $500 Off

4 months ago

POPULAR NEWS

  • Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    0 shares
    Share 0 Tweet 0
  • The world’s top 10 most valuable car brands in 2025

    0 shares
    Share 0 Tweet 0
  • Top 10 African countries with the highest GDP per capita in 2025

    0 shares
    Share 0 Tweet 0
  • Global ranking of Top 5 smartphone brands in Q3, 2024

    0 shares
    Share 0 Tweet 0
  • When Will SHIB Reach $1? Here’s What ChatGPT Says

    0 shares
    Share 0 Tweet 0

Get strategic intelligence you won’t find anywhere else. Subscribe to the Limitless Beliefs Newsletter for monthly insights on overlooked business opportunities across Africa.

Subscription Form

© 2026 LBNN – All rights reserved.

Privacy Policy | About Us | Contact

Tiktok Youtube Telegram Instagram Linkedin X-twitter
No Result
View All Result
  • Home
  • Business
  • Politics
  • Markets
  • Crypto
  • Economics
    • Manufacturing
    • Real Estate
    • Infrastructure
  • Finance
  • Energy
  • Creator Economy
  • Wealth Management
  • Taxes
  • Telecoms
  • Military & Defense
  • Careers
  • Technology
  • Artificial Intelligence
  • Investigative journalism
  • Art & Culture
  • LBNN Blueprints
  • Quizzes
    • Enneagram quiz
  • Fashion Intelligence

© 2023 LBNN - All rights reserved.